Showing posts with label programmer. Show all posts
Showing posts with label programmer. Show all posts

Monday, 7 January 2013

CSS is the reason your website sucks

 CSS (Cascading Style Sheets) have been something that I have been putting off learning when building and designing websites thus far. Honestly it has probably been like the biggest error of judgement on my part so far. I was fiddling trying to make semi-cool looking frames for about 2 weeks until I realized anything not relating directly to content is really the sole responsibility of CSS. If anyone is thinking of taking up website design (I highly recommend doing this) start with the basics of html then move to learning CSS as quick as possible it may save you about two weeks of your life.



 I will be posting how to make a basic CSS sheet that will allow you to create a visually impressive menu bar then a main frame on which to place your content. It took me too long messing around in HTML to get anywhere close to what would be considered a pretty old school looking site, I spent 1 hour this morning reading CSS on the train into work and figured out how to easily do what I had been racking my brain with over the last few weeks.

 Even though computing is frustrating when you reach those 'aha!' moments it becomes all worth it, so hang on in there and keep fighting the good fight.



Sunday, 28 October 2012

API - Application Programming Interface

The API is a term I have seen and heard a million times throughout my travels into the world of computer science and I always had a vague idea of its definition. I understood it mostly to be a kind of go-between for the user and a specific type of program.

 Turns out I wasn't too far from the truth, an article I was recently reading summarised what the API is really well so I thought I'd share it with you.

 Essentially the API is a set of pre-defined rules that programmers have to follow if they want their code to work on a certain application. For instance if you were on the web and you wanted to view a PDF file, you would have to have a 'plugin' that allows you to read PDFs within that browser. The programmer constructing that plugin however would of had to adhere to the rules set by that browser's API to create that functioning plugin.

 APIs are literally everywhere, if you're using Windows right now you may notice how, in all the menus and boxes that come up the style of these interfaces are pretty much the same and that is due to the fact that Windows itself has a general API and again programmers wanting to create programs within windows will have to adhere to the pre-defined rules of that API, making all programs developed under that API have a similar feel.





 So the next time you are considering writing code make sure you are aware of the API for the particular system you are trying to manipulate.

Tuesday, 23 October 2012

Facebook Group for all self-teachers

There has been talk for quite a while discussing the need for a centralized group where everyone who wants to get involved in the process of teaching yourself computer science can meet and discuss progress, tips and tricks etc..

 So I took the liberty of setting up a quick Facebook group that serves that very function.




 Join up, post articles, suggestions, wisdom.

 Share the wealth.


o O o
o o O
OOO

Monday, 22 October 2012

How To Become a Hacker

Amazing article posted on the 420chan thread; the article discusses the mindset behind being a hacker and gives you a good insight into why most people bothered with computing in the first place. This has definitely re-kindled the fires of learning and has given me another positive reason to get heavily involved in the computing community.

Hopefully this will do the same for you.



As far as personal progress has been going I have started to create a diet and fitness regime that creates optimal conditions for the brain to function effectively. I am writing up the best tips and tricks that I have learned recently and will be posting them in here soon enough.

 On a final note there has been a lot of talk regarding an online meeting place where we can also go and share info and progress etc.. In essence a hacking circle. I will be creating a Facebook group and posting the link on here and the 420 thread so if you are interested in joining up and sharing your experience watch this space.
 (Just Posted a link in the latest article, in case you missed it: https://www.facebook.com/groups/360250364063011/)


Wednesday, 15 August 2012

Introduction to Computer Memory

 Got an interesting link set to me from a friend concerning Computer Memory, apparently this text is THE text for understanding computer memory physically and theoretically. Obviously with memory being one of the most integral parts to programming I feel this has to be jumped to the top of my reading list.






 As I am currently following the lecture series at the moment I will start updating what I am picking up from this text, from the first few pages I read already, just 'flicking through' I know it is definitely worth my while and will give me a great advantage in the future when I begin to practice assembly language and any other low-levels tinkering I may have planned.

:)

First On-line Lecture. (9,996.5)

Managed to get on my discipline a bit more today and watched the first lecture in the series I promised I would watch daily. If you don't know what I am talking about (click here).


 Okay so firstly the lecture are a little bit dated but have a very cool old-school feel to them, that makes you feel as if you were there near to the beginning of the birth of the modern-modern computer (I'm talking post windows 95). The first lecture concentrates on introducing you to Lisp, not in a traditional 'declarative' way (picked that word up from the lectures, Gates watch out..) whereby the lecturer basically runs through the various commands and how you link them up, rather, in this lecture the theme is understanding what's inside the each process that each operator calls.

 The lectures are filled with useful quotes that help to put computing into perspective for the programmer. For instance the lecturer makes very clear at the beginning that computing shouldn't be considered a science or even considered computing as it takes away from the processes involved and focuses heavily on the tools you are using. I feel this quote does have some weight to it as he elaborate on it throughout the duration of the lecture and basically leaves you understanding that computing is not really about the fancy programs you get to play with right now for instance, computing for the computer scientist is about engineering an art form. Making something that is already magical and beautiful and making it even more so.

 Anyway enough about the lecturers philosophical input about what he feels computing to be, what is there to learn from the first lecture in this series. Well, enough to realise that if I stick with this and complete this series I will understand computing and programming in a seriously deep way. From all the dabbling I have done before in programming I understood more about what was actually going on from this hour long introductory lecture than I have done in a whole year of faffing around.



 The main concept the Lecture introduced was this idea of  'abstraction' in that the way you can express something in programming has to be understood fully by you for you to be able to do it efficiently. The lecturer went on to explain what he meant by this by creating a squaring function from scratch using Lisp. The details of how to do this are not important as he shows you many methods of producing the same result and even a mini program that was designed to 'guess' the square root of any number.

 The key points that needed to be taken away is the basic structure of code.
 He gives this example and then elaborates on it as I will giving a running commentary.

(+  3  17.4  5).
Lisp uses a system known as prefix notation which means the operator is written to the left of the operands.
 A simple mathematical operation in lisp. The plus sign (+) is the operator the numbers to be added are the operands (3, 17.4 & 5) and the parentheses/brackets including the process inside is known as a combination.

He then explains the method of developing your own code or in this case a squaring function.

(DEFINE SQUARE (LAMBDA x (* x x)))

Now I'm not 100% on the definition of LAMBDA but I have made a note to re-edit this section for clarity, until then I will just explain the point the lecturer was making by showing us this operation.

The intial call 'DEFINE' requires us to create a symbol we want defining, in this case as we are squaring numbers the symbol was 'SQUARE'. The next stage of this definition required us to call a procedure that allowed us to define using an 'argument' what we wanted  our definition to stand for. This is where the term 'LAMBDA' comes in and not knowing this did stump me a little bit on what was going on but it is not impossible to understand the essence of what is going on which was the purpose of this lecture. So, with a procedure called with an argument named 'x' ( what this essentially means is that a value known as x will be entered and when it is entered something must happen to this value due to the call SQUARE being used), a result has to be 'returned' to give purpose to the new definition. The final section of the code '(* x x)' required x to be multiplied by itself (or squared as it is known) to complete the process.


:)


 Again as with all articles I will touch them up for you to make them perfect as possible when my understanding of the concepts matures. Thanks again for your paitence. Also if you can help clarify any of the information I am trying to re-teach people please leave me a comment and I will be more than happy to fix the areas where I have gone wrong, its the reason I'm here. Also I did about 1.5 hours of actual constructive work today hence why I have moved from 9,998 to 9,996.5. It seems this process is going to balance on my fight with my discipline as in my mind its filled all the time with the want to learn computing but there is a huge part of me that hides away from tackling it, who knows for what reason, probably fear of not understanding something and being a failure unto myself and shattering the image I hold of myself being able to tackle anything with enough effort, which truth being told would be a devastating blow, it is however my sincerest intention to reach my goal of hitting a Phd in 10 years. I'm just off to a slow start. Again thank you for your patience.



The Von Neumann Model


The man you see before you is  John von Neumann, an early computing pioneer. From what I've read so far I can establish that this guy is a pretty big cheese and is credited with the general model that computers are constructed by today. I plan to do a outline history of computing soon with all the major players and their contributions so if you were expecting an interesting article on all of von Neumann's contributions you will have to wait till I write that as this article is actually just an introduction to his 'model'.


John von Neumann

As far as writing about Neumann's life goes my knowledge of him is pretty slim at this moment in time, I will however, in the future plan to do an in-depth piece on him, for now however I am more concerned about his contribution to computer science and what that contribution means to me and you.

 Essentially von Neumann established a model that computers should follow that was logical and effective in its design. This 5 part model describes the form your computer takes underneath all of its hardware and shiny cases.

 Neumann states;

 There are Five major components within the computer system:

 The first is an Input Device
              - This element sends data and information to the system. This information and data is then stored in the next component.

 The Memory Unit
              - The instructions and data are then processed by the next component.

The Aritmetic Logic Unit (ALU)
             - The operation carried out within the ALU are carefully guided by the next component.

The Control Unit
             - To which the results of the ALU and Control Unit's work is sent to the final component.

The Output Unit.
             -Which would be your monitor or your printer.


 This simplified breakdown of what a computer is made up of makes its a bit easier to grasp harder concepts in computer architecture as it allows you to picture in your heads the basic outline of the computers structure. This however is an unfinished model and if we were to add a bit of extra detail to make it a little more complete and a little bit more accurate to today's computing systems we would have to mention the System Bus Model.

  As mentioned previously in another article (here) a bus is basically a method to quickly transport data from one component to another. The system bus model essentially partitions a computer into three sub-units:
 CPU / MEMORY / I/O
These three sub units are the five units established in the von Neumann model only grouped by their purpose.  This refinement of the von Neumann model combines the ALU and Control Unit into one functional unit - CPU. This model also combines the Input and Output units into a single I/O unit.

 The system bus links all these components on a shared pathway made up of:
The Data Bus - Carries the information being transmitted
The Address Bus - Identifies where the information is being sent
The Control Bus - Describes aspects of how information is being send and in what manner

:)


-As I have mentioned previously I will update older articles as my understanding of the concepts within develops so if you find this a rather breif introduction to von Neumann and the basic form of the computer, do not worry I am learning all the time and will update when I have found anything considerable to add. If you have anything else you believe to be worthwhile drop us a comment and I'll add it in. Thanks.

Tuesday, 14 August 2012

The Article that started it all.



If you enjoy what is being attempted the link below is the initial page where I got the idea and the logic behind the 10,000 hours.





Incase anyone was wondering what the hell has happened I've hit a major discipline low and have spent the last three days literally fucking around. I'm still on my 9,998 hours and still struggling to understand that floating point definition. 

 I am currently sucking hard. The intention is strong the discipline is not. 

I want this, I really do.




I'm going to start listening to this wolf.

Friday, 10 August 2012

Advice from a Computing Professional


Not much to report by the way of personal progress, I've been giving in to my weak discipline again and playing Starcraft and DayZ all day. Wasted a potentially good day of learning so my technical hours still stand at 9,998/10,000. Which after three days is terrible and testament to the dangers of not heavily discipling your mind and your body when attempting any kind of 'me vs. the odds' type quest. Quite frankly its depressing to go from being so eager to for no conceivable reason being a lazy waste of time. Oh well this day is gone and I have tomorrow to redeem myself. Gotta be up really early for work so I plan to get some crunching of Computer Architecture in before work then hit it hard after work, maybe start exercising aswell as I can't be ignorant to the facts surrounding the saying 'healthy body, healthy mind'.

 Sorry to disappoint you if you came today expecting something worth your while from me today, I have such terrible discipline. I suppose my crap discipline makes this quest all the more epic, if I ever finish it. For now anyway, some advice from someone who managed to get his head down and work his way up to a good position in a company.

 Valuable advice if you are planning on entering into a computing career.

Enjoy -

"I've been hanging around here for a while and it seems to me like there's a lot of people on here who are still in school or are in the early stages of a career as a developer. I thought it would help some of you guys to have a thread where you can get the perspective of a long time software development leader about what we look for when hiring, promoting, etc.

As far as my credentials go, I won't say who I work for just that it's a massive company. I manage a team of 105 programmers working across ~40 project teams. Based on lines of code written my teams work in HTML/CSS/JavaScript, PHP, C#, Java and Python most often, with a bit of F#, Ruby and a few others I'm probably forgetting in there. I'm a 15 year vet, the majority of my team are guys who are just out of college or have a few years experience.

That said, here's my top 3 things you can do to get and keep a job:

1) Be Language Agnostic
When I'm hiring there's a 50% chance that I don't REALLY care what languages you've written in before, just that you're familiar with the language I need you in and can get up and running in general. Since most of our projects are short turn around items, onboarding takes a long time relative to how long the project will last (e.g. 3 weeks of onboarding on a 6 month project). Also, be flexible... I can't tell you how many college kids I just fucking walk out of my office because they tell me all about how Lisp is the greatest language ever invented and we're wrong to be using anything else, which brings me to point 2

2) Be Humble
That kid who tells me we should be using Lisp is wrong. You know how I know he's wrong? Because MY TEAM IS SUCCESSFUL. Again, I can't tell you how shockingly shitty most young guys act in that first interview. Obviously once you're on the team if you think we should switch something I ABSOLUTELY want to hear your idea but make sure it makes sense (and is demonstrably better) and don't get all butthurt if I don't agree. We work based on what the developers pitch to me and we decide as a group is the right play, which backs me into point 3

3) Remember that you're a fucking unicorn
You are the aberration here, your non technical managers, bosses, finance people, HR people, NOBODY in the company understands what the fuck it is you do. You may as well be named Merlin to these people. My job (to crib a line from Jay Mohr) is to not let management spook the thoroughbred. Your part in this is to be that thoroughbred AT ALL TIMES and to remember that a thoroughbred just KNOWS that it's a thoroughbred, when that belief is strong enough, other people will get it naturally. Carry yourself like a boss and you'll be a boss."



Remember, be the Unicorn :)

Thursday, 9 August 2012

A little list (9,998)

Okay, So about 2 'official' hours in, hence the 9,998. Still feeling good about doing this.

 I'm currently coming to the end of the How Computers Work book by Ron White so decided to create a little list of a few terms that have always confused me when messing with my pc.

 As I haven't ever really got into the nitty gritty of my computer before a lot of these terms are internet based as thats where I've had the most problems and run-ins with requests asking for things I didn't even realise my computer had.

(Also if any of these are wrong or just explained incorrectly do let me know and I'll do my best to fix them, enjoy)

 Firstly something so important to the performance of your pc.

 RAM

 *n.b there is RAM and ROM (RAM is writeable ROM is not, I'll explain later)
 So yes RAM, Random Access Memory, RAM is a collection of microchips which the computer uses to store data and programs whilst it uses them. Writing is the process by which the computer stores data on RAM (Reading out of interest is when the computer transfers data or software code from RAM). The bigger the RAM capacity the better.



A picture of a ram, not a picture of ram

Clock speed and 'Overclocking'

 Having browsed YouTube numerous time I occasionally stumble across random videos of some computer guy over clocking his Pc and showing various results, none of which I at the time appreciated. Essentially what 'over clockers' are doing is increasing the computer speed by 'hacking' its clock. The computer's clock is a microchip that regulates the timing and the speed of all computer functions. How it does this is by passing an alternating current through a Quartz crystal which responds with its own natural resonance (vibration), creating in turn another alternating current that is then sent through the computer allowing a processes to occur to the quartz rythm. Where it gets cool is that these crystals oscillate at a frequency measured in Mhz (my physics knowledge is very sub-par at the moment) which I had to find out means a million times a second. Now if you look at your computer's specs you will find you have a processer with a speed above 300Mhz sometime into the Ghz range, which means your processer can work with a speed of up to what ever frequency you have. Again, the higher the number the better.



Bus

 I've found with computing the really complicated things that seem to be named really badly, are actually named really well sometimes quite comically by the people that created them, you just have to understand what they do. (reminds me, I'll do a post on getting to know computing jargon soon as its a very jargon heavy subject) Once such computing component is the Bus. There are buses everywhere in your computer, they are what they sound like a passengers vehicle only the passengers in this instance are bits of data not people. Essentially Buses are the circuitry and chips that manage the transfer of data from one device to another. The USB (Universal Serial BUS) drive you use is a bus, well it has a bus on it that connects whatever memory device attached to it to the computers hardware you just plugged it into.




DSL

 Considering you are most probably connected via DSL right now or definitely have been in the past this one is one of those things you come across everyday without having any idea what it is. Luckily, its not complicated at all its more of a business acronym than a computing one. Digital Subscriber Line, what this means is that the 'line' you are using to connect to the internet right now is part of a digital circuit between your residency and a telephone company's central office allowing you access to high speed data transportation via exsisting twisted copper telephone lines. (technology has moved on a little bit since DSL but its still very relevant to most of us, including me)


A Port

Similar again to a Bus, (in that its a metaphor) meaning any place in the computer where data is transferred, generally, a place where one device usually represented by a cable or wire is physically connected to another device. Imagine boats in a port, they bring something when they dock. There are two types of port, a Serial Port and a Parallel Port, the serial port only allows one bit of information to be sent at a time because only one wire of path is used for data, whereas parallel ports allow for several bits of data usually at least 8 bits simultaneously.


TCP/IP

 I must of seen this acronym a million time and never payed it any attention, not realising that the only way I'm actually connected to the net right now is through TCP/IP. Transmission Control Protocol/Internet Protocol; it is actually a collection of methods used to connect servers on the internet and to exchange data. TCP/IP is a universal standard for connecting to the net.

IP address

This goes hand in hand with TCP/IP, your IP address is an identifier for a computing dvice on a TCP/IP network. Networks using the TCP/IP route messages based on the IP address of the destination. The format of an IP address is a 32bit numeric address written as four number separated by periods. The numbers are between 0 and 255.
 Looking something like this.

XXX.XXX.XXX.XXX

 There are a million and one ways to find your own IP address out, (click here) to let a website tell you yours.





Rasterizer

 Something I've always wondered about but again never bothered to find out, little bit unrelated to the general theme but interesting nonetheless. A Rasterizer is the software used in games that translate 3D geometry of 3D objects into a two dimensional bitmap that can be displayed on your 2-D screen.




Wednesday, 8 August 2012

The Art of Assembly Language

As I mentioned previously I am going to start at well the start, which in this instance happens to be Assembly Language. After having a snoop around the net for some direction it seem thats the holy grail of Assembly Language knowledge is a massive book called the 'Art of Assembly Language'. (Link Below)

 I had a quick flick through and have determined that it is in fact exactly what I was looking for, massively out of my depth but introduced well and progresses relatively fast. It seems well written and understandable enough to really push myself and break those barriers of mental stagnation. Currently I am still finishing Ron Whites book as I detailed in a previous post but once that is done I am going to start some practical Html Learning and spending at least 3 hours every day making my way through the art of assembly language.



 Wish Me Luck.





Link To the Art of Assembly Language website:
 http://www.plantation-productions.com/Webster/www.artofasm.com/index.html


10,000 Hours left to go


"Since most of you are unemployed or students, you can easily spend 50 hours a week working on your skills in the field of your choice. If you did this, four years from now (e.g. the length of a college degree) you would be an expert in your chosen field. Literally, you would be ready to step up and become the next Steve Jobs, or Bill Gates, or crap out a PhD in mathematics, or be an awesome musician, or whatever.

Instead, you will spend the next 4 years racking up 10,000 hours of practice browsing internet forums. (Some of you will get the hours done in only two years). Think about what the last four years of that got you.

Now, what's your excuse for wasting your human potential?"



 An epic post I found the other day that really encouraged me to do this.



 As of today I will make a 10,000 Hours milestone, tallying each day my accumulated hours and put to the test the authenticity of the 10,000 hours mastery theory.


Tuesday, 7 August 2012

Day 1: Setting up a PoA (Plan of Action)

 Had a few very informative responses to the posts on 420chan about reccomendations for someone wanting to pursue a life mastering computing. Much appreciation for those responses.

 I found a timeless advice piece from a computing blogger who used to be a student of Computing at Yale. He offers a top 7 list of personal recommendations for anyone wanting to study computer science at a top-level.

http://www.joelonsoftware.com/articles/CollegeAdvice.html

His site is also very interesting to have a look around I will probably have another more serious look at his site when I have a bit more time.

 I'm still making my way through the 'How Computers Work' by Ron White book but have begun mapping out my PoA. Based on advice given in the posts (thanks again for that guys, especially big thanks to 'Basil Chuttingneg') I have decided to start my focused learning on assembly language.

 The reasoning behind this is :

  • Gets to grips with the 'real' computer part of computing
  • I will eventually be hindered by not knowing enough about assembly language and how compliers interact with the low level computer processes, so I am going to hit this hard now to make it a smoother ride later
  • Allows me eventually to have a massive advantage when developing my skills in 'C' and the similar languages due to a proper understanding of their functions interaction with the I/O devices of the computer


Obviously nobody learns an assembly language over night so instead of focusing soley on assembly language I've decided to make out of every day a 5hour 'practice' time. Within these 5 hours I intend to spend 3 of those hours learning assembly language from the ground up and 2 of those hour gaining practical experience in something instantly usuable such as xHtml and CSS. 

 I will be starting out by re-doing the Bucky Tutorials on xhtml and CSS, so expect a few posts about that coming soon.



Again guys thankyou very much for the pointers so far, if you have anymore advice or any personal experience you'd like to share i'll post the links to the 420 posts below.
http://boards.420chan.org/prog/res/24050.php   /PROG/
http://boards.420chan.org/tech/res/89403.php    /TECH/
http://boards.420chan.org/howto/res/30980.php /HOWTO/

Friday, 6 January 2012

Human reasoning errors

 Having been quite an introspective person naturally I began reading into how to improve cognition, basically make my mind more efficient. The main method through both experience and 'fact' is meditation (I use the term fact loosely as its in part borderline 'pseudo-scientific' claims and theories). I could rant and rave all day about the therapeutic benefits of mediation but this is a blog about computing so I'd recommend you to have a look up yourself, I think it helps sort your head out in all kinds of good ways.

 Back to the topic at hand, human reasoning errors, what I mean by this is how we make those super fast judgements and find alot of them to be massively wrong and some of them to be completely correct, with no seeming order to this process. So I went on an internet adventure exploring what is really going on when these weird processes happen with no seeming conscious control. It turns out conveniently enough that the mind is very similar to a computer in that it stores data based on past experience and relays them automatically to fit similar situations in the present. Sometimes this saves time sometimes it doesn't. This process is labelled under a subject known as Heuristics.

Incredibly interesting article explaining the reasoning behind heuristics and cognitive biases;



 A quick summary from this article; essentially the human brain has so much sensory input to handle that it has its own coping mechanisms which aid it in its daily function. The mechanisms are based, so we believe, on a kind of complex association system in which a certain action (e.g opening a door) is stored in the long term memory and when a similar scenario is encountered again for example reaching another door, the mind automatically assumes that this door must be like the 'door' object of previous encounters so the same principles are assumed to apply and the same action is therefore executed. Whilst this system is very economical and efficient it obviously has numerous limitations.


 An example demonstrated in the text is 'a group of people asked to work out a sum by way of a 5second guesstimate .The example shows just how we will engage our brains to apply previous rules and assumptions to the problem.

 The Question:
    Consider the product of the series:

9 x 8 x7 x 6 x 5 x 4 x3 x 2 x 1 = ?
vs.
1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 = ?

 The typical answers: 

9*8*7*6*5*4*3*2*1 'usually comes out at an average of 4000'
however,
1*2*3*4*5*6*7*8*9 'usually comes out at around 500'

The actual answer is 362,880.

This very basic example shows just how the association function is sometimes limiting and how easy it is to engage it.
The article also explains the different categories of heuristic failure and when and how they are limiting. Examples such as logic assumptions and ignorance to statistics due to emotive association (terrorism vs cancer).

 Worth a re-read every now and then to avoid problem solving genercism as it will occasionally be defeatist.
Shown below is the cognitive process by which people are encouraged to avoid bias.



 If this doesnt make immediate sense, consult your mind in an introspective manner, or read the article

Thursday, 5 January 2012

Creating the foundations

I want to become a certified Computing Master.

I want to become a certified Computing Master because I am currently a very underachieving, undisciplined, average Joe that has a decent enough mind to understand everything I apply that mind to but consistently wastes his potential and has no shortage of people telling him he is doing so. I decided I need to break my pointless cyclical lifestyle and have made the choice to do something that will make me or break me as a human.(Well maybe not break me physically but you get the idea; something that if I don't reach it I will be a serious failure to myself). So I had a look around the world I live in and picked one of the hardest yet useful skills available to me and decided as both a personal/spiritual quest to accomplish something that right now would seem impossible; 'earning a Phd in Computer Science within the next decade'. Whilst in ten years this may not seem like an impossible challenge, for me, being a lazy normal guy that never was considered anything special academically, this is a big mountain to climb.

I feel I have the motivation, in that I have been feeling pumped to do this for at least a year (an example of how bad my discipline is, thinking something and wanting something for a whole year without ever properly acting on it) I have decided now to kick myself into gear and start doing it.

Obviously I am aware of time and realise that to achieve something like this I need to put every spare hour under the sun to productive use if I am to be anywhere near my goal within the time limit set. I need a big plan, like a check-list of everything developed so far in the field so that I don't just 'get a Phd.' but really contribute something incredibly useful for the human race as a whole.

To be a certified Computing Master.




So I am going to develop this blog as a progress blog, documenting where I am at, where I am going and to summarize all the knowledge gained so far on my journey. Also I want it to be another example to people that extraordinary people are just normal people that never gave up.

 I like that last line, think I might get it tattoed on my face.