Wednesday, 15 August 2012

Introduction to Computer Memory

 Got an interesting link set to me from a friend concerning Computer Memory, apparently this text is THE text for understanding computer memory physically and theoretically. Obviously with memory being one of the most integral parts to programming I feel this has to be jumped to the top of my reading list.






 As I am currently following the lecture series at the moment I will start updating what I am picking up from this text, from the first few pages I read already, just 'flicking through' I know it is definitely worth my while and will give me a great advantage in the future when I begin to practice assembly language and any other low-levels tinkering I may have planned.

:)

First On-line Lecture. (9,996.5)

Managed to get on my discipline a bit more today and watched the first lecture in the series I promised I would watch daily. If you don't know what I am talking about (click here).


 Okay so firstly the lecture are a little bit dated but have a very cool old-school feel to them, that makes you feel as if you were there near to the beginning of the birth of the modern-modern computer (I'm talking post windows 95). The first lecture concentrates on introducing you to Lisp, not in a traditional 'declarative' way (picked that word up from the lectures, Gates watch out..) whereby the lecturer basically runs through the various commands and how you link them up, rather, in this lecture the theme is understanding what's inside the each process that each operator calls.

 The lectures are filled with useful quotes that help to put computing into perspective for the programmer. For instance the lecturer makes very clear at the beginning that computing shouldn't be considered a science or even considered computing as it takes away from the processes involved and focuses heavily on the tools you are using. I feel this quote does have some weight to it as he elaborate on it throughout the duration of the lecture and basically leaves you understanding that computing is not really about the fancy programs you get to play with right now for instance, computing for the computer scientist is about engineering an art form. Making something that is already magical and beautiful and making it even more so.

 Anyway enough about the lecturers philosophical input about what he feels computing to be, what is there to learn from the first lecture in this series. Well, enough to realise that if I stick with this and complete this series I will understand computing and programming in a seriously deep way. From all the dabbling I have done before in programming I understood more about what was actually going on from this hour long introductory lecture than I have done in a whole year of faffing around.



 The main concept the Lecture introduced was this idea of  'abstraction' in that the way you can express something in programming has to be understood fully by you for you to be able to do it efficiently. The lecturer went on to explain what he meant by this by creating a squaring function from scratch using Lisp. The details of how to do this are not important as he shows you many methods of producing the same result and even a mini program that was designed to 'guess' the square root of any number.

 The key points that needed to be taken away is the basic structure of code.
 He gives this example and then elaborates on it as I will giving a running commentary.

(+  3  17.4  5).
Lisp uses a system known as prefix notation which means the operator is written to the left of the operands.
 A simple mathematical operation in lisp. The plus sign (+) is the operator the numbers to be added are the operands (3, 17.4 & 5) and the parentheses/brackets including the process inside is known as a combination.

He then explains the method of developing your own code or in this case a squaring function.

(DEFINE SQUARE (LAMBDA x (* x x)))

Now I'm not 100% on the definition of LAMBDA but I have made a note to re-edit this section for clarity, until then I will just explain the point the lecturer was making by showing us this operation.

The intial call 'DEFINE' requires us to create a symbol we want defining, in this case as we are squaring numbers the symbol was 'SQUARE'. The next stage of this definition required us to call a procedure that allowed us to define using an 'argument' what we wanted  our definition to stand for. This is where the term 'LAMBDA' comes in and not knowing this did stump me a little bit on what was going on but it is not impossible to understand the essence of what is going on which was the purpose of this lecture. So, with a procedure called with an argument named 'x' ( what this essentially means is that a value known as x will be entered and when it is entered something must happen to this value due to the call SQUARE being used), a result has to be 'returned' to give purpose to the new definition. The final section of the code '(* x x)' required x to be multiplied by itself (or squared as it is known) to complete the process.


:)


 Again as with all articles I will touch them up for you to make them perfect as possible when my understanding of the concepts matures. Thanks again for your paitence. Also if you can help clarify any of the information I am trying to re-teach people please leave me a comment and I will be more than happy to fix the areas where I have gone wrong, its the reason I'm here. Also I did about 1.5 hours of actual constructive work today hence why I have moved from 9,998 to 9,996.5. It seems this process is going to balance on my fight with my discipline as in my mind its filled all the time with the want to learn computing but there is a huge part of me that hides away from tackling it, who knows for what reason, probably fear of not understanding something and being a failure unto myself and shattering the image I hold of myself being able to tackle anything with enough effort, which truth being told would be a devastating blow, it is however my sincerest intention to reach my goal of hitting a Phd in 10 years. I'm just off to a slow start. Again thank you for your patience.



The Von Neumann Model


The man you see before you is  John von Neumann, an early computing pioneer. From what I've read so far I can establish that this guy is a pretty big cheese and is credited with the general model that computers are constructed by today. I plan to do a outline history of computing soon with all the major players and their contributions so if you were expecting an interesting article on all of von Neumann's contributions you will have to wait till I write that as this article is actually just an introduction to his 'model'.


John von Neumann

As far as writing about Neumann's life goes my knowledge of him is pretty slim at this moment in time, I will however, in the future plan to do an in-depth piece on him, for now however I am more concerned about his contribution to computer science and what that contribution means to me and you.

 Essentially von Neumann established a model that computers should follow that was logical and effective in its design. This 5 part model describes the form your computer takes underneath all of its hardware and shiny cases.

 Neumann states;

 There are Five major components within the computer system:

 The first is an Input Device
              - This element sends data and information to the system. This information and data is then stored in the next component.

 The Memory Unit
              - The instructions and data are then processed by the next component.

The Aritmetic Logic Unit (ALU)
             - The operation carried out within the ALU are carefully guided by the next component.

The Control Unit
             - To which the results of the ALU and Control Unit's work is sent to the final component.

The Output Unit.
             -Which would be your monitor or your printer.


 This simplified breakdown of what a computer is made up of makes its a bit easier to grasp harder concepts in computer architecture as it allows you to picture in your heads the basic outline of the computers structure. This however is an unfinished model and if we were to add a bit of extra detail to make it a little more complete and a little bit more accurate to today's computing systems we would have to mention the System Bus Model.

  As mentioned previously in another article (here) a bus is basically a method to quickly transport data from one component to another. The system bus model essentially partitions a computer into three sub-units:
 CPU / MEMORY / I/O
These three sub units are the five units established in the von Neumann model only grouped by their purpose.  This refinement of the von Neumann model combines the ALU and Control Unit into one functional unit - CPU. This model also combines the Input and Output units into a single I/O unit.

 The system bus links all these components on a shared pathway made up of:
The Data Bus - Carries the information being transmitted
The Address Bus - Identifies where the information is being sent
The Control Bus - Describes aspects of how information is being send and in what manner

:)


-As I have mentioned previously I will update older articles as my understanding of the concepts within develops so if you find this a rather breif introduction to von Neumann and the basic form of the computer, do not worry I am learning all the time and will update when I have found anything considerable to add. If you have anything else you believe to be worthwhile drop us a comment and I'll add it in. Thanks.

Tuesday, 14 August 2012

Time to tick off those hours

I've decided that trying to tackle all of computer science in one go with no real structure for someone like me is incredibly futile so I am going to structure my learning around completing a series of online lectures. Obviously biting off more than I can chew left me escaping my responsibilities in a bubble of gaming and procrastination, so I am going to ease myself into a good cycle by forcing myself to watch just one lecture a day from a selected series. As I get more confident with myself I am going to up the pace, but for now I can manage just one video a day.

 As I have stated previously I am going to be starting with computer architecture and pad out the theoretical learning with practical experience in Assembly language. I have a layman's understanding of computer architecture but I would still consider myself a complete newbie to this topic area. With everything being essentially new I am going to take this slow and do one lecture then make sure I fully get it, summarise what I know into an interesting post, then tackle the next lecture.


 The Lecture series I am going to follow is an old one from 1996 but is recommended as a great introduction to computer architecture. It follows the book 'Structure and Interpretation of Computer Programs' - (The full book can be found here, for free). The sites description of the lectures is as follows;
- 'These twenty video lectures by Hal Abelson and Gerald Jay Sussman are a complete presentation of the course, given in July 1986 for Hewlett-Packard employees, and professionally produced by Hewlett-Packard Television. '





http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-001-structure-and-interpretation-of-computer-programs-spring-2005/video-lectures/



The Spirit of Computing


An amazing quote about what computing should be for programmers;

``I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries. Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands. What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more.''




Alan J. Perlis (April 1, 1922-February 7, 1990)



The Article that started it all.



If you enjoy what is being attempted the link below is the initial page where I got the idea and the logic behind the 10,000 hours.





Incase anyone was wondering what the hell has happened I've hit a major discipline low and have spent the last three days literally fucking around. I'm still on my 9,998 hours and still struggling to understand that floating point definition. 

 I am currently sucking hard. The intention is strong the discipline is not. 

I want this, I really do.




I'm going to start listening to this wolf.

Friday, 10 August 2012

Advice from a Computing Professional


Not much to report by the way of personal progress, I've been giving in to my weak discipline again and playing Starcraft and DayZ all day. Wasted a potentially good day of learning so my technical hours still stand at 9,998/10,000. Which after three days is terrible and testament to the dangers of not heavily discipling your mind and your body when attempting any kind of 'me vs. the odds' type quest. Quite frankly its depressing to go from being so eager to for no conceivable reason being a lazy waste of time. Oh well this day is gone and I have tomorrow to redeem myself. Gotta be up really early for work so I plan to get some crunching of Computer Architecture in before work then hit it hard after work, maybe start exercising aswell as I can't be ignorant to the facts surrounding the saying 'healthy body, healthy mind'.

 Sorry to disappoint you if you came today expecting something worth your while from me today, I have such terrible discipline. I suppose my crap discipline makes this quest all the more epic, if I ever finish it. For now anyway, some advice from someone who managed to get his head down and work his way up to a good position in a company.

 Valuable advice if you are planning on entering into a computing career.

Enjoy -

"I've been hanging around here for a while and it seems to me like there's a lot of people on here who are still in school or are in the early stages of a career as a developer. I thought it would help some of you guys to have a thread where you can get the perspective of a long time software development leader about what we look for when hiring, promoting, etc.

As far as my credentials go, I won't say who I work for just that it's a massive company. I manage a team of 105 programmers working across ~40 project teams. Based on lines of code written my teams work in HTML/CSS/JavaScript, PHP, C#, Java and Python most often, with a bit of F#, Ruby and a few others I'm probably forgetting in there. I'm a 15 year vet, the majority of my team are guys who are just out of college or have a few years experience.

That said, here's my top 3 things you can do to get and keep a job:

1) Be Language Agnostic
When I'm hiring there's a 50% chance that I don't REALLY care what languages you've written in before, just that you're familiar with the language I need you in and can get up and running in general. Since most of our projects are short turn around items, onboarding takes a long time relative to how long the project will last (e.g. 3 weeks of onboarding on a 6 month project). Also, be flexible... I can't tell you how many college kids I just fucking walk out of my office because they tell me all about how Lisp is the greatest language ever invented and we're wrong to be using anything else, which brings me to point 2

2) Be Humble
That kid who tells me we should be using Lisp is wrong. You know how I know he's wrong? Because MY TEAM IS SUCCESSFUL. Again, I can't tell you how shockingly shitty most young guys act in that first interview. Obviously once you're on the team if you think we should switch something I ABSOLUTELY want to hear your idea but make sure it makes sense (and is demonstrably better) and don't get all butthurt if I don't agree. We work based on what the developers pitch to me and we decide as a group is the right play, which backs me into point 3

3) Remember that you're a fucking unicorn
You are the aberration here, your non technical managers, bosses, finance people, HR people, NOBODY in the company understands what the fuck it is you do. You may as well be named Merlin to these people. My job (to crib a line from Jay Mohr) is to not let management spook the thoroughbred. Your part in this is to be that thoroughbred AT ALL TIMES and to remember that a thoroughbred just KNOWS that it's a thoroughbred, when that belief is strong enough, other people will get it naturally. Carry yourself like a boss and you'll be a boss."



Remember, be the Unicorn :)

Thursday, 9 August 2012

Gentoomen - Giant book torrent

I did mention I needed to build up some relevant literature to start building up my knowledge base, so I found out about this 36G torrent that holds an impossibly massive libary on just computer science texts that cover every relevant subject imaginable. It truly is epic. I mean I have had it on download all day today, total of about 12 hours and it still has seven left and I have a average download speed of 1.1M, I appreciate that isn't massive but its not bad.






As far as general progress on my quest is going it is still very early days and I am currently finishing the How Computers Work book, so expect a full review shortly plus some extra recap notes and something a little more in-depth and technical. 

A little list (9,998)

Okay, So about 2 'official' hours in, hence the 9,998. Still feeling good about doing this.

 I'm currently coming to the end of the How Computers Work book by Ron White so decided to create a little list of a few terms that have always confused me when messing with my pc.

 As I haven't ever really got into the nitty gritty of my computer before a lot of these terms are internet based as thats where I've had the most problems and run-ins with requests asking for things I didn't even realise my computer had.

(Also if any of these are wrong or just explained incorrectly do let me know and I'll do my best to fix them, enjoy)

 Firstly something so important to the performance of your pc.

 RAM

 *n.b there is RAM and ROM (RAM is writeable ROM is not, I'll explain later)
 So yes RAM, Random Access Memory, RAM is a collection of microchips which the computer uses to store data and programs whilst it uses them. Writing is the process by which the computer stores data on RAM (Reading out of interest is when the computer transfers data or software code from RAM). The bigger the RAM capacity the better.



A picture of a ram, not a picture of ram

Clock speed and 'Overclocking'

 Having browsed YouTube numerous time I occasionally stumble across random videos of some computer guy over clocking his Pc and showing various results, none of which I at the time appreciated. Essentially what 'over clockers' are doing is increasing the computer speed by 'hacking' its clock. The computer's clock is a microchip that regulates the timing and the speed of all computer functions. How it does this is by passing an alternating current through a Quartz crystal which responds with its own natural resonance (vibration), creating in turn another alternating current that is then sent through the computer allowing a processes to occur to the quartz rythm. Where it gets cool is that these crystals oscillate at a frequency measured in Mhz (my physics knowledge is very sub-par at the moment) which I had to find out means a million times a second. Now if you look at your computer's specs you will find you have a processer with a speed above 300Mhz sometime into the Ghz range, which means your processer can work with a speed of up to what ever frequency you have. Again, the higher the number the better.



Bus

 I've found with computing the really complicated things that seem to be named really badly, are actually named really well sometimes quite comically by the people that created them, you just have to understand what they do. (reminds me, I'll do a post on getting to know computing jargon soon as its a very jargon heavy subject) Once such computing component is the Bus. There are buses everywhere in your computer, they are what they sound like a passengers vehicle only the passengers in this instance are bits of data not people. Essentially Buses are the circuitry and chips that manage the transfer of data from one device to another. The USB (Universal Serial BUS) drive you use is a bus, well it has a bus on it that connects whatever memory device attached to it to the computers hardware you just plugged it into.




DSL

 Considering you are most probably connected via DSL right now or definitely have been in the past this one is one of those things you come across everyday without having any idea what it is. Luckily, its not complicated at all its more of a business acronym than a computing one. Digital Subscriber Line, what this means is that the 'line' you are using to connect to the internet right now is part of a digital circuit between your residency and a telephone company's central office allowing you access to high speed data transportation via exsisting twisted copper telephone lines. (technology has moved on a little bit since DSL but its still very relevant to most of us, including me)


A Port

Similar again to a Bus, (in that its a metaphor) meaning any place in the computer where data is transferred, generally, a place where one device usually represented by a cable or wire is physically connected to another device. Imagine boats in a port, they bring something when they dock. There are two types of port, a Serial Port and a Parallel Port, the serial port only allows one bit of information to be sent at a time because only one wire of path is used for data, whereas parallel ports allow for several bits of data usually at least 8 bits simultaneously.


TCP/IP

 I must of seen this acronym a million time and never payed it any attention, not realising that the only way I'm actually connected to the net right now is through TCP/IP. Transmission Control Protocol/Internet Protocol; it is actually a collection of methods used to connect servers on the internet and to exchange data. TCP/IP is a universal standard for connecting to the net.

IP address

This goes hand in hand with TCP/IP, your IP address is an identifier for a computing dvice on a TCP/IP network. Networks using the TCP/IP route messages based on the IP address of the destination. The format of an IP address is a 32bit numeric address written as four number separated by periods. The numbers are between 0 and 255.
 Looking something like this.

XXX.XXX.XXX.XXX

 There are a million and one ways to find your own IP address out, (click here) to let a website tell you yours.





Rasterizer

 Something I've always wondered about but again never bothered to find out, little bit unrelated to the general theme but interesting nonetheless. A Rasterizer is the software used in games that translate 3D geometry of 3D objects into a two dimensional bitmap that can be displayed on your 2-D screen.




Wednesday, 8 August 2012

Someone has 'got the t-shirt'

Well I was looking around for more advice and I found this 'reading list' set up by a guy who had a whole summer before starting univeristy in which to study and really accomplish something. This guy set out and went on a marathon reading session and essentially grounded himself in all areas of computer science (exactly what I am planning to do). The thread below is his recommendations to people in a thread, it eventually branches out into a huge which book for this subject, which book for that. This guy is a true autodidact.

 I do intend to copy exactly what he has done, but I intend to summarise what is to be gained from each book and try to teach you guys what I've learnt.

This thread is huge so click to expand it and have a good read, definitely worth it. Inspirational to.
 I will post links to all these books in another post. For now back to Ron White.
Peace.


-Edit, seems there is a cap on the file size for one image, so I will post the whole picture, page length by page length below. Sorry about that.

Start- Enjoy












END- :)

The Art of Assembly Language

As I mentioned previously I am going to start at well the start, which in this instance happens to be Assembly Language. After having a snoop around the net for some direction it seem thats the holy grail of Assembly Language knowledge is a massive book called the 'Art of Assembly Language'. (Link Below)

 I had a quick flick through and have determined that it is in fact exactly what I was looking for, massively out of my depth but introduced well and progresses relatively fast. It seems well written and understandable enough to really push myself and break those barriers of mental stagnation. Currently I am still finishing Ron Whites book as I detailed in a previous post but once that is done I am going to start some practical Html Learning and spending at least 3 hours every day making my way through the art of assembly language.



 Wish Me Luck.





Link To the Art of Assembly Language website:
 http://www.plantation-productions.com/Webster/www.artofasm.com/index.html


10,000 Hours left to go


"Since most of you are unemployed or students, you can easily spend 50 hours a week working on your skills in the field of your choice. If you did this, four years from now (e.g. the length of a college degree) you would be an expert in your chosen field. Literally, you would be ready to step up and become the next Steve Jobs, or Bill Gates, or crap out a PhD in mathematics, or be an awesome musician, or whatever.

Instead, you will spend the next 4 years racking up 10,000 hours of practice browsing internet forums. (Some of you will get the hours done in only two years). Think about what the last four years of that got you.

Now, what's your excuse for wasting your human potential?"



 An epic post I found the other day that really encouraged me to do this.



 As of today I will make a 10,000 Hours milestone, tallying each day my accumulated hours and put to the test the authenticity of the 10,000 hours mastery theory.


Tuesday, 7 August 2012

Day 1: Setting up a PoA (Plan of Action)

 Had a few very informative responses to the posts on 420chan about reccomendations for someone wanting to pursue a life mastering computing. Much appreciation for those responses.

 I found a timeless advice piece from a computing blogger who used to be a student of Computing at Yale. He offers a top 7 list of personal recommendations for anyone wanting to study computer science at a top-level.

http://www.joelonsoftware.com/articles/CollegeAdvice.html

His site is also very interesting to have a look around I will probably have another more serious look at his site when I have a bit more time.

 I'm still making my way through the 'How Computers Work' by Ron White book but have begun mapping out my PoA. Based on advice given in the posts (thanks again for that guys, especially big thanks to 'Basil Chuttingneg') I have decided to start my focused learning on assembly language.

 The reasoning behind this is :

  • Gets to grips with the 'real' computer part of computing
  • I will eventually be hindered by not knowing enough about assembly language and how compliers interact with the low level computer processes, so I am going to hit this hard now to make it a smoother ride later
  • Allows me eventually to have a massive advantage when developing my skills in 'C' and the similar languages due to a proper understanding of their functions interaction with the I/O devices of the computer


Obviously nobody learns an assembly language over night so instead of focusing soley on assembly language I've decided to make out of every day a 5hour 'practice' time. Within these 5 hours I intend to spend 3 of those hours learning assembly language from the ground up and 2 of those hour gaining practical experience in something instantly usuable such as xHtml and CSS. 

 I will be starting out by re-doing the Bucky Tutorials on xhtml and CSS, so expect a few posts about that coming soon.



Again guys thankyou very much for the pointers so far, if you have anymore advice or any personal experience you'd like to share i'll post the links to the 420 posts below.
http://boards.420chan.org/prog/res/24050.php   /PROG/
http://boards.420chan.org/tech/res/89403.php    /TECH/
http://boards.420chan.org/howto/res/30980.php /HOWTO/

Monday, 6 August 2012

How Computers Work; Ron White

I recently purchased the highly recommended 'How Computers Work' by Ron White. Having only read about 100 pages in, I am currently in no position to write a full book review I can however recommend this title to anyone interested in any element of computing from hardcore low-level binary operation to how a mouse works, every area is tackled and is approached in a heavily illustrated manner making it a gentle ease into the world of computing.



How Computers Work, Ron White

How to Become a Computing Master

I have recently been developing a personal 'life plan/challenge' in which I am going to devote my life to becoming an absolute master of computing; willing to put in over the recommended 10,000 hours required to become an expert in any field.

 I have decided as a kind of lasting mark for society I will document fully what it takes almost day-in day-out to go from being a normal human being with hardly any discipline to becoming a human being that can be considered a true master of an art. Now I obviously understand how absolutely immense the field of computing is so this is no small feat but I will stick at this for an entire lifetime and if it turns out to be proof that consistent hard work and persistence can truly create greatness regardless of natural talent, then we will have all learnt or re-confirmed something.


The original post on 420chan.

If you are interested in supporting me on this journey post some useful information in the comments. 
Thanks.