As promised my progress blog.
http://10000hourcountdown.blogspot.co.uk/
For those that do not know, read my first article on the 10,000 hour theory then my second one on getting started again. I essentially have this 'outline plan' to become a master of computing by spending 10,000 hours of my free time studying Computer Science. To make the 10,000 hours measurable, in a way that can be contributed to human knowledge as a whole I set up the 'sister-blog' linked to above that is essentially a work diary, detailing what I studied for how long and a brief overview of what I learnt in that time period.
Showing posts with label computer book. Show all posts
Showing posts with label computer book. Show all posts
Thursday, 24 January 2013
Tuesday, 8 January 2013
What is a float? - A float is a term that is constantly over complicated.
A float was something that used to really puzzle me. It was never properly explained on the video tutorials I used to follow when trying to pick up C. It used to constantly stump me and every definition I ever looked at just over-complicated the matter even further. For anyone else who is also baffled by a float, your misery ends below.
A float is literally just a number with a decimal point... Thats it.
The term float is an abbreviation of a floating point number. Floating point refers to the decimal point being able to 'float' between the digits of a number being defined. Simple really but if you follow the Wikipedia link you will see how unnecessarily over-the-top their definition becomes... well I take that back their definition is very precise and proper its just very hard to understand if you don't have a good backing in mathematics, which we all (aspiring computer scientists) should really have.
yep...
Just a little bit more information from me that I believe will help you; an integer can only hold single values and the way an integer is stored in your computers memory is different from a decimal number. A number with a decimal point is slightly more complicated to store and it therefore requires more space in the computer's memory than a plain integer. If you are trying to make efficient programs a little tip I learnt is that floats take up a a fair bit more memory in comparison to an integer so if every byte counts try and use integers rather than floats. Peace out and best of luck for your studying this year.
Labels:
code,
coding,
computer,
computer book,
computer memory,
computer science,
computers,
computing,
computing book,
float,
floating point,
floating point number,
how to program,
Learn computing
Wednesday, 12 December 2012
Everyone knows we all got into computing to make games
READ THIS:
http://news.quelsolaar.com/#post45
Awesome read from a one-man army that built an MMO in his spare time from scratch ON HIS OWN. The article attached discusses some of his tips from his personal blog. This guy is an absolute boss and as with most people that perform incredible feats of personal achievement happens to be a wise character.
Sunday, 28 October 2012
API - Application Programming Interface
The API is a term I have seen and heard a million times throughout my travels into the world of computer science and I always had a vague idea of its definition. I understood it mostly to be a kind of go-between for the user and a specific type of program.
Turns out I wasn't too far from the truth, an article I was recently reading summarised what the API is really well so I thought I'd share it with you.
Essentially the API is a set of pre-defined rules that programmers have to follow if they want their code to work on a certain application. For instance if you were on the web and you wanted to view a PDF file, you would have to have a 'plugin' that allows you to read PDFs within that browser. The programmer constructing that plugin however would of had to adhere to the rules set by that browser's API to create that functioning plugin.
APIs are literally everywhere, if you're using Windows right now you may notice how, in all the menus and boxes that come up the style of these interfaces are pretty much the same and that is due to the fact that Windows itself has a general API and again programmers wanting to create programs within windows will have to adhere to the pre-defined rules of that API, making all programs developed under that API have a similar feel.
So the next time you are considering writing code make sure you are aware of the API for the particular system you are trying to manipulate.
Turns out I wasn't too far from the truth, an article I was recently reading summarised what the API is really well so I thought I'd share it with you.
Essentially the API is a set of pre-defined rules that programmers have to follow if they want their code to work on a certain application. For instance if you were on the web and you wanted to view a PDF file, you would have to have a 'plugin' that allows you to read PDFs within that browser. The programmer constructing that plugin however would of had to adhere to the rules set by that browser's API to create that functioning plugin.
APIs are literally everywhere, if you're using Windows right now you may notice how, in all the menus and boxes that come up the style of these interfaces are pretty much the same and that is due to the fact that Windows itself has a general API and again programmers wanting to create programs within windows will have to adhere to the pre-defined rules of that API, making all programs developed under that API have a similar feel.
So the next time you are considering writing code make sure you are aware of the API for the particular system you are trying to manipulate.
Labels:
book,
book review,
coding,
cognitive,
computer,
computer book,
computer memory,
computer science,
computers,
computing,
computing book,
discipline,
How Computers Work,
Learn computing,
programmer,
programs
Wednesday, 24 October 2012
To Catch A Shooting Star
As I am starting to get back into computing after my break with it to learn about and solidfy personal discipline I came across an article that clarified something that used to puzzle me so much when 'playing around' with coding.
I always struggled with the fact that once a process produced its outcome, say 5+5, the answer would be produced and the program would either close or go onto its next process seemingly destroying the outcome of the previous process. Well it turns out this is to do with the fact that computers only do what they are told and if you told them to add '5+5', that's exactly what they are going to do. You didn't ask them to save that outcome so they obviously are not going to do that for you. What you have to do however is 'Catch a Shooting Star'. The shooting Star in this instance being the outcome from the process that is going to blink up on your screen then just disappear. To save this shooting star you have to hold on to it, hold on to it by storing it as a Variable.
Hopefully this produced one of those 'ohhhhhh yeah, of course' moments to you as it did me'. Knowing that if I want anything to continue existing in computing it has to be stored makes a huge difference to the way I view programming and I am feeling far more confident about not being confused about why my 'programs' would just blink up and then disappear. One thing that still puzzles me however is where does that '25' (from the '5+5' example) go, is it just erased?, was it ever stored?.
Anyway I hope this provides a little bit more clarity to understanding programming as it did for me.
Monday, 22 October 2012
How To Become a Hacker
Amazing article posted on the 420chan thread; the article discusses the mindset behind being a hacker and gives you a good insight into why most people bothered with computing in the first place. This has definitely re-kindled the fires of learning and has given me another positive reason to get heavily involved in the computing community.
Hopefully this will do the same for you.
As far as personal progress has been going I have started to create a diet and fitness regime that creates optimal conditions for the brain to function effectively. I am writing up the best tips and tricks that I have learned recently and will be posting them in here soon enough.
On a final note there has been a lot of talk regarding an online meeting place where we can also go and share info and progress etc.. In essence a hacking circle. I will be creating a Facebook group and posting the link on here and the 420 thread so if you are interested in joining up and sharing your experience watch this space.
Hopefully this will do the same for you.
As far as personal progress has been going I have started to create a diet and fitness regime that creates optimal conditions for the brain to function effectively. I am writing up the best tips and tricks that I have learned recently and will be posting them in here soon enough.
On a final note there has been a lot of talk regarding an online meeting place where we can also go and share info and progress etc.. In essence a hacking circle. I will be creating a Facebook group and posting the link on here and the 420 thread so if you are interested in joining up and sharing your experience watch this space.
(Just Posted a link in the latest article, in case you missed it: https://www.facebook.com/groups/360250364063011/)
Wednesday, 15 August 2012
The Von Neumann Model
The man you see before you is John von Neumann, an early computing pioneer. From what I've read so far I can establish that this guy is a pretty big cheese and is credited with the general model that computers are constructed by today. I plan to do a outline history of computing soon with all the major players and their contributions so if you were expecting an interesting article on all of von Neumann's contributions you will have to wait till I write that as this article is actually just an introduction to his 'model'.
John von Neumann
As far as writing about Neumann's life goes my knowledge of him is pretty slim at this moment in time, I will however, in the future plan to do an in-depth piece on him, for now however I am more concerned about his contribution to computer science and what that contribution means to me and you.
Essentially von Neumann established a model that computers should follow that was logical and effective in its design. This 5 part model describes the form your computer takes underneath all of its hardware and shiny cases.
Neumann states;
There are Five major components within the computer system:
The first is an Input Device
- This element sends data and information to the system. This information and data is then stored in the next component.
The Memory Unit
- The instructions and data are then processed by the next component.
The Aritmetic Logic Unit (ALU)
- The operation carried out within the ALU are carefully guided by the next component.
The Control Unit
- To which the results of the ALU and Control Unit's work is sent to the final component.
The Output Unit.
-Which would be your monitor or your printer.
This simplified breakdown of what a computer is made up of makes its a bit easier to grasp harder concepts in computer architecture as it allows you to picture in your heads the basic outline of the computers structure. This however is an unfinished model and if we were to add a bit of extra detail to make it a little more complete and a little bit more accurate to today's computing systems we would have to mention the System Bus Model.
As mentioned previously in another article (here) a bus is basically a method to quickly transport data from one component to another. The system bus model essentially partitions a computer into three sub-units:
CPU / MEMORY / I/O
These three sub units are the five units established in the von Neumann model only grouped by their purpose. This refinement of the von Neumann model combines the ALU and Control Unit into one functional unit - CPU. This model also combines the Input and Output units into a single I/O unit.
The system bus links all these components on a shared pathway made up of:
The Data Bus - Carries the information being transmitted
The Address Bus - Identifies where the information is being sent
The Control Bus - Describes aspects of how information is being send and in what manner
:)
-As I have mentioned previously I will update older articles as my understanding of the concepts within develops so if you find this a rather breif introduction to von Neumann and the basic form of the computer, do not worry I am learning all the time and will update when I have found anything considerable to add. If you have anything else you believe to be worthwhile drop us a comment and I'll add it in. Thanks.
Tuesday, 14 August 2012
Time to tick off those hours
I've decided that trying to tackle all of computer science in one go with no real structure for someone like me is incredibly futile so I am going to structure my learning around completing a series of online lectures. Obviously biting off more than I can chew left me escaping my responsibilities in a bubble of gaming and procrastination, so I am going to ease myself into a good cycle by forcing myself to watch just one lecture a day from a selected series. As I get more confident with myself I am going to up the pace, but for now I can manage just one video a day.
As I have stated previously I am going to be starting with computer architecture and pad out the theoretical learning with practical experience in Assembly language. I have a layman's understanding of computer architecture but I would still consider myself a complete newbie to this topic area. With everything being essentially new I am going to take this slow and do one lecture then make sure I fully get it, summarise what I know into an interesting post, then tackle the next lecture.
The Lecture series I am going to follow is an old one from 1996 but is recommended as a great introduction to computer architecture. It follows the book 'Structure and Interpretation of Computer Programs' - (The full book can be found here, for free). The sites description of the lectures is as follows;
- 'These twenty video lectures by Hal Abelson and Gerald Jay Sussman are a complete presentation of the course, given in July 1986 for Hewlett-Packard employees, and professionally produced by Hewlett-Packard Television. '
http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-001-structure-and-interpretation-of-computer-programs-spring-2005/video-lectures/
As I have stated previously I am going to be starting with computer architecture and pad out the theoretical learning with practical experience in Assembly language. I have a layman's understanding of computer architecture but I would still consider myself a complete newbie to this topic area. With everything being essentially new I am going to take this slow and do one lecture then make sure I fully get it, summarise what I know into an interesting post, then tackle the next lecture.
The Lecture series I am going to follow is an old one from 1996 but is recommended as a great introduction to computer architecture. It follows the book 'Structure and Interpretation of Computer Programs' - (The full book can be found here, for free). The sites description of the lectures is as follows;
- 'These twenty video lectures by Hal Abelson and Gerald Jay Sussman are a complete presentation of the course, given in July 1986 for Hewlett-Packard employees, and professionally produced by Hewlett-Packard Television. '
Labels:
architecture,
art of assembly language,
computer,
computer book,
computer science,
computers,
computing,
computing book,
documentary,
HP,
lecture,
lectures,
MIT,
online,
programs,
structure,
tutorials
The Spirit of Computing
An amazing quote about what computing should be for programmers;
``I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries. Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands. What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more.''
Alan J. Perlis (April 1, 1922-February 7, 1990)
Labels:
code,
cognition,
cognitive,
computer,
computer book,
computer science,
computers,
computing,
computing book,
develop,
discipline,
geek,
How Computers Work,
how to,
master,
mastery,
nerd,
spirit,
von Neumann
The Article that started it all.
If you enjoy what is being attempted the link below is the initial page where I got the idea and the logic behind the 10,000 hours.
Incase anyone was wondering what the hell has happened I've hit a major discipline low and have spent the last three days literally fucking around. I'm still on my 9,998 hours and still struggling to understand that floating point definition.
I am currently sucking hard. The intention is strong the discipline is not.
I want this, I really do.
I'm going to start listening to this wolf.
Friday, 10 August 2012
Advice from a Computing Professional
Not much to report by the way of personal progress, I've been giving in to my weak discipline again and playing Starcraft and DayZ all day. Wasted a potentially good day of learning so my technical hours still stand at 9,998/10,000. Which after three days is terrible and testament to the dangers of not heavily discipling your mind and your body when attempting any kind of 'me vs. the odds' type quest. Quite frankly its depressing to go from being so eager to for no conceivable reason being a lazy waste of time. Oh well this day is gone and I have tomorrow to redeem myself. Gotta be up really early for work so I plan to get some crunching of Computer Architecture in before work then hit it hard after work, maybe start exercising aswell as I can't be ignorant to the facts surrounding the saying 'healthy body, healthy mind'.
Sorry to disappoint you if you came today expecting something worth your while from me today, I have such terrible discipline. I suppose my crap discipline makes this quest all the more epic, if I ever finish it. For now anyway, some advice from someone who managed to get his head down and work his way up to a good position in a company.
Valuable advice if you are planning on entering into a computing career.
Enjoy -
"I've been hanging around here for a while and it seems to me like there's a lot of people on here who are still in school or are in the early stages of a career as a developer. I thought it would help some of you guys to have a thread where you can get the perspective of a long time software development leader about what we look for when hiring, promoting, etc.
As far as my credentials go, I won't say who I work for just that it's a massive company. I manage a team of 105 programmers working across ~40 project teams. Based on lines of code written my teams work in HTML/CSS/JavaScript, PHP, C#, Java and Python most often, with a bit of F#, Ruby and a few others I'm probably forgetting in there. I'm a 15 year vet, the majority of my team are guys who are just out of college or have a few years experience.
That said, here's my top 3 things you can do to get and keep a job:
1) Be Language Agnostic
When I'm hiring there's a 50% chance that I don't REALLY care what languages you've written in before, just that you're familiar with the language I need you in and can get up and running in general. Since most of our projects are short turn around items, onboarding takes a long time relative to how long the project will last (e.g. 3 weeks of onboarding on a 6 month project). Also, be flexible... I can't tell you how many college kids I just fucking walk out of my office because they tell me all about how Lisp is the greatest language ever invented and we're wrong to be using anything else, which brings me to point 2
2) Be Humble
That kid who tells me we should be using Lisp is wrong. You know how I know he's wrong? Because MY TEAM IS SUCCESSFUL. Again, I can't tell you how shockingly shitty most young guys act in that first interview. Obviously once you're on the team if you think we should switch something I ABSOLUTELY want to hear your idea but make sure it makes sense (and is demonstrably better) and don't get all butthurt if I don't agree. We work based on what the developers pitch to me and we decide as a group is the right play, which backs me into point 3
3) Remember that you're a fucking unicorn
You are the aberration here, your non technical managers, bosses, finance people, HR people, NOBODY in the company understands what the fuck it is you do. You may as well be named Merlin to these people. My job (to crib a line from Jay Mohr) is to not let management spook the thoroughbred. Your part in this is to be that thoroughbred AT ALL TIMES and to remember that a thoroughbred just KNOWS that it's a thoroughbred, when that belief is strong enough, other people will get it naturally. Carry yourself like a boss and you'll be a boss."
Remember, be the Unicorn :)
Labels:
computer,
computer book,
computer science,
computers,
computing,
computing book,
develop,
discipline,
documentary,
gaming,
geek,
How Computers Work,
internet,
language,
programmer,
roberts
Thursday, 9 August 2012
A little list (9,998)
I'm currently coming to the end of the How Computers Work book by Ron White so decided to create a little list of a few terms that have always confused me when messing with my pc.
As I haven't ever really got into the nitty gritty of my computer before a lot of these terms are internet based as thats where I've had the most problems and run-ins with requests asking for things I didn't even realise my computer had.
(Also if any of these are wrong or just explained incorrectly do let me know and I'll do my best to fix them, enjoy)
Firstly something so important to the performance of your pc.
RAM
*n.b there is RAM and ROM (RAM is writeable ROM is not, I'll explain later)So yes RAM, Random Access Memory, RAM is a collection of microchips which the computer uses to store data and programs whilst it uses them. Writing is the process by which the computer stores data on RAM (Reading out of interest is when the computer transfers data or software code from RAM). The bigger the RAM capacity the better.
A picture of a ram, not a picture of ram
Clock speed and 'Overclocking'
Having browsed YouTube numerous time I occasionally stumble across random videos of some computer guy over clocking his Pc and showing various results, none of which I at the time appreciated. Essentially what 'over clockers' are doing is increasing the computer speed by 'hacking' its clock. The computer's clock is a microchip that regulates the timing and the speed of all computer functions. How it does this is by passing an alternating current through a Quartz crystal which responds with its own natural resonance (vibration), creating in turn another alternating current that is then sent through the computer allowing a processes to occur to the quartz rythm. Where it gets cool is that these crystals oscillate at a frequency measured in Mhz (my physics knowledge is very sub-par at the moment) which I had to find out means a million times a second. Now if you look at your computer's specs you will find you have a processer with a speed above 300Mhz sometime into the Ghz range, which means your processer can work with a speed of up to what ever frequency you have. Again, the higher the number the better.Bus
I've found with computing the really complicated things that seem to be named really badly, are actually named really well sometimes quite comically by the people that created them, you just have to understand what they do. (reminds me, I'll do a post on getting to know computing jargon soon as its a very jargon heavy subject) Once such computing component is the Bus. There are buses everywhere in your computer, they are what they sound like a passengers vehicle only the passengers in this instance are bits of data not people. Essentially Buses are the circuitry and chips that manage the transfer of data from one device to another. The USB (Universal Serial BUS) drive you use is a bus, well it has a bus on it that connects whatever memory device attached to it to the computers hardware you just plugged it into.DSL
Considering you are most probably connected via DSL right now or definitely have been in the past this one is one of those things you come across everyday without having any idea what it is. Luckily, its not complicated at all its more of a business acronym than a computing one. Digital Subscriber Line, what this means is that the 'line' you are using to connect to the internet right now is part of a digital circuit between your residency and a telephone company's central office allowing you access to high speed data transportation via exsisting twisted copper telephone lines. (technology has moved on a little bit since DSL but its still very relevant to most of us, including me)A Port
Similar again to a Bus, (in that its a metaphor) meaning any place in the computer where data is transferred, generally, a place where one device usually represented by a cable or wire is physically connected to another device. Imagine boats in a port, they bring something when they dock. There are two types of port, a Serial Port and a Parallel Port, the serial port only allows one bit of information to be sent at a time because only one wire of path is used for data, whereas parallel ports allow for several bits of data usually at least 8 bits simultaneously.TCP/IP
I must of seen this acronym a million time and never payed it any attention, not realising that the only way I'm actually connected to the net right now is through TCP/IP. Transmission Control Protocol/Internet Protocol; it is actually a collection of methods used to connect servers on the internet and to exchange data. TCP/IP is a universal standard for connecting to the net.IP address
This goes hand in hand with TCP/IP, your IP address is an identifier for a computing dvice on a TCP/IP network. Networks using the TCP/IP route messages based on the IP address of the destination. The format of an IP address is a 32bit numeric address written as four number separated by periods. The numbers are between 0 and 255.Looking something like this.
XXX.XXX.XXX.XXX
There are a million and one ways to find your own IP address out, (click here) to let a website tell you yours.
Rasterizer
Something I've always wondered about but again never bothered to find out, little bit unrelated to the general theme but interesting nonetheless. A Rasterizer is the software used in games that translate 3D geometry of 3D objects into a two dimensional bitmap that can be displayed on your 2-D screen.
Labels:
a picture of a ram,
code,
coding,
computer,
computer book,
computer science,
computers,
computing,
computing book,
develop,
How Computers Work,
nerd,
program,
programmer,
ram picture,
ron white,
tutorials,
work
Wednesday, 8 August 2012
Someone has 'got the t-shirt'
Well I was looking around for more advice and I found this 'reading list' set up by a guy who had a whole summer before starting univeristy in which to study and really accomplish something. This guy set out and went on a marathon reading session and essentially grounded himself in all areas of computer science (exactly what I am planning to do). The thread below is his recommendations to people in a thread, it eventually branches out into a huge which book for this subject, which book for that. This guy is a true autodidact.
I do intend to copy exactly what he has done, but I intend to summarise what is to be gained from each book and try to teach you guys what I've learnt.
This thread is huge so click to expand it and have a good read, definitely worth it. Inspirational to.
I will post links to all these books in another post. For now back to Ron White.
Peace.
-Edit, seems there is a cap on the file size for one image, so I will post the whole picture, page length by page length below. Sorry about that.
Start- Enjoy
END- :)
I do intend to copy exactly what he has done, but I intend to summarise what is to be gained from each book and try to teach you guys what I've learnt.
This thread is huge so click to expand it and have a good read, definitely worth it. Inspirational to.
I will post links to all these books in another post. For now back to Ron White.
Peace.
-Edit, seems there is a cap on the file size for one image, so I will post the whole picture, page length by page length below. Sorry about that.
Start- Enjoy
END- :)
Labels:
book,
book review,
code,
coding,
computer,
computer book,
computer science,
computers,
computing,
computing book,
discipline,
how,
How Computers Work,
how to,
language,
master,
mastery,
reading,
reading list,
ron white
Friday, 6 January 2012
Human reasoning errors
Having been quite an introspective person naturally I began reading into how to improve cognition, basically make my mind more efficient. The main method through both experience and 'fact' is meditation (I use the term fact loosely as its in part borderline 'pseudo-scientific' claims and theories). I could rant and rave all day about the therapeutic benefits of mediation but this is a blog about computing so I'd recommend you to have a look up yourself, I think it helps sort your head out in all kinds of good ways.
Back to the topic at hand, human reasoning errors, what I mean by this is how we make those super fast judgements and find alot of them to be massively wrong and some of them to be completely correct, with no seeming order to this process. So I went on an internet adventure exploring what is really going on when these weird processes happen with no seeming conscious control. It turns out conveniently enough that the mind is very similar to a computer in that it stores data based on past experience and relays them automatically to fit similar situations in the present. Sometimes this saves time sometimes it doesn't. This process is labelled under a subject known as Heuristics.
Incredibly interesting article explaining the reasoning behind heuristics and cognitive biases;
A quick summary from this article; essentially the human brain has so much sensory input to handle that it has its own coping mechanisms which aid it in its daily function. The mechanisms are based, so we believe, on a kind of complex association system in which a certain action (e.g opening a door) is stored in the long term memory and when a similar scenario is encountered again for example reaching another door, the mind automatically assumes that this door must be like the 'door' object of previous encounters so the same principles are assumed to apply and the same action is therefore executed. Whilst this system is very economical and efficient it obviously has numerous limitations.
An example demonstrated in the text is 'a group of people asked to work out a sum by way of a 5second guesstimate .The example shows just how we will engage our brains to apply previous rules and assumptions to the problem.
The Question:
Consider the product of the series:
9 x 8 x7 x 6 x 5 x 4 x3 x 2 x 1 = ?
vs.
1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 = ?
The typical answers:
9*8*7*6*5*4*3*2*1 'usually comes out at an average of 4000'
however,
1*2*3*4*5*6*7*8*9 'usually comes out at around 500'
The actual answer is 362,880.
This very basic example shows just how the association function is sometimes limiting and how easy it is to engage it.
The article also explains the different categories of heuristic failure and when and how they are limiting. Examples such as logic assumptions and ignorance to statistics due to emotive association (terrorism vs cancer).
Worth a re-read every now and then to avoid problem solving genercism as it will occasionally be defeatist.
Shown below is the cognitive process by which people are encouraged to avoid bias.
Back to the topic at hand, human reasoning errors, what I mean by this is how we make those super fast judgements and find alot of them to be massively wrong and some of them to be completely correct, with no seeming order to this process. So I went on an internet adventure exploring what is really going on when these weird processes happen with no seeming conscious control. It turns out conveniently enough that the mind is very similar to a computer in that it stores data based on past experience and relays them automatically to fit similar situations in the present. Sometimes this saves time sometimes it doesn't. This process is labelled under a subject known as Heuristics.
Incredibly interesting article explaining the reasoning behind heuristics and cognitive biases;
A quick summary from this article; essentially the human brain has so much sensory input to handle that it has its own coping mechanisms which aid it in its daily function. The mechanisms are based, so we believe, on a kind of complex association system in which a certain action (e.g opening a door) is stored in the long term memory and when a similar scenario is encountered again for example reaching another door, the mind automatically assumes that this door must be like the 'door' object of previous encounters so the same principles are assumed to apply and the same action is therefore executed. Whilst this system is very economical and efficient it obviously has numerous limitations.
An example demonstrated in the text is 'a group of people asked to work out a sum by way of a 5second guesstimate .The example shows just how we will engage our brains to apply previous rules and assumptions to the problem.
The Question:
Consider the product of the series:
9 x 8 x7 x 6 x 5 x 4 x3 x 2 x 1 = ?
vs.
1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 = ?
The typical answers:
9*8*7*6*5*4*3*2*1 'usually comes out at an average of 4000'
however,
1*2*3*4*5*6*7*8*9 'usually comes out at around 500'
The actual answer is 362,880.
This very basic example shows just how the association function is sometimes limiting and how easy it is to engage it.
The article also explains the different categories of heuristic failure and when and how they are limiting. Examples such as logic assumptions and ignorance to statistics due to emotive association (terrorism vs cancer).
Worth a re-read every now and then to avoid problem solving genercism as it will occasionally be defeatist.
Shown below is the cognitive process by which people are encouraged to avoid bias.
If this doesnt make immediate sense, consult your mind in an introspective manner, or read the article
Labels:
assembly,
bias,
biases,
code,
coding,
cognition,
cognitive,
computer,
computer book,
computer science,
computers,
computing,
computing book,
discipline,
heuristics,
language,
programmer,
reading,
reading list
Thursday, 5 January 2012
Creating the foundations
I want to become a certified Computing Master.
I want to become a certified Computing Master because I am currently a very underachieving, undisciplined, average Joe that has a decent enough mind to understand everything I apply that mind to but consistently wastes his potential and has no shortage of people telling him he is doing so. I decided I need to break my pointless cyclical lifestyle and have made the choice to do something that will make me or break me as a human.(Well maybe not break me physically but you get the idea; something that if I don't reach it I will be a serious failure to myself). So I had a look around the world I live in and picked one of the hardest yet useful skills available to me and decided as both a personal/spiritual quest to accomplish something that right now would seem impossible; 'earning a Phd in Computer Science within the next decade'. Whilst in ten years this may not seem like an impossible challenge, for me, being a lazy normal guy that never was considered anything special academically, this is a big mountain to climb.
I feel I have the motivation, in that I have been feeling pumped to do this for at least a year (an example of how bad my discipline is, thinking something and wanting something for a whole year without ever properly acting on it) I have decided now to kick myself into gear and start doing it.
Obviously I am aware of time and realise that to achieve something like this I need to put every spare hour under the sun to productive use if I am to be anywhere near my goal within the time limit set. I need a big plan, like a check-list of everything developed so far in the field so that I don't just 'get a Phd.' but really contribute something incredibly useful for the human race as a whole.
To be a certified Computing Master.
So I am going to develop this blog as a progress blog, documenting where I am at, where I am going and to summarize all the knowledge gained so far on my journey. Also I want it to be another example to people that extraordinary people are just normal people that never gave up.
I like that last line, think I might get it tattoed on my face.
I want to become a certified Computing Master because I am currently a very underachieving, undisciplined, average Joe that has a decent enough mind to understand everything I apply that mind to but consistently wastes his potential and has no shortage of people telling him he is doing so. I decided I need to break my pointless cyclical lifestyle and have made the choice to do something that will make me or break me as a human.(Well maybe not break me physically but you get the idea; something that if I don't reach it I will be a serious failure to myself). So I had a look around the world I live in and picked one of the hardest yet useful skills available to me and decided as both a personal/spiritual quest to accomplish something that right now would seem impossible; 'earning a Phd in Computer Science within the next decade'. Whilst in ten years this may not seem like an impossible challenge, for me, being a lazy normal guy that never was considered anything special academically, this is a big mountain to climb.
I feel I have the motivation, in that I have been feeling pumped to do this for at least a year (an example of how bad my discipline is, thinking something and wanting something for a whole year without ever properly acting on it) I have decided now to kick myself into gear and start doing it.
Obviously I am aware of time and realise that to achieve something like this I need to put every spare hour under the sun to productive use if I am to be anywhere near my goal within the time limit set. I need a big plan, like a check-list of everything developed so far in the field so that I don't just 'get a Phd.' but really contribute something incredibly useful for the human race as a whole.
To be a certified Computing Master.
So I am going to develop this blog as a progress blog, documenting where I am at, where I am going and to summarize all the knowledge gained so far on my journey. Also I want it to be another example to people that extraordinary people are just normal people that never gave up.
I like that last line, think I might get it tattoed on my face.
Labels:
code,
coding,
cognition,
cognitive,
computer,
computer book,
computer science,
computers,
computing book,
develop,
heuristics,
How Computers Work,
master,
mastery,
nerd,
new boston,
programmer,
to be,
tutorials
Subscribe to:
Posts (Atom)