Chapter 0: Why Do Programmers Start Counting at Zero?

[Note: This is a sample of my upcoming book Brown Dogs and Barbers. Please be aware that this text is subject to change and that diagrams are only placeholders.

If you'd like to see this book become a fully illustrated and professional book, why not consider donating?]

I’d like to begin this book about computer science by asking you about your toaster. If I asked you to tell me how your toaster worked, I bet you’d have no trouble coming up with a decent explanation. Initially, you might claim you have no idea, but I’m sure a moment’s thought would yield a good description. Even in the worst case, you could actually look inside a toaster and deduce what was happening. Perhaps then you’d be able to tell me all about how electricity causes the filaments to heat up and that heat radiates onto the bread or the crumpet or whatever, causing it to cook.

If I were to ask about how a car worked, that might be more challenging. Again, you might instinctively feel that a car’s workings are a mystery to you. But even then, if you stop and think about it, you might recall a few vague terms that help out. Perhaps you could tell me about how petroleum is stored in the car’s tank and when you press the footpedal, the fuel is drawn into the engine where it’s ignited. Then you’d go on and tell me that this action drives the pistons… or something… and they turn the… I think it’s called the crankshaft… which is connected to the wheels and makes them turn. That’s what I would say anyway, and I know virtually nothing about how cars actually work.

I’m guessing all this without even knowing you, your occupation or your interests. True, you might be an engineer or a physicist for all I know, and able to give better explanations, but the chances are that you’re not. My point is, even if you have only the merest passing interest in science and technology, I’m confident that you comprehend things like toasters and cars enough to give half-decent explanations of them. Understanding things like these comes partly from school learning where, even if you sat spaced out during physics lessons, you still picked up some of that stuff about electricity and internal combustion engines. And let’s not underestimate how ingrained on our popular consciousness these concepts are. The people around us talk about the workings of everyday technical items all the time, so some of it is bound to stick with us whether we realise it or not.

But computers are different. Many of us haven’t got the first clue how computers work. Think about it. Could you tell me how the individual components in your computer work together? Could you even name any of the components? I’m certain some of you could, but I’m just as sure that a lot more people couldn’t even begin to explain a computer. To some, it’s a kind of magic box that sits under the desk and somehow draws letters and images on the monitor screen at breathtaking speed.

Let’s get one thing straight: I wouldn’t blame you for being unable to offer an explanation, because there are several reasons why you shouldn’t be expected to know about computers. One very important reason, again, is schooling. In many countries, computer science is not taught as part of general education. In my own country of birth (the United Kingdom), computing education has for many years meant nothing more than learning how to use word processors and spreadsheets; important skills to be sure, but this is definitely not computer science, a topic that studies at a fundamental level how to use mathematical principles in the solving of problems. The great majority of children leave school having learned to be passive users of computers at the most and many people are currently asking why such an important area of knowledge is absent from the curriculum.

The mystery surrounding computers is a problem that’s only becoming worse over time. When computers first arrived they were monstrous things bigger than a family-sized fridge and kept in huge, environmentally-controlled rooms. Their job was usually to carry out boring tasks like process tax returns and payrolls; tasks that anyone could do by hand, albeit a lot slower. They had banks of flickering lights that lit up when the machines were “thinking”; spools of tape mounted on the front spun around, indicating that the computer was looking in its databank; some were even partly mechanical, clicking and tapping uproariously when the numbers were being crunched. Yes, they were still mysterious — but today it’s even worse.

Computers are no longer just mysterious — they’re magical.

Today’s computers are a million light years ahead of their early ancestors. Nowadays they’re small, sometimes able to fit into the palm of your hand. How can something so tiny do such impressive things? They’re also ubiquitous, having gone far beyond their original, humble number-crunching duties until they organise every aspect of our lives. As a result they’ve become utterly unknowable. Today’s computer is an impersonal black box that gives no hint as to its workings. Of course, there’s a user interface that allows us mere humans to operate the computer, but one main purpose of a modern user interface is actually to hide the internal workings of the machine as much as possible. There are few external indicators about what’s really happening inside. Without moving parts (apart from the cooling fan, which I assure you performs no calculations) and with internal components that give no visible clue as to what they’re doing, it’s become impossible to try and deduce how a computer works by examining it. So advanced and unknowable have computers become, they may as well operate on principles of magic.

But there are genuinely knowable principles upon which computers operate. We find things that pump, rotate or burn easier to understand, because physical principles are more intuitive to us. In contrast, the driving principles behind computers are mathematical and thinking in these terms comes harder to humans. There are some physical principles involved, of course. Your computer contains various things —
circuit-boards, wires and chips — which all function according to good old-fashioned physics. But (and I don’t mean this to sound dismissive), those are merely the computer’s hardware. In computer science, there is a sharp and critical distinction between the physical machinery that performs the work (the hardware) and the mathematical principles which allow it to do anything meaningful. These principles make up the field of computer science. In theory, you can build computers out of all sorts of weird and wonderful parts, be they mechanical, electronic, or even water-powered. Yet, however a computer is implemented, it must work according to the principles of computer science in the same way that every car’s internal combustion engine, as varied as they are, all work according to the relevant laws of physics.

Hardware gets mixed up with the field of computer science. I’m pretty laid back about that, but some purists like to emphasise the strict division between the machinery and the principles. Roughly speaking, this corresponds to a separation between hardware and software. Software, a word I’m sure you’ve heard before, is the collection of programs which computers run and the concept of a program goes to the heart of computer science. Unfortunately, programs are a little hard to define, but rest assured that you’ll come to understand what a program is over the course of this book. What makes them tough to penetrate is that they’re nebulous, abstract things rooted in mathematics, a subject that’s a sort of parent to computer science. Programs have numerous legacies through this inheritance. Like mathematics, programs don’t really exist in a physical sense. They’re conceptual things, ideas that exist in programmers’ minds which are only given substance after they’re written down.

This inheritance from mathematics explains many things. It explains why programs look like jumbles of mathematical formulae. It explains why computer science attracts so many nerdy folks who are good with numbers. And it explains why programmers count up from 0 instead of 1 like the rest of the human race. Maybe you’ve noticed that? You might look through some of the programs on your computer and find a new one labelled version 1.0 . Why 1.0 ?

OK, you might say, after a program is updated the author appends a number to the version to make it clear. After the initial version is updated several times we progress through versions like 1.4 to 1.5 to 1.6 and so on. I get that. But why start at 1.0 ? Why not 1.1 ? And why, when I upgrade to the second version, is that called version 1.1 ?

You’d also find this peculiarity were you to read through the contents of a computer program. If you watch a race on TV, then at the end you’d say that the winner came in position 1, the runner-up in position 2 and so on. If you ask a programmer to write a program for processing the race, the results would begin with the winner assigned position 0 instead and the runner-up in position 1. To a programmer, the hero is a zero.

Counting up from zero, which instinctively seems unnatural, actually simplifies matters when you deal with lists of things. In these cases, counting up from 1 can cause confusion. For instance, have you ever stopped to think why the years of the twentieth century all began with 19 and not 20 ? It’s something that often trips up little kids (and occasionally big ones too). Why was the year 1066 part of the eleventh century and not the tenth?


Figure 1: Two buildings with different floor numbering schemes.

To explain, let’s look at an example of counting up from 0, because we all do that occasionally whether we realise it or not. In some parts of the world, the bottom floor of a building is called the ground floor and the next one up is the first floor. In this case, the ground floor could just as easily be called the zeroth floor. Similarly, when programmers refer to specific items in a list (which they do a heck of a lot), they often need to calculate the position of an item in that list by offsetting it from a base position. This base item is labelled number 0. Working out a position when a list is arranged like the floors in a building makes things a little simpler. Floor 3 (or the third item) is three above the ground floor (or zeroth item). If the ground floor were floor 1, then the third floor would be two above the ground floor. This is visualised in Figure 1. We count centuries similarly to the left-hand building. Because we count centuries up from the one (the years 1 to 100 were the first century, not the zeroth century), we then have to remember that centuries don’t match with the years within them. It’s only a small confusion, but working out positions in a list is done so often that little hiccups like this can actually cause more problems than you think.

With this explanation, you’ve hopefully just learned something new about computer science. I know it’s only trivial, but nevertheless it shows you something about the subject and explains why that something is the way it is. This example is just the tip of the iceberg, so there’s much more complex and interesting stuff still to come. Computers are complex things, more so than any other machine we’re likely to use on a daily basis. Unfortunately, they remain mysterious to many people. For many of us, our relationship with computers is one of bemusement, frustration, and fascination, all experienced at arm’s length. We sometimes even find ourselves as the servile member in the relationship, desperately reacting to the unfathomable whims of our computer trying to make it happy. This is not the best state of affairs to be in if we’re going to be so reliant on them in our everyday lives. It doesn’t have to be this way. If our relationship with computers is sullied by their mysteriousness, the answer is simple: learn more about them. And I don’t mean learn how to make spreadsheets.

To understand what’s going on in that magic box beneath your desk, we’ll look in this book at the science behind it.

This book presents you with the core ideas of computer science. By reading them you will learn about the subject’s history, its fundamentals and a few things about its most pertinent protagonists. Understanding them will help to demystify the machine. Each chapter can be read as a self-contained unit, but nevertheless, they have all been written together in a way that reading from start to finish is like a story. They vaguely follow a chronology and each chapter builds gently on preceding ones. It’s your choice.

However you choose to read it, this book will take you from the earliest beginnings of mechanical computation and show you how we arrived at today’s world of the magical and ubiquitous electronic computer. You will learn of the monumental problems that faced computer scientists at every stage. You will see how they developed ingenious solutions which allowed the field to progress. And you will observe how progress leads to both new opportunities and new problems.


The ideal gift for your computer-illiterate loved one

I’m running a crowd funding campaign right now to raise money for the publication of my book, Brown Dogs and Barbers. It is a popular science title that explains the fundamentals of computer science so that anyone can understand them.

Although it’s proven informative even to IT veterans (read what Professor Cornelia Boldyreff kindly wrote about it), it’s particularly aimed at beginners. I’ve shared drafts with readers who are firmly outside the computing sphere and the response has been very encouraging. Despite the topic being new to them, my test audience got the hang of the concepts I discuss, concepts that go to the heart of computer science.

So, if you already work in IT, there’s every reason to be interested in it. Furthermore, if you have friends of relatives who puzzle over what exactly you do for a living and pester you to explain what you do all day long, you might consider Brown Dogs and Barbers an ideal gift. After they’ve read it, the recipient will have gained an understanding in the fundamentals of your subject and won’t harass you any longer… either that, or they’ll have a hundred more questions for you, their appetite suitably whetted.

In fact, my crowd funding campaign has the ideal perk for you. If you contribute €60 (that’s about $80 US or £50 UK), you’ll get two signed advance copies of the book, one of them already gift-wrapped ready for you to give as a present. The book scheduled to be ready in June, so it would arrive just in time to supply some summer holiday reading.

Go over to the crowd funding page and contribute today.

Learning about computers – The motivation behind Brown Dogs and Barbers

In the opening chapter of my new book, Brown Dogs and Barbers (which explains how computers fundamentally work in a way anyone can understand it), I talk about part of my motivation for writing it. After pointing out that most people can quite easily understand many forms of technology (toasters, cars etc.), I contrast this with computers:

“For many of us, our relationship with computers is one of bemusement, frustration, and fascination, all experienced at arm’s length. We sometimes even find ourselves as the servile member in the relationship, desperately reacting to the unfathomable whims of our computer trying to make it happy. This is not the best state of affairs to be in if we’re going to be so reliant on them in our everyday lives. It doesn’t have to be this way. If our relationship with computers is sullied by their mysteriousness, the answer is simple: learn more about them… To understand what’s going on in that magic box beneath your desk, we’ll look in this book at the science behind it.”

I believe that by learning about the scientific principles behind computers, we put ourselves in a much stronger position: informed, confident, and empowered.

While perusing one of my favourite authors, Ben Goldacre, I found we share similar sentiments in this regard. In his excellent book Bad Science Ben explains how an ignorance of science can have negative impacts.

“Fifty years ago you could sketch out a full explanation of how an AM radio worked on the back of a napkin, using basic school-level knowledge of science… When your parents were young they could fix their own car, and understand the science behind most of the everyday technology they encountered, but this is no longer the case. Even a geek today would struggle to give an explanation of how his mobile phone works because technology has become more difficult to understand and explain, and everyday gadgets have taken on a ‘black box; complexity that can feel sinister, as well as intellectually undermining.”

Today’s mobile phones are not phones – they’re computers with an antenna attached to them. And it’s not just phones; computers have crept into most modern technology, rendering them much harder to understand. This is not going to go away. If anything, it’s going to intensify with some truly staggering applications of computers on the horizon (self-driving cars, anyone?).

By making sure people have a basic understanding of computing principles, we can dispel the ignorance, the suspicion and the frustration.

I offer my book as one place to start. Please help me crowdfund the publication process so I can make it available to everyone.


Brown Dogs and Barbers – A computer science book for everyone

I’m very excited about my latest project. I’ve written a book explaining the fundamentals of computer science that requires no IT expertise at all to understand. My main motivation is to provide an easy to read work of popular science for an everyday reader who’s interested in computer science, although I’m confident that experienced computing folk will find it interesting too.

I’m going to self-publish it and for that I need several things to make it a professional piece of work. These all need paying for, so I’ve launched a crowdfunding project at Indiegogo to cover the costs. Time for the hard sell…

What’s this all about?

Computers are a huge part of our lives. They are everywhere powering so much of what we do.

And yet, how well do we understand them or how they became so ubiquitous? We take computers for granted but many of us don’t appreciate the fascinating ideas behind them. If you look closely, there is a rich trail of puzzles that had to be solved to make them what they are now.

I’ve written a book, Brown Dogs and Barbers, which explains how the ideas of computer science developed throughout history.

When you read this book, you will join me on a journey through the story of computing, discovering the basic principles of what makes the machines tick and learning why computers work they way they do.

Who is the book for?

I would like to make computer science accessible to all. Brown Dogs and Barbers is a work of popular science aimed at both beginner and experienced alike, no expertise required with as little in the way of formulas and code as possible.

If you are a beginner you will get an introduction to the fascinating world of computer science. If you are experienced you can enjoy reading about your field from a different perspective and perhaps learn a new thing or two. It would also make a great gift for an IT worker’s friends and family who haven’t got a clue what it is they do all day.

In any case, you will develop an understanding of the puzzles and theories behind computers, and meet some of the characters who have steered computing over the centuries.

Why me?

I’m a big fan of reading about science. Whenever I go into a bookshop, I’m dismayed to see that the popular science section hardly ever seems to carry titles explaining my subject – computer science – to the masses.

I’m trying to fill this gap with my book. Brown Dogs and Barbers examines some of the foundational concepts of computing. I can still remember the stumbling blocks I encountered when I first learned about these fascinating ideas, so my book strives to light the path so you may avoid them. I’m also a PhD-level computer scientist, an experienced teacher and a published writer on IT and computing topics.

What’s the current status?

All text is written and a collection of placeholder diagrams and illustrations are in place. It now needs some polish, formatting and professionally designed images to make it a kick-ass publication.

The book has 38 chapters. That might sound like a lot, but each chapter deals primarily with one idea and in the final product I estimate chapters will be around 5-6 pages long on average. That’s about 220-230 pages.

What do I need funds for?

To polish the book, I need three things:

    • A professional proof reader to fix any mistakes, inconsistencies and grammatical errors.
    • An illustrator who can produce an awesome-looking front cover, and also take my placeholder diagrams and illustrations and make them look beautiful.
    • A copy editor to give the book a professional finish.

I already have estimates for each of these services.

Want to read a sample?

Go here.

You might also be interested to know I’ve contributed several articles in the past to Linux User and Developer magazine. Some of them are available online (e.g. “Wikimedia: Wikipedia’s Game Changer” and “Kolab: David and Goliath” ).

Other ways you can help

Don’t forget, you can contribute in ways other than donating funds. Tell your friends, share this page and tweet about it to the world. Help me get the word out!

Please visit the project’s Indiegogo page to find out more and, more importantly, to contribute!


Software development and your natural rhythms

One of the lessons from Joel Spolsky I took to heart right from the start of my career as a software developer was to schedule tasks honestly. Scheduling dishonestly means estimating a task when you don’t really know what work is involved. As a result, you end up putting an incorrect time estimate on it.

To combat this, you should schedule honestly and Joel proposes this simple rule: tasks should be measured in hours not days, with a realistic estimate being no larger than 16 hours. If your estimate is larger than that, then you probably don’t appreciate the true size of the task. Experience tells us that programmers usually underestimate tasks they don’t fully understand, so estimating in terms of days or weeks risks schedule overrun. Tasks which will take more than a day or two should therefore be broken down into smaller sub-tasks, because this forces you to think about the work in more detail and gain a better understanding of it.

But how much time is in a day?

So an 8-hour task will take about a day, yes? Well, maybe not. One of the other things I recall Joel saying is that there is less time in the day than you think. Yes, we ‘lose’ time to lunch breaks and meetings, but that’s quite easily taken account of in a schedule.

What about that other ‘lost time’? You know what I’m talking about: all that web-surfing, checking email, chatting at the water-cooler. How much time every day is ‘lost’ to these? And how guilty do they make us feel when work ends, because we know we spent a cumulative hour on them instead of completing that task which should have taken only a day.

In actuality, we probably shouldn’t be feeling guilty about this at all. We should just be scheduling better.

Scheduling in cycles

I recently read this article: “The origin of the 8 hour work day and why we should rethink it“. It presents interesting arguments for switching away from a traditional 8-hour work day and has several tips for working more effectively. The one that really intrigued me was the argument that human concentration follows a natural rhythm whereby we can focus on a task for a certain amount of time before we start to lose it and need a break. The article claims that most of us can hold focus for about 90 minutes before requiring a break of around 20 minutes, and so advocates planning work time in 90-minute windows (let’s call them cycles).

Ultradian Rhythm of Life

So what happens when we combine this with Joel’s advice? A task estimated to take 6 hours effort (360 minutes) would require 4 cycles (4 x 90 minutes). When you factor in the 20-minute breaks between tasks that’s an extra hour.

If you can manage more than four cycles in a day then good for you, but four is enough for me. Under this cycle-scheduling system then, “one day” (if we stick to an 8-hour day) clearly gives you six hours work time, a one-hour mealtime, and recognises that we all give in to the temptation to take little breaks now and again.

Can it work for programmers?

Let’s consider a few things.

  • It factors in those more unpredictable distractions (Facebook, email, coffee etc.) which are actually symptoms of us reaching a loss of focus. Using cycles allow you to actually allocate time for them, which recognises we need them and stops us from feeling any guilt about ‘wasting’ time.
  • It can be encouraging to know that your current cycle of work will soon be followed by a scheduled break, especially when the work is intense.
  • You can more easily track the work you did. At the end a day, can I really be sure I put in X hours of work? How much cumulative time was lost to those little breaks? If you follow the cycles, you have more confidence in how much time you really spent on tasks.
  • Granularity. Most of the time, programmers can’t measure progress with the granularity of certain other professions (like a bricklayer who can measure bricks laid per hour). But there are still activities we can plan which fit into a 90-minute cycle. Think along the lines of: layout one basic screen design, implement that setter function, write the code that makes TestX pass. And speaking of testing…
  • Cycles can also help to make the distinction between coding and testing clearer. When you estimate a task as 8 hours, it’s easier and tempting to lump the two together in a schedule only to find that coding takes up all the allocated time and testing gets pushed back or even dropped. Using 90-minute cycles encourages the approach of spending one cycle getting something working and then spending the following cycle reviewing and testing what you’ve done. Most programmers can relate to how different code looks when it’s examined again after a break and how this makes the flaws easier to find.
  • Interrupting the flow. One objection I’ve heard goes along the line of: “It can take programmers several hours just to get into a task, so you shouldn’t interrupt the flow.” This is true, but getting into a task still needs factoring into a schedule somehow. Just like cycles force you to think about the actual small details of a task, they also force you to plan your research or preparation.
    • Try reducing the breaks in-between preparation cycles to 5 minutes.
    • Be more detailed about what “getting into a task” means: How many pages of a resource can you read in one cycle? How much of a prototype/test code can you write in one cycle? etc.


So, instead of estimating a task or feature in terms of the number of hours it will take, why not in terms of the number of cycles instead? Personally, I like the approach and I already work in a similar pattern anyway. If it is indeed effective and quite natural to all of us then why not  make scheduling explicitly take it all into account and give yourself a more accurate project plan?

Edit: One reader seems to have the impression that I’m saying non-peak time is exclusively time to slack off, even though I never stated such a thing and even included work tasks in the examples of non-peak time. Let me make it clear: non-peak time includes unscheduled stuff that doesn’t contribute to the development project you’re working on and billing for. It could be slacking off time, but it could also be reading your email, checking in with work colleagues, looking at or updating the project plan,  and so on…. anything that takes your focus away from the scheduled development work.