If you have started to learn any of the mainstream web programming languages (JavaScript, Ruby, PHP, or Python) you have no doubt been introduced to something called an array.

An array is a collection of values, organized in a specific order. They can be text or numbers or boolean (true or false), and each value in the array has a number (it’s subscript) that correlates to its position in the array.

Let’s look at this sample array of my favorite colors:

`[‘blue’, ‘yellow’, ‘red’, ‘purple’, ‘white’]`

So if I were to ask you what numerical position in the array ‘blue’ has, what would you say?

“1 silly! It’s the first number!”

In life, you would be correct. In JavaScript, PHP, Python or Ruby, you would not. The right answer would be zero, because arrays are counted not from 1 but from zero.

In other words, another way to explain how the array works is this:

`[‘blue’0, ‘yellow’1, ‘red’2, ‘purple’3, ‘white’4`

I have 5 favorite colors in my array, but their respective subscripts are 0-4.

Think about it like this: let’s say instead of asking you what number ‘blue’ is, I asked you how many positions you would need to move in order to get to ‘blue’?

Since ‘blue’ is the first value in the array, and since you are starting at the beginning, you have to move zero times to get to ‘blue’, right?

Ah ha!

**Why do programmers count from zero, when everyone else starts at 1?**

Computer programming is all about efficiency, and even small improvements in efficiency can make big differences at scale.

And yes, counting from zero is slightly more efficient than starting at 1.

Let’s explore a simple mathematical equation to understand why:

If we count from zero, every value in the array of length **N**, can be represented by the following equation where **i** represents the numerical position of each value:

`0 ≤ i < N`

Our color array from before has 5 total values. If we were to take each value’s subscript (it’s numerical position in the array) and slot them into this equation:

`‘blue’`

0 ≤ 0 < 5 // true!
‘yellow’
0 ≤ 1 < 5 // also true!
‘white’
0 ≤ 4 < 5 // yep, yep, true too!

Can we all agree that each of these equations is true?

Now, if we were to count from 1, every value in the array of length **N** could be represented by the following equation where **i** represents the numerical position of each value:

`1 ≤ i < N + 1`

So for a moment, let’s consider this alternative array of colors I don’t like, indexed by 1:

`[‘beige’1, ‘orange’2, ‘green’3]`

The equations now looks like so:

`‘beige’`

1 ≤ 1 < 4 // true!
‘orange’
1 ≤ 2 < 4 // huh, also true!
‘green’
1 ≤ 3 < 4 // too true!

Those are also true, right? So what’s the problem?

The problem is found in the **N + 1** part of the equation.

You see what that means is that in order for the computer to process the equation it has to find the length of the array and then add 1. Sure, it’s not a hard task (to add 1) but it is extra work that the computer doesn’t have to do when processing the former equation, and therefore, starting the count at zero wins!

*This article is heavily indebted and inspired by Edsger W. Dijkstra, Avi Flombaum , and Emily Davis.*