Computing is famous for having many languages begin their enumerations from 0. Many humans, on the other hand, begin enumerating from 1. Arguments for whether 0 and 1 is better often pick up examples from the real world and claim that either is more natural. There are, I suppose, qualities that differentiate natural arguments for 0-based and natural arguments for 1-based.

1-based indexing seems natural for counting. When we count apples, we go 1 apple, 2 apples, 3 apples; it makes sense to assign the first apple we point to to 1. I like to think of this as the discrete cardinality heuristic, because the number assigned to the element is the size of the entire set.

0-based indexing is what I like to think of as the distance heuristic. We can define ordinals as the number of units from a base element we call the first element. When we go to large numbers, and think in terms of orders of magnitude, we tend to think in terms of distance from a center, rather than in terms of counting every discrete unit from the center, picturing in our mind the real number line.

When we started counting the years, we began from year 1. As we started to think of time as a measurable, continuous commodity, our perception of time shifted from discrete years to years as containers for 12 months, like how tens are like containers for ten ones. We have the unfortunate convention today that there is no year 0; hence there are only 12 months between 1 BCE and 1 CE. We, too, have the unfortunate convention that 20xx refers to the 21st century, not the 20th century. We should rightly be celebrating the commencement of the 21st century in 2001, not 2000; else one of the centuries counted will have been missing one year. If we enumerated from 0 and not 1, I would not have struggled with century numbers as a boy.

TL;DR because Arabic digits.