Divide the year by 100, drop the decimals and add one. Adding one is necessary because the year 1 is considered the first century, so the year 101 is the second, and so on.Continue Reading
For example, to calculate the century in which the year 1955 is, divide 1955 by 100. You will get 19.55. Drop the decimals and add one (19 + 1). The result is 20. Thus, 1955 is in the 20th century. Centuries are used in conjunction with the astronomical year numbering, which includes the year 0.
Europeans used a calendar that split time into whether it was before Christ, or after. This calendar had year 1 BC followed by year 1 AD. When calculating the first 100 years from 1 AD, the first century starts in year 1 AD and ends in 100 AD. The BC and AD are sometimes replaced with BCE and CE, respectively, to denote common era. This year numbering is the most common today. However, astronomers developed the astronomical year numbering, in which the first century starts in year 0 and ends in year 99 AD. Thus, the astronomical year numbering eliminates the confusion of having 1 AD and 100 AD both in the same century.Learn more about Math Calculators