The expanding universe, seen through redshifted galaxies and the afterglow of the Big Bang, tells us when time began. (AI Image via DALLE)For most of human history, time was measured in days and seasons — not billions of years. The idea that the universe itself had an age is astonishingly recent. It’s a tale that begins not with stars or galaxies, but with the Earth beneath our feet.
In the late 1700s, Scottish geologist James Hutton watched rivers carve valleys and sediments harden into rock. “No vestige of a beginning, no prospect of an end,” he wrote. That was the first inkling that Earth’s history stretched far beyond biblical times. By the 19th century, geologists like Charles Lyell had extended this view: Earth’s slow processes implied ages of millions of years.
The true age of the planet, however, remained uncertain until the discovery of radioactivity at the turn of the 20th century. Scientists realised that certain elements decay at fixed, measurable rates — ticking atomic clocks buried in stone. By comparing ratios of uranium to lead in ancient rocks and meteorites, geophysicists such as Clair Patterson deduced that Earth and the entire solar system formed 4.54 billion years ago. These radioactive isotopes became the first universal chronometers, allowing us to synchronise the ages of rocks, planets, and stars.
Physicists meanwhile faced a different puzzle: how could the Sun shine for billions of years? In the 1860s, Lord Kelvin calculated that the Sun could only radiate for about 30 million years if powered by heat or gravity alone — far too short to accommodate geological and biological history.
The problem lingered until Hans Bethe, while riding a train through upstate New York in 1938, realised that in the Sun’s core, hydrogen atoms fuse into helium, releasing energy through nuclear fusion. That insight not only explained the Sun’s endurance but linked our star’s lifetime to the age of the Earth, a shared chronology written in the language of physics.
In 1929, Edwin Hubble, perched atop Mount Wilson with the 100-inch Hooker Telescope, measured the speeds and distances of galaxies. Their light was redshifted — stretched to longer wavelengths — showing that the universe was expanding. Hubble once remarked that “equipped with his five senses, man explores the universe around him and calls the adventure Science,” a sentiment that perfectly captured his discovery’s scale.
If galaxies are moving apart today, they must once have been closer together. And run backward far enough, to a beginning.
Early estimates of the expansion rate gave an impossible result: a universe younger than the Earth itself. The error lay in mismeasured galaxy distances. Astronomers spent decades refining their data, slowly recalibrating the scale of the cosmos.
By the mid-20th century, astronomers found a new clock in globular clusters — dense spheres of ancient stars that orbit our galaxy. Using the physics of stellar evolution, they estimated their ages by observing when stars begin to leave the main sequence i.e.,the “turnoff point” where they run out of hydrogen fuel. At this stage, stars begin to expand and cool, entering the red giant phase.
Early estimates in the 1950s and 60s gave ages of 10–15 billion years. With space-based telescopes like Hubble, those figures became more precise: 12–13 billion years, setting a firm lower bound on the age of the universe.
Then came one of science’s great accidents. In 1964, engineers Arno Penzias and Robert Wilson noticed a faint hiss in their radio antenna — a noise that persisted no matter how much they cleaned or adjusted it. That “noise” turned out to be the Cosmic Microwave Background (CMB), the cooled afterglow of the Big Bang itself, a relic from when the universe was just 380,000 years old.
Subsequent satellites — COBE, WMAP, and Planck — mapped this radiation in exquisite detail, transforming that hiss into a cosmic baby picture. From its faint ripples, cosmologists determined the universe’s precise age: 13.80 ± 0.02 billion years.
The most incomprehensible thing about the universe is that it is comprehensible,
Einstein once wrote. The CMB proved just that — that light from the dawn of time could be measured, modelled, and understood.
Each method cross-checks the others. Type Ia supernovae, exploding stars with near-uniform brightness, act as “standard candles,” allowing distances — and thus expansion rates — to be measured precisely. Combined with the CMB and stellar ages, these data reveal that expansion has not been steady: gravity slowed it early on, and dark energy is now accelerating it.
A common misunderstanding is that the universe expands into something. It doesn’t. Space itself stretches — like raisins in rising dough, galaxies move apart because the dough (space-time) expands between them. The farther away a galaxy is, the more its light has been redshifted during its journey, showing us not just distance, but time itself unfolding.
Today, three independent clocks — stellar evolution, supernovae, and the cosmic background — all point to the same answer: the universe is 13.8 billion years old.
That number, so casually cited now, is the product of centuries of discovery — from Hutton’s rocks to Patterson’s isotopes, from Bethe’s equations to Hubble’s redshifts. The cosmos began as a hot, dense point, and has been expanding, cooling, and creating ever since.
And in this vast, ancient universe, we have managed something extraordinary: to measure not just our own history, but the age of time itself.
Shravan Hanasoge is an astrophysicist at the Tata Institute of Fundamental Research.