Or were all the old second counting systems wrong?
Many people have pointed out that if you start a timer on your phone and count out to 45 Mississippis, hippopotamus or number-one thousands it now ties out to a minute.
I have tried it a dozen times myself and after 45 counts I get anywhere between 54 to a minute and 4 seconds.
I specifically remember counting chunks of time as long as 15 minutes and not being off by a minute.


They’re not wrong but they are inaccurate and unreliable. Clocks, on the other hand, are pretty accurate and reliable, and atomic clocks even moreso, and most digital clocks are now synchronized to the atomic clock standards in some form using the internet or wireless. The definition of time is quite accurately standardized to an extremely high level of precision and has been for a very long time. The human brain is not standardized like this and hopefully will never be because that’s a gross and scary idea.
The definition of a length of time has been maintained with levels of precision that have increased dramatically since ancient times, but at no point in the last, oh, say, at least 1000 years, has the measurement of time changed by anywhere close to 25%.
The antikythera mechanism is believed to be at least 2,000 years old and was able to calculate the passage of time and the motion of the planets far more accurately than Mississippis ever could hope to. The passage of time has not changed the accuracy of that device, only our understanding of the motion of the planets has, and again that’s a human brain problem not a time or motion of the planets problem.