Perhaps you’ve experienced this before: You arrive for an important appointment, and your watch says you’re precisely on time. But much to your dismay, the receptionist remarks that they’ve been wondering whether you were going to show; she points to the wall clock, which indicates that you’re five minutes late.
After you recover from feeling flustered, you probably wonder: Just how in the heck did that happen? After all, you’re the super-conscientious sort — the kind of person who buys a watch and then promptly calls the U.S. Naval Observatory’s “correct time” number (202-762-1401) to set it precisely — down to the second. So how in the world does your timepiece always wind up being a little bit off? And sometimes it’s more than just a little bit — or at least that’s what your friends insist. Which leads to another question: Whose timepiece is correct — yours or theirs? And furthermore, just how accurate can anyone expect his or her watch to be? It’s enough to make want to throw up your hands and sing the Chicago’s 1969 hit “Does anybody really know what time it is?” (No? You don’t feel like singing?)
Well, if it’s any consolation, you’re hardly the first person to feel bedeviled by personal timekeeping. It’s a bewildering thing. But we’re going to get to the bottom of it. First, let’s take a look at the history of the watch.
A Brief History of (Correct) Time
As you might expect, early humans didn’t wear watches. And they didn’t really need them, either, since the nomadic hunter-gatherer lifestyle didn’t require them to catch commuter trains or keep track of billable hours for clients. But the development of civilization and the division of labor put more pressure on humans to function together efficiently. Sundials, which measured shadows cast by the sun, were an early innovation. The Egyptians, who were concerned with keeping time at night so their priests would know when to perform rituals, invented the water clock — basically, a giant vase with a hole in the bottom, which measured hours in drips]. In Medieval Europe in the 1300s, the advent of mechanical clocks made even precise timekeeping feasible. The first mechanical clocks were only accurate to within 15 minutes, but advances were made when the late-1600s Dutch scientist Christiaan Huygens developed a pendulum clock that lost just 10 seconds of time each day . In the 1850s, American Watch Co. in Waltham, Mass., marketed the first mass-produced spring-powered pocket watches, which enabled people to keep track of time wherever they went .
But once everyone had clocks and watches, there was another, trickier dilemma: What time should everyone set them to? In 19th-century America, there were hundreds of local times, each determined by the big clocks at local courthouses or city halls, which in turn were set to the solar noon at each location. That meant that when it was noon in Chicago, it was 11:40 a.m. in St. Louis and 12:18 in Detroit. This posed a problem for the then-growing railroad industry, which needed a reliable standard for train schedules . The railroads themselves set their clocks to celestial observations at the Harvard College Observatory, which they obtained via telegraph . To eliminate the discrepancy between local and railroad time, in 1883, railroad companies divided the U.S. into four time zones, each with a standard time, and compelled cities to adjust to them, or face economic isolation. People in Maine bristled at having to reset their clocks 25 minutes to what they derided as “Philadelphia Time,” but eventually the whole nation was synchronized.
In the 20th century, scientists developed clocks set to the vibrations of crystals and even individual atoms, which made it possible to measure time in units so tiny — down to the trillionths of a second — that they were beyond normal, unaided human perception . That’s why we all have the exact same time on our watches, and we’re all precisely on time for our appointments today. Except that we’re not. So, what’s up with that?
How much does watch accuracy vary, and why?
At least in theory, we all should be Johnny-on-the-spot synchronized. Starting in the early 1970s, the advent of battery-powered quartz wristwatches gave ordinary folks access to a timekeeping technology that once was available only to scientists and technicians . Basically, if you apply electricity to a tiny piece of quartz and then bend it, the crystal will give off a relatively constant electrical signal that can be used to operate an electronic clock face . By the early 2000s, quartz watches had become so popular that mechanical watches had been reduced to just 13 percent of the global watch market.
But consumer-grade quartz watches aren’t totally precise. Remember, we’re talking about relatively cheap miniature devices that are churned out rapidly in vast quantities in factories — not some multi-million-dollar gadget custom built for a lab. Even the most expensive quartz-crystal watch in the jewelry store still relies on a mechanical vibration whose frequency can be affected by a variety of factors, including a crystal’s size and shape. No two quartz crystals are exactly alike, which can lead to at least a slight discrepancy between two watches from the same assembly line . Additionally, watches’ precision can be affected by external factors, such as temperature and humidity, and by wear and tear that affects the stability of the tiny motors inside them, which generate the electric field to which the crystals are exposed .
The upshot is that quartz watches tend to become slightly less accurate over time — with a great deal of emphasis on “slightly.” Chronocentric.com, a Web site for timepiece enthusiasts, estimates that consumer-grade quartz watches typically lose between a tenth of a second and two seconds per day — a discrepancy that, if left uncorrected over long periods, could lead to a watch being off by a few minutes . A study published in Horological Journal in 2008, however, suggests that at least a few cheap watches are vastly more accurate. Researchers, who looked at humble timepieces that included a counterfeit Rolex purchased from a street vendor for $15 and a $30 discount store Timex, found they were all accurate to within a few thousandths of a second per day. It would take years for such a shift to become noticeable to their owners .
Setting Your Watch for Accuracy
So, if even cheap quartz watches are accurate to within less than a second per day, why was your watch so far off when you walked into that office for your appointment? The likely reason is that you either didn’t set it to the correct time in the first place, or you’ve been wearing it nonstop for ages, and possibly subjected it repeatedly to humidity and temperature extremes that affected its operation. But you probably don’t need to buy a new watch. Instead, it’s easier just to check it every few months against a reliable reference, and reset it if necessary.
If you’re in the U.S., check with the National Institute of Standards and Technology, which has two radio stations, one in Colorado and the other in Hawaii, that provide a continuous time signal. You can access the Colorado station by phone at (303) 449-7111 and the one in Hawaii at (808) 335-4363. The time provided by telephone is accurate to within 30 milliseconds, which is the maximum delay caused by cross-country telephone lines .
The official U.S. government time, which is based on NIST and the U.S. Naval . NIST also provides a free program that will synchronize your Windows computer with the government’s official clock
By syncing up with these timekeeping bodies every so often, perhaps you’ll never be late for an important appointment again.
Credited to:https://electronics.howstuffworks.com/