It allowed the interpretation of dates as year, month, day, hour, minute, and second values.
Nearly all modern operating systems assume that 1 day = 24 × 60 × 60 = 86400 seconds in all cases.In UTC, however, about once every year or two there is an extra second, called a "leap second." The leap second is always added as the last second of the day, and always on December 31 or June 30.For example, the last minute of the year 1995 was 61 seconds long, thanks to an added leap second.Most computer clocks are not accurate enough to be able to reflect the leap-second distinction.Some computer standards are defined in terms of Greenwich mean time (GMT), which is equivalent to universal time (UT).
GMT is the "civil" name for the standard; UT is the "scientific" name for the same standard.
The distinction between UTC and UT is that UTC is based on an atomic clock and UT is based on astronomical observations, which for all practical purposes is an invisibly fine hair to split.
Because the earth's rotation is not uniform (it slows down and speeds up in complicated ways), UT does not always flow uniformly.
Leap seconds are introduced as needed into UTC so as to keep UTC within 0.9 seconds of UT1, which is a version of UT with certain corrections applied.
There are other time and date systems as well; for example, the time scale used by the satellite-based global positioning system (GPS) is synchronized to UTC but is not adjusted for leap seconds.
An interesting source of further information is the U. Naval Observatory, particularly the Directorate of Time at: Determines the date and time based on the arguments.