What is Unix Time?
Unix time (also known as POSIX time or UNIX Epoch time) is a system for describing a point in time. It is the number of seconds that have elapsed since the **Unix Epoch**, minus leap seconds. The Unix Epoch is **00:00:00 UTC on 1 January 1970**.
Because Unix time is a single integer, it is incredibly efficient for computers to store and compare. Whether you are developing a distributed database or a simple mobile app, Unix timestamps are the universal language of temporal data.
The "Year 2038" Problem
On January 19, 2038, 32-bit Unix timestamps will overflow. This is similar to the Y2K bug. Most modern systems have already migrated to 64-bit integers, which can represent time for the next 292 billion years.
Milliseconds vs Seconds
While the standard Unix timestamp is in seconds, many platforms (like JavaScript and Java) use **milliseconds**. If your timestamp has 13 digits instead of 10, it is likely in milliseconds.
Programming Examples
// Current timestamp in seconds
const seconds = Math.floor(Date.now() / 1000);
// Convert to Date object
const date = new Date(seconds * 1000); import time
from datetime import datetime
# Current timestamp
ts = time.time()
# Convert to readable format
dt = datetime.fromtimestamp(ts)
print(dt.strftime('%Y-%m-%d %H:%M:%S')) Common Mistakes
- Treating 13-digit values as seconds instead of milliseconds.
- Displaying local server time when the requirement is UTC.
- Serializing timestamps without documenting whether they are integer seconds, integer milliseconds, or ISO strings.
Frequently Asked Questions
Does Unix time include leap seconds?
No. Unix time ignores leap seconds, which is why it is not a perfect linear representation of time, but it is extremely practical for computing.
What happens at timestamp 0?
Timestamp 0 corresponds to exactly January 1, 1970, at 00:00:00 UTC.