← All Guides
Technical5 min read

Unix Time Explained: What Every Developer Should Know

Every time your computer stores a timestamp, it's almost certainly a Unix timestamp under the hood. Understanding epoch time is fundamental to writing correct date/time code — and to understanding a looming overflow problem in 2038.

What Is Unix Time?

Unix time (also called epoch time, POSIX time, or Unix timestamp) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC. This reference point is called the Unix epoch.

The Unix epoch was chosen for the original Unix operating system developed at Bell Labs in the early 1970s. The exact date was selected because it was a round number that predated any production Unix deployment, and because the 1970 boundary was far enough in the past to represent historical dates while still fitting in available integer storage.

As of early 2026, the current Unix timestamp is approximately 1,760,000,000 seconds. You can verify this on our Unix Timestamp Converter.

// Example Unix timestamps:
1970-01-01 00:00:00 UTC = 0
2000-01-01 00:00:00 UTC = 946684800
2024-01-01 00:00:00 UTC = 1704067200
2026-01-01 00:00:00 UTC = 1767225600

Why Unix Time Exists

Storing time as a single integer is dramatically simpler than storing a date/time string. Consider what it would take to compare two calendar timestamps: you'd need to account for month lengths, leap years, leap seconds, timezone offsets, and DST rules. Unix time reduces all of this to a simple integer comparison.

Unix timestamps have three key advantages:

  • Timezone-agnostic: Unix time is always in UTC. Converting to a local time is a display concern, not a storage concern. This means a timestamp written in Tokyo and read in New York refers to the same absolute moment.
  • Mathematically simple: Adding 86,400 seconds to a Unix timestamp advances it by exactly one day. Computing the difference between two timestamps is a single subtraction. No calendrical arithmetic required.
  • Universally supported: Every programming language, database, and operating system understands Unix time. The POSIX standard mandates it. Interoperability between systems is trivial.

Seconds vs Milliseconds: The 10-Digit vs 13-Digit Problem

The original Unix timestamp counts seconds. In 2026, a seconds-based Unix timestamp is a 10-digit number (e.g., 1760000000).

However, many modern systems — particularly JavaScript, Java, and databases like MongoDB — store time in milliseconds. A milliseconds-based timestamp is a 13-digit number (e.g., 1760000000000).

// JavaScript:
Date.now() // milliseconds → 13 digits
Math.floor(Date.now() / 1000) // seconds → 10 digits
// Python:
import time; time.time() // seconds (float)
int(time.time()) // seconds (integer)

A common bug: receiving a 13-digit milliseconds timestamp and treating it as seconds, which produces dates in the year 57,000+. Always check the digit count when working with unknown timestamp sources. If you see a 13-digit value, divide by 1000 to get seconds.

How to Convert Unix Timestamps

Converting between Unix time and human-readable dates in common languages:

JavaScript
// Unix timestamp → Date
const date = new Date(1704067200 * 1000); // multiply by 1000 for ms
console.log(date.toISOString()); // "2024-01-01T00:00:00.000Z"

// Date → Unix timestamp (seconds)
const ts = Math.floor(new Date('2024-01-01').getTime() / 1000);
console.log(ts); // 1704067200
Python
from datetime import datetime, timezone

# Unix timestamp → datetime (UTC-aware)
dt = datetime.fromtimestamp(1704067200, tz=timezone.utc)
print(dt)  # 2024-01-01 00:00:00+00:00

# datetime → Unix timestamp
ts = int(datetime(2024, 1, 1, tzinfo=timezone.utc).timestamp())
print(ts)  # 1704067200
SQL (PostgreSQL)
-- Unix timestamp → timestamptz
SELECT to_timestamp(1704067200);
-- Returns: 2024-01-01 00:00:00+00

-- Current timestamptz → Unix timestamp
SELECT EXTRACT(EPOCH FROM NOW());

The Y2K38 Problem

The year 2038 problem (also called Y2K38 or the Unix millennium bug) is a time representation issue that will affect systems using a signed 32-bit integer to store Unix timestamps.

A signed 32-bit integer can hold values from −2,147,483,648 to 2,147,483,647. The maximum value, 2,147,483,647, in seconds after the Unix epoch, corresponds to January 19, 2038 at 03:14:07 UTC. One second after that, the integer overflows to the most negative value, −2,147,483,648, which represents December 13, 1901 — flipping time backward by 136 years.

Systems affected include any software compiled for 32-bit architectures that stores time as a 32-bit signed integer: older embedded systems, legacy SCADA and industrial control systems, some file systems, and certain database implementations. Modern 64-bit systems are not affected — a 64-bit signed integer can hold Unix timestamps until approximately the year 292 billion.

The good news: most consumer software and cloud infrastructure migrated to 64-bit architectures years ago. The risk area is embedded systems with decades-long deployment lifespans — medical devices, automotive control systems, infrastructure monitoring, and industrial equipment installed in the early 2000s that may still be running in 2038.

// 32-bit max:
2147483647 seconds = 2038-01-19 03:14:07 UTC
// 64-bit max:
9223372036854775807 seconds = ~292 billion years

Negative Unix Timestamps

Dates before January 1, 1970 are represented as negative Unix timestamps. For example:

1969-12-31 23:59:59 UTC = -1
1960-01-01 00:00:00 UTC = -315619200
1900-01-01 00:00:00 UTC = -2208988800

Not all systems support negative Unix timestamps. Windows historically had issues with pre-1970 dates. When working with historical dates in software, verify that your runtime and database correctly handle negative epoch values before relying on them.

Notable Unix Timestamps Reference

TimestampDate (UTC)
01970-01-01 00:00:00 — The Unix Epoch
10000000002001-09-09 01:46:40 — One billion seconds
12345678902009-02-13 23:31:30 — Pop culture milestone
15000000002017-07-14 02:40:00 — 1.5 billion seconds
17000000002023-11-14 22:13:20 — 1.7 billion seconds
20000000002033-05-18 03:33:20 — Two billion seconds
21474836472038-01-19 03:14:07 — 32-bit overflow point
99999999992286-11-20 17:46:39 — Max 10-digit timestamp

Frequently Asked Questions

How do I know if a timestamp is in seconds or milliseconds?

Count the digits. In 2026, a seconds-based Unix timestamp has 10 digits (currently around 1,760,000,000). A milliseconds-based timestamp has 13 digits (1,760,000,000,000). If you see a 13-digit number, divide by 1,000 to get seconds. If you see a 10-digit number, it's almost certainly seconds. Values outside these ranges likely indicate a different format.

Does Unix time account for leap seconds?

Technically, no. The POSIX standard defines Unix time as ignoring leap seconds — every day is exactly 86,400 seconds, even when a leap second is inserted into UTC. In practice, this means Unix time and UTC can differ by up to several seconds. Systems that need leap-second accuracy (GPS receivers, scientific instruments) use special handling. For most applications, the difference is irrelevant.

Why 1970? Why not a rounder date like 1900 or 2000?

The Unix epoch was chosen because 1970 was close to when Unix was being developed (late 1960s–early 1970s). Using 1900 would have required larger integers to store dates at the time (memory was expensive). Using 2000 would have made negative timestamps unavoidable for historical data in the system's early years. 1970-01-01 was a practical compromise that placed the epoch at a round number close to the software's creation date.

Is Unix time the same everywhere, regardless of timezone?

Yes — this is its key property. Unix time is always measured in seconds since 1970-01-01 00:00:00 UTC. It has no timezone. The value 1704067200 represents the same absolute moment in time whether read by a server in Tokyo or New York. Converting to a local time is a separate step done only for display purposes.

Related Tools

Use our Unix Timestamp Converter to convert any epoch value to a human-readable date, or our Timezone Converter to see what a UTC timestamp looks like in any local timezone.