Unix Timestamp
Convert between Unix timestamps and human-readable dates.
What is a Unix Timestamp?
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC — the Unix epoch. It is a single integer that precisely identifies any point in time, independent of timezone or locale.
Millisecond timestamps (13 digits) are simply the epoch in milliseconds rather than seconds. JavaScript's Date.now() returns milliseconds; most Unix system calls return seconds. This converter automatically detects which format you've entered.
Why Developers Use Unix Time
- Timezone-safe — A Unix timestamp always refers to the same instant globally, with no ambiguity from DST or regional offsets.
- Compact storage — A single 32-bit integer stores a full date and time, ideal for databases and APIs.
- Easy arithmetic — Subtracting two timestamps gives an exact duration in seconds. No calendar logic required.
- Universal support — Every major programming language, database, and operating system supports Unix time natively.
The Year 2038 Problem
Systems that store Unix timestamps in a signed 32-bit integer will overflow on January 19, 2038 at 03:14:07 UTC — when the value reaches 2,147,483,647 and wraps to a large negative number. This is the Y2K38 problem.
Modern 64-bit systems are not affected — a 64-bit signed integer can represent dates hundreds of billions of years into the future. However, legacy embedded systems and older databases using 32-bit timestamps should be migrated before 2038.