Unix Timestamp / Epoch Converter

Convert between Unix timestamps and human-readable dates in both directions. Enter a Unix timestamp (in seconds or milliseconds) to get the corresponding UTC date and time, or fill in year, month, day, hour, minute, and second fields to get the equivalent epoch timestamp. Results include both seconds and milliseconds representations.

Enter a timestamp in seconds (10 digits) or milliseconds (13 digits)

Results

Result

--

Unix Timestamp (seconds)

--

Unix Timestamp (milliseconds)

--

UTC Date & Time

--

ISO 8601

--

Relative Time

--

Frequently Asked Questions

What is epoch time (Unix time)?

Unix epoch time, also called Unix time or POSIX time, is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — a moment known as the Unix epoch. It does not count leap seconds. It provides a simple, language-agnostic way to represent any point in time as a single integer.

Is there a difference between epoch time and Unix time?

In everyday usage, epoch time and Unix time mean the same thing: the number of seconds since January 1, 1970, 00:00:00 UTC. Strictly speaking, 'the epoch' refers to that specific starting moment (Unix time 0), while 'Unix time' refers to any elapsed-seconds count from that origin. The terms are used interchangeably in most contexts.

What is the Year 2038 Problem?

Some older systems store Unix timestamps as signed 32-bit integers, which can only hold values up to 2,147,483,647. That maximum value corresponds to January 19, 2038, at 03:14:07 UTC. After that moment, a 32-bit counter would overflow and roll back to a large negative number, potentially causing software errors — similar to the Y2K problem. Modern 64-bit systems are not affected.

Why can Unix time only represent dates between 1901 and 2038 on some systems?

On systems using a signed 32-bit integer, the timestamp ranges from −2,147,483,648 (representing November 20, 1901) to 2,147,483,647 (representing January 19, 2038). Dates outside this window cannot be stored. 64-bit systems extend this range to hundreds of billions of years in both directions.

How do I convert a Unix timestamp to a human-readable date?

Select 'Timestamp → Date', enter your Unix timestamp in seconds (10 digits) or milliseconds (13 digits), and the tool will display the corresponding UTC date, ISO 8601 string, and relative time. In JavaScript you can do the same with `new Date(timestamp * 1000).toUTCString()`.

What are the common time units in epoch time?

The three most common representations are seconds (10-digit timestamps, e.g. 1700000000), milliseconds (13-digit, multiply seconds by 1,000), and microseconds (16-digit, multiply seconds by 1,000,000). Most Unix APIs and databases use seconds; JavaScript's Date object natively uses milliseconds.

What is the difference between UTC and epoch time?

UTC (Coordinated Universal Time) is a human-readable time standard that includes hours, minutes, seconds, and a date — it is the basis for world time zones. Epoch time is a machine-friendly integer counting seconds from a UTC reference point. They represent the same moments in time but in different formats: UTC is for humans, epoch is for computers.

What are common use cases for Unix timestamps?

Unix timestamps are widely used in software development for logging events, scheduling tasks, measuring durations, storing dates in databases, and synchronising time across distributed systems. Because they are timezone-agnostic integers, they simplify sorting, comparison, and arithmetic on dates and times.

More Time & Date Tools