Unix Timestamp Converter

Enter a Unix timestamp (in seconds or milliseconds) to convert it to a human-readable date and time — or enter a date and time to get back the corresponding Unix epoch value. Your results include the UTC date, local date, timestamp in seconds, and timestamp in milliseconds.

Enter a 10-digit timestamp (seconds) or 13-digit timestamp (milliseconds).

Results

Result

--

UTC Date & Time

--

Local Date & Time

--

Timestamp (seconds)

--

Timestamp (milliseconds)

--

Day of Week

--

Relative Time

--

Frequently Asked Questions

What is a Unix timestamp (epoch time)?

A Unix timestamp, also called epoch time or POSIX time, is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC. It is a universal, timezone-independent way for computer systems to represent a specific moment in time. The term 'epoch' technically refers to Unix time 0 — midnight on January 1, 1970.

Is there a difference between epoch time and Unix time?

No — epoch time, Unix time, POSIX time, and Unix timestamp all refer to the same thing: the number of seconds elapsed since 00:00:00 UTC on January 1, 1970. The terms are used interchangeably across different systems and programming languages.

How do I know if my timestamp is in seconds or milliseconds?

A 10-digit timestamp (e.g. 1700000000) is in seconds. A 13-digit timestamp (e.g. 1700000000000) is in milliseconds. This converter automatically detects which format you've entered based on the number of digits and handles the conversion accordingly.

Why can Unix time only represent dates between 1901 and 2038 on some systems?

Some older systems store Unix timestamps as a signed 32-bit integer, which can hold values from roughly −2.1 billion to +2.1 billion. This means dates beyond January 19, 2038 (the 'Year 2038 problem' or Y2038) would overflow and cause errors. Modern 64-bit systems are not affected and can represent dates billions of years into the future.

What are the common time units used with epoch time?

Epoch time is most commonly expressed in seconds (10 digits), milliseconds (13 digits), microseconds (16 digits), or nanoseconds (19 digits). JavaScript uses milliseconds by default, while most Unix/Linux systems and APIs use seconds.

How do I convert a Unix timestamp in JavaScript?

You can convert a Unix timestamp to a date in JavaScript using: new Date(timestamp * 1000) for second-based timestamps, or new Date(timestamp) for millisecond-based timestamps. To get the current Unix timestamp in seconds, use: Math.floor(Date.now() / 1000).

Does Unix time account for time zones?

Unix timestamps themselves are always in UTC and are completely timezone-independent — the same timestamp represents the exact same moment everywhere in the world. The conversion to a human-readable local date and time is where timezone differences come into play.

What is the maximum Unix timestamp value?

On 64-bit systems, the maximum Unix timestamp is 9,223,372,036,854,775,807 seconds, corresponding to a date approximately 292 billion years in the future. On older 32-bit signed systems, the maximum is 2,147,483,647 seconds, which expires on January 19, 2038 at 03:14:07 UTC.

More Everyday Life Tools