Epoch/Timestamp List (Historical Dates)

Look up the Unix timestamp for any historical date you enter. Enter a year, month, and day (plus optional time), choose your preferred timezone (UTC or local), and get back the exact Unix epoch timestamp in seconds — plus the millisecond value and a human-readable GMT date string. Handy for developers, data analysts, and anyone cross-referencing historical records.

Enter any year from 1 AD to 2038

24-hour format

Choose whether the date/time above is in UTC or your local timezone

Results

Unix Timestamp (seconds)

--

Unix Timestamp (milliseconds)

--

Days Since Unix Epoch (Jan 1, 1970)

--

Years Since Unix Epoch

--

Results Table

Frequently Asked Questions

What is a Unix timestamp (epoch timestamp)?

A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC — a moment known as the Unix epoch. It is a universal, timezone-independent way to represent a specific point in time, widely used in programming, databases, and log files.

How many seconds are in a day, month, or year?

There are 86,400 seconds in a day (60 × 60 × 24). A standard 30-day month contains 2,592,000 seconds, and a common (non-leap) year contains 31,536,000 seconds. A leap year has 31,622,400 seconds due to the extra day in February.

Can Unix timestamps represent dates before 1970?

Yes — dates before January 1, 1970 are represented as negative Unix timestamps. For example, January 1, 1960 has a timestamp of roughly −315,619,200. Most modern systems handle negative timestamps correctly, though very old or limited implementations may not.

What is the maximum date a Unix timestamp can represent?

On 32-bit systems, the maximum Unix timestamp is 2,147,483,647, which corresponds to January 19, 2038 at 03:14:07 UTC. This is known as the Year 2038 problem. On 64-bit systems, timestamps can represent dates billions of years into the future.

What is the difference between UTC and local time for timestamps?

Unix timestamps are always based on UTC (Coordinated Universal Time). If you enter a local time, the tool converts it to UTC before computing the timestamp. Two people in different timezones who both refer to the same instant in history will get the same Unix timestamp.

How do I convert a Unix timestamp to milliseconds?

Simply multiply the Unix timestamp in seconds by 1,000. For example, a timestamp of 1,000,000,000 seconds equals 1,000,000,000,000 milliseconds. Many programming languages and APIs (such as JavaScript's Date.now()) work in milliseconds by default.

Why do developers use Unix timestamps instead of regular date strings?

Unix timestamps are language-agnostic, timezone-neutral, and sortable as plain integers — making them ideal for databases, APIs, and log systems. Date strings can vary by locale and format (MM/DD/YYYY vs DD/MM/YYYY), introducing ambiguity that timestamps avoid entirely.

What notable historical dates have well-known Unix timestamps?

Some memorable epoch milestones include: 1,000,000,000 (September 9, 2001), 1,234,567,890 (February 13, 2009), and 1,500,000,000 (July 14, 2017). Events like the Apollo 11 moon landing (July 20, 1969) yield a negative timestamp of approximately −14,182,940 because they occurred before 1970.

More Time & Date Tools