WebKit / Chrome Timestamp Converter

Enter a 17-digit WebKit/Chrome timestamp (microseconds since January 1, 1601 UTC) and convert it to a human-readable date and time, Unix timestamp, and Unix milliseconds. You can also reverse-convert — enter a Unix timestamp to get the equivalent WebKit timestamp. Used for analyzing browser cookies, Chrome storage, and WebKit-based browser debugging.

A 17-digit integer representing microseconds since January 1, 1601 00:00 UTC. Found in Chrome cookies, local storage, and browser debug tools.

Optional: enter a Unix timestamp (seconds since Jan 1, 1970) to calculate the equivalent WebKit timestamp.

Results

Human-Readable Date & Time

--

Unix Timestamp (seconds)

--

Unix Timestamp (milliseconds)

--

WebKit Timestamp (from Unix input)

--

Microseconds Since WebKit Epoch (1601-01-01)

--

Frequently Asked Questions

What is a WebKit / Chrome timestamp?

A WebKit/Chrome timestamp is a 64-bit integer representing the number of microseconds elapsed since January 1, 1601 00:00:00 UTC. It is used internally by browsers like Google Chrome, Apple Safari, and other WebKit/Chromium-based browsers to store time values in cookies, local storage, and browser internals.

Why do browsers use this timestamp format?

Chrome and WebKit-based browsers use microseconds since January 1, 1601 UTC to maintain compatibility with the Windows FILETIME format, which also uses that epoch. The microsecond precision provides high-resolution timing needed for browser operations, cookie expiration, and cache management.

How is a WebKit timestamp different from a Unix timestamp?

A Unix timestamp counts seconds (or milliseconds) since January 1, 1970 UTC, while a WebKit timestamp counts microseconds since January 1, 1601 UTC. To convert between them, you subtract the difference between the two epochs (11644473600 seconds, or 11644473600000000 microseconds) and adjust for units.

How do I find a Chrome timestamp?

Chrome timestamps appear in browser cookies, local storage databases (like the SQLite files in the Chrome profile folder), and browser debugging tools such as Chrome DevTools. Forensic tools and SQLite browsers used in digital investigations commonly surface these values.

Are WebKit timestamps timezone-aware?

WebKit timestamps are stored in UTC and have no built-in timezone information. When displaying them as human-readable dates, you need to apply a timezone offset manually. This converter lets you view the result in either UTC or your local browser timezone.

What does a typical WebKit timestamp look like?

A WebKit timestamp is typically a 17-digit integer, for example 13417621812000000. If the value you have is shorter or longer, it may be a different timestamp format such as Unix seconds (10 digits), Unix milliseconds (13 digits), or Windows FILETIME (which shares the same epoch but uses 100-nanosecond intervals).

Can I convert a regular date back to a WebKit timestamp?

Yes. You can enter a Unix timestamp (seconds since 1970) into the reverse-convert field on this tool and it will calculate the equivalent WebKit timestamp. You can obtain a Unix timestamp from any standard date by using a Unix epoch converter.

What is the current WebKit timestamp?

The current WebKit timestamp is calculated by taking the current Unix time in microseconds and adding the offset for the difference between the WebKit epoch (1601-01-01) and the Unix epoch (1970-01-01), which is 11644473600000000 microseconds. The value grows continuously and is currently in the range of 13 quadrillion.

More Time & Date Tools