Everything you need to know about Unix timestamps — what they are, how to work with them in any language, and common pitfalls.
A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — known as the Unix epoch.
// Current timestamp (seconds)
const now = Math.floor(Date.now() / 1000);
// Convert timestamp to Date
const date = new Date(1708560000 * 1000);
console.log(date.toISOString()); // '2024-02-22T00:00:00.000Z'
Why 1970? When Unix was developed, its creators picked January 1, 1970 as an arbitrary but convenient starting point. Unix time counts seconds elapsed, making date arithmetic trivially simple:
const ONE_DAY = 60 * 60 * 24; // 86400 seconds
const thirtyDaysAgo = Math.floor(Date.now() / 1000) - 30 * ONE_DAY;
| System | Unit | Example |
|---|---|---|
| Unix | Seconds | 1708560000 (10 digits) |
| JavaScript | Milliseconds | 1708560000000 (13 digits) |
| Python | Seconds | time.time() |
| Java | Milliseconds | System.currentTimeMillis() |
Quick check: 13 digits = milliseconds, 10 digits = seconds.
32-bit systems store Unix timestamps as a signed 32-bit integer, which overflows at 2,147,483,647 — January 19, 2038. Modern 64-bit systems won't have this problem for ~292 billion years.
import time
now = int(time.time()) # seconds
now := time.Now().Unix() // seconds
use std::time::{SystemTime, UNIX_EPOCH};
let now = SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_secs();
Use the Timestamp Converter to convert any Unix timestamp to a human-readable date.