Back to articles
Unix Timestamps: The 10 vs 13 Digit Problem That Breaks More APIs Than You Think

Unix Timestamps: The 10 vs 13 Digit Problem That Breaks More APIs Than You Think

via Dev.to WebdevHAU

There's a bug pattern I've seen in production more times than I can count. It goes like this: const timestamp = Date . now (); // Returns something like 1742812800000 fetch ( `/api/events?since= ${ timestamp } ` ); The backend receives 1742812800000 and interprets it as a Unix timestamp in seconds — which translates to the year 57,212 . Cue the confused "why are no events showing up" support ticket. This is the 10-digit vs 13-digit timestamp problem, and it trips up even experienced developers. What Is a Unix Timestamp? A Unix timestamp counts the number of seconds (or milliseconds) elapsed since January 1, 1970, 00:00:00 UTC — also called the Unix epoch. 1742812800 — 10 digits → seconds since epoch 1742812800000 — 13 digits → milliseconds since epoch The same moment in time, just different units. The problem is that different languages and APIs use different conventions: Language/Platform Default Unit JavaScript Date.now() Milliseconds JavaScript new Date().getTime() Milliseconds Pyth

Continue reading on Dev.to Webdev

Opens in a new tab

Read Full Article
2 views

Related Articles