Skip to main content
Unix Timestamp Converter hero banner

Unix Timestamp Converter

Convert epoch time to dates and vice versa. Supports timezones, milliseconds, and batch conversion.

About the Calculator

Unix time (epoch time) is the number of seconds since January 1, 1970 00:00:00 UTC. This converter turns timestamps into human-readable dates and vice versa. Use it when working with APIs, databases, or logs that store times as numbers. It handles seconds, milliseconds, microseconds, and nanoseconds, and shows conversions across common timezones. Use the Unix Timestamp Converter to get a clear result you can act on right away. This calculator is designed to be practical, fast, and easy to use on any device. If you are comparing options, run a few scenarios to see how small changes affect the outcome.

The Converter


Converted Date & Time

Wednesday, January 1, 2025 at 12:00:00 AM

Unix: 1735689600 (seconds)

UTC

Wednesday, January 1, 2025 at 12:00:00 AM

New York (EST/EDT)

Tuesday, December 31, 2024 at 7:00:00 PM

Los Angeles (PST/PDT)

Tuesday, December 31, 2024 at 4:00:00 PM

London (GMT/BST)

Wednesday, January 1, 2025 at 12:00:00 AM

Tokyo (JST)

Wednesday, January 1, 2025 at 9:00:00 AM

Sydney (AEDT/AEST)

Wednesday, January 1, 2025 at 11:00:00 AM

Relative time

16 months ago

ISO 8601

2025-01-01T00:00:00.000Z

RFC 2822

Wed, 01 Jan 2025 00:00:00 GMT

Definition

What is a Unix timestamp?

A Unix timestamp is the number of seconds since Jan 1, 1970 (UTC).

  • Also called: Epoch time, POSIX time
  • Example: 1715990400 = May 18, 2024 (UTC midnight)

Unix Timestamp Cheat Sheet

  • Seconds → 10 digits
  • Milliseconds → 13 digits
  • Microseconds → 16 digits
  • Nanoseconds → 19 digits

The Formula

Unix timestamp = seconds since Jan 1, 1970 00:00:00 UTC To convert: Date = new Date(timestamp * 1000) To get timestamp: Math.floor(Date.now() / 1000)

The 1969 Bug

Why does my timestamp show 1969 or 1970?

If your date appears as 1969 or 1970, it usually means your timestamp is incorrect. This commonly happens when:

  • You used seconds instead of milliseconds (or vice versa)
  • The timestamp value is missing or invalid
  • Your code failed to parse the date correctly

Deeper explanation

If you ever see a date like:

  • December 31, 1969
  • January 1, 1970
  • Or something very close to those

It almost always means your Unix timestamp is wrong (or missing).

Why this happens

Unix time starts at January 1, 1970 (00:00:00 UTC) - the Unix epoch.

When something goes wrong, your code often falls back to:

  • 0 → Jan 1, 1970
  • -1 → Dec 31, 1969
  • null / undefined → often coerced to 0
  • Failed parse → defaults to epoch

The most common causes

1. Seconds vs milliseconds mismatch (most common)

new Date(1716153600) // Wrong - treated as milliseconds

This produces a date near 1970.

Correct:

new Date(1716153600 * 1000)

JavaScript's Date expects milliseconds, not seconds.

2. Null / undefined values

new Date(undefined)

Often results in 1970 or "Invalid Date".

3. Parsing failure

new Date("bad-data")

Fallback behavior can default to the epoch.

4. Backend returning wrong unit

For example: the API sends seconds but the frontend expects milliseconds - dates land near 1970.

5. Negative timestamps

new Date(-1000)

Returns Dec 31, 1969 - valid for pre-epoch times, but often unintended.

Quick debugging rule

If you see 1969 or 1970, check this immediately:

  • Is the value in seconds vs milliseconds?
  • Is the value null, undefined, or NaN?
  • Is the API returning the unit you expect?
  • Are you multiplying or dividing by 1000 correctly?

Developer shortcut

  • 1970 → often a "zero" or default epoch problem
  • 1969 → often negative or broken input

Real-world example

You expect 1716153600 → 2024, but you write:

new Date(1716153600)

Result: around January 20, 1970 (UTC) - because the value was treated as milliseconds, not seconds. Fix: multiply by 1000 when building the Date.

Convert Unix Timestamp in Popular Programming Languages

Example timestamp used: 1716153600 May 19, 2024 (UTC)

JavaScript (Node.js / Browser)

const timestamp = 1716153600;
const date = new Date(timestamp * 1000);
console.log(date.toUTCString());

Python

from datetime import datetime

timestamp = 1716153600
date = datetime.utcfromtimestamp(timestamp)
print(date)

Java

import java.time.Instant;
import java.time.ZoneId;
import java.time.ZonedDateTime;

long timestamp = 1716153600L;
ZonedDateTime date = Instant.ofEpochSecond(timestamp).atZone(ZoneId.of("UTC"));
System.out.println(date);

C#

using System;

long timestamp = 1716153600;
DateTime date = DateTimeOffset.FromUnixTimeSeconds(timestamp).UtcDateTime;
Console.WriteLine(date);

Go

package main
import (
    "fmt"
    "time"
)

func main() {
    timestamp := int64(1716153600)
    date := time.Unix(timestamp, 0).UTC()
    fmt.Println(date)
}

PHP

<?php
$timestamp = 1716153600;
echo gmdate("Y-m-d H:i:s", $timestamp);
?>

Ruby

timestamp = 1716153600
date = Time.at(timestamp).utc
puts date

Swift

import Foundation

let timestamp: TimeInterval = 1716153600
let date = Date(timeIntervalSince1970: timestamp)
print(date)

Kotlin

import java.time.Instant
import java.time.ZoneOffset

val timestamp = 1716153600L
val date = Instant.ofEpochSecond(timestamp).atOffset(ZoneOffset.UTC)
println(date)

Rust

use chrono::{TimeZone, Utc};

let timestamp = 1716153600;
let date = Utc.timestamp_opt(timestamp, 0).unwrap();
println!("{}", date);

C

#include <stdio.h>
#include <time.h>

int main() {
    time_t timestamp = 1716153600;
    printf("%s", asctime(gmtime(&timestamp)));
    return 0;
}

C++

#include <iostream>
#include <ctime>

int main() {
    std::time_t timestamp = 1716153600;
    std::cout << std::asctime(std::gmtime(&timestamp));
    return 0;
}

Bash

date -u -d @1716153600

PowerShell

$timestamp = 1716153600
$date = [DateTimeOffset]::FromUnixTimeSeconds($timestamp).UtcDateTime
Write-Output $date

SQL (PostgreSQL)

SELECT to_timestamp(1716153600);

SQL (MySQL)

SELECT FROM_UNIXTIME(1716153600);

R

timestamp <- 1716153600
date <- as.POSIXct(timestamp, origin="1970-01-01", tz="UTC")
print(date)

MATLAB

timestamp = 1716153600;
date = datetime(timestamp, 'ConvertFrom', 'posixtime', 'TimeZone', 'UTC');
disp(date);

Dart

void main() {
  int timestamp = 1716153600;
  DateTime date = DateTime.fromMillisecondsSinceEpoch(timestamp * 1000, isUtc: true);
  print(date);
}

TypeScript

const timestamp: number = 1716153600;
const date: Date = new Date(timestamp * 1000);
console.log(date.toUTCString());

🔥 Pro Tips

Common Gotcha

  • JavaScript uses milliseconds, so multiply by 1000
  • Most other languages use seconds

Quick Rule

  • 10 digits → seconds
  • 13 digits → milliseconds

FAQ

What is a Unix timestamp?

A Unix timestamp is the number of seconds that have elapsed since January 1, 1970 (00:00:00 UTC), also known as the Unix epoch. It is used by computers to store and track time in a simple numeric format that is easy to calculate and compare.


Why are Unix timestamps used?

Unix timestamps are widely used because they are:

  • Simple (just a number)
  • Timezone-independent (always based on UTC)
  • Easy to compare and sort
  • Efficient for storage in databases and APIs

They are commonly used in programming, databases, logging systems, and APIs.


How do I convert a Unix timestamp to a readable date?

To convert a Unix timestamp:

  1. Take the timestamp value (e.g., 1716153600)
  2. Interpret it as seconds since Jan 1, 1970 (UTC)
  3. Convert it into a human-readable date using a tool or programming language

Example: 1716153600 → May 19, 2024 (UTC)

You can use the converter on this page to instantly convert timestamps in both directions.


What is the difference between seconds and milliseconds in timestamps?

Unix timestamps can be stored in different units:

  • Seconds (10 digits) → 1716153600
  • Milliseconds (13 digits) → 1716153600000

Key difference:

  • JavaScript typically uses milliseconds
  • Most other systems use seconds

If your timestamp has 13 digits, divide by 1000 to convert to seconds.


Are Unix timestamps always in UTC?

Yes. Unix timestamps are always based on UTC (Coordinated Universal Time).

  • They do not include timezone information
  • Timezone is only applied when displaying the date

This ensures consistency across systems worldwide.


Can Unix timestamps be negative?

Yes. Negative Unix timestamps represent dates before January 1, 1970.

Example: -1 → December 31, 1969 (UTC)

This is useful for historical dates and certain database systems.


What is the 2038 problem?

The 2038 problem is a limitation in older 32-bit systems where Unix timestamps will overflow on:

January 19, 2038 at 03:14:07 UTC

This happens because the timestamp exceeds the maximum value a 32-bit signed integer can store.

Modern systems use 64-bit integers, which solve this issue.


Why is my Unix timestamp wrong?

Common causes include:

  • Using milliseconds instead of seconds (or vice versa)
  • Timezone confusion (timestamps are always UTC)
  • Forgetting to multiply by 1000 in JavaScript
  • Parsing errors in code

Always check the number of digits:

  • 10 digits = seconds
  • 13 digits = milliseconds

What are Unix timestamps used for in real applications?

Unix timestamps are commonly used in:

  • APIs (sending and receiving time data)
  • Databases (sorting and indexing records)
  • Logging systems (tracking events)
  • Authentication (JWT expiration times)
  • Scheduling systems

They provide a consistent and efficient way to handle time across systems.


What is the difference between Unix time and ISO 8601?

  • Unix timestamp → Numeric format (e.g., 1716153600)
  • ISO 8601 → Human-readable format (e.g., 2024-05-19T21:20:00Z for the same instant as 1716153600)

Unix timestamps are better for computation, while ISO 8601 is better for readability.


Does daylight saving time affect Unix timestamps?

No. Unix timestamps are based entirely on UTC, so they are not affected by daylight saving time (DST).

DST only affects how timestamps are displayed in local timezones.


How accurate are Unix timestamps?

Standard Unix timestamps are accurate to seconds, but they can also be stored in:

  • Milliseconds
  • Microseconds
  • Nanoseconds

Higher precision is used in systems that require exact timing, such as financial systems or high-performance computing.

🎉 Fun Facts

  • Unix time started on January 1, 1970 at midnight UTC, chosen when Unix was developed at Bell Labs.
  • The Year 2038 problem: 32-bit systems will overflow at timestamp 2,147,483,647 (Jan 19, 2038). Most modern systems use 64-bit.
  • Unix time ignores leap seconds. Every day is treated as exactly 86,400 seconds.
  • JavaScript uses milliseconds (13 digits), while many APIs use seconds (10 digits). Always check the format.