How can i get UTCTime in millisecond since January 1, 1970 in c language
Asked Answered
M

6

29

Is there any way to get milliseconds and its fraction part from 1970 using time.h in c language?

Mccreery answered 23/12, 2009 at 11:40 Comment(2)
You can't get the fraction part in a platform independent way. Which platform are you focusing on?Wurth
I am following the standard of ANSI C, so that my application will be platform independent. Currently I am on window platform.Mccreery
O
46

This works on Ubuntu Linux:

#include <sys/time.h>

...

struct timeval tv;

gettimeofday(&tv, NULL);

unsigned long long millisecondsSinceEpoch =
    (unsigned long long)(tv.tv_sec) * 1000 +
    (unsigned long long)(tv.tv_usec) / 1000;

printf("%llu\n", millisecondsSinceEpoch);

At the time of this writing, the printf() above is giving me 1338850197035. You can do a sanity check at the TimestampConvert.com website where you can enter the value to get back the equivalent human-readable time (albeit without millisecond precision).

Obe answered 4/6, 2012 at 22:52 Comment(0)
P
12

If you want millisecond resolution, you can use gettimeofday() in Posix. For a Windows implementation see gettimeofday function for windows.

#include <sys/time.h>

...

struct timeval tp;
gettimeofday(&tp);
long int ms = tp.tv_sec * 1000 + tp.tv_usec / 1000;
Pteryla answered 23/12, 2009 at 12:7 Comment(2)
It should probably be gettimeofday (&tv, NULL)Savior
The link in the answer is dead. The domain doesn't exist.Dishrag
K
6

It's not standard C, but gettimeofday() is present in both SysV and BSD derived systems, and is in POSIX. It returns the time since the epoch in a struct timeval:

struct timeval {
    time_t      tv_sec;     /* seconds */
    suseconds_t tv_usec;    /* microseconds */
};
Kelikeligot answered 23/12, 2009 at 12:4 Comment(0)
W
5

For Unix and Linux you could use gettimeofday.

For Win32 you could use GetSystemTimeAsFileTime and then convert it to time_t + milliseconds:

void FileTimeToUnixTime(FILETIME ft, time_t* t, int* ms)
{
  LONGLONG ll = ft.dwLowDateTime | (static_cast<LONGLONG>(ft.dwHighDateTime) << 32);
  ll -= 116444736000000000;
  *ms = (ll % 10000000) / 10000;
  ll /= 10000000;
  *t = static_cast<time_t>(ll);
}
Wurth answered 23/12, 2009 at 12:4 Comment(4)
gettimeofday() does what is needed. Here's a code example: docs.hp.com/en/B9106-90009/gettimeofday.2.html So what OSes is Aman trying to support?Eadie
First link is dead.Chong
@Chong Hence why it's site policy to quote the relevant passage of (or summarize) any link you include.Farlay
The second link is deadDegreeday
H
2
    // the system time
    SYSTEMTIME systemTime;
    GetSystemTime( &systemTime );

    // the current file time
    FILETIME fileTime;
    SystemTimeToFileTime( &systemTime, &fileTime );

    // filetime in 100 nanosecond resolution
    ULONGLONG fileTimeNano100;
    fileTimeNano100 = (((ULONGLONG) fileTime.dwHighDateTime) << 32) + fileTime.dwLowDateTime;

    //to milliseconds and unix windows epoche offset removed
    ULONGLONG posixTime = fileTimeNano100/10000 - 11644473600000;
    return posixTime;
Hazen answered 21/10, 2014 at 12:0 Comment(1)
How do I write a good answer to a question? - Meta Stack Exchange. Your answer seems to be heavily reliant on a specific OS/environment (what about non-POSIX environments?), and lacks context. For example, POSIX cstdlib does not use all-capital type names for any of the datatypes it specifies--those had to be defined elsewhere. You appear to be using a library. It's fine to mention that "this answer pertains to X class of systems with Y library, I am not sure how to do it with Z system, though", but code-only answers like this are hardly usefulFarlay
E
1

Unix time or Posix time is the time in seconds since the epoch you mentioned.

bzabhi's answer is correct: you simply multiply the Unix timestamp by 1000 to get milliseconds.

Be aware that all millisecond values returned by relying on the Unix timestamp will be multiples of 1000 (like 12345678000). The resolution is still only 1 second.

You can't get the fraction part

The comment from Pavel is correct also. The Unix timestamp does not take into account leap seconds. This makes it even less wise to rely on a conversion to milliseconds.

Expansion answered 23/12, 2009 at 11:52 Comment(2)
Any other library which can get the exact milisecond including its fraction part?????Mccreery
The Unix timestamp is about as fundamental as we can go. It must be that the designers of Unix thought one second resolution was enough. Then again the overhead of maintaining a 1ms resolution was probably beyond early Unix systems.Expansion

© 2022 - 2024 — McMap. All rights reserved.