How to safely import timestamps with Nanosecond precision
Asked Answered
R

1

1

I’ve discovered this morning that bulk of timestamp formats in R seem to be posix.ct class based, which seems to be risky for use with nano second timestamps due to rounding and accumulation errors. Is this true?

If so, What packages and processing steps are needed to safely import timestamps in nano seconds precision - probably from csv files? (Preferably staying with packages within tidyverse)

Output Visual tools used currently are ggplot2 , plotly, and d3

Ricebird answered 23/10, 2020 at 20:32 Comment(1)
Have a look at nanotime as used in the answer hereHudspeth
S
2

We wrote a package for that: nanotime

It relies on the standard 'numer of nanoseconds since epoch stored in an in int64' representation, and package bit64 supplies the integer64 type. Internally package RcppCCTZ is used for some of the parsing and formatting and more. And one package that already works well with integer64 and hence our nanotime objects is data.table.

Stonge answered 23/10, 2020 at 21:4 Comment(2)
Thanks! Are these objects convertible between tibbles- or will that destroy the timestamp precision ? Any practical oriented examples using nanotime - including handling cleaningRicebird
That may be a good exercise for you to try and actually check and work out. I happen to like data.table a lot, and there it works. The key is to not loose the specific attribute which they may well get right---I simply never tested that. But there is nothing else in R for nano-second resolution so your call. Oh, and none of the plotting packages will know what to do. You will want to downsample or cast the indices of whatever you try to plot to POSIXct.Stonge

© 2022 - 2024 — McMap. All rights reserved.