Import multiple json files from a directory and attaching the data
Asked Answered
W

1

5

I am trying to read multiple json files into a working directory for further converting into a dataset. I have files text1, text2, text3 in the directory json. Here is the code i wrote:

setwd("Users/Desktop/json")
temp = list.files(pattern="text*.")
myfiles = lapply(temp, read.delim)
library("rjson")
json_file <- "myfiles"
library(jsonlite)
out <- jsonlite::fromJSON(json_file)
out[vapply(out, is.null, logical(1))] <- "none"
data.frame(out, stringsAsFactors = FALSE)[,1:5]
View(out)

I have about 200 files so i was wondering if there is way in which the json files can be imported.

Thanks

Wrack answered 14/11, 2014 at 20:49 Comment(1)
Take a look at this solution as well.Budgerigar
A
9

I think that I had a similar problem when working w/ Twitter data. I had a directory containing separate files for each user name, and I wanted to import/analyze them as a group. This worked for me:

library(rjson)
filenames <- list.files("Users/Desktop/json", pattern="*.json", full.names=TRUE) # this should give you a character vector, with each file name represented by an entry
myJSON <- lapply(filenames, function(x) fromJSON(file=x)) # a list in which each element is one of your original JSON files

If this doesn't work, then I need a bit more information to understand your problem.

Asphalt answered 9/4, 2015 at 17:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.