httr GET operation unable to access JSON response - json

I am trying to access the JSON response from an API call in my R script. The API call is succesful, and I can view the JSON response in the console. However, I am unable to access any data from it.
A sample code segment is:
require(httr)
target <- '#trump'
sentence<- 'Donald trump has a wonderful toupe, it really is quite stunning that a man can be so refined and elegant'
query <- url_encode(sentence)
target <- gsub('#', '', target)
endpoint <- "https://alchemy.p.mashape.com/text/TextGetTargetedSentiment?outputMode=json&target="
apiCall <- paste(endpoint, target, '&text=', query, sep = '')
resp <-GET(apiCall, add_headers("X-Mashape-Key" = sentimentKey, "Accept" = "application/json"))
stop_for_status(resp)
headers(resp)
str(content(resp))
content(resp, "text")
I followed examples in the httr quickstart guide from CRAN (here) as well as this stack.
Unfortunately, I keep getting either "unused parameters 'text' in content()" or "no definition exists for content() accepting a class of 'response.' Does anyone have any advice? PS the headers will print, and resp$content will print the raw bitstream

Expanding on the comment, you need to set the content type explicitly in the call to content(...). Since your code is not reproducible, here is an example using the Census Bureau's geocoder (which returns a json response).
library(httr)
url <- "http://geocoding.geo.census.gov/geocoder/locations/onelineaddress"
resp <-GET(url, query=list(address="1600 Pennsylvania Avenue, Washington DC",
benchmark=9,
format="json"))
json <- content(resp, type="application/json")
json$result$addressMatches[[1]]$coordinates
# $x
# [1] -77.038025
#
# $y
# [1] 38.898735
Assuming your are actually getting a json response, and that it is well-formed, simply using content(resp, type="application/json") should work.

Related

Strange beahviour. `read` works when issuing commands, but not inside a function

I am just trying to restart with Julia (made some tries a couple of years ago but the libraries were still missing too much stuff).
I am now trying something really simple and can't figure out why doesn't work.
If I run these very same commands directly outside a function, I get what I want, but if I put them inside a function, I get an error when calling the read command inside my read_datafile function:
using ArgParse, ZipFile, CSV, DataFrames
function read_datafile(fp)
z = ZipFile.Reader(fp)
a = z.files[1]
df = DataFrame(CSV.File(read(a)))
return df
end
read_datafile("./folder1/test.zip")
SystemError: seek: Bad file descriptor
Stacktrace: [1] #systemerror#48 at ./error.jl:167 [inlined] [2]
systemerror at ./error.jl:167 [inlined] [3] seek at ./iostream.jl:129
[inlined] [4] read(::ZipFile.ReadableFile, ::Int64) at
/home/morgado/.julia/packages/ZipFile/fdYkP/src/ZipFile.jl:508 [5]
read at /home/morgado/.julia/packages/ZipFile/fdYkP/src/ZipFile.jl:504
[inlined] [6] read_datafile(::String) at ./In[14]:4 [7] top-level
scope at In[15]:1
EDIT:
Added more info.
using Pkg; Pkg.status()
Status `~/.julia/environments/v1.5/Project.toml`
[c7e460c6] ArgParse v1.1.1
[336ed68f] CSV v0.8.3
[a93c6f00] DataFrames v0.21.8
[92fee26a] GZip v0.5.1
[7073ff75] IJulia v1.23.1
[6f49c342] RCall v0.13.10
[fd094767] Suppressor v0.2.0
[70df011a] TableReader v0.4.0
[a5390f91] ZipFile v0.9.3
I found the answer, it's a 5 year old unsolved bug in the ZipFile package :( : https://github.com/fhs/ZipFile.jl/issues/14
Need to write the function with a global variable:
function read_datafile(fp)
global z = ZipFile.Reader(fp)
a = z.files[1]
df = DataFrame(CSV.File(read(a)))
return df
end

How Do I Consume an Array of JSON Objects using Plumber in R

I have been experimenting with Plumber in R recently, and am having success when I pass the following data using a POST request;
{"Gender": "F", "State": "AZ"}
This allows me to write a function like the following to return the data.
#* #post /score
score <- function(Gender, State){
data <- list(
Gender = as.factor(Gender)
, State = as.factor(State))
return(data)
}
However, when I try to POST an array of JSON objects, I can't seem to access the data through the function
[{"Gender":"F","State":"AZ"},{"Gender":"F","State":"NY"},{"Gender":"M","State":"DC"}]
I get the following error
{
"error": [
"500 - Internal server error"
],
"message": [
"Error in is.factor(x): argument \"Gender\" is missing, with no default\n"
]
}
Does anyone have an idea of how Plumber parses JSON? I'm not sure how to access and assign the fields to vectors to score the data.
Thanks in advance
I see two possible solutions here. The first would be a command line based approach which I assume you were attempting. I tested this on a Windows OS and used column based data.frame encoding which I prefer due to shorter JSON string lengths. Make sure to escape quotation marks correctly to avoid 'argument "..." is missing, with no default' errors:
curl -H "Content-Type: application/json" --data "{\"Gender\":[\"F\",\"F\",\"M\"],\"State\":[\"AZ\",\"NY\",\"DC\"]}" http://localhost:8000/score
# [["F","F","M"],["AZ","NY","DC"]]
The second approach is R native and has the advantage of having everything in one place:
library(jsonlite)
library(httr)
## sample data
lst = list(
Gender = c("F", "F", "M")
, State = c("AZ", "NY", "DC")
)
## jsonify
jsn = lapply(
lst
, toJSON
)
## query
request = POST(
url = "http://localhost:8000/score?"
, query = jsn # values must be length 1
)
response = content(
request
, as = "text"
, encoding = "UTF-8"
)
fromJSON(
response
)
# [,1]
# [1,] "[\"F\",\"F\",\"M\"]"
# [2,] "[\"AZ\",\"NY\",\"DC\"]"
Be aware that httr::POST() expects a list of length-1 values as query input, so the array data should be jsonified beforehand. If you want to avoid the additional package imports altogether, some system(), sprintf(), etc. magic should do the trick.
Finally, here is my plumber endpoint (living in R/plumber.R and condensed a little bit):
#* #post /score
score = function(Gender, State){
lapply(
list(Gender, State)
, as.factor
)
}
and code to fire up the API:
pr = plumber::plumb("R/plumber.R")
pr$run(port = 8000)

Error while trying to parse json into R

I have recently started using R and have a task regarding parsing json in R to get a non-json format. For this, i am using the "fromJSON()" function. I have tried to parse json as a text file. It runs successfully when i do it with just a single row entry. But when I try it with multiple row entries, i get the following error:
fromJSON("D:/Eclairs/Printing/test3.txt")
Error in feed_push_parser(readBin(con, raw(), n), reset = TRUE) :
lexical error: invalid char in json text.
[{'CategoryType':'dining','City':
(right here) ------^
> fromJSON("D:/Eclairs/Printing/test3.txt")
Error in feed_push_parser(readBin(con, raw(), n), reset = TRUE) :
parse error: trailing garbage
"mumbai","Location":"all"}] [{"JourneyType":"Return","Origi
(right here) ------^
> fromJSON("D:/Eclairs/Printing/test3.txt")
Error in feed_push_parser(readBin(con, raw(), n), reset = TRUE) :
parse error: after array element, I expect ',' or ']'
:"mumbai","Location":"all"} {"JourneyType":"Return","Origin
(right here) ------^
The above errors are due to three different formats in which i tried to parse the json text, but the result was the same, only the location suggested by changed.
Please help me to identify the cause of this error or if there is a more efficient way o performing the task.
The original file that i have is an excel sheet with multiple columns and one of those columns consists of json text. The way i tried right now is by extracting just the json column and converting it to a tab separated text and then parsing it as:
fromJSON("D:/Eclairs/Printing/test3.txt")
Please also suggest if this can be done more efficiently. I need to map all the columns in the excel to the non-json text as well.
Example:
[{"CategoryType":"dining","City":"mumbai","Location":"all"}]
[{"CategoryType":"reserve-a-table","City":"pune","Location":"Kothrud,West Pune"}]
[{"Destination":"Mumbai","CheckInDate":"14-Oct-2016","CheckOutDate":"15-Oct-2016","Rooms":"1","NoOfPax":"3","NoOfAdult":"3","NoOfChildren":"0"}]
Consider reading in the text line by line with readLines(), iteratively saving the JSON dataframes to a growing list:
library(jsonlite)
con <- file("C:/Path/To/Jsons.txt", open="r")
jsonlist <- list()
while (length(line <- readLines(con, n=1, warn = FALSE)) > 0) {
jsonlist <- append(jsonlist, list(fromJSON(line)))
}
close(con)
jsonlist
# [[1]]
# CategoryType City Location
# 1 dining mumbai all
# [[2]]
# CategoryType City Location
# 1 reserve-a-table pune Kothrud,West Pune
# [[3]]
# Destination CheckInDate CheckOutDate Rooms NoOfPax NoOfAdult NoOfChildren
# 1 Mumbai 14-Oct-2016 15-Oct-2016 1 3 3 0

Convert tweets into Bson using twitteR and rmongo library

Since the streamR connection API doesn't work anymore on Tweeter I try to convert the output from searchTwitter function (from TwitteR) into BSON before insert it in a mongodb database.
test.tweets = searchTwitter("mongodb", n=10, lang="en")
class(test.tweets)
test.text=laply(test.tweets,function(t) t$getText())
class(toJSON(test.text))
bson <- mongo.bson.from.JSON(test.text)
R return an error : "Error in mongo.bson.from.JSON(test.text) : Not a valid JSON content:..."
How to resolve this conversion or does exist another solution ?
Thank you
This works
library(rmongodb)
library(jsonlite)
test.text <- c("A tweet", "Another tweet")
(bson <- mongo.bson.from.JSON(toJSON(test.text)))
# 1 : 2 A tweet
# 2 : 2 Another tweet

Getting data from JSON file in R

Lets say that I have the following json file:
{
"id": "000018ac-04ef-4270-81e6-9e3cb8274d31",
"currentCompany": "",
"currentTitle": "--",
"currentPosition": ""
}
I use the following code:
Usersfile <- ('trial.json') #where trial the json above
library('rjson')
c <- file(Usersfile,'r')
l <- readLines(c,-71L)
json <- lapply(X=l,fromJSON)
and I have the following error:
Error: parse error: premature EOF
{
(right here) ------^
But when I enter the json file(with notepad) and put the data in one line:
{"id": "000018ac-04ef-4270-81e6-9e3cb8274d31","currentCompany": "","currentTitle": "--","currentPosition": ""}
The code works fine.(In reality the file is really big to do it manually for each line). Why is this happening? How can I overcome that?
Also this one doesnt work:
{ "id": "000018ac-04ef-4270-81e6-9e3cb8274d31","currentCompany": "","currentTitle": "--","currentPosition": ""
}
EDIT: I used the following code that I could read only the first value:
library('rjson')
c <- file.path(Usersfile)
data <- fromJSON(file=c)
Surprised this was never answered! Using the jsonlite package, you can collapse your json data into one character element using paste(x, collapse="") removing EOF markers for proper import into an R dataframe. I, too, faced a pretty-printed json with exact error:
library(jsonlite)
json <- do.call(rbind,
lapply(paste(readLines(Usersfile, warn=FALSE),
collapse=""),
jsonlite::fromJSON))