Currently trying to learn how to use reactJS and I'm trying to read in a malformed json file.
EG
{
Item : "text",
Desc : "example"
}
Now my understanding is that typically I can use
import data from "data.json"
which would build an object I could use, however that won't work with the malformed json.
I think that my solution would be to read the file as one big string, use regex to clean the json, and then attempt to parse it, but I'm not really sure how to go about reading the file in.
Related
I have a RedPanda message queue, which NIFI pulls JSON data from. That JSON data is then reduced to only the fields I want using JOLTTransform.
The result from this is individual JSON objects.
{
"foo": "bar",
"message": "<Litteral copy of 1 syslog line>"
}
I want to either use a processor like ParseSyslog or ExtractGrok to parse the content of the message field and add the result from a successful parsing back into the JSON content.
So far I have been able to extract the message field into an attribute called message using EvaluateJSONPath, but this is where I get lost. I find no Grok processor or Syslog processor which can read from a specific field in the JSON content or from an attribute in the flowfile.
The question is, what can I use/do to Grok the message field and put it back into the flowfile content. I am pretty new to NIFI, so there might be obvious answers to this, but I cannot find it.
Cheers.
I have been using NIFI in a way, that it was not designed for. I was trying to use an axe to make NIFI do what I believed it was designed for.
Instead of trying to work on the flowfile content, I should be routing the event to other services which do the work I want. Meaning, making NIFI a routing service instead of a "parsing" service.
I have some data in a csv format. However they are already a string, since i have got them from an HTTP request.
I would like to use Data Frames, in order to view the data.
However i don't know how to parse it, because the CSV package only accepts files, not Strings.
One solution would be to write the content of the String into a file, and then to read it out again. But there has to be a better way!
Use IOBuffer(your_string):
CSV.read(IOBuffer(your_string), DataFrame)
I'm new to the World of triplets :-) I'm trying to use DotNetRDF to load the SOLR searchresult into a Graph using DotNetRDF.
The URL I'm getting data from is:
https://nvv.entryscape.net/store/search?type=solr&query=rdfType:https%5C%3A%2F%2Fnvv.entryscape.net%2Fns%2FDocument+AND+context:https%5C%3A%2F%2Fnvv.entryscape.net%2Fstore%2F1
The format is supposed to be "RDF/JSON". No matter what parser or what I try - I only get "invalid URI". Have tried to load from the URL and also tried downloadning the result to a file and load from file, same error.
I'm using VS2017 and have "nugetted" the latest version of DotNetRdf.
Please help me, what am I missing?
Regards,
Lars Siden
It looks like the JSON being returned by that endpoint is not valid RDF/JSON. It does appear to contain some RDF/JSON fragments but they are wrapped up inside another JSON structure. The RDFJSONParser in dotNetRDF requires that your entire JSON document be a single, valid chunk of RDF/JSON.
The value at resource.children[*].metadata is an RDF/JSON object. So is the value at resource.children[*].info. The rest is wrapper using property names that are not valid IRIs (hence the parser error message).
Unfortunately there is no easy way to skip over the rest of the JSON document and only parse the valid bits. To do that you will need to load the JSON document using Newtonsoft.JSON and then serialize each valid RDF/JSON object you are interested in as a string and load that using the RDFJSONParser's Load(IGraph, TextReader) or Parse(IRdfHandler, TextReader) method.
I have following JSON but need to parse it into C# object.
But I am getting message that JSON is not well formatted.
{
[
"<a target="\"WindowA1\"" href="%5C%22http: //www.google.com/test%5C%22">1234568</a>",
"<a target="\"WindowA1\"" href="%5C%22http: //www.google.com/test%5C%22">1234568</a>"
]
}
How to handle the " inside the " while parsing the json file.
That "JSON" is totally broken. You don't parse broken JSON. You go to the amateurs who tried to produce JSON and failed miserably doing it, and tell them off and make them fix it.
If that doesn't work then you tell your boss to make them fix it.
I'll repeat that: You never, ever parse broken JSON data.
I am loading data from a mongodb collection to a mysql table through Kettle transformation.
First I extract them using MongodbInput and then I use json input step.
But since json input step has very low performance, I wanted to replace it with a
javacript script.
I am a beginner in Javascript and even though i tried somethings, the kettle javascript script is not recognizing any keywords.
can anyone give me sample code to convert Json data to different columns using javascript?
To solve your problem you need to see three aspects:
Reading from MongoDB
Reading from JSON
Reading from (probably) String
Reading from MongoDB Except if you changed the interface, MongoDB returns not JSON but BSON files (~binary JSON). You need to see the MongoDB documentation about reading and writing BSON: probably something like BSON.to() and BSON.from() but I don't know it by heart.
Reading from JSON Once you have your BSON in JSON format, you can read it using JSON.stringify() which returns a String.
Reading from (probably) String If you want to use the capabilities of JSON (why else would you use JSON?), you also want to use JSON.parse() which returns a JSON object.
My experience is that to send a JSON object from one step to the other, using a String is not a bad idea, i.e. at the end of a JavaScript step, you write your JSON object to a String and at the beginning of the next JavaScript step (can be further down the stream) you parse it back to JSON to work with it.
I hope this answers your question.
PS: writing JavaScript steps requires you to learn JavaScript. You don't have to be a master, but the basics are required. There is no way around it.
you could use the json input step to get the values of this json and put in common rows