RAML - json schema error - json

I have included a JSON schema file in RAML.
In the real file, I'm getting an error saying "The included resource schemas/createRecord.json contains an error"
I have validated the JSON file, but it has no errors
when I replaced the file contents with the working file content, still I'm getting the same error.
It seems to be a problem with the file format and not the content.
Can someone help me in identifying the issue?

This depends on how you validated the json schema. Not all validitors are created equal and the trick is to use one that matches mulesoft and I have found http://jsonschemalint.com/#/version/draft-05/markup/json is the best and when i get an error in mule I get the same error in jsonschemalint

Related

Schema.org in JSON LD format recognized by Google, but shows error in website console

I recently created my own JSON-LD schema and tested it through Googles Structured data testing tool and no errors were detected, but when I checked the website console URL:https://www.policyhouse.com/uae/en
I could see the following error, can someone please help me resolve the issue,
[Facebook Pixel] - Unable to parse JSON-LD tag. Malformed JSON found: '

JSONs with (supposedly) same format treated differently by BigQuery - one accepted, one rejected

I am trying to upload JSON files to BigQuery. The JSON files are outputs from the Lighthouse auditing tool. I have made some changes them in Python to make field names acceptable for BigQuery and converted the format into newline JSON.
I am now testing this process and I have found that while for many web pages the upload runs without issue, BigQuery is rejecting some of the JSON files. The rejected JSONs always seem to be from the same website, for example, many of the audit JSONs from Topshop have failed on upload (the manipulations in Python run without issue). What I am confused by is that I can see no difference in the formatting/structure of the JSONs which succeed and fail.
I have included some examples here of the JSON files: https://drive.google.com/open?id=1x66PoDeQGfOCTEj4l3VqMIjjhdrjqs9w
The error I get from BigQuery when a JSON fails to load is this:
Error while reading table: build_test_2f38f439_7e9a_4206_ada6_ac393e55b8ec4_source, error message: Failed to parse JSON: No active field found.; ParsedString returned false; Could not parse value; Could not parse value; Could not parse value; Could not parse value; Could not parse value; Could not parse value; Parser terminated before end of string
I have also attempted to upload the failed JSONs to a new table through the interface using the autodetect feature (in an attempt to discover whether the Schema was at fault) and these uploads fail too, with the same error.
This makes me think the JSON files must be wrong, but I have copied them into several different JSON validators which all accept them as one row of valid JSON.
Any help understanding this issue would be much appreciated, thank you!
When you load JSON files to BigQuery, it's good to remember that there are some limitations associated to this format. You can find them here. Even though your files might be valid JSON files, some of them may not comply with BigQuery limitations, so I would recommend you to double check if they are actually correct for BigQuery.
I hope that helps.
I eventually found the error here through a long trial and error process where I uploaded first the first-half and then the second-half of the JSON file to BigQuery. The second-half failed so I split that in half again to see which half the error occurred in. This continued until I found the line.
At a deep level of nesting there was a situation where one field was always a list of strings, but when there were no values associated with the field it appeared as an empty string (rather than an empty list). This inconsistency was causing the error. The trial and error process was long but given the vague error message and that the JSON was thousands of lines long, this seemed like the most efficient way to get there.

jmeter shows "500 Internal Server Error" with UnmarshalException in viewresulttree

Out of nowhere, jmeter started failing 100% for my normal web service requests. Upon further analysis found that, at server empty request is received. Attached viewresultstree listener with all flags ON and found below error
<?xml
version=&apos;1.0&apos; encoding=&apos;UTF-8&apos;?><ns2:Fault
xmlns:ns2="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:ns3="http://www.w3.org/2003/05/soap-envelope"><faultcode>ns2:Server</faultcode><faultstring>javax.xml.bind.UnmarshalException
- with linked exception: [com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </Client>; expected
</EOF>. at [row,col {unknown-source}]:
[24,16]]</faultstring></ns2:Fault>
I can say that there is no XML parsing exception. Can someone please help me on this weird jmeter behavior?
Found the issue, it was with the input file. somehow the input file turned into empty file which was providing EOF for input parameter and hence the request was being treated as ended when this first EOF appeared at server HTTP endpoint.
Fixing this input data file resolved this issue to me. :)

load rdf/json from URL using DotNetRDF

I'm new to the World of triplets :-) I'm trying to use DotNetRDF to load the SOLR searchresult into a Graph using DotNetRDF.
The URL I'm getting data from is:
https://nvv.entryscape.net/store/search?type=solr&query=rdfType:https%5C%3A%2F%2Fnvv.entryscape.net%2Fns%2FDocument+AND+context:https%5C%3A%2F%2Fnvv.entryscape.net%2Fstore%2F1
The format is supposed to be "RDF/JSON". No matter what parser or what I try - I only get "invalid URI". Have tried to load from the URL and also tried downloadning the result to a file and load from file, same error.
I'm using VS2017 and have "nugetted" the latest version of DotNetRdf.
Please help me, what am I missing?
Regards,
Lars Siden
It looks like the JSON being returned by that endpoint is not valid RDF/JSON. It does appear to contain some RDF/JSON fragments but they are wrapped up inside another JSON structure. The RDFJSONParser in dotNetRDF requires that your entire JSON document be a single, valid chunk of RDF/JSON.
The value at resource.children[*].metadata is an RDF/JSON object. So is the value at resource.children[*].info. The rest is wrapper using property names that are not valid IRIs (hence the parser error message).
Unfortunately there is no easy way to skip over the rest of the JSON document and only parse the valid bits. To do that you will need to load the JSON document using Newtonsoft.JSON and then serialize each valid RDF/JSON object you are interested in as a string and load that using the RDFJSONParser's Load(IGraph, TextReader) or Parse(IRdfHandler, TextReader) method.

how to resolve json validation error in mlab?

I have this document which nicely uploads to robomongo but in mlab(mlab.com)it is showing JSON validation error.
Specifically ,"We encountered an error while parsing your JSON. Please check your syntax (e.g. ensure you are using double quotes around both your field names and values) and try again. " is making me nervous.
please check the document here.
That appears to be an array of JSON documents, not a single JSON document, which is what the mLab JSON document editor expects. In other words, an array is not a valid JSON document, even though its elements may be valid JSON documents.