JSON Parsing Error for Schema.org Structured Data - json

I'm an SEO just getting into manual structured data building, so apologies if I come off as a newb here.
Working on this JSON code an**I keep getting parsing errors.
The error seems to be at line 125, but I think the object's brackets are closed and commas in the right place.
Lines 124-133
"#type": "HowToStep",
"position": "2",
"itemListElement": [
{
"#type": "HowToDirection",
"position": "1",
"text": "Research the business you want an appointment with"
},
I've tried ensuring all the brackets were closed and that commas were placed on all key:value pairs except for the last key:value pair of an object or array - using parsing tools like validator.schema.org/ , json-ld.org/playground/ , and search.google.com/test/rich-results keep telling me there's a missing "," "}" or "]" but the key:value pairs, objects, and arrays all look right to me.
I can provide the whole code if needed for more context.
Thanks for the help everyone.

Related

How to use regular expression in JSON as key

Is it even possible to do so? I didn't find a single solution on the internet. I'm working on Angular 6 project. I've to translate few strings and for that I'm maintaining one en-US.json file in i18n folder. JSON is like this:
"words": {
"Car": "Wagen",
"Ship": "Schiff",
"Flight": "Flug",
"*": "", // <------------ HELP NEEDED
}
I don't want to convert other strings whatsoever. Can I use regular expression to ignore them. Leaving them unattended is giving me strings like:
words.train
words.helicopter
Please correct me where I am wrong.

Insomnia: Invalid JSON: Unexpected token ) in JSON at position 0

This is response of the Login request that i need to extract data from.
As you can see first line is not JSON format and Insomnia is complaining about it as "Invalid JSON: Unexpected token ) in JSON at position 0"
Is there any way to ignore/remove first line then extract data? May be custom query?
)]}',
{
"userid": "USER1",
"email": "user1#email.com",
"name": "John Jones"
}
If this is a one-off, just delete the first line with a text editor.
If it's happening systematically, then (a) you really ought to get it fixed at source rather than repairing it after the event, and (b) you need to know what the general pattern of bad data is, rather than working from one example.

Json - optional subdocument

I´m getting a json from an application with a couple of nested subdocuments. Some of those documents are optional and not present all the time. I´m wondering if there is a best practice how to handel this.
e.g. (The document is just an example, the real one looks differnt but I can´t post it, the Example is copied from: How to represent sub-documents in JSON array as Java Collection using Jackson?): The Adreess subdocument is not present in every document I receive.
{
"attributes": {
"type": "Lead",
"url": "/services/data/v30.0/sobjects/Lead/00Qi000000Jr44XEAR"
},
"Id": "00Qi000000Jr44XEAR",
"Name": "Kristen Akin",
"Address": {
"city": null,
"country": "USA",
"state": "CA",
"stateCode": null,
"street": null
},
"Phone": "(434) 369-3100"
}
Currently I´m receiving the data in the worst possible way I can imagine with a differnt type, which is like:
{
"attributes": {
"type": "Lead",
"url": "/services/data/v30.0/sobjects/Lead/00Qi000000Jr44XEAR"
},
"Id": "00Qi000000Jr44XEAR",
"Name": "Kristen Akin",
"Address": "",
"Phone": "(434) 369-3100"
}
I want to suggest better ways and I´m wondering whats the best one?
Leaving the adress subdocument out completely
receiving "Adress: null"
receiving Adress: {}
receiving Adress: {"city": null, "country": null, ...}
anything else
Personally I would go with Nr. 3 because I still get a (sub)document and can treat it the usual way. Does anythin speak against it or are there any best practices for this situation?
Thanks in advance.
Best regards.
Go with 3.
Leaving the adress subdocument out completely
Would work for many deserialization tools, but it is hard to identify the structure and identify if something is missing on debugging easily
receiving "Adress: null"
Would work for many deserialization tools, but it is not a good practice to deliver null for more complex attributes like arrays or objects. You cannot identify, that this is a complex object easily.
receiving Adress: {}
It is a good practice to deliver empty arrays if they are empty and empty objects, if they are empty. You can identify that there could be a complex object but it is not available here. Please go with this solution
receiving Adress: {"city": null, "country": null, ...}
Don't do this. It gives you more details for the complex object, but you cannot identify easily if the address was not added on purpose or if the API partner sends incomplete address data by accident or if incomplete data is valid on their side.
I always differentiate between values which are:
set but empty: We usually interpret these values as valid values which are intended to be empty, like an empty address book, which may contain no entries at all.
undefined: usually this is an optional value. The application has to handle if it needs the data from somewhere else.
null: setting a value intentionally to null means to invalidate the value. We often use this to reset the data. In case of the address book means: there is no address book at all, even no empty one.
I would prefer these options:
1.: if it is left out, it is undefined and means that it is up to the application to handle undefined values. Especially for optional values, you should be aware of handling undefined values.
3.: if it is empty, you still have a valid address book, but an empty one, which makes the handling in code easier.
What I would avoid:
4.: You get an valid address with invalid data, so you have to deep-check if the address is usable, which increases the efforts on validation, so I would not use this option.
5.: changing the data type to "" is also bad because for typed languages it will make it hard to parse because it expects an object but receives a string.

Is a plain string valid JSON? [duplicate]

This question already has answers here:
Is this simple string considered valid JSON?
(5 answers)
Closed 5 years ago.
Is a plain string valid JSON or does there have to be an object?
For example:
"morpheus"
versus:
{
"name": "morpheus"
}
It is valid in Javascript.
You might get confused at first trying to test this:
JSON.parse("bob");
This would fail with the error: "Unexpected token b". However, that's the equivalent of passing just plain bob as the text in the response, not "bob". If you add the quotes:
JSON.parse('"bob"')
You get the simple string "bob" back as you should.
Important
This answer once said No, the first character of the JSON must be a { or a [.
At the time I wrote that, I was testing it with Python. In Python (2.7.x at least), json.loads("a") gives an error, which means that a plain string is not valid JSON there.
It has been rightfully pointed out by others that it cannot be said that a plain string is not valid JSON. See this question for a more appropriate answer.
At this time all I can say is that it depends on your environment. In javascript it may be valid, in python it may not be, etc, etc.
JSON stands for JavaScript Object Notation
Here is a quote from the official site
JSON is built on two structures:
A collection of name/value pairs. In various languages, this is
realized as an object, record, struct, dictionary, hash table, keyed
list, or associative array. An ordered list of values. In most
languages, this is realized as an array, vector, list, or sequence.
These are universal data structures. Virtually all modern programming
languages support them in one form or another. It makes sense that a
data format that is interchangeable with programming languages also be
based on these structures.
In JSON, they take on these forms:
An object is an unordered set of name/value pairs. An object begins
with { (left brace) and ends with } (right brace). Each name is
followed by : (colon) and the name/value pairs are separated by ,
(comma).
Take note of the text I bolded.
Because of that, and JSON being JS Object Notation, it is implied that a JSON representation of a name:value pair must always be in the form of
{
"name": "value"
}
Note that you can also make the 'root object' a list
[
{
"name1": "value1"
},
{
"name2": "value2"
}
]
This basically means that the JSON contains more than one object.
As Sunny R Gupta pointed out, this is also valid JSON
[
"this",
"is",
"valid"
]
Note that this works because the strings are not in the form "name":"value" but just strings. Taking that into consideration valid options for your example would be
{
"name": "Morpheus"
}
or
[
"morpheus"
]
The first character of the JSON must be a { or a [
UPDATE: 2018:
It has been 4 years since I originally answered this question. Back then plain strings in quotes were not valid JSON. As of today, it is being accepted as a valid JSON.
The following is kept for people to see what the error used to be earlier:
Parsing a simple string gives:
Parse error on line 1:
"morpheus"
^
Expecting '{', '['
indicating that it needs to be an object or an array.
TIP: To validate JSON strings and see what works and what does not, try using http://jsonlint.com

JSON interface with UNIX

Am very new to JSON interaction, I have few doubts regarding it. Below are the basic one
1) How could we call/invoke/open JSON file through Unix, I mean let suppose I have a metedata file in JSON, then how should I fetch/update the value backforth from JSON file.
2) Need the example, on how to interact it.
3) How Unix Shell is compatible to JSON, whether is there any other tech/language/tool which is better than shell script.
Thanks,
Nikhil
JSON is just text following a specific format.
Use a text editor and follow the rules. Some editors with "JSON modes" will help with [invalid] syntax highlighting, indenting, brace matching..
A "Unix Shell" has nothing directly to do with JSON - how does a shell relate to text or XML files?
There are some utilities for dealing with JSON which might be of use such as jq - but it really depends on what needs to be done with the JSON (which is ultimately just text).
Json is a format to store strings, bools, numbers, lists, and dicts and combinations thereof (dicts of numbers pointing to lists containing strings etc.). Probably the data you want to store has some kind of structure which fits into these types. Consider that and think of a valid representation using the types given above.
For example, if your text configuration looks something like this:
Section
Name=Marsoon
Size=34
Contents
foo
bar
bloh
EndContents
EndSection
Section
Name=Billition
Size=103
Contents
one
two
three
EndContents
EndSection
… then this looks like a list of dicts which contain some strings and numbers and one list of strings. A valid representation of that in Json would be:
[
{
"Name": "Marsoon",
"Size": 34,
"Contents": [
"foo", "bar", "bloh"
]
},
{
"Name": "Billition",
"Size": 103,
"Contents": [
"one", "two", "three"
]
},
]
But in case you know that each such dictionary has a different Name and always the same fields, you don't have to store the field names and can use the Name as a key of a dictionary; so you can also represent it as a dict of strings pointing to lists containing numbers and lists of strings:
{
"Marsoon": [
34, [ "foo", "bar", "bloh" ]
],
"Billition": [
103, [ "one", "two", "three" ]
]
}
Both are valid representations of your original text configuration. How you'd choose depends mainly on the question whether you want to stay open for later changes of the data structure (the first solution is better then) or if you want to avoid bureaucratic overhead.
Such a Json can be stored as a simple text file. Use any text editor you like for this. Notice that all whitespace is optional. The last example could also be written in one line:
{"Marsoon":[34,["foo","bar","bloh"]],"Billition":[103,["one","two","three"]]}
So sometimes a computer-generated Json might be hard to read and would need an editor at least capable of handling very long lines.
Handling such a Json file in a shell script will not be easy just because the shell has no notion of the idea of such complex types. The most complicated it can handle properly is a dict of strings pointing to strings (bash arrays). So I propose to have a look for a more suitable language, e. g. Python. In Python you can handle all these structures quite efficiently and with very readable code:
import json
with open('myfile.json') as jsonFile:
data = json.load(jsonFile)
print data[0]['Contents'][2] # will print "bloh" in the first example
# or:
print data['Marsoon'][1][2] # will print "bloh" in the second example