Print JSON with "pretty" (indented) format - json

If I create a JSON object and print it on the console:
LJSONObject:= TJSONObject.Create;
LJSONObject.AddPair(TJSONPair.Create(TJSONString.Create('Hello'), TJSONString.Create('World')));
LJSONObject.AddPair(TJSONPair.Create(TJSONString.Create('Ciao'), TJSONString.Create('Mondo')));
Writeln(LJSONObject.ToString);
the result is:
{"Hello":"World", "Ciao":"Mondo"}
How I can print the result with nicer indentation, like this?
{
"Hello":"World",
"Ciao":"MOndo"
}

TJSONObject does not support pretty printing.
Other JSON libraries do. For instance SuperObject, as discussed here: How do I pretty-print JSON in Delphi?

As Sir Rufo pointed out, there is an inbuilt option as of XE5.
uses REST.JSON,System.JSON;
...
function PrettyJSON(jsonstring:String):String;
var jdoc:TJSONObject;
begin
jdoc:=TJSONObject.ParseJSONValue(jsonstring) as TJSONObject;
result:=TJSON.Format(jdoc)
end;

Related

JSON with SuperObject: is element an array or an object?

I get JSON from API and it have a quirk: usually it returns "tags" element as object {"x":"y"}, but if ther are no tags, it returns empty array [] instead.
I parse JSON with SuperObject, and use this code:
var
JsonObject: ISuperObject;
item: TSuperAvlEntry;
temp: TStringList;
begin
{...}
for item in JsonObject.O['tags'].AsObject do
begin
temp.Add(item.Name);
end;
{...}
It works wonderfully for objects, but it crashes with Access Violation error if it's an array.
As well, if I try something like:
if JSONObject['tags'].AsArray.Length=0 then
it works fine for empty array, but crashes if it is an object.
I don't know for sure that elements may be in "tags" and thus don't know how can I use Exists() in this case.
Any ideas?
Well, looks like I found the answer myself, so I will share it.
ISuperObject has a property "DataType" which you can check, like this:
if JsonObject['tags'].DataType = stObject then
begin
for item in JsonObject.O['tags'].AsObject do
begin
temp.Add(item.Name);
end;
end;
stObject and stArray are most useful to check, but there's also: stBoolean, stDouble, stCurrency, stInt and stMethod.

TRESTRequest: How to add an array as the body of a PUT request

I want to send a PUT request where the body contains an array of JSON objects, like this:
PUT http://hostname/api/items
[{"ID":1},{"ID":2},...]
Using code like the following, I can easily send a POST request with a single TJSONObject in the body:
req := TRESTRequest.Create(nil);
req.Client := FRESTClient;
req.Method := TRESTRequestMethod.rmPOST;
req.Resource := 'api/items';
req.AddBody(someJSONObject);
req.Execute;
Fiddler shows the request as having the correct content:
{"ID",1}
However, if I use a PUT request and add a TJSONArray as the body instead...
ja := TJSONArray.Create;
for jo in someJSONObjects do
ja.Add(jo);
req.Method := TRESTRequestMethod.rmPUT;
req.AddBody(ja);
Fiddler shows the request as having a huge pile of bizarre JSON content:
{"elements":{"items":[{"members":{"items":[{"jsonString":{"strBuffer":{"data":["I","D","","","","","","","","","","","","","",""],"length":2,"maxCapacity":2147483647},"owned":true},"jsonValue":{"strBuffer":{"data":["1","","","","","","","","","","","","","","",""],"length":1,"maxCapacity":2147483647},"owned":true},"owned":true}],...
It looks like some kind of low-level serialization of the raw in-memory object, instead of the expected JSON array contents.
Any idea what I'm doing wrong? The documentation on the AddBody method is not very helpful.
Answering my own question...
The overloads of the AddBody method include:
procedure AddBody(AObject: TJSONObject);
procedure AddBody<T>(AObject: T);
I had assumed TJSONArray was derived from TJSONObject and would therefore use the first overload, but in fact both classes derive from TJSONValue. Therefore, the TJSONObject overload was not used in my case, but rather the generic overload, which apparently succeeded at some kind of lower-level serialization.
Since there is no direct overload for TJSONArray, and the API I'm using doesn't expect a JSON array wrapped in an object, I did this instead:
req.AddBody(ja.ToJSON, ctAPPLICATION_JSON);
This serializes the array to a string, and then specifies the content type as application/json.

Does SuperObject have UTF-8 support

I have been using superobject for all my json parsing needs and today I ran into a bit of a problem that I cannot seem to fix. I downloaded a json file that had an entry in it that looked like this: "place" : "café"and when I tried to parse the file and show it in a messagebox the word café turned out like this: café which tells me that the there is some kind of conversion failure going on when the file was parsed using superobject so before I invest any more time in this library, I would like to know if it supports UTF-8 and if so, how would I go about enabling it.
BTW, The pseudo code I am using to parse the file looks something like this:
uses
SuperObject
...
const
jsonstr = '{ "Place" : "café" }';
...
var
SupOB : ISuperObject;
begin
SupOB := SO(jsonstr);
ShowMessage(SupOB['Place'].AsString);
end;
Is the conversion failing because I am casting the object as a string? I tried also using AsJsonto see if that would have any effect, but it did not so I am not sure what is needed to make objects like these display as they are intended and would appreciate some help. Finally, I have checked and verified that the original file that is being parsed is indeed encoded as UTF-8.
You say you are parsing a file, but your example is parsing a string. That makes a big difference, because if you are reading file data into a string first, you are likely not reading the file data correctly. Remember that Delphi strings use UTF-16 in Delphi 2009 and later, but use ANSI in earlier versions. Either way, not UTF-8. So if your input file is UTF-8 encoded, you must decode its data to the proper string encoding before you can then parse it. café is the UTF-8 encoded form of café being mis-interpreted as ANSI.
Reading and writing files json encoded utf8. Tested on Delphi 2007.
function ReadSO(const aFileName: string): ISuperObject;
var
input: TFileStream;
output: TStringStream;
begin
input := TFileStream.Create(aFileName, fmOpenRead, fmShareDenyWrite);
try
output := TStringStream.Create('');
try
output.CopyFrom(input, input.Size);
Result := TSuperObject.ParseString(PWideChar(UTF8ToUTF16(output.DataString)), true, true);
finally
output.Free;
end;
finally
input.Free;
end;
end;
procedure WriteSO(const aFileName: string; o: ISuperObject);
var
output: TFileStream;
input: TStringStream;
begin
input := TStringStream.Create(UTF16ToUTF8(o.AsJSon(true)));
try
output := TFileStream.Create(aFileName, fmOpenWrite or fmCreate, fmShareDenyWrite);
try
output.CopyFrom(input, input.Size);
finally
output.Free;
end;
finally
input.Free;
end;
end;
Functions UTF8ToUTF16 and UTF16ToUTF8 from unit JclConversions http://sourceforge.net/projects/jcl/.

Does the ulkJSON library have limitations when dealing with base64 in Delphi 7?

I'm working on a project that is using Delphi 7 to consume RESTful services. We are creating and decoding JSON with the ulkJSON library. Up to this point I've been able to successfully build and send JSON containing a base64 string that exceed 5,160kb. I can verify that the base64 is being received by the services and verify the integrity of the base64 once its there. In addition to sending, I can also receive and successfully decode JSON with a smaller (~ 256KB or less) base64.
However I am experiencing some issues on the return trip when larger (~1,024KB+) base64 is involved for some reason. Specifically when attempting to use the following JSON format and function combination:
JSON:
{
"message" : "/9j/4AAQSkZJRgABAQEAYABgAAD...."
}
Function:
function checkResults(JSONFormattedString: String): String;
var
jsonObject : TlkJSONObject;
iteration : Integer;
i : Integer;
x : Integer;
begin
jsonObject := TlkJSONobject.Create;
// Validate that the JSONFormatted string is not empty.
// If it is empty, inform the user/programmer, and exit from this routine.
if JSONFormattedString = '' then
begin
result := 'Error: JSON returned is Null';
jsonObject.Free;
exit;
end;
// Now that we can validate that this string is not empty, we are going to
// assume that the string is a JSONFormatted string and attempt to parse it.
//
// If the string is not a valid JSON object (such as an http status code)
// throw an exception informing the user/programmer that an unexpected value
// has been passed. And exit from this routine.
try
jsonObject := TlkJSON.ParseText(JSONFormattedString) as TlkJSONobject;
except
on e:Exception do
begin
result := 'Error: No JSON was received from web services';
jsonObject.Free;
exit;
end;
end;
// Now that the object has been parsed, lets check the contents.
try
result := jsonObject.Field['message'].value;
jsonObject.Free;
exit;
except
on e:Exception do
begin
result := 'Error: No Message received from Web Services '+e.message;
jsonObject.Free;
exit;
end;
end;
end;
As mentioned above when using the above function, I am able to get small (256KB and less) base64 strings out of the 'message' field of a JSON object. But for some reason if the received JSON is larger than say 1,024kb the following line seems to just stop in its tracks:
jsonObject := TlkJSON.ParseText(JSONFormattedString) as TlkJSONobject;
No errors, no results. Following the debugger, I can go into the library, and see that the JSON string being passed is not considered to be JSON despite being in the format listed above. The only difference I can find between calls that work as expected and calls that do not work as expect appears to be the size of base64 being transmitted.
Am I missing something completely obvious and should be shot for my code implementation (very possible)? Have I missed some notation regarding the limitations of the ulkJSON library? Any input would be extremely helpful. Thanks in advance stack!
So after investigating this for hours over the course of some time, I did discover that the library indeed was working properly and there was no issue.
The issue came down to the performance of my machine as it was taking on average 215802 milliseconds (3.5967 minutes) to process a moderately sized image (1.2 meg) in base64 format. This performance scaled according to the size of the base64 string (faster for smaller, longer for larger).

Crossplatform JSON Parsing

Good evening all.
I'm currently developing a cross-platform compatible version of my product WinFlare. The issue I'm facing is that SuperObject still isn't cross-platform compatible with Firemonkey. By all means, I used it in the original version of the product, but now I want to create a cross-platform version as opposed to one limited to just Windows, I'm finding it to be a hassle.
DBXJSON is the only cross-platform solution I've been able to find after extensive hours of research, but that's proving to be frustrating to try and deal with. Most all of the examples I've found for it either don't apply for my situation, or they're too complicated to gleam anything useful from. There's lots of discussion, but I'm just struggling to get to grips with what was such a simple task with SuperObject. I've spent the best part of this evening trying to find something that works to build from, but everything I've tried has just led me back to square one.
Ideally, I'd like to fix up SuperObject, but I lack the knowledge to go so in depth as to make it cross-platform compatible with OS X (and ready for the mobile studio). I'd welcome any suggestions on that, but as I imagine no one's got the time to go through such a huge task, it looks like DBXJSON is my only option.
The JSON layout I'm dealing with is still the same;
{
response: {
ips: [
{
ip: "xxx.xxx.xxx.xxx",
classification: "threat",
hits: xx,
latitude: xx,
longitude: xx,
zone_name: "domain-example1"
},
{
ip: "yyy.yyy.yyy.yyy",
classification: "robot",
hits: yy,
latitude: xx,
longitude: xx,
zone_name: "domain-example2"
}
]
}
result : "success",
msg: null
}
There can be hundreds of results in the ips array. Let's say I want to parse through all of the items in the array and extract every latitude value. Let's also assume for a second, I'm intending to output them to an array. Here's the sort of code template I'd like to use;
procedure ParseJsonArray_Latitude(SInput : String);
var
i : Integer;
JsonArray : TJsonArray;
Begin
// SInput is the retrieved JSON in string format
{ Extract Objects from array }
for i := 0 to JsonArray.Size-1 do
begin
Array_Latitude[i] := JsonArray.Item[i].ToString;
end;
end;
Essentially, where it says { Extract Objects from array }, I'd like the most basic solution using DBXJSON that would solve my problem. Obviously, the calls I've shown related to JsonArray in the template above might not be correct - they're merely there to serve as an aid.
First, parse the string to get an object.
var
obj: TJsonObject;
obj := TJsonObject.ParseJsonValue(SInput) as TJsonObject;
That gives you an object with three attributes, response, result, and msg. Although ParseJsonValue is a method of TJsonObject, and your particular string input happens to represent an object value, it can return instances of any TJsonValue descendant depending on what JSON text it's given. Knowing that's where to start is probably the hardest part of working with DbxJson.
Next, get the response attribute value.
response := obj.Get('response').JsonValue as TJsonObject;
That result should be another object, this time with one attribute, ips. Get that attribute, which should have an array for a value.
ips := response.Get('ips').JsonValue as TJsonArray;
Finally, you can get the values from the array. It looks like you're expecting the values to be numbers, so you can cast them that way.
for i := 0 to Pred(ips.Size) do
Array_Latitude[i] := (ips.Get(i) as TJsonObject).Get('latitude').JsonValue as TJsonNumber;
Remember to free obj, but not the other variables mentioned here, when you're finished.
For completion, since the question stated that there was no alternative to DBXJSON for cross-platform, I would like to point out two Open Source alternatives, which appeared since the initial question.
XSuperObject has an API very close to SuperObject, but is cross-platform;
Our SynCrossPlatformJSON.pas unit, which is lighter and much faster than both DBXJSON and XSuperObject.
SynCrossPlatformJSON is able to create schema-less objects or arrays, serialize and unserialize them as JSON, via a custom variant type, including late-binding to access the properties.
For your problem, you could write:
var doc: variant;
ips: PJSONVariantData; // direct access to the array
i: integer;
...
doc := JSONVariant(SInput); // parse JSON Input and fill doc custom variant type
if doc.response.result='Success' then // easy late-binding access
begin
ips := JSONVariantData(doc.response.ips); // late-binding access into array
SetLength(Arr_Lat,ips.Count);
for i := 0 to ips.Count-1 do begin
Arr_lat[i] := ips.Values[i].latitude;
Memo1.Lines.add(ips.Values[i].latitude);
end;
end;
... // (nothing to free, since we are using variants for storage)
Late-binding and variant storage allow pretty readable code.
Thanks to assistance from Rob Kennedy, I managed to build a solution that solved the problem;
var
obj, response, arrayobj : TJSONObject;
ips : TJSONArray;
JResult : TJsonValue;
i : Integer;
Arr_Lat : Array of string;
begin
try
Memo1.Lines.Clear;
obj := TJsonObject.ParseJSONValue(SInput) as TJSONObject;
response := Obj.Get('response').JsonValue as TJSONObject;
ips := response.Get('ips').JsonValue as TJSONArray;
SetLength(Arr_Lat, ips.Size-1);
for i := 0 to ips.Size-1 do
begin
arrayobj := ips.Get(i) as TJSONObject;
JResult := arrayobj.Get('latitude').JsonValue;
Arr_lat[i] := JResult.Value;
Memo1.Lines.Add(JResult.Value);
end;
finally
obj.Free;
end;
This will add the results to both the array (Arr_Lat), and output them to the memo (Memo1).