Base64 is breaking lines when encoding in Delphi [duplicate] - json

This question already has answers here:
Convert BitMap to string without line breaks?
(2 answers)
Closed 3 years ago.
I am encoding an image to Base64 using the following code snippet in delphi.
procedure TWM.WMactArquivosAction(Sender: TObject; Request: TWebRequest;
Response: TWebResponse; var Handled: Boolean);
var
ImagePath: string;
JsonObject: TJSONObject;
inStream, outStream: TStream;
StrList: TStringList;
begin
inStream := TFileStream.Create(ImagePath, fmOpenRead);
try
outStream := TFileStream.Create('final_file', fmCreate);
JsonObject := TJSONObject.Create;
try
TNetEncoding.Base64.Encode(inStream, outStream);
outStream.Position := 0;
StrList := TStringList.Create;
StrList.LoadFromStream(outStream);
JsonObject.AddPair('file', StrList.Text);
finally
Response.Content := JsonObject.ToString;
outStream.Free;
JsonObject.DisposeOf;
end;
finally
inStream.Free;
end;
end;
It works fine, the file is converted to Base64 and added to the JsonObject.
The problem is that when retrieving this JsonObject from the webserver I get a bad json formatted because there are line breaks in the base64 string.
You can see that the red one is the string. After the first line break the json is disturbed and it shows in blue, meaning that there is an error in the json response.
The problem
So, the problem is that when encoding to Base64 it is adding line breaks to the string and this is not supported in Json.
My Guess
I have a guess, which indeed worked but I am not sure this is the best solution.
I looped through all the Strings in the TStringList and added the data into a TStringBuilder. After all, I added the TStringBuilder to the Json. Look at my code.
...
var
...
StrBuilder: TStringBuilder;
begin
...
try
...
StrList.LoadFromStream(outStream);
// New
StrBuilder := TStringBuilder.Create;
for I := 0 to StrList.Count - 1 do
StrBuilder.Append(StrList.Strings[I]);
JsonObject.AddPair('file', StrBuilder.ToString);
finally
Response.Content := JsonObject.ToString;
...
end;
...
end;
As you can see the JSON is fine now.
The question
Looping through all the items seems a bad solution for me, will it work fine? (It is getting the response in 344ms on localhost)
Is there a better solution?

Instead of the convenience instance TNetEncoding.Base64 create your own instance and specify the CharsPerLine parameter in Create with 0.
encoding := TBase64Encoding.Create(0);
try
encosing.Encode(inStream, outStream);
finally
encoding.Free;
end;

Related

How to save JSONObject to json file in UTF8 encoding Rad Studio/Delphi

I use Rad studio 11. I read information from json file (file is UTF8 encoded) and convert to jsonobject.
Then I make changes to this jsonobject and want to save to json file. The information is successfully written to the file, but the file has the Windows-1251 encoding. What needs to be done to make the file encoding UTF8? It's need me because json file include russian symbols (in Windows-1251 encoding it's looking like '?').
I read from a file like this:
var inputfile:TextFile;
str:string;
...
if OpenDialog1.Execute then begin
AssignFile(inputfile, OpenDialog1.FileName);
reset(inputfile);
while not Eof(inputfile) do
begin
ReadLn(inputfile, str);
str1 := str1+UTF8ToANSI(str);
end;
closefile(inputfile);
end;
I convert to Jsonobject like this:
LJsonObj:=TJSONObject.ParseJSONValue(str1) as TJSONobject;
Trying to save JsonObject like this:
var
listStr: TStringList;
Size: Integer;
I: Integer;
...
Size := Form3.LJsonObj.Count;
liststr := TStringList.Create;
try
listStr.Add('{');
if Size > 0 then
listStr.Add(LJsonObj.Get(0).ToString);
showmessage(LJsonObj.Get(0).ToString);
for I := 1 to Size - 1 do
begin
listStr.Add(',');
listStr.Add(ANSITOUTF8(LJsonObj.Get(I).ToString));
end;
listStr.Add('}');
// Form1.filepath-is path of file,form1.filename-name of file without file extension
listStr.SaveToFile(Form1.filepath+'\'+form1.filename+'.json');
finally
listStr.Free;
end;
Why are you reading the file using old-style Pascal file I/O? And why are you converting between UTF-8 and ANSI? You are using a Unicode version of Delphi, you should not be dealing with ANSI at all.
In any case:
When reading the file, consider using TStringList.LoadFromFile() or TFile.ReadAllText() instead. Both allow you to specify UTF-8 as the source encoding.
When writing the file, consider using TStringList.SaveToFile() or TFile.WriteAllText() instead. Both allow you to specify UTF-8 as the target encoding.
For example:
var
inputfile: TStringList;
str1: string;
...
begin
...
inputfile := TStringList.Create;
try
inputfile.LoadFromFile(OpenDialog1.FileName, TEncoding.UTF8);
str1 := inputfile.Text;
finally
inputfile.Free;
end;
...
end;
...
var
listStr: TStringList;
...
begin
...
listStr.SaveToFile(Form1.filepath + '\' + form1.filename + '.json', TEncoding.UTF8);
...
end;
var
str1: string;
...
begin
...
str1 := TFile.ReadAllText(OpenDialog1.FileName, TEncoding.UTF8);
...
end;
...
var
listStr: TStringList;
...
begin
...
TFile.WriteAllText(listStr.Text, TEncoding.UTF8);
...
end;
Note that you don't really need to use a TStringList to build up JSON syntax manually. TJSONObject has ToString() and ToJSON() methods to handle that for you. But, if you really want to build up your own JSON syntax manually, consider using TJSONObjectBuilder or TJsonTextWriter for that purpose instead.
No need to loop over the JSONObject. Just use:
TFile.WriteAllBytes(Form1.filepath+'\'+form1.filename+'.json',TEncoding.UTF8.GetBytes(LJsonObj.ToJSON))

Decoding and comparing JSON with accented char

I have an IntraWeb app. In the HTML template, I have Javascript creating a JSON document.
This JSON is sent to the IntraWeb backend and I receive the JSON as:
{"order":"Razão Social"}
I parse the JSON and put "Razão Social" in a var _order.
My problem is when I try to compare that value with a string, it fails. I am having some problem with the encoding. The line
if uppercase(_order) = 'RAZÃO SOCIAL' then
is always false.
I put a breakpoint and I can see the accented char is not OK.
s:=aParams.Values['xorder'];
if s<>'' then begin
jso := TJSonObject.ParseJSONValue(TEncoding.UTF8.GetBytes(s),0) as TJSONObject;
try
jso.TryGetValue<string>('order',_order);
finally
jso.free;
end;
end;
if uppercase(_order) = 'RAZÃO SOCIAL' then
_order:='Order by A.razao_social ';
UpperCase supports ASCII characters only. Instead compare string case insensitively using AnsiCompareText or AnsiSameText, which are aware of Unicode.

Does SuperObject have UTF-8 support

I have been using superobject for all my json parsing needs and today I ran into a bit of a problem that I cannot seem to fix. I downloaded a json file that had an entry in it that looked like this: "place" : "café"and when I tried to parse the file and show it in a messagebox the word café turned out like this: café which tells me that the there is some kind of conversion failure going on when the file was parsed using superobject so before I invest any more time in this library, I would like to know if it supports UTF-8 and if so, how would I go about enabling it.
BTW, The pseudo code I am using to parse the file looks something like this:
uses
SuperObject
...
const
jsonstr = '{ "Place" : "café" }';
...
var
SupOB : ISuperObject;
begin
SupOB := SO(jsonstr);
ShowMessage(SupOB['Place'].AsString);
end;
Is the conversion failing because I am casting the object as a string? I tried also using AsJsonto see if that would have any effect, but it did not so I am not sure what is needed to make objects like these display as they are intended and would appreciate some help. Finally, I have checked and verified that the original file that is being parsed is indeed encoded as UTF-8.
You say you are parsing a file, but your example is parsing a string. That makes a big difference, because if you are reading file data into a string first, you are likely not reading the file data correctly. Remember that Delphi strings use UTF-16 in Delphi 2009 and later, but use ANSI in earlier versions. Either way, not UTF-8. So if your input file is UTF-8 encoded, you must decode its data to the proper string encoding before you can then parse it. café is the UTF-8 encoded form of café being mis-interpreted as ANSI.
Reading and writing files json encoded utf8. Tested on Delphi 2007.
function ReadSO(const aFileName: string): ISuperObject;
var
input: TFileStream;
output: TStringStream;
begin
input := TFileStream.Create(aFileName, fmOpenRead, fmShareDenyWrite);
try
output := TStringStream.Create('');
try
output.CopyFrom(input, input.Size);
Result := TSuperObject.ParseString(PWideChar(UTF8ToUTF16(output.DataString)), true, true);
finally
output.Free;
end;
finally
input.Free;
end;
end;
procedure WriteSO(const aFileName: string; o: ISuperObject);
var
output: TFileStream;
input: TStringStream;
begin
input := TStringStream.Create(UTF16ToUTF8(o.AsJSon(true)));
try
output := TFileStream.Create(aFileName, fmOpenWrite or fmCreate, fmShareDenyWrite);
try
output.CopyFrom(input, input.Size);
finally
output.Free;
end;
finally
input.Free;
end;
end;
Functions UTF8ToUTF16 and UTF16ToUTF8 from unit JclConversions http://sourceforge.net/projects/jcl/.

Does the ulkJSON library have limitations when dealing with base64 in Delphi 7?

I'm working on a project that is using Delphi 7 to consume RESTful services. We are creating and decoding JSON with the ulkJSON library. Up to this point I've been able to successfully build and send JSON containing a base64 string that exceed 5,160kb. I can verify that the base64 is being received by the services and verify the integrity of the base64 once its there. In addition to sending, I can also receive and successfully decode JSON with a smaller (~ 256KB or less) base64.
However I am experiencing some issues on the return trip when larger (~1,024KB+) base64 is involved for some reason. Specifically when attempting to use the following JSON format and function combination:
JSON:
{
"message" : "/9j/4AAQSkZJRgABAQEAYABgAAD...."
}
Function:
function checkResults(JSONFormattedString: String): String;
var
jsonObject : TlkJSONObject;
iteration : Integer;
i : Integer;
x : Integer;
begin
jsonObject := TlkJSONobject.Create;
// Validate that the JSONFormatted string is not empty.
// If it is empty, inform the user/programmer, and exit from this routine.
if JSONFormattedString = '' then
begin
result := 'Error: JSON returned is Null';
jsonObject.Free;
exit;
end;
// Now that we can validate that this string is not empty, we are going to
// assume that the string is a JSONFormatted string and attempt to parse it.
//
// If the string is not a valid JSON object (such as an http status code)
// throw an exception informing the user/programmer that an unexpected value
// has been passed. And exit from this routine.
try
jsonObject := TlkJSON.ParseText(JSONFormattedString) as TlkJSONobject;
except
on e:Exception do
begin
result := 'Error: No JSON was received from web services';
jsonObject.Free;
exit;
end;
end;
// Now that the object has been parsed, lets check the contents.
try
result := jsonObject.Field['message'].value;
jsonObject.Free;
exit;
except
on e:Exception do
begin
result := 'Error: No Message received from Web Services '+e.message;
jsonObject.Free;
exit;
end;
end;
end;
As mentioned above when using the above function, I am able to get small (256KB and less) base64 strings out of the 'message' field of a JSON object. But for some reason if the received JSON is larger than say 1,024kb the following line seems to just stop in its tracks:
jsonObject := TlkJSON.ParseText(JSONFormattedString) as TlkJSONobject;
No errors, no results. Following the debugger, I can go into the library, and see that the JSON string being passed is not considered to be JSON despite being in the format listed above. The only difference I can find between calls that work as expected and calls that do not work as expect appears to be the size of base64 being transmitted.
Am I missing something completely obvious and should be shot for my code implementation (very possible)? Have I missed some notation regarding the limitations of the ulkJSON library? Any input would be extremely helpful. Thanks in advance stack!
So after investigating this for hours over the course of some time, I did discover that the library indeed was working properly and there was no issue.
The issue came down to the performance of my machine as it was taking on average 215802 milliseconds (3.5967 minutes) to process a moderately sized image (1.2 meg) in base64 format. This performance scaled according to the size of the base64 string (faster for smaller, longer for larger).

How to encode output json file (SuperObject)?

I'm using SuperObject library for working with JSON.
This code creates JSON:
procedure TfmMain.btnIngredientsSaveClick(Sender: TObject);
var obj: ISuperObject;
i: integer;
begin
try
obj := SO();
for i := 0 to sgIngredients.RowCount - 2 do
begin
obj.O[sgIngredients.Cells[0, i+1]] := SA([]);
obj.A[sgIngredients.Cells[0, i+1]].S[0] := sgIngredients.Cells[1, i+1];
obj.A[sgIngredients.Cells[0, i+1]].S[1] := sgIngredients.Cells[2, i+1];
end;
obj.SaveTo(ExtractFileDir(Application.ExeName)+ingrJSONFile);
finally
obj := nil;
end;
end;
sgIngredients - TStringGrid
sgIngredients contain cyrillic symbols. So output file is:
{
"4":["Hello","count"],
"3":["\u0411\u0443\u043b\u044c\u043e\u043d \u043e\u0432\u043e\u0449\u043d\u043e\u0439","\u0441\u0442."],
"2":["\u0411\u0443\u043b\u044c\u043e\u043d \u043a\u0443\u0440\u0438\u043d\u044b\u0439","\u0441\u0442."],
"1":["\u0411\u0435\u043a\u043e\u043d","\u0433\u0440."]
}
How to correctly save my data to JSON-file?
EDIT
This is screenshot of my string grid.
Reading the sources, you can call function TSuperObject.SaveTo(stream: TStream; indent, escape: boolean): integer; setting escape := false
I can say it again, when using libraries with their source code given, just "Use the Source, Luke"
Also, u may save JSON to string, and then replace escaped characters with actual WideChar values (like was done in http://UniRed.sf.net or at http://www.sql.ru/forum/936760/perevesti-kodirovannye-simvoly-funkciya-v-delphi-analog-iz-js) and then save the resulting string to file while enforcing UTF-8 charset.