I need to store QML source code in a JSON file, in such a way that the formatting (newlines and spacing and whatnot) are preserved. I thought about programmatically inserting special unicode characters that I would never use in my source code as markers into the JSON (when saving it) to represent new lines and spaces. When reading the source code from JSON, I would replace these markers with either a newline or a space. However, this doesn't feel like a very robust solution.
Is there a better way to do this?
You can use QByteArray::toBase64() to convert the QML source to a string that can be saved to JSON:
void SourceCodeSerialiser::read(const QJsonObject &json)
{
mQml = QByteArray::fromBase64(json["qml"].toString().toUtf8());
}
And QByteArray::toBase64() to read the saved Base64 string back to a string of QML:
void SourceCodeSerialiser::write(QJsonObject &json) const
{
json["qml"] = QString(mQml.toUtf8().toBase64());
}
(mQml is a QString)
This turns the following QML:
import QtQuick 2.0
Item {
id: item
}
into this Base64 string:
aW1wb3J0IFF0UXVpY2sgMi4wCgpJdGVtIHsKICAgIGlkOiBpdGVtCn0=
As mentioned by #dtech, it's also possible to compress the byte array using qCompress() and qUncompress() to save some memory:
void SourceCodeSerialiser::read(const QJsonObject &json)
{
mQml = qUncompress(QByteArray::fromBase64(json["qml"].toString().toUtf8()));
}
void SourceCodeSerialiser::write(QJsonObject &json) const
{
json["qml"] = QString(qCompress(mQml.toUtf8(), 9).toBase64());
}
This results in the following Base64 string:
AAAAKXjay8wtyC8qUQgsCSzNTM5WMNIz4OLyLEnNVajmUgCCzBQrhUwgl6sWABKDDFM=
This is larger than the uncompressed version because the QML snippet was so small. Larger QML files will see a benefit from compression.
Related
I use List & Label to generate various reports from database content. The output is handeled by a middleware service which receives the data as Json payload.
Up to now I did not manage to display images/pictures in reports such as i.e. an Itempicture on a customer quotation.
Does anyone may know, which graphic format is expected by the report designer picture object and/or which function might have to be used?
Drawing({String}) expects a path to a physical image file -> hence can not use it because the image data is available as kind of raw image text.
Any help ist highly appreciated.
It will work, if you handle this points:
the field within the JSON need to be base64 encoded - you can get it like this:
using (var memory = new MemoryStream())
{
using (var picture = new Bitmap(<YOURPIC>))
{
picture.Save(memory, System.Drawing.Imaging.ImageFormat.Jpeg);
var base64 = Convert.ToBase64String(memory.GetBuffer());
}
}
use the AutoDefineField-Event of the List & Label object and override FieldType for the matching picture field with its base64 content to LlFieldType.Drawing - e.g.:
private void LL_AutoDefineField(object sender, AutoDefineElementEventArgs e)
{
if (e.Name == "Contacts.myPic")
{
e.FieldType = LlFieldType.Drawing;
}
}
We have an ASP.NET MVC application in which we need to send back json response. The content for sending this json response is coming from a PipeReader.
The approach we have taken is to read all the contents from the PipeReader using ReadAsync and convert the byte array to a string and write it as base64 string.
Here is the code sample:
List<byte> bytes = new List<byte>();
try
{
while (true)
{
ReadResult result = await reader.ReadAsync();
ReadOnlySequence<byte> buffer = result.Buffer;
bytes.AddRange(buffer.ToArray());
reader.AdvanceTo(buffer.End);
if (result.IsCompleted)
{
break;
}
}
}
finally
{
await reader.CompleteAsync();
}
byte[] byteArray = bytes.ToArray();
var base64str = Convert.ToBase64String(byteArray);
We have written a JsonConverter which does the conversion to json. The JsonConverter has a reference to the Utf8JsonWriter instance and we write using the WriteString method on the Utf8JsonWriter.
The above approach requires us to read the entire content in memory from the pipereader and then write to the Utf8JsonWriter.
Instead we want to read a sequence of bytes from the pipereader, convert to utf8 and write it immediately. We do not want to convert the entire content in memory before writing.
Is that even feasible ? I don't know if we can do utf8 conversion in chunk instead of doing it all in one go.
The main reason for this is that the content coming from PipeReader can be large and so we want to do some kind of streaming instead of converting to string in memory and then write to the Json output.
How are you doing?
I'm trying to get data from a json to show on a screen that should be like the image below. I'm able to get most of the data, except for one field coded as String which consists of the image and a description like this one:
"lessonText": "{\"ops\":[{\"insert\":{\"image\":\"data:image/jpeg;base64,(IMAGE CODE HERE)=\"}},{\"attributes\":{\"color\":\"#444444\"},\"insert\":\"(LESSON TEXT HERE)\"},{\"insert\":\"\\n\"}]}",
How do I extract data from here? I have tried to convert this to a Map but it is not working.
Thanks for the help!
Something in line with this should give you the image
// json string containing the base64 image string
String jsonString = "{\"ops\":[{\"insert\":{\"image\":\"data:image/png;base64,(IMAGE CODE HERE)=\"}},{\"attributes\":{\"color\":\"#444444\"},\"insert\":\"(LESSON TEXT HERE)\"},{\"insert\":\"\\n\"}]}";
// convert the string to a map structure
Map<String, dynamic> json = jsonDecode(jsonString);
// extract the image string
String imageString = json['ops'][0]['insert']['image'];
// extract the base64 string
var prefix = "data:image/png;base64,";
var imageBase64String = imageString.substring(prefix.length);
// decode the base 64 string
var bytes = base64Decode(imageBase64String);
// build the image widget from bytes
var imageWidget = Image.memory(bytes);
As I mentioned in the comments, use a combination of decoding the base64 string to bytes and then loading the image from memory. See the the relevant documentation for base64Decode and Image.memory. If you would like a full code sample just let me know and I would be happy to throw one together.
Note: you should run the base64Decode method asynchronously, as it may take some time to decode an entire image (especially on lower-end hardware).
I have a QByteArray, which I want to save in a JSON file using Qt and also be able to read from it again. Since JSON natively can't store raw data, I think the best way would probably be a string? The goal is to save a QPixmap this way:
{
"format" : "jpg",
"data" : "...jibberish..."
}
How do I achieve this and how do I read from this JSON Object again (I am using Qt5)? What I have right now looks this way:
QPixmap p;
...
QByteArray ba;
QBuffer buffer(&ba);
buffer.open(QIODevice::WriteOnly);
p.save(&buffer, "jpg");
QJsonObject json;
gameObject["data"] = QString(buffer.data());
QJsonDocument doc(json);
file.write(doc.toJson());
But the resulting 'jibberish' is way to short to contain the whole image.
A QString cannot be constructed from an arbitrary QByteArray. You need to encode the byte array such that it is convertible to a string first. It is somewhat misleading that a QString is constructible from a QByteArray from the C++ semantics point of view. Whether it is really constructible depends on what's in the QByteArray.
QByteArray::toBase64 and fromBase64 are one way of doing it.
Since you would want to save the pixmap without losing its contents, you should not save it in a lossy format like JPG. Use PNG instead. Only use JPG if you're not repeatedly loading and storing the same pixmap while doing the full json->pixmap->json circuit.
There's another gotcha: for a pixmap to store or load itself, it needs to internally convert to/from QImage. This involves potentially color format conversions. Such conversions may lose data. You have to be careful to ensure that any roundtrips are made with the same format.
Ideally, you should be using QImage instead of a QPixmap. In modern Qt, a QPixmap is just a thin wrapper around a QImage anyway.
// https://github.com/KubaO/stackoverflown/tree/master/questions/pixmap-to-json-32376119
#include <QtGui>
QJsonValue jsonValFromPixmap(const QPixmap &p) {
QBuffer buffer;
buffer.open(QIODevice::WriteOnly);
p.save(&buffer, "PNG");
auto const encoded = buffer.data().toBase64();
return {QLatin1String(encoded)};
}
QPixmap pixmapFrom(const QJsonValue &val) {
auto const encoded = val.toString().toLatin1();
QPixmap p;
p.loadFromData(QByteArray::fromBase64(encoded), "PNG");
return p;
}
int main(int argc, char **argv) {
QGuiApplication app{argc, argv};
QImage img{32, 32, QImage::Format_RGB32};
img.fill(Qt::red);
auto pix = QPixmap::fromImage(img);
auto val = jsonValFromPixmap(pix);
auto pix2 = pixmapFrom(val);
auto img2 = pix2.toImage();
Q_ASSERT(img == img2);
}
i want to insert data from an excel file into a local database in a UNIX server with java without any manipulation of data.
1- someone told me that i've to convert the excel file extension into .csv to conform with unix. i created a CSV file for each sheet (i've 12) with a macro. the problem is it changed the date format from DD-MM-YYYY to MM-DD-YYYY. how to avoid this?
2- i used LOAD DATA command to insert data from the CSV files to my database. there's a date colonne that is optionnaly specified in the excel file. so in CSV it become ,, so the load data doesn't work (an argument is needed). how can i fix this?
thanks for your help
It should be quite easy to read out the values from Excel with Apache POI. Then you save yourself the extra step of converting to another format and possible problems when your data contains comma and you convert to CSV.
Save the EXCEL file as CSV (comma separated values) format. It will make it easy to read and parse with fairly simple use of StringTokenizer.
Use MySQL (or SQLite depending on your needs) and JDBC to load data into the database.
Here is a CSVEnumeration class I developed:
package com.aepryus.util;
import java.util.*;
public class CSVEnumeration implements Enumeration {
private List<String> tokens = new Vector<String>();
private int index=0;
public CSVEnumeration (String line) {
for (int i=0;i<line.length();i++) {
StringBuffer sb = new StringBuffer();
if (line.charAt(i) != '"') {
while (i < line.length() && line.charAt(i) != ',') {
sb.append(line.charAt(i));
i++;
}
tokens.add(sb.toString());
} else {
i++;
while(line.charAt(i) != '"') {
sb.append(line.charAt(i));
i++;
}
i++;
tokens.add(sb.toString());
}
}
}
// Enumeration =================================================================
public boolean hasMoreElements () {
return index < tokens.size();
}
public Object nextElement () {
return tokens.get(index++);
}
}
If you break the lines of the CSV file up using split and then feed them one by one into the CSVEnumeration class, you can then step through the fields. Or here is some code I have lying around that uses StringTokenizer to parse the lines. csv is a string that contains the entire contents of the file.
StringTokenizer lines = new StringTokenizer(csv,"\n\r");
lines.nextToken();
while (lines.hasMoreElements()) {
String line = lines.nextToken();
Enumeration e = new CSVEnumeration(line);
for (int i=0;e.hasMoreElements();i++) {
String token = (String)e.nextElement();
switch (i) {
case 0:/* do stuff */;break;
}
}
}
I suggest MySQL for its performance and obviously open source.
Here comes two situations:
If you want just to store the excel cell values into the database. You can convert the excel to CSV format, so that you can simply LOAD DATA command in MySQL command.
If you have to do some manipulation before the values to get into the tables, I suggest Apache POI. I've used, that works so fine, whatever you're format of Excel you just have to use the correct implementation.
We are using SQLite in our java application. It's serveless, really simple to use and very efficient.