I have some troubles understanding the ingestion of JSON entries (from Event Hubs) into Kusto / ADX. I seem not to be able to get the GetPathElement transform statement to work. I'd expected that something like
[{"test":"name","path":"$.content.something","transform":"GetPathElement(0)"}]
would work (according to documentation). Unfortunately, I get the (imo) undocumented error:
Value 'GetPathElement(0)' used in a switch/case is invalid
Can someone give me a hint/example on how GetPathElement should work?
This transformation option indeed doesn't work.
I am not sure this transformation meant to do what you are expecting from it.
If it had worked correction, it would put the constant word 'something' to each ingested row.
Is this what you meant to get? If yes, you can use 'ConstValue' property of the mapping.
If you mean anything different, please, explain in more details.
can you share your ingest statement.
also you can do a 1-click ingestion which, do not complete the setup but just point it to the JSON file and it will give you an ingest commands to run with Table schema commands and etc which you can then run in kusto explorer or through Azure Portal.
https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-one-click
Related
I want to convert incoming JSON data from Kafka into a dataframe.
I am using structured streaming with Scala 2.12
Most people add a hard coded schema, but if the json can have additional fields, it requires changing the code base every-time, which is tedious.
One approach is to write it into a file and infer it with but I rather avoid doing that.
Is there any other way to approach this problem?
Edit: Found a way to turn a json string into a dataframe but cant extract it from the stream source, it is possible to extract it?
One way is to store the schema itself in the message headers (not in the key or value).
Though, this increases message size, it will be easy to parse the JSON value without the need for any external resource like a file or a schema registry.
New messages can have new schemas while at the same time old messages can still be processed using their old schema itself, because the schema is within the message itself.
Alternatively, you can version the schemas and include an id for every schema in the message headers (or) a magic byte in the key or value and infer the schema from there.
This approach is followed by Confluent Schema registry. It allows you to basically go through different versions of same schema and see how your schema has evolved over time.
Read the data as string and then convert it to map[string,String], this way you can process the any json without even knowing its schema
based on JavaTechnical answer , the best approach would be to use a schema registry and
avro data instead of json, there is no going around hardcoding a schema (for now).
include your schema name and id as a header and use them to read the schema from the schema registry.
use the from_avro fucntion to turn that data into a df!
I'm working on flutter mobile application which should be attached (connected) with the web platform that I developed with laravel, I want to generate JSON file from Postgres dynamically , I mean when I update any thing in the database it will be updated in the mobile also and I need to display the data in the mobile application.
I followed this tutorial and I understood that I must convert the database or the tables into json file. How I'm going to do it please, it's the first time I'm working with Flutter and json.
https://www.youtube.com/watch?v=m7b7_Nq7XSs&list=PLK7ZDJTUghFAmRR4mueiai7zq1RJfMQ62&index=11&t=1s
If you're just getting started, please take your time and get yourself familiar with the basics and how flutter treats the data coming from a database.
Also something you should have been reading and understood is JSON and serialization.
Based on that it is not advisable that you retrieve JSON right from the database. Instead, JSON serialization happens in one way or another inside flutter using one of the recommended approaches.
Specifically for working with PostgreSQL, there seems to be a decent tutorial.
Please keep in mind that what you actually have asked for here ("... database to JSON file") indicates you really want a file output, which is completely contrary to the API that you're going to provide with flutter.
Of course it is possible to query PostgreSQL and get the result already in JSON format, but that then also means you won't be able to work with the data model inside flutter.
However, if you finally know what you are doing, here is a way to get the result of any PostgreSQL query directly as JSON:
SELECT json_agg(t) FROM (
SELECT ...whatever you can think of...
) AS t;
If you are using laravel version greater than 5, you can use API resources to create API and connection to PostgreSQL(https://laravel.com/docs/5.7/database). It is so simple to create API using laravel API resources. Then in Flutter only thing you have to do is request endpoints you have created using laravel.
I'm trying to build a logic app that inserts data into a Sql database. The data is coming from s Stream Analytics job, outputting it on a Service Bus topic, consumed in Logic Apps in Service Bus trigger.
To populate the properties of the row inserted (lets say it only has one column 'Name'), I've found that this should work using following syntax:
"body": {
"Name": "#{json(decodeBase64(triggerBody()['ContentData'])).Name}"
},
Provided the message body contains a 'Name' property.
However I get following error message when running this:
{"code":"InvalidTemplate","message":"Unable to process template language expressions in action 'Insert_row' inputs at line '1' and column '2017': 'The template language function 'json' parameter is not valid. The provided value '#\u0006string\b3http://schemas.microsoft.com/2003/10/Serialization/��{\"time\":\"2016-05-25T10:29:17.4953250Z\",\"Name\":\"Y-Axis\",\"Value\":81.0,\"Date\":\"2016-05-25T10:29:17.4953250\",\"EventProcessedUtcTime\":\"2016-05-25T10:29:17.5525449Z\",\"PartitionId\":2,\"EventEnqueuedUtcTime\":\"2016-05-25T10:29:17.2220000Z\"}\u0001' cannot be parsed: 'Unexpected character encountered while parsing value: #. Path '', line 0, position 0.'. Please see https://aka.ms/logicexpressions#json for usage details.'."}
So it seems like that the content is enclosed in another envelope that is preventing json parsing to work.
1) Any simple way how to get around this?
2) Isn't such an integration all within Microsoft Stack just supposed to work without this mocking around?
Thanks,
Stefan
with the limited string functions available in he workflow definition language I had to use a long winded way for removing the additional strings
#{json(substring(replace(decodeBase64(triggerBody()['ContentData']),'#string3http://schemas.microsoft.com/2003/10/Serialization/��', ''),0,sub(length(replace(decodeBase64(triggerBody()['ContentData']),'#string3http://schemas.microsoft.com/2003/10/Serialization/��', '')),1))).fieldname}
is there a better way to do this
Thanks for reporting this, you're right it should simply work. There is a known issue where ASA ServiceBus output JSON is being wrapped in a XML header. It will be addressed in a near future but can't specify a particular date. Could you please workaround it (maybe using substring/replace) until then ?
cheers,
Chetan
Sorry for the delay in getting back, this issue is now fixed. Its exposed under a new compatibility level (1.1) to avoid breaking existing solutions. Please use this URL to configure your job with compat level 1.1 and give it a try.
cheers!
My program stack is ReactiveMongo 0.11.0, Scala 2.11.6, Play 2.4.2.
I'm adding PATCH functionality support to my Controllers. I want it to be type safe, so that PATCH would not mess the data in Mongo.
Current dirty solution of doing this, is
Reading object from Mongo first,
Performing JsObject.deepMerge with provided patch,
Checking that value can still be deserialized to target type.
Serializing merged object back to JsObject, and check, that patch contains only fields that are present in merged Json (So that there is no trash added to the stored object)
Call actual $set on mongo
This is obviously not perfect, but works fine. I would write macros to generate appropriate format generalization, but it might take too much time, which I currently lack of.
Is there a way to use Playframework Json macro generated format for partial entity validation like this?
Or any other solution, that can be easily integrated in Playframework for that matters.
With the help of #julien-richard-foy made a small library, to do exactly what I wanted.
https://github.com/clemble/scala-validator
Need to add some documentation, and I'll publish it to repository.
In this https://developer.gooddata.com/article/data-modeling-api there is a logical data model and its corresponding JSON. However, I can't seem to find out how to extract JSON from a logical data model via the REST API. Is there a way to do this other than using the single load interface (which would be very inefficient)?
For the record, my end goal is to make a tool that extracts that JSON (which would be in dev), then post that to the ldm manager2, and then apply the suggested changes through the returned MaQL to production. Any help is greatly appreciated
Currently this works only for Getting or Updating the entire Project. Anyway you can GET all model definition by simple API call. See the documentation:
http://docs.gooddatadrafts.apiary.io/
There is a GET request which is asynchronous. You can build some logic on the top of that on your end. You can get all models, store per datasets information, but at the end you need to POST the "final version" and all updates will be applied.
Let me know if I can help you with anything!
Regards,
JT