Importing a JSON data file into Tableau - json

I'm a parse.com novice. I had an iPhone app built that sent to and stored data with parse(core). The variables it stored were image, latitude, longitude, time,and type. It stored the information like a spreadsheet - each image was associated with a latitude, longitude, time and type. I wanted to use Tableau to take this data and create a visual representation, but I can't figure out how to import the data into Tableau. When I export the parse file as a JSON file, it seems to be in string format, and Tableau does not recognize it as a table. Is there a way I can export the parse data so tableau can easily recognize it as a data table?

In my experience this can be best handled by writing some Tableau Extract API to convert your data to Tableau TDEs. If you want to schedule such refresh, you can use "addtoFile" tabcmd to add these TDEs to the existing dashboards.
Currently Tableau does not support out of box JSON support. Here is the latest discussion in the Tableau ideas forum.
Here
Also i will recommend looking at the latest Web Data Connector. They will be very handy for lot of use cases.

Tableau does not read JSON data natively. By using Apache Drill, you can use Tableau to point at JSON data and start analyzing it.
Here is the link for complete explanation.

Related

Azure Data Factory v2 Data Transformation

I am new to Azure Data Factory. And my question is, I have a requirement to move the data from an on-premise Oracle and on-premise SQL Server to a Blob storage. The data need to be transformed into JSON format. Each row as one JSON file. This will be moved to an Event Hub. How can I achieve this. Any suggestions.
You could use lookup activity + foreach activity. And inside the foreach, there is a copy activity. Please reference this post. How to copy СosmosDb docs to Blob storage (each doc in single json file) with Azure Data Factory
The Data copy tool as part of the azure data factory is an option to copy on premises data to azure.
the data copy tool comes with a configuration wizard where you do all the required steps like configuring the source, sink, integration pipeline etc.
In the source you need to write a custom query to fetch data from the tables you require in json format.
In case of SQL server to select json you would use the options OPENJSON, FOR JSON AUTO to convert the rows to json. Supported in SQL 2016. For older versions you need to explore the options available. Worst case you can write a simple console app in C#/java to fetch the rows and then convert them to json file. And then you can upload the file to azure blob storage. If this is an one time activity this option should work and you may not require a data factory.
In case of ORACLE you can use the JSON_OBJECT function.

How properties.db is used in Forge Viewer?

The sqlite database file properties.db is usually the biggest file in the output from https://extract.autodesk.io/.
What is it used for in Forge Viewer, and if it's not used, why is it available in the ZIP file?
The reason this example is copying both is that the purpose of the sample is to demo how to extract the 'bubble' from the Autodesk server. The Design File' properties are extracted in 2 formats: aka json (json.gz) and sqlLite (sdb/db).
The Autodesk Viewer only uses the json format, but other systems may prefer using sqlLite. The json approach makes it easier when you code executes in client browsers.
It is fairly easier to modify the sample to exclude the sqlLite database if you are not interested to get this file. I can point you which code you need to modify if that's something you want to do.
That file contains the components properties as a sqlite database, which are also contained in objects_xxx.json.gz. The viewer only uses the json format.
That article shows how you can easily run the extraction code your your side, it doesn't extract the .db file:
Forge SVF Extractor in Node.js

Using blob as a JSON data feed

As Azure blob storage is a very cheap data storage solution, would it make sense to store JSON data in blobs so clients can access it like a data API? This way we don't need to spin up Web Apps/APIs to server JSON data.
That could work, depending on the scenario. The blobs will be updated on-the-fly when you push a new version of the JSON files.
I demonstrated this a while ago with a simple app that uploads and updates a file. Clients could target the URL and for them is seemed like they were accessing a JSON data feed that kept being updated.

Get Json From Azure Storage Blob

I want to get the json file in Azure Storage Blob through the browser.
I used Stream Analysis and comes out a json file in the Blob container. Now i need to get the information inside the json file in order to show the IOT device status in real-time.
I tried to use Jsonp,
but I don't know how to add the callBack method in the Json file without download it. Is there any way to add the callBack method??
or Is there another way to get the information inside the container?
for this particular scenario, I'd recommend PowerBI. Now Stream Analytics have direct output to PowerBI and you can pretty much customize the dashboard for your real time IoT needs.
You can refer to this article for step by step Stream Analytics + PowerBI.
Coming back to your question, you need to download the blob to access the content. Stream Analytics to BLOB is usually for archiving or later predictive analysis scenarios.
Instead if you still prefer not to use PowerBI, I'd either arrange the SA output to an event hub and read the data from there in real time or alternatively save the data into a NO-SQL db like DocumentDB on Azure and then read from there. I can recommend Highcharts if you want to use custom gauges etc to visualize the data.
Hope this helps.

Birt report Rest api Data source

We are using Birt report viewer to create reporting pages.
We use jdbc data source to connect with a Oracle database.
But is it possible to use a rest api (json format) as data source for the reports?
Does someone has experience with this?
BIRT has no build in JSON data source. However there are some community JSON data source plugins, but all of them I have seen are very low level and not comfortable to use, so I do not recommand any of them here.
You could create a "scripted data source" where you connect to your URL and parse the result by yourself but this is also not very comfortable. Someone tried it here so you have a starting point.
If you are in charge of the infrastructure providing the JSON output it would be easier to add an export to XML and use the BIRT build in XML data source.