Can I load data from my Mac directly into Snowflake? - json

Can I load json data from my Mac directly into Snowflake?
Or is there a way in the Snowflake Console (either classic or new) to load json data directly into a database I have created, in order to conduct further analysis?

Related

How do I ingest data from a CSV file stored on OneDrive into a Kusto temp table?

I have an Excel file people use to edit data outside Azure Data Explorer (Kusto). What is the Kusto code I would use to ingest this data as needed into Kusto query?
So far it seems I need to use:
.create table (Name:type, Name:type)
to create a table.
If my CSV file is stored in OneDrive, what is the syntax to fill the table? Assume the file name is Sample.csv.
OneDrive location is not supported directly by Azure Data Explorer. However there are other options:
Using ingestion commands - you will need to place the files first in Azure Storage.
One Click Ingestion - is a feature of the Web Explorer tool, it will also can create the table for you. you can either download the files to your local computer or place it in Azure storage.
Using Import data from local file feature of Kusto Explorer (Windows client) (only works for local files)

Azure Data Factory v2 Data Transformation

I am new to Azure Data Factory. And my question is, I have a requirement to move the data from an on-premise Oracle and on-premise SQL Server to a Blob storage. The data need to be transformed into JSON format. Each row as one JSON file. This will be moved to an Event Hub. How can I achieve this. Any suggestions.
You could use lookup activity + foreach activity. And inside the foreach, there is a copy activity. Please reference this post. How to copy СosmosDb docs to Blob storage (each doc in single json file) with Azure Data Factory
The Data copy tool as part of the azure data factory is an option to copy on premises data to azure.
the data copy tool comes with a configuration wizard where you do all the required steps like configuring the source, sink, integration pipeline etc.
In the source you need to write a custom query to fetch data from the tables you require in json format.
In case of SQL server to select json you would use the options OPENJSON, FOR JSON AUTO to convert the rows to json. Supported in SQL 2016. For older versions you need to explore the options available. Worst case you can write a simple console app in C#/java to fetch the rows and then convert them to json file. And then you can upload the file to azure blob storage. If this is an one time activity this option should work and you may not require a data factory.
In case of ORACLE you can use the JSON_OBJECT function.

Using blob as a JSON data feed

As Azure blob storage is a very cheap data storage solution, would it make sense to store JSON data in blobs so clients can access it like a data API? This way we don't need to spin up Web Apps/APIs to server JSON data.
That could work, depending on the scenario. The blobs will be updated on-the-fly when you push a new version of the JSON files.
I demonstrated this a while ago with a simple app that uploads and updates a file. Clients could target the URL and for them is seemed like they were accessing a JSON data feed that kept being updated.

Big Query table to be extracted as JSON in Local machine

I have an idea on how to extract Table data to Cloud storage using Bq extract command but I would like rather like to know, if there are any options to extract a Big Query table as NewLine Delimited JSON to Local Machine?
I could extract Table data to GCS via CLI and also download JSON data from WEB UI but I am looking for solution using BQ CLI to download table data as JSON in Local machine?. I am wondering is that even possible?
You need to use Google Cloud Storage for your export job. Exporting data from BigQuery is explained here, check also the variants for different path syntaxes.
Then you can download the files from GCS to your local storage.
Gsutil tool can help you further to download the file from GCS to local machine.
You first need to export to GCS, then to transfer to local machine.
If you use the BQ Cli tool, then you can set output format to JSON, and you can redirect to a file. This way you can achieve some export locally, but it has certain other limits.
this exports the first 1000 line as JSON
bq --format=prettyjson query --n=1000 "SELECT * from publicdata:samples.shakespeare" > export.json
It's possible to extract data without using GCS, directly to your local machine, using BQ CLI.
Please see my other answer for details: BigQuery Table Data Export

Parsing JSON data and storing locally on iphone

Any one has done this, i want to get data parse from the JSON (Google location API) and store it into the sqlite databse on the iphone.
Problem is that if the parsed data is huge in amount how to synchronize the parsing and saving data in sqlite locally on the iphone.
And user interface includes table of saved data that should be work with out any interruption.
And the solution should be in COCOA Framework using Objective C
You must read some tutorials
How do I parse JSON with Objective-C?
http://cookbooks.adobe.com/post_Store_data_in_the_HTML5_SQLite_database-19115.html
http://html5doctor.com/introducing-web-sql-databases/
Parse JSON in JavaScript?