I have deployed a asp mvc where I am trying to display csv files as tables which have been stored in Azure blob storage.
I have problems to read the files in a blob container. I couldn't find any solution in the Microsoft documentation.
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
To read the csv file stored in Azure Blob storage, you could refer to the following sample code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlobReference = container.GetBlockBlobReference("testdata.csv");
using (var reader = new StreamReader(blockBlobReference.OpenRead()))
{
string row = "";
while (!reader.EndOfStream)
{
//read data from csv file
row = reader.ReadLine();
}
}
My aim is to visualize real time data that comes into the blob storage.
It seems that you’d like to real-time display csv data as tables in clients’ web page. ASP.NET SignalR could help us develop real-time web functionality easily, you could detect csv file under a specified Blob container and call hub method to read data from csv file and push data to connected clients in your WebJob function, and then you could update UI based on the pushed csv data on SignalR client side.
call hub method inside your WebJob function
var hub = new HubConnection("http://xxx/signalr/hubs");
var proxy = hub.CreateHubProxy("HubName");
hub.Start().Wait();
//invoke hub method
proxy.Invoke("PushData", "filename");
hub method to push data to connected clients
public void PushData(string filename)
{
//read data from csv file (blob)
//call javascript side function to populate (or update) tables with csv data
Clients.All.UpdateTables(data);
}
Related
I want to update an existing spreadsheet in a drive folder but have trouble implementing the http request. I followed the documentation and was able to update a spreadsheet but the request body, which I tried to send in JSON, is always converted to CSV. This results in the distribution of the JSON parts into individual cells depending on present commas.
For instance, cell1= "{key1" and cell2= "value1" and so on. However, this prevents me from specifying the style of the sheet and values within the cells.
I found the possibility to send multipart request which, however, results in the same result. Now the first boundary string and the initial information until the first comma are included in the first cell and the rest ist divided according to existing commas.
What I want to do ist send an HTTP request with the body consisting of a JSON-File of specified information for the spreadsheet as described in the Sheets API of Google, but cannot find my current mistake. Even with mimetype set to "application/vnd.google-apps.spreadsheet" the json is always converted to csv.
mimetype "application/vnd.google-apps.spreadsheet"
If the file in question is an actuall google sheets file type. For example the mime type is "application/vnd.google-apps.spreadsheet". Then you should go though the google sheets api to update it. Other wise updating it though google drive you will need to load the file itself into a file stream and then upload it that way. You cant pick and choose what parts are uploaded with drive its all or nothing. Drive doesn't have the power to format things like cells and stuch it just uploads the raw file data.
Mimetype "text/plain"
If the file is in fact a csv file so the mime type is "text/plain" then you can update the text directly. by turning the text into a stream.
You have not said what language you are using so here is my sample for C#. The code is ripped from How to upload to Google Drive API from memory with C#
var uploadString = "Test";
var fileName = "ploadFileString.txt";
// Upload file Metadata
var fileMetadata = new Google.Apis.Drive.v3.Data.File()
{
Name = fileName,
Parents = new List<string>() { "1R_QjyKyvET838G6loFSRu27C-3ASMJJa" } // folder to upload the file to
};
var fsSource = new MemoryStream(Encoding.UTF8.GetBytes(uploadString ?? ""));
string uploadedFileId;
// Create a new file, with metadata and stream.
var request = service.Files.Create(fileMetadata, fsSource, "text/plain");
request.Fields = "*";
var results = await request.UploadAsync(CancellationToken.None);
if (results.Status == UploadStatus.Failed)
{
Console.WriteLine($"Error uploading file: {results.Exception.Message}");
}
// the file id of the new file we created
uploadedFileId = request.ResponseBody?.Id;
I am developing a prototype on Google cloud platform for which I am using cloud storage, appengine and bigquery.
Now, one of the tasks is to load a file daily from google cloud storage to bigquery for which I am using Cron task on Appengine
The problem is bigquery expects the data to be in the NDJSON format.(new line delimited json) whereas my source file is in normal JSON format.
Currently, I downloaded the file to my laptop and converted it to NDJSOn and then uploaded to bigquery but how do I do it programatically on google clould platform? I am hoping there is something available which I can use as I do not want to write from scratch.
Might be useful to others. This is how I did it but let me know if there's a better or easier way to do it.
Need to download Cloud storage java API and dependencies (http client api and oauth api):
https://developers.google.com/api-client-library/java/apis/
Need to download JSON parser like jackson.
Steps:
1> Read the json file as inputstream using the java cloud storage API
Storage.Objects.Get getObject = client.objects().get("shiladityabucket", "abc.json");
InputStream input = getObject.executeMediaAsInputStream();
2> Convert into array of Java objects (the json file in my case has multiple records). If it's a single record, no need of the Array.
ObjectMapper mapper = new ObjectMapper();
BillingInfo[] infoArr = mapper.readValue(input, BillingInfo[].class);
3> Create a StorageObject to upload to cloud storage
StorageObject objectMetadata = new StorageObject()
// Set the destination object name
.setName("abc.json")
// Set the access control list to publicly read-only
.setAcl(Arrays.asList(
new ObjectAccessControl().setEntity("allUsers").setRole("READER")));
4> iterate over objects in the array and covert them to json string. Append newline for ndjson.
for (BillingInfo info:infoArr) {
jSonString += mapper.writeValueAsString(info);
jSonString += "\n";
}
5> Create an Inputstream to insert using cloud storage java api
InputStream is = new ByteArrayInputStream(jSonString.getBytes());
InputStreamContent contentStream = new InputStreamContent(null, is);
6> Upload the file
Storage.Objects.Insert insertRequest = client.objects().insert(
"shiladitya001", objectMetadata, contentStream);
insertRequest.execute();
I want to create a html page which synchronize JSON data with pouch Db.
The JSON data is a response from a web service.I have created a sample html file which can create a pouch Db database.I have created rest web service which gives certain data as response. can any one help me to synchronize these two.
PouchDB has a built in method for synchronizing with CouchDB using one- or two-way replication.
I understand that you want to sync with a datasource, which doesn't have a CouchDB compatible API. Then you'll have to write code to perform the synchronization with your specific JSON API.
There's a library now which makes it possible to import a JSON string (dump) as database into PouchDB.
It's called PouchDB-Load and is written by PouchDB author Nolan Lawson.
So in your case the code can be as simple as:
var db = new PouchDB('my-awesome-db');
db.load('http://example.com/my-dump-file.json').then(function () {
// done loading!
}).catch(function (err) {
// HTTP error or something like that
});
I am creating a mobile app using Titanium. I am using the titanium db which is sqlite. This pdf needs to have boxes to structure the data and images that I am taking with the app as well.
I am assuming what I need to do is convert the data into json on titanium, upload it to a web server and insert into a mysql/phpmysql db and then use some sort of script that is out there will read the web db and create a pdf and send it back to the phone
is that right?
and if so...i need help with that whole process haha...any good tutorials on db upload to web db process?
Check the docs, HTTPClient is what you need to use, its a standard.
First steps would be to create a web service on your server that parses your JSON formatting. The bulk of the work you would have to do has nothing to do with Titanium, but here is the code for sending a JSON object to some web service with a POST from a Titanium App.
var xhr_getstep = Titanium.Network.createHTTPClient();
xhr_getstep.onload = function(e) {
// Do something with the response from the server
var responseBlob = this.responseText;
};
xhr_getstep.onerror = function() {
Ti.API.info('[ERROR] WebService failed.');
};
xhr_getstep.open("POST", 'http://yourwebsite.com/yourwebserviceentry.php');
xhr_getstep.setRequestHeader("Content-Type", "application/json");
// Create your object with info on how to create the PDF
var objSend = {title : 'Amazing Title'};
xhr_getstep.send(obj); // Send it all off
Can I save data to to either CSV or XML files on offline on client-side via HTML5?
The offline storage is an internal storage. It is not meant to export some files to a specific format / specific folder on disk.
The web storage API stores data as [key,value] pair where both key,value are Strings.
So data in any format needs to adhere to this mechanism for local storage. So for example, if you have a JSON object like :
{
name:'John',
gender:'male'
}
You can store it (through JavaScript) after passing it as a string like :
localStorage.setItem("myObj","{name:'John',gender:'male'}");
For JSON objects, use JSON.stringify() to convert them to strings and use JSON.parse() to read them back.
You can use localstorage, but that only allows you to store something on browsers' internal storage (you cannot decide where and how to write data).
There's also a File API, but is at its very early stages and, by now, it doesn't allow to store files arbitrarily on the client:
HTML 5 File API
Let say you have created array or object like this.
var arrayOrObject = [{obj1:{name:John, age:16}},{obj2:{name:Jane, age:17}}];
you can save this data to local devices by using localStorage.
if (typeof(localStorage) == 'undefined' ) {
alert('Your browser does not support HTML5 localStorage. Try upgrading.');
}
else {
try {
localStorage.setItem("storedArrayOrObject", JSON.stringify(arrayOrObject));
//saves to the database, “key”, “value”
} catch (e) {
if (e == QUOTA_EXCEEDED_ERR) {
alert('Quota exceeded!'); //data wasn’t successfully saved due to quota exceed so throw an error
}
}
}
To get the data in Array or Object Structure:
var getStoredArrayOrObject = JSON.parse(localStorage.getItem('storedArrayOrObject'));`
To remove the localStorage Data:
localStorage.removeItem('storedArrayOrObject');
Don't recommend this but available:
localStorage.clear();
You could save and export as csv like this... http://joshualay.net/examples/StamPad/StamPad.html