Generate PDF document with data on Titanium DB - json

I am creating a mobile app using Titanium. I am using the titanium db which is sqlite. This pdf needs to have boxes to structure the data and images that I am taking with the app as well.
I am assuming what I need to do is convert the data into json on titanium, upload it to a web server and insert into a mysql/phpmysql db and then use some sort of script that is out there will read the web db and create a pdf and send it back to the phone
is that right?
and if so...i need help with that whole process haha...any good tutorials on db upload to web db process?

Check the docs, HTTPClient is what you need to use, its a standard.
First steps would be to create a web service on your server that parses your JSON formatting. The bulk of the work you would have to do has nothing to do with Titanium, but here is the code for sending a JSON object to some web service with a POST from a Titanium App.
var xhr_getstep = Titanium.Network.createHTTPClient();
xhr_getstep.onload = function(e) {
// Do something with the response from the server
var responseBlob = this.responseText;
};
xhr_getstep.onerror = function() {
Ti.API.info('[ERROR] WebService failed.');
};
xhr_getstep.open("POST", 'http://yourwebsite.com/yourwebserviceentry.php');
xhr_getstep.setRequestHeader("Content-Type", "application/json");
// Create your object with info on how to create the PDF
var objSend = {title : 'Amazing Title'};
xhr_getstep.send(obj); // Send it all off

Related

Why is JsonContent not populating before SendAsync?

I'm using a .NET Core console application to send JSON data to an Azure Logic App. If the JSON data is a short list of data e.g. 4 items then everything works. For more data I have to evaluate the Content of the HttpRequestMessage before sending otherwise the Azure Logic App reports empty content. Is there any alternative to forcing evaluation, it feels counter intuitive.
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var postRequest = new HttpRequestMessage(HttpMethod.Post, uri)
{
Content = JsonContent.Create(data)//data is a list object
};
//need this line to evaluate the content before posting otherwise content is empty according to Azure Logic App
await postRequest.Content.ReadAsStringAsync();
using var postResponse = await httpClient.SendAsync(postRequest);
postResponse.EnsureSuccessStatusCode();
This code structure seems to feature in a lot of tutorials without the call to postRequest.Content.ReadAsStringAsync() without anyone experiencing similar problems.
An example of the JSON that fails would be:
[{"runDate":"2021-02-18T16:30:11.23","name":"AuditLog","rows":3859209,"reservedKb":22061528,"dataKb":8781560,"reservedIndexSize":66328,"reservedUnused":13213640,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"AnswerData","rows":70167,"reservedKb":12586736,"dataKb":12494912,"reservedIndexSize":1472,"reservedUnused":90352,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"Data_import","rows":3623146,"reservedKb":722632,"dataKb":379704,"reservedIndexSize":1424,"reservedUnused":341504,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"Answers","rows":2036892,"reservedKb":528872,"dataKb":270096,"reservedIndexSize":257360,"reservedUnused":1416,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"Data","rows":3623146,"reservedKb":381256,"dataKb":379704,"reservedIndexSize":1424,"reservedUnused":128,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"Data_backup","rows":3623146,"reservedKb":380048,"dataKb":379704,"reservedIndexSize":16,"reservedUnused":328,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"datadec","rows":3623146,"reservedKb":252168,"dataKb":232952,"reservedIndexSize":8,"reservedUnused":19208,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"SalesDetails","rows":84168,"reservedKb":170496,"dataKb":138104,"reservedIndexSize":20136,"reservedUnused":12256,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"OurUsers","rows":66145,"reservedKb":72240,"dataKb":37880,"reservedIndexSize":32776,"reservedUnused":1584,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.23","name":"Orders","rows":362720,"reservedKb":70016,"dataKb":53608,"reservedIndexSize":11976,"reservedUnused":4432,"dbName":"DBOne"},{"runDate":"2021-02-18T16:30:11.597","name":"CustDetails","rows":8949,"reservedKb":16392,"dataKb":15704,"reservedIndexSize":80,"reservedUnused":608,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"OurUsers","rows":5580,"reservedKb":6096,"dataKb":4456,"reservedIndexSize":1600,"reservedUnused":40,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"Answers","rows":56835,"reservedKb":5184,"dataKb":5104,"reservedIndexSize":32,"reservedUnused":48,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"UserRoles","rows":5577,"reservedKb":4632,"dataKb":1496,"reservedIndexSize":3056,"reservedUnused":80,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"CustDetails","rows":5866,"reservedKb":1928,"dataKb":1864,"reservedIndexSize":16,"reservedUnused":48,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"OrderHistories","rows":6102,"reservedKb":1224,"dataKb":1168,"reservedIndexSize":16,"reservedUnused":40,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"Orders","rows":6196,"reservedKb":1224,"dataKb":1176,"reservedIndexSize":16,"reservedUnused":32,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"Transfers","rows":7522,"reservedKb":840,"dataKb":816,"reservedIndexSize":16,"reservedUnused":8,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"SalesDetails","rows":8762,"reservedKb":648,"dataKb":576,"reservedIndexSize":16,"reservedUnused":56,"dbName":"DBTwo"},{"runDate":"2021-02-18T16:30:11.597","name":"TransferAdvice","rows":13706,"reservedKb":392,"dataKb":328,"reservedIndexSize":16,"reservedUnused":48,"dbName":"DBTwo"}]

Visualize files from Azure blob container

I have deployed a asp mvc where I am trying to display csv files as tables which have been stored in Azure blob storage.
I have problems to read the files in a blob container. I couldn't find any solution in the Microsoft documentation.
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
To read the csv file stored in Azure Blob storage, you could refer to the following sample code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlobReference = container.GetBlockBlobReference("testdata.csv");
using (var reader = new StreamReader(blockBlobReference.OpenRead()))
{
string row = "";
while (!reader.EndOfStream)
{
//read data from csv file
row = reader.ReadLine();
}
}
My aim is to visualize real time data that comes into the blob storage.
It seems that you’d like to real-time display csv data as tables in clients’ web page. ASP.NET SignalR could help us develop real-time web functionality easily, you could detect csv file under a specified Blob container and call hub method to read data from csv file and push data to connected clients in your WebJob function, and then you could update UI based on the pushed csv data on SignalR client side.
call hub method inside your WebJob function
var hub = new HubConnection("http://xxx/signalr/hubs");
var proxy = hub.CreateHubProxy("HubName");
hub.Start().Wait();
//invoke hub method
proxy.Invoke("PushData", "filename");
hub method to push data to connected clients
public void PushData(string filename)
{
//read data from csv file (blob)
//call javascript side function to populate (or update) tables with csv data
Clients.All.UpdateTables(data);
}

Display MQTT webchannel data in a html page

I have to display data of mqtt channel to the html page. I have prepared a node js code for the same as following :
var mqtt = require('mqtt');
client = mqtt.createClient(1883, 'mqtt.beebotte.com',
//Authenticate with your channel token,
{username: 'token:TOKEN_KEY', password: ''});
client.on('message', function (topic, message) {
console.log('topic: ' + topic + ' payload: ' + message);
});
client.subscribe('AppTeamDemo/GpsDataUpload');
client.subscribe('AppTeamDemo/EventDataUpload');
client.subscribe('AppTeamDemo/CommToZiggi');
client.subscribe('AppTeamDemo/CommFromZiggi');
And i am able to get data which i need to parse using JSON formatter and have to display it in html page.
I am confused as I am little new to all of this, how would I be able to use nodejs variable to the html page as it is a server side javascript.
How would I be able to connect with mqtt channel because every time it is saying mqtt is not defined.
I preferred not to use node and subscribe this channel in html page by using mqtt and by parsing the JSON data, I want to display latitude and longitude in google map. How would I be able to connect and get JSON response at front-end side ?
Please guide me. Thanks in advance.
Rather than write your own MQTT to Web bridge have you considered using a MQTT broker that supports WebSockets?
If you use the Paho library you can subscribe to the topics you are interested directly from javascript in the page.

synchronizing pouchDB with json data

I want to create a html page which synchronize JSON data with pouch Db.
The JSON data is a response from a web service.I have created a sample html file which can create a pouch Db database.I have created rest web service which gives certain data as response. can any one help me to synchronize these two.
PouchDB has a built in method for synchronizing with CouchDB using one- or two-way replication.
I understand that you want to sync with a datasource, which doesn't have a CouchDB compatible API. Then you'll have to write code to perform the synchronization with your specific JSON API.
There's a library now which makes it possible to import a JSON string (dump) as database into PouchDB.
It's called PouchDB-Load and is written by PouchDB author Nolan Lawson.
So in your case the code can be as simple as:
var db = new PouchDB('my-awesome-db');
db.load('http://example.com/my-dump-file.json').then(function () {
// done loading!
}).catch(function (err) {
// HTTP error or something like that
});

Data array from Couchdb documents into D3

I am having a problem integrating Couchdb and D3. D3 is a Javascript library that performs document driven data visualization. Couchdb is a document database. They were made for each other.
D3 binds an array of data to DOM elements of a web page. In most of the examples I have seen on the web or in books, people are working on a static data set. Generally, examples will show an array written into the Javascript or a text.csv file loaded into the page.
I would like to take data directly from database documents and load it into D3. I'm uncertain how to do it. I have seen one example on the web where a person has loaded all of their data as an array into one couchdb document and then brought the data into index.html with a couchdb.jquery call:
/ This function replaces the d3.csv function.
$.couch.db("d3apps3").openDoc("sp500", {
success : function (doc) {
var data = doc.data;
data.forEach(function(d) {
d.date = formatDate.parse(d.date);
d.price = +d.price;
})
I tried something similar with db.allDocs:
<script type="text/javascript">
$dbname = "dataset2";
$appname = "dataset2";
$db = $.couch.db("dataset2");
$db.allDocs({
success: function (data) {
console.log(data)
}
});
</script>
I could get the data to render in console.log, but could not get it into D3 and index.html. I also realized that the datastream resulting from db.allDocs is limited to the _id and _rev of each document.
I also tried to GET the data from a Couchdb view with a d3.json call. That wouldn't work because d3.json is looking for an existing .json file.
It's funny, I can call the view with cURL using a GET command and see the datastream, but can't seem to bind it with D3.
~$ curl -X GET http://anywhere.com:5984/dataset2/_desing/list_view/_view/arnold
{"total_rows":25,"offset":0,"rows":[
{"id":"dataset.csv1","key":"0","value":null},
{"id":"dataset.csv2","key":"1","value":null},
{"id":"dataset.csv11","key":"10","value":null},
{"id":"dataset.csv12","key":"11","value":null},
Any ideas would be appreciated.
Part four of https://gist.github.com/anonymous/9275891 has an example that I think you'd appreciate. You don't need to rely on the jquery.couchdb library at all - d3 knows enough abuot http and json to work right out the box. The relevant piece of code is:
d3.json("_view/pricetimeseries", function(viewdata) {
// We just want rows from the view in the visualisation
data = viewdata["rows"];
data.forEach(function(d) {
// the key holds the date, in seconds
d.date = new Date(d.key);
d.price = +d.value;
});
// rest of the visalisation code
HTH
If the page in which your D3 code is embedded is not served from the same domain (+ port) than CouchDB you will have to enable Cross-Origin Resource Sharing.
Assume your page is at http://example.com/data.html which contains JavaScript D3 code that acesses data from http://db.example.com/ or http://example.com:5984/. In that case your browser (which is executing the JavaScript) will by default deny such (cross-origin) requests unless the requested domain explicitly allows it.
There are basically two solutions to this:
Serve both the data and the page from the same domain, either by
putting a reverse proxy in between that maps resources to upstream servers (eg /couch to your CouchDB server and everything else to your web server)
serving your static files directly from CouchDB
or by allowing Cross-Origin Resource Sharing, which is available in CouchDB since version 1.3. You can find a list of relevant settings in the CouchDB docs on CORS.