I have a meteor react app and I need a way to save some info locally which will also be persistent. For example, if a user save a json, I want to use that same json even if the app is closed and reopen later. I tried groundDb but it requires server side as well. I need this feature to enable each user to save info such as game level. It will be great if I can use it on the web as well and not just for native versions. Thanks!
To join the pedants, here's the proper explanation:
You should first convert your object literal into JSON using the global JSON object and its stringify method:
let data = {a: 'some', b: 'data', c: null};
let json = JSON.stringify(data);
localStorage.setItem('data', json);
When you want to retrieve the data and use it in your application, you'll need to parse the JSON back into an object literal:
let json = localStorage.getItem('data');
let data = JSON.parse(json);
localForage
One option, which we use, is the npm package localforage which provides an asynchronous storage wrapper for localStorage and allows you to save data of any type. You can configure your store, and create multiple instances of local storage.
You can store any type in localForage; you aren't limited to strings like in localStorage.
Setting up localforage is similar to using localStorage except you have to use asynchronous calls:
Using Promises (Note you can use callbacks if you are not using the ES6 API)
Store your data
localforage
.setItem(
'state', data
)
.catch(console.error.bind(console))
Retrieve your data
localforage.getItem('state')
.then(data => /* ... */ )
.catch(console.error.bind(console))
Name your store
localforage.config({
name: 'myStore'
})
You can read more about localForage here.
yup it is quite easy. take the object you want and stringify it and then store in the local storage. JSON objects are string objects and not string per say. so when you store it to local storage it can not be stored as the object but the need to convert it to a json string.
to add : localStorage.setItem();
to retrieve : localStorage.getItem();
another good practice can be to, retrieve the data on log in, store it in an array and then store back into localstorage. so you wont have any junk data and its always up to date.
to clear : localStorage.clear(); Note: this will clear the entire localstorage.
also note that there is a size limit on the localstorage. most browsers allow 5-10 mb.
if you have large amounts of data you can also indexedDB. its fairly new but better for large amounts of data to be stored in the browser. for now i think only IE has a good implementation.
Use standard localStorage: https://developer.mozilla.org/en/docs/Web/API/Window/localStorage
localStorage.setItem('xyz', JSON.stringify(object));
const object = JSON.parse(localStorage.getItem('xyz'));
It should be easy. Stringify the Object to make it a string then you save it. Follow this:
var dataToStore = JSON.stringify(data);
localStorage.setItem('someData', dataToStore);
You serialize the JSON when saving to localStorage:
var serializedData = JSON.stringify(data);
localStorage.setItem('dataKey', serializedData);
And deserialize it when retrieving from localStorage:
var serializedData = localStorage.getItem('dataKey');
var data = JSON.parse(serializedData);
Related
We need to implement a cron service in node js that follows this flow:
query from postgres lot's of data (about 500mb)
transform json data into another json
convert json to csv
gzip
upload to s3 with "upload" method
Obviusly, we need to implement this procedure using streams, without generating memory overhead.
we got lot's of problems:
we are using sequelize, an SQL orm. With it, we can't stream the queries. So we are converting our JSON returned by the query into a readable Stream
we can't find an elegant and clever way to implement a transform stream that transforms the json returned by the query. (for example input-> [{a:1,b:2}..] --> output ->[{a1:1,b1:2}..]
while logging and tryng to write to fs instead of s3 (using fs.createWriteStream), seems that the file is created at same time as the pipeline starts but the size it's about 10bytes and it became consistent only when the streaming process is finished. Furthermore, lot's of RAM is used and the streaming process seems to be useless in terms of memory usage.
How would you write this flow in node js?
I've used the following libraries during my experiments:
json2csv-stream
JSONStream
oboe
zlib
fs
aws-sdk
Since the Sequelize results are being read into memory anyway, I don't see the point of setting up a stream to transform the JSON (as opposed to directly manipulating the data that's in memory already), but say you would port the Sequelize queries to mysql, which does provide streaming, you could use something like this:
const es = require('event-stream');
const csv = require('fast-csv');
const gzip = require('zlib').createGzip();
const AWS = require('aws-sdk');
const s3Stream = require('s3-upload-stream')(new AWS.S3());
// Assume `connection` is a MySQL connection.
let sqlStream = connection.query(...).stream();
// Create the mapping/transforming stream.
let mapStream = es.map(function(data, cb) {
...modify `data`...
cb(null, data);
});
// Create the CSV outputting stream.
let csvStream = csv.createWriteStream();
// Create the S3 upload stream.
let upload = s3Stream.upload(...);
// Let the processing begin.
sqlStream.pipe(mapStream).pipe(csvStream).pipe(gzip).pipe(upload);
If the "input stream" were emitting JSON, you can replace sqlStream with something like this:
const JSONStream = require('JSONStream');
someJSONOutputtingStream.pipe(JSONStream.parse('*'))
(the rest of the pipeline would remain the same)
i have this JSON array stored in my local variable:
let bigJsonArray = JSON(response)
my question is if there is any possibility to store this "bigJsonArray" in a global variable/session/cookie/config so i can access it in every view of my app ?
Anybody knows how to process this and could help me?
Greetings and thanks!
What you can do is to define bigJsonArray as a global variable just by defining it outside of any class and the Swift compiler will understand it as a global variable and you can access it from anywhere in your code.
for example:
import UIKit
var bigJsonArray = JSON(response)
class a {
var x = 0
}
that's of curse will not save the data if you killed the app, but from what I understand from your question you just need to be able to access it from all the app without resending a request to the server.
If you want to save the JSON data permanently, you just store the data that you received as a file, and the next time you need it, you read it from the file and parse it (there's actually a method for that) instead of downloading and parsing the data. Much easier than trying to store the parsed data.
If this is data that can be downloaded again, read the appropriate documentation to make sure the file isn't backed up, and is stored in a cache directory where the OS can remove it if space is tight.
How can I use HTML5 local storage to save a little exe file and then download it by clicking the button?
Localstorage as you think is not a DataBase or even the File System, it's just some plain JSON files that store tiny bits of data in key: value pairs.
If you have worked with JSON before this will be very easy to grasp the Idea behind it.
Below is an example of setting and retrieving values from Local-storage:
locastorage.setItem('KEY',JSON.stringify('VALUE'));
// KEY is kind of like the variable name and the VALUE is the actual Data
JSON.parse(locastorage.getItem('KEY'));
// You use the KEY to access the value
// Using JSON methods stringify and parse just to be on the safer side.
HTML5 Localstorage is not for files.
Have a look at Mozilla's documentation here: https://developer.mozilla.org/en-US/docs/Web/API/Storage/LocalStorage
Instead it's for key/value pairs.
// Save data to the current local store
localStorage.setItem("username", "John");
// Access some stored data
alert( "username = " + localStorage.getItem("username"));
To start a download, you may want to look at a question like Download File Using Javascript/jQuery
I have an Asp.Net MVC3 application that use embedded RavenDB to store data.
The view needs json data that is now created by the controller in this way:
public ContentResult Data()
{
var res = JsonConvert.SerializeObject(DocumentSession.Query<DataObject>());
return new ContentResult { Content = res, ContentType = "application/json" };
}
Everything works fine but to me it seems inefficient because data that are stored in DB in JSON format is serialized in POCO and then deserialized again.
Is there a more direct way to get json data directly from the embedded db?
It's not inefficient at all. Keep in mind that internally, raven actually uses BSON - so you would have to translate it anyway. Also there are metadata fields. If you were to return it directly through your controller, you would have no opportunity to shape the response of the data and strip off the unwanted fields.
If you must continue with this line of thinking, you have two options:
You could use the DocumentStore.DatabaseCommands.Get() and related operations to return RavenJObjects that you could then translate JSON from.
You could talk directly to the Raven database over HTTP without using the raven client.
Neither of these are straightforward, and you are throwing away a lot of goodness of the Raven Client API. IMHO, any performance gain you were to achieve would be unnoticeable. I would stick with your current approach.
Also - If you are just trying to avoid having to serialize here, consider returning a JsonResult instead of a ContentResult. If you want to use Json.Net instead (per your other recent post), Here is a cleaner way to do it: http://james.newtonking.com/archive/2008/10/16/asp-net-mvc-and-json-net.aspx
Can I save data to to either CSV or XML files on offline on client-side via HTML5?
The offline storage is an internal storage. It is not meant to export some files to a specific format / specific folder on disk.
The web storage API stores data as [key,value] pair where both key,value are Strings.
So data in any format needs to adhere to this mechanism for local storage. So for example, if you have a JSON object like :
{
name:'John',
gender:'male'
}
You can store it (through JavaScript) after passing it as a string like :
localStorage.setItem("myObj","{name:'John',gender:'male'}");
For JSON objects, use JSON.stringify() to convert them to strings and use JSON.parse() to read them back.
You can use localstorage, but that only allows you to store something on browsers' internal storage (you cannot decide where and how to write data).
There's also a File API, but is at its very early stages and, by now, it doesn't allow to store files arbitrarily on the client:
HTML 5 File API
Let say you have created array or object like this.
var arrayOrObject = [{obj1:{name:John, age:16}},{obj2:{name:Jane, age:17}}];
you can save this data to local devices by using localStorage.
if (typeof(localStorage) == 'undefined' ) {
alert('Your browser does not support HTML5 localStorage. Try upgrading.');
}
else {
try {
localStorage.setItem("storedArrayOrObject", JSON.stringify(arrayOrObject));
//saves to the database, “key”, “value”
} catch (e) {
if (e == QUOTA_EXCEEDED_ERR) {
alert('Quota exceeded!'); //data wasn’t successfully saved due to quota exceed so throw an error
}
}
}
To get the data in Array or Object Structure:
var getStoredArrayOrObject = JSON.parse(localStorage.getItem('storedArrayOrObject'));`
To remove the localStorage Data:
localStorage.removeItem('storedArrayOrObject');
Don't recommend this but available:
localStorage.clear();
You could save and export as csv like this... http://joshualay.net/examples/StamPad/StamPad.html