JSON object getting damaged when using chrome.storage.local.set - json

When retrieving a complex JSON object from chrome.storage.local the object is breaking.
mock.json
{
"ThingOne" : [
"a",
"b"
],
"ThineTwo" : [
"a",
"b"
],
"People" : {
"FamilyOne" : {
"AgeOne" : "3",
"AgeTwo" : "8"
}
},
"Hats" : ["blue", "red", "green"]
}
and I am fetching this file (correctly) using
fetch('./mock.json').then(response => {
console.log(response);
return response.json();
}).then(data => {
//data == the whole json file
var data2 = JSON.stringify(data);
chrome.storage.local.set({'StoredJson': data2});
//here this is the result of this code
//console.log(data2.ThingOne[0]);
//outputs => "a"
}).catch(err => {
console.log("Error Reading data " + err);
});
waitfunction();
chrome.storage.local.get('StoredJson', function(result) {
console.log("from get ------"); //outputs below
console.log(result); //{Data: ""{\"ThingOneOne\":[\"a\",\"b\"],\...
console.log(typeof result); //object
console.log(typeof result.ThingOne);//undefined
//https://imgur.com/OF7pVQQ
});
Why is it working when I fetch the object but not when I retrieve it. I have tried storing it after JSON.stringifying it. And I have tried to use it after JSON.parsing it which returns
VM6:1 Uncaught SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse ()
indicating that it is already a JS object.
I have tried using dot notation and bracket notaion it doesn't work. When I store it in the chrome console as var data = {//json here} it works. But not live. StackOverflow: Save json to chrome storage / local storage hasn't helped me. Picture of console

There are multiple problems in the code.
There's no need for JSON.stringify. Just store the data directly.
Both fetch and chrome.storage are asynchronous so your chrome.storage.local.get will run before the data is set and it won't see the correct data.
waitfunction(); won't wait for anything, it won't influence asynchronous code before it or afterwards.
chrome.storage.local.get('StoredJson', callback) reads the data into an object property named StoredJson i.e. you can read the value as result.StoredJson.
Overall, a proper modern solution is to switch to async/await:
(async () => {
try {
const data = await (await fetch('./mock.json')).json();
console.log('Fetched', data);
await writeStorage({StoredJson: data});
const {StoredJson} = await readStorage('StoredJson');
console.log('Stored', StoredJson);
} catch (err) {
console.log(err);
}
})();
function readStorage(key) {
return new Promise(resolve => {
chrome.storage.local.get(key, resolve);
});
}
function writeStorage(data) {
return new Promise(resolve => {
chrome.storage.local.set(data, resolve);
});
}

Related

Data fetch from JSON file

I am trying to fetch data from external JSON and I was able to console.log it, so the fetch works, but I am having trouble to print the values.
JSON:
{
"data": {
"shoes": [
{
"types": [
{
"color": "pink",
}]
}]
}
I need to get access to the color (pink).
This is my fetch:
const shoesInformations = "json.url"
const [shoesData, setShoesData] = useState([]);
useEffect(() => {
getShoesInfo();
}, []);
const getShoesInfo = async () => {
try {
const response = await fetch(shoesInformations);
const jsonData = await response.json();
const { data } = jsonData;
setShoesData(jsonData);
console.log(data);
} catch (err) {
console.log(err);
}
};
And my attempt to print it:
<p>{shoesData.types.color}</p>
I do not need to map through the data just print the value one by one {shoesData.types.color[1]}
The main problem is that you assign the whole fetch response instead of only the shoes array within it to the shoesData state variable. Try this:
const [shoesData, setShoesData] = useState([]);
useEffect(() => {
getShoesInfo();
}, [getShoesInfo]);
const getShoesInfo = async () => {
try {
const response = await fetch(shoesInformations);
const jsonData = await response.json();
setShoesData(jsonData.data.shoes);
} catch (err) {
console.log(err);
}
};
and then when you want to present it first add a check for empty array or use the safe navigation operator. Either do (if you really want it hardcoded):
<p>{shoesData[0]?.types[0].color}</p>
<p>{shoesData[1]?.types[0].color}</p>
or more flexibly something like:
const getShoesRepresentation = () => {
if (shoesData.length > 0) {
return null;
}
else {
return (
<p>{shoesData[0].types[0].color}</p>
);
}
};
and then use {getShoesRepresentation()} in your rendering. This will handle the empty array case and you can extend it to handle iteration over all shoe objects that you need. I strongly suggest you use an iteration approach instead of hard-coding the data like that. You can safely use it by supplying shoeIndexes which contains only the indexes you want to present and then iterate over them and create a respective <p> for each.
shoesData.types.color[1] won't work, you only have 1 element, so your index must be 0:
shoesData.types.color[0]

Creating a JSON - Angular 5

I have a JSON read to UI and modifiled as part of review process. Finally I want to save New data to a New folder/File.
Code Sample below - Always ends with POST call in error :
SaveFinalData(){
this.postJson<Reaction>
('./assets/ICP/Reviewed/Json/out.json',
this.reactionDatabase.reactiondataChange.value);
}
postJson<Reaction>(url: string, Reaction: any){
//let body = JSON.stringify({ Reaction });
this.http.post(url, body)
.subscribe(
(val) => {
console.log("POST call successful value returned in body", val);
},
response => {
console.log("POST call in error", response);
},
() => {
console.log("The POST observable is now completed.");
});
}
Have tried below 2 alternatives:
(1)
var theJSONdata = JSON.stringify({ Reaction });
window.localStorage.setItem('./assets/ICP/Reviewed/Json/out.json', theJSONdata)
Result -- NO LUCK!
(2)
var jsonfile = require('jsonfile')
var file = './assets/ICP/Reviewed/Json/out.json'
var obj = {name: 'JP'} //SAMPLE DATA
jsonfile.writeFile(file, obj, function (err) {
console.error(err)
})
Result -- NO LUCK! --> GIVES ERROR fs.writeFile is not a function
Pls kindly help/guide me to reach the final result....Thanks

Parsing json response gives undefined

I am using angularfire2 to fetch data from realtime DB. and response comes like below
{"type":"value",
"payload":{
"-LJXFAd_q3Cin64EBc7H":
{"_date":"9-8-2018",
"_deliveryType":"Pick up",
"_estDeliveryTime":"2018-08-10T11:43:57.164Z",
"_location":""}
}}
this on doing
the element inside payload is a key created using push. I wont know it to refer so how do i get the data under "LJXFAd_q3Cin64EBc7H" ?
there are many such entries inside payload and i need to fetch all.
code used to get above is:
getOrderHistory(uid:string){
console.log('start of getOrderHistory with uid:' + uid)
return new Promise((resolve, reject) =>
{
this.db.object("/users/" + uid + "/orders").snapshotChanges().subscribe(
res => {
//console.log('response:' + res)
resolve(res)
},
err => {
console.log(err)
reject(err)
}
)
})
}
Try this, it maps each snapshot to an object that holds its key and values.
this.db.object("/users/" + uid + "/orders").snapshotChanges()
.map(snapshot => {
const key = snapshot.key;
const data = snapshot.payload.val();
return { key, ...data };
})
.subscribe(res => {
resolve(res);
},
err => {
console.log(err);
reject(err);
});
normally, realtime database sends snapshot where you can do snapshot.id to get the id and .data (or .data() I forget at the moment) to get at the payload.
JSON.parse it then iterate it or get at it with dot notation or bracket notation. To get a property:
obj[payload][LJXFAd_q3Cin64EBc7H]
To iterate it:
function eachRecursive(obj)
{
for (var k in obj)
{
if (typeof obj[k] == "object" && obj[k] !== null)
eachRecursive(obj[k]);
else
// do something...
}
}

TextDecoder failing in ES6 Promise recursion

I'm attempting to query an API which responds with a ReadableStream of XML.
The code below uses a recursive Promise. Recursive because it sometimes doesn't decode the stream in a singular iteration and this is whats causing my headache.
While I'm successfully fetching the data, for some reason the decoding stage doesn't complete sometimes, which leads me to believe it's when the stream is too large for a single iteration.
componentDidMount() {
fetch("http://thecatapi.com/api/images/get?format=xml&size=med&results_per_page=9")
.then((response) => {
console.log('fetch complete');
this.untangleCats(response);
})
.catch(error => {
this.state.somethingWrong = true;
console.error(error);
});
}
untangleCats({body}) {
let reader = body.getReader(),
string = "",
read;
reader.read().then(read = (result) => {
if(result.done) {
console.log('untangling complete'); // Sometimes not reaching here
this.herdingCats(string);
return;
}
string += new TextDecoder("utf-8").decode(result.value);
}).then(reader.read().then(read));
}
I think that the next iteration was sometimes being called before the current iteration had completed, leading to incorrectly concatenation of the decoded XML.
I converted the function from sync to async and as a regular recursive method of the component rather than a recursive promise with a method.
constructor({mode}) {
super();
this.state = {
mode,
string: "",
cats: [],
somethingWrong: false
};
}
componentDidMount() {
fetch("http://thecatapi.com/api/images/get?format=xml&size=med&results_per_page=9")
.then( response => this.untangleCats( response.body.getReader() ) )
.catch(error => {
this.setState({somethingWrong: true});
console.error(error);
});
}
async untangleCats(reader) {
const {value, done} = await reader.read();
if (done) {
this.herdingCats();
return;
}
this.setState({
string: this.state.string += new TextDecoder("utf-8").decode(value)
});
return this.untangleCats(reader);
}

Streaming responses from MongoDB to client via Hapi

What's the best approach to streaming MongoDB query responses to the client via Hapi? I've seen some examples with http or request, but not hapi.
The problem is that I'm getting concatenated and stringified JSON objects on the client side, but I can't can't call JSON.parse on the result because together it's not valid JSON.
Some solutions I've seen suggest concatenating on the server side before sending to the client, but that seems to defeat the value of streams.
For example:
const Hapi = require('hapi'),
MongoClient = require('mongodb').MongoClient,
Readable = require('stream').Readable;
// Connection url
const url = 'mongodb://localhost:27017/test';
// Create a server with a host and port
const server = new Hapi.Server();
server.connection({
host: 'localhost',
port: 8000
});
// Add the route
server.route({
method: 'GET',
path: '/stream',
handler: function (request, reply) {
let docs = [{ a: 1, b: 1 }, { a: 2, b: 2 }, { a: 3, b: 3 }, { a: 4, b: 4 }];
// Connect using MongoClient
MongoClient.connect(url, (err, db) => {
// Create a collection we want to drop later
const col = db.collection('stream_example');
// Insert documents into collection
col.insertMany(docs, { w: 1 }, function (err) {
if (err) return console.log(err);
// Peform a find to get a cursor
const stream = col.find()
.stream({
transform: function (doc) {
return JSON.stringify(doc);
}
});
reply(new Readable().wrap(stream));
});
});
}
});
// Start the server
server.start(err => {
if (err) {
throw err;
}
console.log('Server running at:', server.info.uri);
});
Returns a response.result of:
"{"_id":"57b0b99d681bb97a9321f03e","a":1,"b":1}{"_id":"57b0b99d681bb97a9321f03f","a":2,"b":2}{"_id":"57b0b99d681bb97a9321f040","a":3,"b":3}{"_id":"57b0b99d681bb97a9321f041","a":4,"b":4}"
Which is not valid JSON and cannot be parsed.
I've tried piping this stream into the event-stream module's .join('\n') stream to add newlines while also pushing string'd "[" and "]" before and after to build a stringified JSON Array, but have not yet been successful. This feels too hacky anyways.
Is there a better way?
A valid JSON has to be sent by using a stream transform.
Basically you have to:
start the stream with '['
then append stringified JSON object
add ',' after each of the objects
end stream with ']'
so the final result recevied in the stream would be valid JSON, like
[
{'key': 'value'},
{'key': 'value'},
]
Some of the solutions:
http://andyfiedler.com/2017/01/mongodb-stream-to-hapi
https://github.com/nlindley/hapi-mongo-stream
https://github.com/dominictarr/JSONStream
Here is an example of how I have been using Mongo with Hapi.
From BoardRepo.js:
module.exports = {
GetAllBoards: function (facility) {
return new Promise(function (resolve, reject) {
var db = mongo.ETestDatabase;
db.collection('boards').find({ "Location.Facility": facility }).toArray().then(r => {
resolve(r);
}).catch(err => {
logger.error('Error getting all boards by facility - ' + err.message);
reject(err.message);
});
});
}
};
In the Hapi handler (BoardHandler.js):
module.exports = {
GetAllBoards: {
method: 'GET',
path: '/v1/boards/facility/all/{facility}',
config: {
auth: 'jwt',
plugins: { 'hapiAuthorization': { roles: ['ADMINISTRATOR', 'MANAGER', 'TECHNICIAN', 'OPERATOR'] } },
description: 'Gets all boards per facility',
tags: ['api'],
handler: (request, reply) => {
logger.info('[' + request.auth.credentials.username + '] GetAllBoards requested');
var facility = request.params.facility;
repo.GetAllBoards(facility)
.then(boards => {
if (boards !== null) {
reply(boards);
} else {
reply().code(404);
}
})
.catch(err => {
geh.HandleError(request.auth.credentials.username, err, reply);
});
}
}
}
};