Save filereader result to variable for later use - json

I can't find simple answer, but my code is simple.
I tried something like that, but always when i try to console.log my testResult, then i always recieving null. How to save data from file correctly?
public getFile(
sourceFile: File
): string {
let testResult;
const file = sourceFile[0]
const fileReader = new FileReader();
fileReader.readAsText(file, "UTF-8")
fileReader.onloadend = (e) => {
testResult = fileReader.result.toString()
}
console.log(testResult)
return testResult
}
This problem is related to my other topics, main reason is i can't handle load json file, translate them and upload to user. If i can save this file outside onloadend, then i hope i can handle rest of them (other attempts failed, this one blocking me at beginning)

Your issue is quite classical and is related to the asynchronous operations. Function which you assign to the onloadend request is called only when loadend event fires, but the rest of code will not wait for that to happen and will continue execution. So console.log will be executed immediately and then return will actually return testResult while it is still empty.
Firstly, in order to understand what I just said, put the console.log(testResult) line inside of your onloadend handler:
fileReader.onloadend = (e) => {
testResult = fileReader.result.toString();
console.log(testResult);
}
At this point testResult is not empty and you may continue handling it inside this function. However, if you want your getFile method to be really reusable and want it to return the testResult and process it somewhere else, you need to wrap this method into a Promise, like this:
public getFile(
sourceFile: File
): Promise<string> {
return new Promise((resolve) => {
const file = sourceFile[0]
const fileReader = new FileReader();
fileReader.onloadend = (e) => {
const testResult = fileReader.result.toString();
resolve(testResult);
}
fileReader.readAsText(file, "UTF-8");
});
}
Now whereever you need a file you can use the yourInstance.getFile method as follows:
yourInstance.getFile().then(testResult => {
// do whatever you need here
console.log(testResult);
});
Or in the async/await way:
async function processResult() {
const testResult = await yourInstance.getFile();
// do whatever you need
console.log(testResult);
}
If you are now familiar with promises and/or async/await, please read more about here and here.

Related

ReactJS DraftJS Initialize from Serialized Data

So I am using the DraftJS package with React along with the mentions plugin. When a post is created, I store the raw JS in my PostreSQL JSONField:
convertToRaw(postEditorState.getCurrentContent())
When I edit the post, I set the editor state as follows:
let newEditorState = EditorState.createWithContent(convertFromRaw(post.richtext_content));
setEditorState(newEditorState);
The text gets set correctly, but none of the mentions are highlighted AND I can't add new mentions. Does anyone know how to fix this?
I am using the mention plugin: https://www.draft-js-plugins.com/plugin/mention
to save data
function saveContent() {
const content = editorState.getCurrentContent();
const rawObject = convertToRaw(content);
const draftRaw = JSON.stringify(rawObject); //<- save this to database
}
and retrieval:
setEditorState(()=> EditorState.push(
editorState,
convertFromRaw(JSON.parse(draftRaw)),
"remove-range"
););
it should preserve your data as saved.
the example provided (which works ok) is for inserting a new block with mention, saving the entityMap as well.
mentionData is jus a simple object {id:.., name:.., link:... , avatar:...}
One more thing:
initialize only once:
in other words do not recreate the state.
const [editorState, setEditorState] = useState(() => EditorState.createEmpty() );
und then populate something like:
useEffect(() => {
try {
if (theDraftRaw) {
let mtyState = EditorState.push(
editorState,
convertFromRaw(JSON.parse(theDraftRaw)),
"remove-range"
);
setEditorState(mtyState);
} else editorClear();
} catch (e) {
console.log(e);
// or some fallback to other field like text
}
}, [theDraftRaw]);
const editorClear = () => {
if (!editorState.getCurrentContent().hasText()) return;
let _editorState = EditorState.push(
editorState,
ContentState.createFromText("")
);
setEditorState(_editorState);
};

calling store procedures within fast-csv asynchronously

I am writing a backend API in node.js and need the functionality for users to be able to upload files with data and then calling stored procedures for inserting data into MySQL. I'm thinking of using fast-csv as parser, however I am struggling with how to set up the call to stored procedure in csv stream. the idea is something like this:
var fs = require("fs");
var csv = require("fast-csv");
var stream1 = fs.createReadStream("files/testCsvFile.csv");
csv
.fromStream(stream2, { headers: true })
.on("data", function(data) {
//CALL TO SP with params from "data"//
numlines++;
})
.on("end", function() {
console.log("done");
});
In other parts of application I have set up routes as follows:
auth.post("/verified", async (req, res) => {
var user = req.session.passwordless;
if (user) {
const rawCredentials = await admin.raw(getUserRoleCredentials(user));
const { user_end, role } = await normalizeCredentials(rawCredentials);
const user_data = { user_end, role };
res.send(user_data);
} else {
res.sendStatus(401);
}
});
..that is - routes are written in async/await way with queries (all are Stored Procedures called) being defined as Promises.. I would like to follow this pattern in upload/parse csv/call SP for every line function
This is doing the job for me - - can you please describe how to achive that with your framework - - I believe it should be done somehowe, I just need to configure it correctli
//use fast-csv to stream data from a file
csv
.fromPath(form.FileName, { headers: true })
.on("data", async data => {
const query = await queryBuilder({
schema,
routine,
parameters,
request
}); //here we prepare query for calling the SP with parameters from data
winston.info(query + JSON.stringify(data));
const rawResponse = await session.raw(query); //here the query gets executed
fileRows.push(data); // push each row - for testing only
})
.on("end", function() {
console.log(fileRows);
fs.unlinkSync(form.FileName); // remove temp file
//process "fileRows" and respond
res.end(JSON.stringify(fileRows)) // - for testing
});
As mentioned in the comment, I made my scramjet to handle such a use case with ease... Please correct me if I understood it wrong, but I understand you want to call the two await lines for every CSV row in the test.
If so, your code would look like this (updated to match your comment/answer):
var fs = require("fs");
var csv = require("fast-csv");
var stream1 = fs.createReadStream("files/testCsvFile.csv");
var {DataStream} = require("scramjet");
DataStream
// the following line will convert any stream to scramjet.DataStream
.from(csv.fromStream(stream2, { headers: true }))
// the next lines controls how many simultaneous operations are made
// I assumed 16, but if you're fine with 40 or you want 1 - go for it.
.setOptions({maxParallel: 16})
// the next line will call your async function and wait until it's completed
// and control the back-pressure of the stream
.do(async (data) => {
const query = await queryBuilder({
schema,
routine,
parameters,
request
}); //here we prepare query for calling the SP with parameters from data
winston.info(query + JSON.stringify(data));
const rawResponse = await session.raw(query); //here the query gets executed
return data; // push each row - for testing only)
})
// next line will run the stream until end and return a promise
.toArray()
.then(fileRows => {
console.log(fileRows);
fs.unlinkSync(form.FileName); // remove temp file
//process "fileRows" and respond
res.end(JSON.stringify(fileRows)); // - for testing
})
.catch(e => {
res.writeHead(500); // some error handling
res.end(e.message);
})
;
// you may want to put an await statement before this, or call then to check
// for errors, which I assume is your use case.
;
To answer your comment question - if you were to use an async function in the on("data") event - you would need to create an array of promises and await Promise.all of that array on stream end - but that would need to be done synchronously - so async function in an event handler won't do it.
In scramjet this happens under the hood, so you can use the function.

Is it possible to ignore &#65279 in innerhtml

I have a line of code that looks
await page.$$eval("a", as => as.find(a => a.innerText.includes("shop")).click());
So, it will click at shop and all okay, but if shop is written like this - "S&#65279h&#65279op". So, puppeteer wouldn't be able to find it. Is it possible to ignore &#65279? So, that puppeteer would only see "shop".
You can decode the innerText using DOMParser. Example copied from this answer.
window.getDecodedHTML = function getDecodedHTML(encodedStr) {
const parser = new DOMParser();
const dom = parser.parseFromString(
`<!doctype html><body>${encodedStr}`,
"text/html"
);
return dom.body.textContent;
}
Save the above snippet to some file like script.js and inject it for easier usage.
await page.evaluate(fs.readFileSync('script.js', 'utf8'));
Now you can use it to decode the innerText.
await page.$$eval("a", as => as.find(a => getDecodedHTML(a.innerText).includes("shop")).click());
The solution might not be optimal. But it should work out.
Here is another snippet for you which doesn't require DOMparser.
window.getDecodedHTML = function(str) {
return str.replace(/&#(\d+);/g, function(match, dec) {
return String.fromCharCode(dec);
});
};

html fetch multiple files

I would like to fetch multiple files at once using the new fetch api (https://fetch.spec.whatwg.org/). Is is possible natively? If so, how should I do it leveraging the promises?
var list = [];
var urls = ['1.html', '2.html', '3.html'];
var results = [];
urls.forEach(function(url, i) { // (1)
list.push( // (2)
fetch(url).then(function(res){
results[i] = res.blob(); // (3)
})
);
});
Promise
.all(list) // (4)
.then(function() {
alert('all requests finished!'); // (5)
});
This is untested code! Additionally, it relies on Array.prototype.forEach and the new Promise object of ES6. The idea works like this:
Loop through all URLs.
For each URL, fetch it with the fetch API, store the returned promise in list.
Additionally, when the request is finished, store the result in results.
Create a new promise, that resolves, when all promises in list are resolved (i.e., all requests finished).
Enjoy the fully populated results!
While implementing Boldewyn's solution in Kotlin, I pared it down to this:
fun fetchAll(vararg resources: String): Promise<Array<out Response>> {
return Promise.all(resources.map { fetch(it) }.toTypedArray())
}
Which roughly translates to this in JavaScript:
function fetchAll(...resources) {
var destination = []
resources.forEach(it => {
destination.push(fetch(it))
})
return Promise.all(destination)
}
Earlier, I tried to use map instead of forEach + pushing to a new array, but for some reason that simply didn't work.

LocomotiveJS access response JSON in controller's after filter

I'm looking for a way to access the JSON being sent back to the requestor in the "after" filter for a controller.
var locomotive = require('locomotive');
var myController = new locomotive.Controller();
myController.after('myAction', function(next) {
var response = {}; //I want to access the JSON being sent back in myAction: {'hello':'world'}
console.log(response); //this should log "{'hello':'world'}"
next();
});
myController.myAction = function myAction() {
this.res.json({'hello':'world'});
}
module.exports = myController;
If anyone has any way of doing this, it would be much appreciated.
In your main action, assign your json to an object on this (res is reserved):
myController.myAction = function myAction() {
this.model = {'hello':'world'};
this.res.json(this.model);
}
Then you can access it in your after filter:
myController.after('myAction', function(next) {
var model = this.model;
console.log(model);
next();
});
I found a "hack" solution... It's not the cleanest, and requires changing the code within the express response.js file in "node_modules"...
If anyone has a better option where you can access the json being sent in response to the request within the controller action (or controller filter) itself, I'd greatly appreciate it.
Thanks.
in the ~/node_modules/locomotive/node_modules/express/lib/response.js file, I altered the "res.json" function (line 174 for me) to include the following line after the declaration of the body variable (which is passed to the send function).
this.responseJSON = body;
This allows you to access this.responseJSON within a controller's after filter, as follows:
myController.after('myAction', function(next) {
**var response = this.res.responseJSON; //ACCESS RESPONSE JSON HERE!!!!!**
console.log(response); //Now logs "{'hello':'world'}"
next();
});
Like I said, not the most elegant, but gets the job done in a pinch. Any more elegant solutions welcome...