Get random wiki page from cloud functions - json

I tried to get a random Wikipedia page over their API via Google Cloud Functions. The Wikipedia API works fine. This is my request:
https://de.wikipedia.org/w/api.php?action=query&format=json&generator=random
For testing you can change the format to jsonfm in see the result in the browser. Click here 👍.
But it seems that my functions get destroyed even before the request was completely successfully. If I want to parse the data (or even if I want to log that data) I got a
SyntaxError: Unexpected end of json
The log look like (for example) that (no I haven't cut it by myself):
DATA: ue||"},"query":{"pages":{"2855038":{"pageid":2855038,"ns":0,"title":"Thomas Fischer
Of course, that is not a valid json and can't be parsed. Whatever this is my function:
exports.randomWikiPage = function getRandomWikiPage (req, res) {
const httpsOptions = {
host: "de.wikipedia.org",
path: "/w/api.php?action=query&format=json&generator=random"
};
const https = require('https');
https.request(httpsOptions, function(httpsRes) {
console.log('STATUS: ' + httpsRes.statusCode)
console.log('HEADERS: ' + JSON.stringify(httpsRes.headers))
httpsRes.setEncoding('utf8')
httpsRes.on('data', function (data) {
console.log("DATA: " + data)
const wikiResponse = JSON.parse(data);
const title = wikiResponse.query.title
res.status(200).json({"title": title})
});
}).end();
};
I've already tried to return something here. Like that video explained. But as I look into the node docs https.request don't return a Promise. So return that is wrong. I've also tried to extract the on('data', callback) into it's own function so that I can return the callback. But I haven't a success with that either.
How have to look my function that it return my expected:
{"title": "A random Wikipedia Page title"}
?

I believe your json comes through as a stream in chunks. You're attempting to parse the first data chunk that comes back. Try something like:
https.request(httpsOptions, function(httpsRes) {
console.log('STATUS: ' + httpsRes.statusCode)
console.log('HEADERS: ' + JSON.stringify(httpsRes.headers))
httpsRes.setEncoding('utf8')
let wikiResponseData = '';
httpsRes.on('data', function (data) {
wikiResponseData += data;
});
httpRes.on('end', function() {
const wikiResponse = JSON.parse(wikiResponseData)
const title = wikiResponse.query.title
res.status(200).json({"title": title})
})
}).end();
};

Related

aws s3 bucket image upload in angular6 having problem in return

I am using this approach to upload images to aws s3 bucket:
https://grokonez.com/aws/angular-4-amazon-s3-example-how-to-upload-file-to-s3-bucket
This works fine as an individual task but as far as I rely on the result which is coming a bit later due to async behavior may be. I would like the next task to be executed just after the confirmation.
upload() {
let file: any;
// let urltype = '';
let filename = '';
// let b: boolean;
for (let i = 0 ; i < this.urls.length ; i++) {
file = this.selectedFiles[i];
// urltype = this.urltype[i];
filename = file.name;
const k = uuid() + '.' + filename.substr((filename.lastIndexOf('.') + 1));
this.uploadservice.uploadfile(file, k);
console.log('done');
// console.log('file: ' + file + ' : ' + filename);
// let x = this.userservice.PostImage('test', file);
// console.log('value of ' + x);
}
// return b;
}
fileupload service:
bucket.upload(params, function (err, data) {
if (err) {
console.log('There was an error uploading your file: ', err);
return false;
}
console.log('Successfully uploaded file.', data);
return true;
}).promise();
}
Here, done is getting executed before the file upload is done.
I think you should check out a tutorial for asynchronous programming and try to play around with couple of examples using simple timeouts to get the hang of it and then proceed with more complex things like s3 and aws.
Here is how I suggest you start your journey:
1) Learn the basic concepts of asynchronous programming using pure JS
https://eloquentjavascript.net/11_async.html
2) Play around with your own examples using callbacks and timeouts
3) Replace the callbacks with Promises
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Using_promises
4) Do it the "angular" way with rxjs Observables (similar to JS Observable)
http://reactivex.io/rxjs/class/es6/Observable.js~Observable.html
PS: To be more concrete:
Your code fails because the following line is executed in an asynchronous manner. Thus the code will call your uploadfile function and will immedietly continue executing without waiting.
this.uploadservice.uploadfile(file, k);
Once you follow all the points I described above you will be able to do something like this (using a Promise):
this.uploadservice.uploadfile(file, k)
.then( result => {
console.log('Upload finished');
})
.catch(error => {
console.log('Something went wrong');
});

CouchDb 2.1.1 Admin API Compaction PUT Request

I am working in NodeJS with CouchDB 2.1.1.
I'm using the http.request() method to set various config settings using the CouchDB API.
Here's their API reference, yes, I've read it:
Configuration API
Here's an example of a working request to set the logging level:
const http = require('http');
var configOptions = {
host: 'localhost',
path: '/_node/couchdb#localhost/_config/',
port:5984,
header: {
'Content-Type': 'application/json'
}
};
function setLogLevel(){
configOptions.path = configOptions.path+'log/level';
configOptions.method = 'PUT';
var responseString = '';
var req = http.request(configOptions, function(res){
res.on("data", function (data) {
responseString += data;
});
res.on("end", function () {
console.log("oldLogLevel: " + responseString);
});
});
var data = '\"critical\"';
req.write(data);
req.end();
}
setLogLevel();
I had to escape all the quotes and such, which was expected.
Now I'm trying to get CouchDb to accept a setting for compaction.
The problem is that I'm attempting to replicate this same request to a different setting but that setting doesn't have a simple structure, though it appears to be "just a String" as well.
The CouchDB API is yelling at me about invalid JSON formats and I've tried a boatload of escape sequences and attempts to parse the JSON in various ways to get it to behave the way I think it should.
I can use Chrome's Advanced Rest Client to send this payload, and it is successful:
Request Method: PUT
Request URL: http://localhost:5984/_node/couchdb#localhost/_config/compactions/_default
Request Body: "[{db_fragmentation, \"70%\"}, {view_fragmentation, \"60%\"}, {from, \"23:00\"}, {to, \"04:00\"}]"
This returns a "200 OK"
When I execute the following function in my node app, I get a response of:
{"error":"bad_request","reason":"invalid UTF-8 JSON"}
function setCompaction(){
configOptions.path = configOptions.path+'compactions/_default';
configOptions.method = 'PUT';
var responseString = '';
var req = http.request(configOptions, function(res){
res.on("data", function (data) {
responseString += data;
});
res.on("end", function () {
console.log("oldCompaction: " + responseString);
});
});
var data = "\"[{db_fragmentation, \"70%\"}, {view_fragmentation, \"60%\"}, {from, \"23:00\"}, {to, \"04:00\"}]\"";
req.write(data);
req.end();
}
Can someone point at what I'm missing here?
Thanks in advance.
You need to use node's JSON module to prepare the data for transport:
var data = '[{db_fragmentation, "70%"}, {view_fragmentation, "60%"}, {from, "23:00"}, {to, "04:00"}]';
// Show the formatted data for the requests' payload.
JSON.stringify(data);
> '"[{db_fragmentation, \\"70%\\"}, {view_fragmentation, \\"60%\\"}, {from, \\"23:
00\\"}, {to, \\"04:00\\"}]"'
// Format data for the payload.
req.write(JSON.stringify(data));

NodeJS http.request end processing before data processes

Can some explain why the the http.request end function is running before any data is actually retrieved? And how would I debug this any further? Should I be checking an http status?
This is going to work with Google Home app, but I took that code out and getting same error running locally. The http.request is from what a teacher provided in a class.
You can paste: people/?search=Luke%20Skywalker
into http://swapi.com (SW = StarWars API) to see the expected result.
'use strict';
/*eslint no-undef: "error"*/
/*eslint-env node*/
/*eslint-disable no-console */
let http = require('http');
let starWarsAPI = `www.swapi.co`;
//function to get details of the Star Wars Characters
//exports.handler = function(event, context, callback) {
//console.log("event=" + JSON.stringify(event));
//console.log("context=" + JSON.stringify(context));
//let characterName = event.result.parameters.StarWarsCharacter;
let characterName = "Luke Skywalker";
console.log("**** characterName=" + characterName);
let options = searchPeopleRequestOptions(characterName);
console.log("options=" + JSON.stringify(options));
makeRequest(options, function( data, error) {
console.log(" Processing data.results");
let person = data.results[0];
if (person) {
let height = person.height;
let mass = person.mass;
let response = person.name + " is " + height + " centimeters tall, weighs " + mass + " kilograms";
console.log("**** response=" + response);
//callback(null, {"speech": response});
}
else {
console.log ("No person found");
//callback(null, {"speech": "I'm not sure that character exists!"});
}
});
//};
console.log("The end");
//create a function to read first and last names from the API.
function searchPeopleRequestOptions(argCharacterName) {
var pathValue = `/api/people/?search=`+
encodeURIComponent(argCharacterName);
return {
host: starWarsAPI,
path: pathValue
};
}
function makeRequest(options, callback) {
var responseString = "";
var request = http.request(options,
function(response) {
response.on('data', function(data) {
responseString += data;
console.log("responseString=" + responseString);
});
response.on('end', function() {
console.log("end: responseString=" + responseString);
// dies on next line because responseString is empty
var responseJSON = JSON.parse(responseString);
callback(responseJSON, null);
});
response.on('error', function (error) {
console.log('\n Error received: ' + error);
});
});
request.end();
}
This is what I see when I run it:
E:\GitHub\NealWalters\GoogleHomeTest
λ node indexTest.js
**** characterName=Luke Skywalker
options={"host":"www.swapi.co","path":"/api/people/?search=Luke%20Skywalker"}
The end
end: responseString=
undefined:1
I'm not sure what's writing out the "undefined: 1" line.
If you look at the server's response status code, it will be 301: Moved Permanently.
And value of location field of response is:
https://swapi.co/api/people/?search=Luke%20Skywalker
instead
http://swapi.co/api/people/?search=Luke%20Skywalker
As we can see, the protocol changed from http to https.
The problem is that the http client supplied with the node.js does not support redirection for permanently changed URL.
So, you can use https module instead http (just change the require('https')).
Or use packages that support redirection. For example axios or request.

Parsing nested JSON with NODE

Second question for the day :)
Still working on my first ever app, and I've hit a bit of a snag using an API that returns currency exchange values. I need to extract the current AUD value from this JSON :
{"base":"USD","date":"2016-05-30","rates":{"AUD":1.3919,"BGN":1.7558,"BRL":3.6043,"CAD":1.3039,"CHF":0.99273,"CNY":6.5817,"CZK":24.258,"DKK":6.6765,"GBP":0.68341,"HKD":7.7688,"HRK":6.7195,"HUF":281.72,"IDR":13645.0,"ILS":3.8466,"INR":67.139,"JPY":111.19,"KRW":1190.9,"MXN":18.473,"MYR":4.1175,"NOK":8.3513,"NZD":1.4924,"PHP":46.73,"PLN":3.9447,"RON":4.0428,"RUB":65.89,"SEK":8.3338,"SGD":1.3811,"THB":35.73,"TRY":2.9565,"ZAR":15.771,"EUR":0.89775}}
Here is the code I am using:
var http = require('http');
var options = {
host: 'api.fixer.io',
port: 80,
path: '/latest?base=USD',
method: 'GET'
};
http.request(options, function(res) {
console.log('STATUS: ' + res.statusCode);
res.setEncoding('utf8');
res.on('data', function (chunk) {
const json = JSON.parse(chunk);
rate = json.AUD;
console.log(rate);
});
}).end();
Unfortunately this doesn't work, and I assume that is because the JSON is nested? How do I go about querying this nested string correctly?
I also know I need to tighten up my handling of chunks, but it's baby steps for me right now :)
Thank you!
not json.AUD, it is
json.rates.AUD
You should wait for whole data first, or use one of the streaming parsers instead (for example: https://github.com/dominictarr/JSONStream).
That is because "chunk" is not all data at once - it may be just part of it, which means it's not a valid JSON itself.
http.request(options, function(res) {
console.log('STATUS: ' + res.statusCode);
var data = '';
res.setEncoding('utf8');
res.on('data', function (chunk) {
data += chunk;
});
res.on('end', function () {
const json = JSON.parse(data);
// As #huaoguo mentioned, it should be `json.rates.AUD`, not `json.AUD`
rate = json.rates.AUD;
console.log(rate);
});
}).end();
Also, as #huaoguo mentioned, there should be json.rates.AUD instead of json.AUD.

nodejs - parsing chunked twitter json

The nodejs server 'gets' this JSON stream from Twitter and sends it to the client:
stream.twitter.com/1/statuses/filter.json?track=gadget
The data returned to the client is 'chunked' JSON and both JSON.parse(chunk) and eval('(' + chunk + ')') on the client side result in parsing errors.
Concatenating the chucked pieces and waiting for the 'end' event isn't a solution either
I noticed previous samples used something like this on the client side that apparently worked before:
socket.onmessage = function(chunk) {
data = eval("(" + chunk.data + ")");
alert(data.user.screen_name);
I'm using this on the client side and it results in a parsing error:
var socket = new io.Socket();
socket.on('message', function(chunk) {
var data = eval('(' + chunk + ')'); // parsing error
alert(data.screen_name):
I know that its successfully returning a JSON chunk with:
var socket = new io.Socket();
socket.on('message', function(chunk) {
alert(chunk): // shows a JSON chunk
Server:
response.on('data', function (chunk) {
client.each(function(e) {
e.send(chunk);
});
Did something change or what else em I doing wrong?
UPDATE: The 'end' event does not fire because its streaming?
http.get({
headers: { 'content-type': 'application/json' },
host: 'stream.twitter.com',
path: '/1/statuses/filter.json?track...
}, function(res) {
res.setEncoding('utf8');
res.on('data', function (chunk) {
client.each(function(e) {
e.send(chunk);
});
});
// does not fire
res.on('end', function () {
});
...
I'm looking into the difference with http 1.0 and http 1.1 as far as sending chunked data.
Look at the section titled Parsing Responses in Twitter's documentation.
Parsing JSON responses from the Streaming API is simple every object is returned on its own line, and ends with a carriage return. Newline characters (\n) may occur in object elements (the text element of a status object, for example), but carriage returns (\r) should not.
On the server side, keep accumulating chunks until you see the carriage return "\r". Once the carriage return is found, extract the string up to the carriage return, and that gives us one tweet.
var message = ""; // variable that collects chunks
var tweetSeparator = "\r";
res.on('data', function(chunk) {
message += chunk;
var tweetSeparatorIndex = message.indexOf(tweetSeparator);
var didFindTweet = tweetSeparatorIndex != -1;
if (didFindTweet) {
var tweet = message.slice(0, tweetSeparatorIndex);
clients.forEach(function(client) {
client.send(tweet);
});
message = message.slice(tweetSeparatorIndex + 1);
}
});
The client becomes simple. Simply parse the socket message as JSON in its entirety.
socket.on('message', function(data) {
var tweet = JSON.parse(data);
});
#Anurag I'cant add comments, however instead of
if (chunk.substr("-1") == "\r")
it should be:
if ( chunk.charCodeAt(chunk.length-2) == 13 )
The carriage return isn't the last character.
I would recommend piping the response into a JSON parser. You can use this: https://github.com/dominictarr/JSONStream