i was making an api but i got here an error
the error:
const err = new MongooseError(message);
full error:
MongooseError: Operation `users.insertOne()` buffering timed out after 10000ms
at Timeout.<anonymous> (C:\path%\path%\path%\path%\path\node_modules\mongoose\lib\drivers\node-mongodb-native\collection.js:158:23)
Please help me i cant solve this error
Related
I have an array of work to do inside a Google Cloud Function. If I run the script locally it produces output to console but not in the cloud function.
// this is logged
console.error('An error');
for (var i=0; i<chunks.length; i++) {
const work = chunks[i].map(c => createOrUpdateTable(c, timestamp));
await Promise.all(work)
// this is not logged
.catch(e => console.error(e.message));
}
I've tried to put the catch inside the function and a whole lot of other things but same behaviour. How can I get the error to appear in the log?
According to the official documentation, you can emit an error from a Cloud Function to
Stackdriver Error Reporting:
// These WILL be reported to Stackdriver Error Reporting
console.error(new Error('I failed you'));
console.error('I failed you', new Error('I failed you too'));
throw new Error('I failed you'); // Will cause a cold start if not caught
// These will NOT be reported to Stackdriver Error Reporting
console.info(new Error('I failed you')); // Logging an Error object at the info level
console.error('I failed you'); // Logging something other than an Error object
throw 1; // Throwing something other than an Error object
callback('I failed you');
res.status(500).send('I failed you');
Reporting Errors
You should use a try/catch block as follows:
for (var i=0; i<chunks.length; i++) {
const work = chunks[i].map(c => createOrUpdateTable(c, timestamp));
try {
await Promise.all(work)
} catch (e) {
console.error(e.message));
return { error: err }
}
}
I have reproduced the issue and I get the error logged but differently.
First function:
exports.helloError = (data, context, callback) => {
// [START functions_helloworld_error]
// These WILL be reported to Stackdriver Error Reporting
console.error(new Error('I failed you'));
console.error('I failed you', new Error('I failed you too'));
throw new Error('I failed you'); // Will cause a cold start if not caught
// [END functions_helloworld_error]
};
In stackdriver logging it appears as an error !!
severity: "ERROR"
textPayload: "Error: I failed you
at exports.helloError (/srv/index.js:4:17)
at /worker/worker.js:783:7
at /worker/worker.js:766:11
And with the second function:
exports.helloError = (data, context, callback) => {
try{
throw new Error('I failed you'); // Will cause a cold start if not caught
}catch(e){
console.log(e.message);
};
};
It is reported as an INFO λ
severity: "INFO"
textPayload: "I failed you"
I suspect that as the error is handled in the second one, the function is working as expected, so it will not be reported as an error but information of the performance.
When trying to log into textnow through an API using the correct username and password, the following error occurs:
UnhandledPromiseRejectionWarning: Error: 401 Unauthorized
at _response.transport.request.then (E:\nodejs\node_modules\snekfetch\src\index.js:193:21)
at process._tickCallback (internal/process/next_tick.js:68:7)
(node:19732) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
This error originated either by throwing inside of an async function without
a catch block, or by rejecting a promise which was not handled with .catch().
(rejection id: 2)
Here's a look at the code from the API that I'm using:
module.exports.textnowLogin = (email, password) => {
return new Promise((resolve, reject) => {
let json = { "password": password, "username": email };
let queryEndpoint = "sessions?client_type=TN_ANDROID";
let signature = md5(`${tnSignatureKey}POST${queryEndpoint}${JSON.stringify(json)}`);
snekfetch.post(`https://api.textnow.me/api2.0/${queryEndpoint}&signature=${signature}`)
.set("Content-Type", "application/json")
.send(json)
.then((result) => {
return resolve(result.body);
}).catch(reject);
});
};
Here's a look at how I use this method in my js file:
const textNow = require('textnow-api');
textNow.login(username, password).then(client => {
console.log(`Logged in as ${client.username}`);
});`
This definitely has to be a server side issue, no? Something must be going wrong on Textnow's end. What can I do to circumvent this?
EDIT: const snekfetch = require("snekfetch"),
md5 = require("md5"),
tnSignatureKey = "f8ab2ceca9163724b6d126aea9620339";
Where did this key originate from? Perhaps if a new one was generated then the authorization error would be solved?
As a side note, another potential issue could be the client_type being set to ANDROID, and I am trying to use an iOS account to login. However, whenever I try using an Android account to log in instead, I get a 400 Bad Request, like Textnow does not recognize the account's credentials.
There may be other errors in the code, but the first mistake is the creation of an extra promise. Here's a version of the code that solves that and should either work or be simpler to debug...
module.exports.textnowLogin = (email, password) => {
let json = { "password": password, "username": email };
let queryEndpoint = "sessions?client_type=TN_ANDROID";
let signature = md5(`${tnSignatureKey}POST${queryEndpoint}${JSON.stringify(json)}`);
let url = `https://api.textnow.me/api2.0/${queryEndpoint}&signature=${signature}`;
return snekfetch.post(url).set("Content-Type", "application/json").send(json).then(result => {
return result.body;
});
};
I'm attempting to parse a fairly large JSON file (~500Mb) in NodeJS. My implementation is based on the Async approach given in this answer:
var fileStream = require('fs');
var jsonObj;
fileStream.readFile('./data/exporttest2.json', fileCallback);
function fileCallback (err, data) {
return err ? (console.log(err), !1):(jsonObj = JSON.parse(data));
//Process JSON data here
}
That's all well and good, but I'm getting hit with the following error message:
buffer.js:495
throw new Error('"toString()" failed');
^
Error: "toString()" failed
at Buffer.toString (buffer.js:495:11)
at Object.parse (native)
at fileCallback (C:\Users\1700675\Research\Experiments\NodeJS\rf_EU.js:49:18)
at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:445:3)
I understand from this answer that this is caused by the maximum buffer length in the V8 engine set at 256Mb.
My question then is this, is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
This is acommon problem and there are several modules than can help you with that:
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/json-stream
https://www.npmjs.com/package/json-parse-stream
https://www.npmjs.com/package/json-streams
https://www.npmjs.com/package/jsonparse
Example with JSONStream:
const JSONStream = require('JSONStream');
const fs = require('fs');
fs.createReadStrem('./data/exporttest2.json')
.pipe(JSONStream.parse('...'))...
See the docs for details of all of the arguments.
Try using streams:
let fs = require("fs");
let s = fs.createReadStream('./a.json');
let data = [];
s.on('data', function (chunk) {
data.push(chunk);
}).on('end', function () {
let json = Buffer.concat(data).toString();
console.log(JSON.parse(json));
});
I got the following code:
using (var client = new HttpClient(new NativeMessageHandler()))
{
client.Timeout = TimeSpan.FromSeconds(30.0);
var response = await client.PostAsync(requestString, content);
json = await response.Content.ReadAsStringAsync();
var tk = JObject.Parse(json);
if ((string)tk.GetValue("Ident") == "Message" && (int)tk.GetValue("Status") == 401)
throw new AuthenticationException();
return tk;
}
This is the base for calling a webservice for the app I'm working on. The thing is: Every time I step over this one line:
var response = await client.PostAsync(requestString, content);
I get an error message from Visual Studio saying "The network connection to {IP:PORT} has been lost. Debugging will be aborted."
And so I get disconnected (happens with Emulator and WP device). This is really annoying since I would like to debug the result, but no chance. As I just have noticed the connection breaks even if you continue. Once it hit's it, the connection get lost.
This is a very weird behavior. Any clues why this happens or what could trigger connection loss?
FYI I'm using Newtonsoft.Json for that.
I'm trying to send an object type UserEntry to the client.
the route I used was: http://localhost:3027/api/userapi/getinfo?username=myUsername
What is the cause of this error or what is wrong with my code?
[HttpGet, Route("api/userapi/getinfo")]
public async Task<string> getUserInfo([FromUri]string username)
{
UserEntry u = await UserEntry.getUserInfo(username);
return new JavaScriptSerializer().Serialize(u);
}
Here is what inner exception shows:
InnerException: {
Message: "An error has occurred.",
ExceptionMessage: "Invalid operation. The connection is closed.",
ExceptionType: "System.InvalidOperationException",
StackTrace: " at System.Data.SqlClient.SqlConnection.GetOpenConnection() at System.Data.SqlClient.SqlConnection.get_ServerVersion()"
}
I checked and made sure that there was no error in connecting with the database, but it still shows error.
I temporarily solved it by making it synchronous
I solved it by making it synchronous
[HttpGet,Route("api/userapi/getinfo")]
public SimpleUser getUserInfo([FromUri]string username)
{
var ur = new UserRepository();
return ur.getUser(username).First();
}
public IEnumerable<SimpleUser> getUser(string username)
{
UserEntryDBContext context = new UserEntryDBContext();
UserEntry u = context.Users.Where( x => x.username == username).FirstOrDefault();
List<SimpleUser> s = new List<SimpleUser>();
s.Add(new SimpleUser(u));
return s;
}
but I still have no idea what causes the error nor how can I make it asynchronous.