I got working code on local NodeJs server, then I tried to run same code on Firebase Function and failed.
Here is my code split into 2 files:
// index.js
const functions = require("firebase-functions");
const polly = require("./polly");
exports.pollySynth = functions.https.onCall((req, res) => {
return polly.speak()
});
// polly.js
const AWS = require("aws-sdk");
// Create an Polly client
const Polly = new AWS.Polly({
signatureVersion: "v4",
region: "my-region",
accessKeyId: "keyId,
secretAccessKey: "access key",
});
// Params
const params = {
Text: "Hello world. This is the word of speech!",
OutputFormat: "json",
};
// prettier-ignore
const synthSpeech = function() {
Polly.synthesizeSpeech(params, function(err, res) {
if (err) {
console.log("err", err);
return err;
} else if (res && res.AudioStream instanceof Buffer) {
return res.AudioStream;
}
});
};
module.exports = {speak: synthSpeech};
When trying to deploy this function, I'm getting this error:
Functions deploy had errors with the following functions: pollySynth(us-central1)
I'm not pro coder, so maybe it's just some dumb error in my code.. Please help me :)
That took around half day to solve this issue. The root of the problem was that for some reason, firebase didn't provide detailed error logs. It just said that there is some error in the code.
Only running this command through the command line helped me to understand what's wrong:
firebase functions:log --only pollySynth
I found that command only at StackOverflow, and official documentation didn't mention it.
Log looked this way:
Did you list all required modules in the package.json dependencies?
2021-08-20T12:04:51.823Z ? speechSynth:
Detailed stack trace: Error: Cannot find module 'aws-sdk'
Now I understand that problem is that npm modules was not installed. It worked locally with an emulator, but deploy was throwing an error 🤯
I tried to install aws sdk using commands like this again and again. Tried to add module into package.json manually as well, but nothing worked...
npm install aws-sdk
npm install --save
Then I looked into "node_modules" folder inside the "functions" folder and didn't find that package!
Then I looked at the folder structure installed by default:
- root
- functions
- node modules
- public
It seems like the terminal is running the command on the root folder, and "node_modules" is placed inside the "functions" folder.
So I opened the terminal, moved to the "functions" folder, and installed aws sdk.
cd functions
npm install aws-sdk
And successfully deployed the function!
The next problem was that the function return null. I needed to use a promise to wait for aws server response.
And that code worked for me:
const createSpeech = function() {
return Polly.synthesizeSpeech(params)
.promise()
.then((audio) => {
if (audio.AudioStream instanceof Buffer) {
return audio.AudioStream;
} else {
throw new Error("AudioStream is not a Buffer.");
}
});
};
Now everything works correctly, and I can move to the next steps.
Related
I increase time the same way as described in OpenZepplin samples in the test below:
it("should revert claim drawing with 'Android: bad state'", async () => {
const [owner, signer1] = await ethers.getSigners();
let duration = time.duration.seconds(3);
await time.increase(duration);
await truffleAssert.reverts(
android.claimPainting(1),
'Android: bad state'
);
});
And it fails with Error: Invalid JSON RPC response: "". How can i fix it?
time is imported as const { time } = require("#openzeppelin/test-helpers"); + "#openzeppelin/test-helpers": "0.5.15" in package.json.
I also use ethers from hardhat, don't know if this could cause the problem.
You will need to execute Hardhat local node server and try again.
"npx hardhat node"
Hope this works on your side.
Install web3 and the hardhat-web3 plugin.
npm install --save-dev #nomiclabs/hardhat-web3 web3
Then add the following line to your hardhat config
import "#nomiclabs/hardhat-web3";
In my code I am calling a query from my lambda function
let featured_json_data = JSON.parse(fs.readFileSync('data/jsons/featured.json'))
This works locally because my featured.json is in the directory that I am reading from. However when I deploy with serverless, the zip it generates doesn't have those files, I get a
ENOENT: no such file directory, open...
I tried packaging by adding
package:
include:
- data/jsons/featured.json
but it just doesn't work. The only way I get this to work is manually adding the json file and then change my complied handler.js code to read from the json file in the root directory.
In this screenshot I have to add the jsons then manually upload it again and in the compiled handler.js code change the directory to exclude the data/jsons
I want to actually handle this in my servereless.yml
You can load JSON files using require().
const featured_json_data = require('./featured.json')
Or better yet, convert your JSON into JS!
For working with non-JSON files, I found that process.cwd() works for me in most cases. For example:
const fs = require('fs');
const path = require('path');
export default async (event, context, callback) => {
try {
console.log('cwd path', process.cwd());
const html = fs.readFileSync(
path.resolve(process.cwd(), './html/index.html'),
'utf-8'
);
const response = {
statusCode: 200,
headers: {
'Content-Type': 'text/html'
},
body: html
};
callback(null, response);
} catch (err) {
console.log(err);
}
};
I recommend looking at copy-webpack-plugin: https://github.com/webpack-contrib/copy-webpack-plugin
You can use it to package other files to include with your Lambda deployment.
In my project, I had a bunch of files in a /templates directory. In webpack.config.js to package up these templates, for me it looks like:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
plugins: [
new CopyWebpackPlugin([
'./templates/*'
])
]
};
fs.readFileSync cannot find file when deploying with lambda
Check the current directory and check target directory content in deploy environment. Add appropriate code for that checking to your program/script.
I'm writing a method that uses async/await and promises to write some JSON to a file and then render a pug template. But for some reason the code that writes the JSON conflicts with the res.render() method resulting in the browser not being able to connect to the server.
The weird thing is that I don't get any errors in the console, and the JSON file is generated as expected — the page just won't render.
I'm using the fs-extra module to write to disk.
const fse = require('fs-extra');
exports.testJSON = async (req, res) => {
await fse.writeJson('./data/foo.json', {Key: '123'})
.then(function(){
console.log('JSON updated.')
})
.catch(function(err){
console.error(err);
});
res.render('frontpage', {
title: 'JSON Updated...',
});
}
I'm starting to think that there is something fundamental I'm not getting that conflicts with promises, writing to disk and/or express' res.render method. It's worth noting that res.send() works fine.
I've also tried a different NPM module to write the file (write-json-file). It gave me the exact same issue.
UPDATE:
So I'm an idiot. The problem has nothing to do with Express og the JSON file. It has to do with the fact that I'm running nodemon to automatically restart the server when files are changed. So as soon as the JSON file was saved the server would restart, stopping the process of rendering the page. Apologies to the awesome people trying to help me anyway. You still helped me get to the problem, so I really appreciate it!
Here's the actual problem:
The OP is running nodemon to restart the server whenever it see filechanges, and this is what stops the code from running, because as soon as the json file is generated the server restarts.
Efforts to troubleshoot:
It's going to take some trouble shooting to figure this out and since I need to show you code, I will put it in an answer even though I don't yet know what is causing the problem. I'd suggest you fully instrument things with this code:
const fse = require('fs-extra');
exports.testJSON = async (req, res) => {
try {
console.log(`1:cwd - ${process.cwd()}`);
await fse.writeJson('./data/foo.json', {Key: '123'})
.then(function(){
console.log('JSON updated.')
}).catch(function(err){
console.error(err);
});
console.log(`2:cwd - ${process.cwd()}`);
console.log("about to call res.render()");
res.render('frontpage', {title: 'JSON Updated...',}, (err, html) => {
if (err) {
console.log(`res.render() error: ${err}`);
res.status(500).send("render error");
} else {
console.log("res.render() success 1");
console.log(`render length: ${html.length}`);
console.log(`render string (first part): ${html.slice(0, 20}`);
res.send(html);
console.log("res.render() success 2");
}
});
console.log("after calling res.render()");
} catch(e) {
console.log(`exception caught: ${e}`);
res.status(500).send("unknown exception");
}
}
I have a karma.conf.js file that exports a function that takes a config object and applies a bunch of configurations to that object.
module.exports = function(config) {
config.set({
basePath: '',
frameworks: ['jasmine'],
...
If I start karma from the command line like this: karma start, it runs correctly. Clearly the karma start function is inserting the required config object when it calls the function exported by karma.conf.js.
I am trying to start it with a gulp task that looks like this:
gulp.task('test', function (done) {
var karma = require('karma').server;
var karmaConf = require('./karma.conf.js')();
karma.start(karmaConf, done);
});
This gives me an error because the config parameter is missing.
Two questions:
How can I get the karma config object to include as a parameter, and
Is there a better way to do this?
try this:
gulp.task('test', function(done) {
var Server = require('karma').Server;
new Server({
configFile: __dirname + '/karma.conf.js',
singleRun: true
}, done).start();
});
I am aware this question is a little old, but I found it when I was about to post my own q/a to something similar. The main difference being that I'm not working with gulp, but am just using Karma's public API directly. I'm still stuck using Karma v0.12, but it doesn't look like the spec has changed in this regard. It still requires an ordinary object, and my config file exports a function, just like in the OP's situation.
The main problem with the sample in the question is it tries calling the config function without providing any arguments. That is probably what is throwing the error. In particular, the config function expects a single input config, and calls config.set(actualConfigObject). What I did was write a function of my own that provides a minimally suitable object.
All that is needed is to ensure that what is passed in to the config function has a set function that in some way captures its first argument for later use. I actually ended up wrapping all that in a function that returns the argument for convenience:
function extractConfig(configFunction) {
var last;
var shell = {
set: function (input) { last = input; }
};
configFunction(shell);
return last;
}
Then I can call this with my required config file:
var config = extractConfig(require('./local-karma.conf.js'));
Which got my tests running. I have noticed that something is slightly off, since I override the logging level in my config, but the API seems to be using config.LOG_DEBUG regardless. But that's the only problem I've had so far. Though this unanswered question seems to also be doing something similar, but with less successful results.
I try to use this from npm in order to convert a json file I have saved to my desktop.
However I can't understand how to do it because I am not an original programmer.
I have installed the npm and
npm install json-2-csv
after this steps I can't understand how can I import my json file.
If there is anyone who could help me with the following steps or screenshots it could be very very useful for me.
Thank you in advance.
Put your json file to root directory of your Node.js application and rename it to data.json.
Then create app.js file in the same directory with this code:
var converter = require('json-2-csv');
var fs = require('fs');
var jsonData = require('./data.json');
var json2csvCallback = function (err, csv) {
if (err) throw err;
fs.writeFile("./data.csv", csv, function(err) {
if(err) throw err;
console.log("data.csv file has been saved.");
});
};
converter.json2csv(jsonData, json2csvCallback);
It requires data.json file from the root app directory (will work only with a valid JSON, otherwise you will find an error description in the console). If all ok, it will convert JSON to CSV using json-2-csv module and will save data.csv file with the result still in the root app directory.
Just try to run it with: node app.js