I am looking up how to export RDS to S3 using Lambda, so far I found a Python node and Java API but I can't find a way of programmatically running something like mysqldump without the actual executable.
Is there a way of doing it?
I am thinking using node.js to call SHOW CREATE TABLE for each element in SHOW TABLES
Then somehow create extended INSERT statements.
Yes it is possible, you need to include mysqldump executable with your lambda package though!
Sample script in Node.js for backup using mysqldump and uploading to S3.
var S3 = require('./S3Uploader'); //custom S3 wrapper with stream upload functionality
var fs = require('fs');
var path = require('path');
var util = require('util');
const writeFile = util.promisify(fs.writeFile);
const execFile = util.promisify(require('child_process').execFile);
const exec = util.promisify(require('child_process').exec);
async function backupToS3(){
var backupName = 'mysqlbackup-'+new Date().toISOString()+'.gz'
var content = `cd /tmp
BACKUPNAME="[BACKUP_NAME]"
[EXE_PATH]/mysqldump --host [DB_ENDPOINT] --port [DB_PORT] -u [DB_USER] --password="[DB_PASS]" [DB_NAME] | gzip -c > $BACKUPNAME
`;
content = content.replace('[BACKUP_NAME]', backupName);
content = content.replace('[DB_ENDPOINT]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_PORT]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_USER]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_PASS]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_NAME]', 'xx'); //get from lambda environment variables
content = content.replace('[EXE_PATH]', __dirname+'/tools'); //path where mysqldump executable is located withing the lambda package
//generate backup script
await writeFile('/tmp/db_backup.sh', content);
fs.chmodSync('/tmp/db_backup.sh', '755');
//run script
var res1 = await execFile('/tmp/db_backup.sh');
//stream upload to S3
var res2 = await S3.uploadFile('/tmp/'+backupName, 'backups');
//cleanup local backup (this should cleanup automatically according to lambda lifecycle)
var res3 = await exec('rm /tmp/'+backupName);
return 'Backup complete';
};
Sample S3Uploader posted here - Loading File directly from Node js req body to S3
Related
In my Google Cloud Functions script, I want to delete a Google Cloud Storage file by using the following code:
const gcs = require('#google-cloud/storage')()
exports.deletePost = functions.https.onRequest((request, response) => {
if(!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const the_post = request.query.the_post;
const filePath = context.auth.uid + '/posts/' + the_post;
const bucket = gcs.bucket('android-com')
const file = bucket.file(filePath)
const pr = file.delete()
});
The problem is that I also need to delete a Google Firebase Firestore database entry after having deleted the Storage file. So I would want to know if I could do it within, for example, a promise that would be returned by delete?
PS : I don't find the doc
The code file.delete() is asynchronous and returns a Promise, as defined in the Google Cloud Storage: Deleting objects documentation.
To delete an object from one of your Cloud Storage buckets:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'File to delete, e.g. file.txt';
// Deletes the file from the bucket
await storage
.bucket(bucketName)
.file(filename)
.delete();
console.log(`gs://${bucketName}/${filename} deleted.`);
It's not overly clear, but because the await syntax is used, this implies that the result of the expression to it's right is a Promise.
Note: Most of the useful Google Cloud Storage documentation can be found here.
I am trying to deploy a Google Cloud function, I started by just adding the initial requirements to my index.js file:
// Import the Google Cloud client libraries
const nl = require('#google-cloud/language')();
const speech = require('#google-cloud/speech')();
const storage = require('#google-cloud/storage')();
But I get the following message when deploying:
Detailed stack trace: TypeError: require(...) is not a function
This only happens with the #google-cloud/speech and #google-cloud/language modules, the #google-cloud/storage module is loaded fine as a function (I tested by commenting the first two).
Any advise will be greatly appreciated.
Borrigan
With reference to this Github comment, there was some changes in google-cloud v2 package
so you import packages such as:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({
// config...
});
Google cloud function are nodejs modules so the syntax is same as nodejs syntax.
Regarding your problem:
you have to write
const storage = require('#google-cloud/storage');
(without () at the end of each statement)
So the correct declaration will be:
// Import the Google Cloud client libraries
const nl = require('#google-cloud/language');
const speech = require('#google-cloud/speech');
const storage = require('#google-cloud/storage');
I hope this helps.
It tells you that whatever you required is not a function and therefore can't be invoked with ()
if you look here : https://www.npmjs.com/package/#google-cloud/language#using-the-client-library
you see a service object with multiple class returning functions is being returned, so you should set it up like this:
const nl = require('#google-cloud/language');
const language = new nl.LanguageServiceClient();
Follow the new syntax below:
const {Storage} = require('#google-cloud/storage');
const googleCloudStorage = new Storage({
projectId: 'your-project-id',
keyFilename: 'path-to-json-config-file'
});
I am starting with Mongo db. I want to insert two variables into mongo. So, in the same file
1.- I define the two variables
2.- I create a function that returns a JSON format object with the two variables
3.- I create an app in Express to fill the values of the two variables and send them out as well
4.- I connected to Mongo and insert the JSON object creating a collection and passing in the first argument a call to the function that returns the JSON object with two arguments that are the two variables.
RESULT CHECKING IN THE CONSOLE:
1.- The connection is correct
2.- There is a JSON object inserted but empty
I think I have a problem of scopes.How would it be the right sequence?
// Express files
var express = require('express');
var app = express();
// Mongo files
var mongodb=require("mongodb")
var MongoClient = mongodb.MongoClient;
var MONGODB_URI="mongodb://user:psswd#00000.mlab.com:00000/"
// Variables
var one;
var two;
// JSON object to insert in mongo
var doc=function(one,two){
return{
"one":one,
"two": two
}
}
// App in Express
app.get("new/:which",function(req,res){
one=req.params.which
var randomNum=Math.round(Math.random()*10000)
two=req.headers["x-forwarded-host"]+("/")+randomNum.toString()
res.end(JSON.stringify(doc(one,two)))
})
// Mongo connection and insertion of JSON object
MongoClient.connect(MONGODB_URI,function(err,db){
if (err) {
console.log('Unable to connect to the mongoDB server.
Error:', err);
} else {
console.log('Connection established to', MONGODB_URI);
}
var collection=db.collection("url")
collection.insert(doc(one,two),function(){
if(err) throw err
console.log(JSON.stringify(doc(one,two)))
db.close()
})
})
// Express files
var express = require('express');
var app = express();
//mongoose files
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
mongoose.connect('mongodb://user:psswd#00000.mlab.com:00000/'), {
useMongoClient: true,
});
//Define the document schema
var Schema = new mongoose.Schema({
one: {
type: String, //or maybe Number
required: true
},
two: {
type: String,
required: true
},
});
var Model = mongoose.model('model', Schema);
app.get("/new/:which",function(req,res){
one=req.params.which
var randomNum=Math.round(Math.random()*10000)
two=req.headers["x-forwarded-host"]+ ("/")+randomNum.toString();
var new_doc = new Model({
one: one,
two: two
});
new_doc.save(err=>{
err ? res.send(err) : res.send('added!');
});
});
I recommend to use mongoose npm package for working with mongo
You can split the code into modules for more comfortability
I found the answer to the question of the scopes in a freecourse for Node developers in MongoDB University showing how to put all together Mongo University
The sequence is the following:
//Dependencies
var express = require('express');
var app = express();
var mongodb=require("mongodb")
var MongoClient = mongodb.MongoClient;
var MONGODB_URI
="mongodb://<user>:<psswd>#000000.mlab.com:41358"
// Connection to mongo
MongoClient.connect(MONGODB_URI,function(err,db){
app.use("/new/:which",function(req,res){
var one=req.params.which
var two=req.headers["x-forwarded-host"]+"/"+randomNum
// Insertion of documents
db.collection("url").insertOne({"one":one,
"two":two})
res.send({"one":one,"two":two})
})
var listener = app.listen(8000)
}
So, the app need to be within the scope of the mongo connection and the collection methods within the scope of the app.
At the same level that the app will be the port.
The documents inserted in the database and the result send it out to the client as well.
I'm trying to automatically generate manifest files with gulp, but can't find how to get filename, modify it and send it forward through pipe.
var fs = require('fs');
var content = 'test';
gulp.src('./wwwroot/**/*.file')
.pipe(fs.writeFileSync(??? , content))
.pipe(gulp.dest('./wwwroot/')); // should be same as original file
Where ??? on 4th line is, I'd like to have filename.file.manifest.
Code above is more of an idea, since gulp.dest and fs both write file.
Not with pipes, but nonetheless a nice solution using node glob
var content = 'test'
glob("./wwwroot/**/*.file", null, function(er, files) {
if(er)
return;
files.forEach(function(element) {
var manifestFile = element + '.manifest';
fs.writeFileSync(manifestFile, content);
console.log("Generated: " + element);
});
})
I have the following code which accepts data from the url and print the json formatted data.I want to publish the same data to mqtt using node.js.Is there any sample code for the same?
`var request = require('request')
var JSONStream = require('JSONStream')
`var es = require('event-stream')`
`request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
`.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
console.log(data);
console.error(data)
return data
}))
You could use the mqtt node library MQTT.js
Your current code becomes something like that:
var request = require('request');
var JSONStream = require('JSONStream');
var es = require('event-stream');
var mqtt = require('mqtt');
request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
console.log(data);
console.error(data);
//MQTT publish starts here
var client = mqtt.createClient(1883, 'localhost');
client.publish('demoTopic', JSON.stringify(data));
client.end();
return data;
}))
The above code assumes the broker is running on the local machine on port 1883.
Just use a node.js library for mqtt such as MQTT.js https://github.com/adamvr/MQTT.js
Also you can run your own multi-protocol broker in node.js by installing mosca https://github.com/mcollina/mosca