Upload image URL to SFTP server using Cloud Function - google-cloud-functions

I am working on a task that uploads image to SFTP server with Firebase Function. But the image source is not from my local computer but a http URL such as https://image.com/abc.jpg. I am using ssh2-sftp-client npm package. Currently I am using my mac both for client and server and it is working fine when I am accessing local file(/Users/shared/abc.jpeg) and uploading it to local server(/Uesrs/shared/sftp-server/abc.jpeg). But when I tried to have access to https://image.com/abc.jpg. and upload it to local server I got the error that says "ENOENT: no such file or directory/ ...". And below is my code
const functions = require('firebase-functions');
let Client = require('ssh2-sftp-client');
exports.sftpTest = functions.https.onRequest((request, response) => {
let sftp = new Client();
const config = {
host: '192.***.***.***',
port: '22',
username: '****',
password: '****'
}
let localFile = 'https://images.unsplash.com/photo-1487260211189-670c54da558d?ixlib=rb-1.2.1&ixid=eyJhcHBfaWQiOjEyMDd9&auto=format&fit=crop&w=934&q=80';
let remoteFile = '/Users/Shared/unsplash.JPG';
sftp.connect(config)
.then(() => {
sftp.fastPut(localFile, remoteFile);
})
.catch(err => {
console.error(err.message);
});
});
My first time to have access to sftp server and anyone's advice will be much appreciated.

The method you are using from this library does not support usage in the way you are trying to set it up. the method fastPut usually is to upload local files to a remote server, I think you should use the fastGet method in order to download files from a remote server, however, please note that there are no notes that indicate that you can use these methods with the URL in the way you are trying to achieve.

Related

The below code, a json file has some missing errors but no visible corrections are displayed. Appreciate any help or suggestions

Future<List> senddata() async {
} final response = await http.post(Uri.parse(("``http://localhost/app/insertdata.php``"), body: {
) "name": user.text
"email": pass.text
"mobile":mobile.text
});
var datauser = json.decode(response.body);
connection file to a database, have but does not display any errors
Make sure your device is connected to internet
You need to establish connection to the SQL first before requesting any thing in your localhost
Your application and localhost server should be running on the same port to use API's locally in your application.

Is it possible to use express.js to build rest api in frontend?

the plan is to build the web app with react.js and also build the backend using express.js specfically using rest api to connec to mySQL database....the problem is
for authentication, my supervisor doesnt want me to store password anywhere, instead he suggested me to build an authentication using the usename and password that we use to connect to mySQL database. For example, when i try to make a connection with mysqlCreateConnection method, theres a section where i have to fill out the ip address and username and password. the problem is if i do this, when the user logs out, the connection between backend and database will disconnect......
is it possible to use mySQL createconnection in the front end? so whenever the user logs in it will connect to the database directly from the frontend? once the connection is created, then use backend rest api? if this works i assume the rest api has to be hosted with the same url as the frontend, since we made the connection to mySQL database in the frontend.......but again if we do this, doesnt it defeat the purpose of backend? meaning anyone can login to the frontend and change whatever they want to the backend?
so the result will be like
within the frontend. user logs in using mySQL workbench username and password, then that username and password is going to fill out the mysql createconnection method(this method is written in the frontend). which will then try to connect to the database.
user logs in successfull
user fills out a form about a product and clicks on submit and this data is send to our rest api, and mySQL database adds the data in.
here are examples of router-token and router-session within the app if you are at express.js.
db.js
const mysql = require("mysql2")
const config = require("../../config/config.json").DB
module.exports = mysql.createPool({
host: config.host,
user: config.username,
password: config.password,
database: config.database,
waitForConnections: true,
connectionLimit: 100,
queueLimit: 0,
multipleStatements: true
})
db.fun.js
const db = require("./db").promise()
let query = async(sql, data) => {
try {
let d = await db.query(sql, data);
return d[0];
} catch (err) {
console.log(`EDB: ./app/database/db.fun.js 8rows \n${err}`);
return { err: 1, errdata: err };
}
}
module.exports = {
query: query
}
import query
(async function(){
const { query } = require("../database/db.fun");
let user = await query("SELECT * FROM users",[]);
console.log(user)
})();

Connect to remote database when deployed on Vercel

In my NextJS Vercel app, I am unable to successfully connect to my remote MySQL database which is located on GoDaddy, after following Vercel's official tutorial.
I expect the api pages to return JSON data that resulted from the database query. Instead I am getting
I tried changing the username, but for some reason, the 4 environment variables that I have - MYSQL_USER, MYSQL_DATABASE, MYSQL_HOST, and MYSQL_PASSWORD - never update on the live site! I changed in Production, Preview, and even Development, and they stay the same in the above link’s object.
Everything works fine on my localhost because my home IP address is whitelisted in cPanel. But Vercel has dynamic IPs so I can't do this on the live site. It also works fine if I host on GoDaddy, but I need to host on Vercel.
Here’s my source code for the db.js file which connects to the database
lib/db.js
const mysql = require('serverless-mysql');
const db = mysql({
config: {
host: process.env.MYSQL_HOST,
database: process.env.MYSQL_DATABASE,
user: process.env.MYSQL_USER,
password: process.env.MYSQL_PASSWORD,
}
})
exports.query = async query => {
try {
const results = await db.query(query);
await db.end();
return results
} catch (error) {
return {
error
}
}
}
pages/api/columns/index.js
const db = require('../../../lib/db')
const escape = require('sql-template-strings')
/**
* Queries the database to return the newspaper's columns
* #param {IncomingMessage} _req The request object (unused)
* #param {ServerResponse} res The response object
*/
module.exports = async (_req, res) => {
const columns = await db.query(escape`SELECT * FROM columns ORDER BY id`);
res.status(200).json({ columns })
}
I expect this result, which appears locally:
Connecting to a remote database only works with Cloud hosting (e.g. Microsoft Azure, AWS). Because I am using a Hosting service, this won't work.
In Remote MySQL, whitelist %.%.%.%. Because Vercel's IPs are dynamic, this will allow a consistent connection between Vercel and the database. It is also a security risk, but I have a password protection.

How can I fix this CORS with Socket.io?

I'm having trouble migrating my locally hosted chat application using Socket.io to my live cloud server. I am aware there are solutions out there, however, I cannot find anything that solves my problem.
I am receiving "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at.... (Reason: CORS request did not succeed)"
'''
const io = require('socket.io')(3000)
io.on('connection', socket => {
socket.on('new-user', name => {
users[socket.id] = name
socket.broadcast.emit('user-connected', name)
})
socket.on('send-chat-message', message => {
socket.broadcast.emit('chat-message', {message: message, name:
users[socket.id]})
})
socket.on('disconnect', () => {
socket.broadcast.emit('user-disconnected', users[socket.id])
delete users[socket.id]
})
})
Above is the server.js file.
const socket = io('<url>:3000')
const messageForm = document.getElementById('send-container')
const messageInput = document.getElementById('message-input')
const messageContainer = document.getElementById('message-container')
This is the script my app is using, the is replaced by mine, it just masked on here.
Things I have tried:
Setting headers using "Header set Access-Control-Allow-Origin" in my Apache & Nginx config
Changing the url on the script
Changing the ports
So far I've no luck, please help me!!
Try
// Allow all origins
io.origins('*');
source

How to serve index.html file with Apollo Server?

I have this code in index.js:
const PORT = process.env.PORT || 5000;
server.listen(PORT).then({ url }) => {
console.log(`Server running at url: ${url}`);
});
In local development, when I went to localhost:5000 on my browser, I could test with the GraphQL playground.
Now, I just finished deploying with Heroku. When I go to my URL, I see:
GET query missing. I assume this happens because apollo is trying to open the GraphQL playground, but it is blocked in production mode.
How can I tell apollo to serve client/index.html instead?
Note: I tried putting index.html in the root directory as well, but nothing changed.
I saw in a tutorial video that the answer to this question in express is:
app.use(express.static('client'));
app.get('*', (req, res) => {
res.sendFile(path.resolve(__dirname, 'client','index.html'));
});
I don't know how to do this in Apollo.
The standalone Apollo Server cannot be used for serving static files or exposing other endpoints. If you need this functionality, you need to use an HTTP framework like Express, Hapi or Koa and then use the appropriate Apollo Server integration.
Example using Express:
const server = new ApolloServer({ ... });
const app = express();
server.applyMiddleware({ app });
app.listen({ port: 4000 }, () =>
console.log(`🚀 Server ready at http://localhost:4000${server.graphqlPath}`)
);