Using AWS Javascript SDK with NativeScript - aws-sdk

I'm planning to build an app using Nativescript. However, I want to integrate AWS services in the app architecture.
Please, can someone tell me whether I can use AWS JS SDk with nativescript or not.
If yes, how?
Thank you.

Yes, you can use AWS SDK with NativeScript. I am using it to upload files to S3. You need to install it using
npm i aws-sdk
File upload to AWS S2 example
In your component file, import it
import * as AWS from 'aws-sdk';
const AWSService = AWS;
const region = 'Your_Region_name';
this.bucketName = 'bucketName ';
const IdentityPoolId = 'IdentityPoolId';
// Configures the AWS service and initial authorization
AWSService.config.update({
region: region,
credentials: new AWSService.CognitoIdentityCredentials({
IdentityPoolId: IdentityPoolId
})
});
// adds the S3 service, make sure the api version and bucket are correct
this.s3 = new AWSService.S3({
apiVersion: '2006-03-01',
params: { Bucket: this.bucketName }
});
this.s3.upload({ Key: 'test/' + file.name, Bucket: this.bucketName, Body: file, ACL: 'public-read' }, function (err, data) {
if (err) {
console.log(err, 'there was an error uploading your file');
}
console.log('upload done');
});
P.S. You need to create a Idendity pool in cognito if you don't have one.

Related

What is the AWS SDK library for signin in Cognito (from backend without using Amplify)?

I have implemented user signup using #aws-sdk/client-cognito-identity-provider but not able to find the module or API from AWS SDK to implement sign in to cognito
#aws-sdk/client-cognito-identity-provider is the right lib for what you want to do.
import * as AWS from "#aws-sdk/client-cognito-identity-provider";
const client = new AWS.CognitoIdentityProvider({ region: "REGION" });
client.initiateAuth({
ClientId: '...',
AuthFlow: 'USER_PASSWORD_AUTH',
AuthParameters: {
USERNAME: 'my_test_user',
PASSWORD: '...'
}
}, function(err, data) {});

Parse JSON file uploaded in S3 using Lambda

I'm trying to parse a JSON file that get's uploaded in S3. I invoke the lambda function using an S3 PUT/POST method trigger.
I'm using the following code.. however i'm not able to parse the json file. Can someone please help me?
var aws = require('aws-sdk');
var s3 = new aws.S3();
exports.handler = async (event, context, callback) => {
var srcBucket = event.Records[0].s3.bucket.name;
var srcKey = event.Records[0].s3.object.key;
console.log("Params: srcBucket: " + srcBucket + " srcKey: " + srcKey + "\n");
var getParams = {
Bucket: srcBucket,
Key: srcKey,
};
s3.getObject(getParams, function (err, data) {
if (err) console.log(err, err.stack);
else {
console.log(JSON.stringify(data.Body.toString()));
}
});
};
Your code looks correct However, I'd suggest taking the example from AWS docs as your starting point. Since your Lambda handler is an async handler, you have to await the promise returned by s3.getObject() , otherwise your function will complete before the callback executes (see the example code from the link).
Since you mention that your Lambda function cannot parse the file, I assume the function gets invoked by S3 trigger (i.e. you can see the console.log('Params: ...) line in Cloudwatch Logs). If that's not the case, first check that the S3 trigger is configured correctly and S3 has permission to invoke the Lambda function. If you created the function via AWS Console, this permission would have been set automatically.
The next step I'd suggest is to check the Lambda function's IAM role. Check if the IAM role has s3:GetObject permission for your bucket and all objects under it or for the specific prefix you have configured S3 notification (e.g. <bucket>/* or <bucket>/prefix/*).
If the Lambda IAM permissions are correct, you'll have to check S3 bucket policies. I suspect you haven't set up bucket policies according to what you describe.

Upload image URL to SFTP server using Cloud Function

I am working on a task that uploads image to SFTP server with Firebase Function. But the image source is not from my local computer but a http URL such as https://image.com/abc.jpg. I am using ssh2-sftp-client npm package. Currently I am using my mac both for client and server and it is working fine when I am accessing local file(/Users/shared/abc.jpeg) and uploading it to local server(/Uesrs/shared/sftp-server/abc.jpeg). But when I tried to have access to https://image.com/abc.jpg. and upload it to local server I got the error that says "ENOENT: no such file or directory/ ...". And below is my code
const functions = require('firebase-functions');
let Client = require('ssh2-sftp-client');
exports.sftpTest = functions.https.onRequest((request, response) => {
let sftp = new Client();
const config = {
host: '192.***.***.***',
port: '22',
username: '****',
password: '****'
}
let localFile = 'https://images.unsplash.com/photo-1487260211189-670c54da558d?ixlib=rb-1.2.1&ixid=eyJhcHBfaWQiOjEyMDd9&auto=format&fit=crop&w=934&q=80';
let remoteFile = '/Users/Shared/unsplash.JPG';
sftp.connect(config)
.then(() => {
sftp.fastPut(localFile, remoteFile);
})
.catch(err => {
console.error(err.message);
});
});
My first time to have access to sftp server and anyone's advice will be much appreciated.
The method you are using from this library does not support usage in the way you are trying to set it up. the method fastPut usually is to upload local files to a remote server, I think you should use the fastGet method in order to download files from a remote server, however, please note that there are no notes that indicate that you can use these methods with the URL in the way you are trying to achieve.

Trigger a cloud build pipeline using Cloud Function

I'm trying to create a cloud function listening to cloudbuilds topic and making an API call to trigger the build. I think I'm missing something in my index.js file (I'm new to Node.js). Can you provide a sample example of a Cloud Function making an API call to the Cloud Build API?
Here is my function:
const request = require('request')
const accessToken = '$(gcloud config config-helper --format='value(credential.access_token)')';
request({
url: 'https://cloudbuild.googleapis.com/v1/projects/[PROJECT_ID]/builds',
auth: {
'bearer': accessToken
},
method: 'POST',
json: {"steps": [{"name":"gcr.io/cloud-builders/gsutil", "args": ['cp','gs://adolfo-test-cloudbuilds/cloudbuild.yaml', 'gs://adolfo-test_cloudbuild/cloudbuild.yaml']}]},
},
module.exports.build = (err, res) => {
console.log(res.body);
});
I was executing the command gcloud config config-helper --format='value(credential.access_token)', copying the token, and putting it as a value to the variable accessToken. But this didn't work for me.
Here is the error: { error: { code: 403, message: 'The caller does not have permission', status: 'PERMISSION_DENIED' } }
I had the same exact problem and I have solved it by writing a small package, you can use it or read the source code.
https://github.com/MatteoGioioso/google-cloud-build-trigger
With this package you can run a pre-configured trigger from cloud build.
You can also extend to call other cloud build API endpoints.
As my understanding cloud build API requires either OAuth2 or a service account. Make sure you gave the right permission to cloud build in the gcp console under IAM. After that you should be able to download the service-account.json file.

How to use aws athena using nodejs?

Athena is analytics service for retrieving data from s3 using sql query.
I have queried data in s3 using t aws console
Need access to aws athena using nodejs code
I am using athena like following way in my nodejs project :
download JDBC driver from AWS.
Create a connector.js file.
npm install jdbc NPM.
Paste followings:
var JDBC = require('jdbc');
var jinst = require('jdbc/lib/jinst');
if (!jinst.isJvmCreated()) {
jinst.addOption("-Xrs");
jinst.setupClasspath(['./AthenaJDBC41-*.jar']);
}
var config = {
// Required
url: 'jdbc:awsathena://athena.*.amazonaws.com:443',
// Optional
drivername: 'com.amazonaws.athena.jdbc.AthenaDriver',
minpoolsize: 10,
maxpoolsize: 100,
properties: {
s3_staging_dir: 's3://aws-athena-query-results-*/',
log_path: '/logs/athenajdbc.log',
user: 'access_key',
password: 'secret_key'
}
};
var hsqldb = new JDBC(config);
hsqldb.initialize(function(err) {
if (err) {
console.log(err);
}
});
Just use the Athena Service on the JS SDK.
Athena JS Documentation
AWS JS SDK
You could use the athena-express module from here, as documented by AWS here
You need to use aws-sdk and athena-express dependencies,
There's a full working tutorial in this video I made:
https://www.youtube.com/watch?v=aBf5Qo9GZ1Yac