S3 error undefined (reading byteLength) for AWS AMPLIFY - json

I keep getting a
TypeError: Cannot read properties of undefined (reading 'byteLength')
Error when using AWS with Amplify.
It looks like the s3 bucket permissions were not created properly when I added the s3 through the Amplify CLI.
When I add the sample s3 bucket policy from https://docs.amplify.aws/lib/storage/getting-started/q/platform/js/
It still does not work.
AWSS3Provider - get signed url error TypeError: Cannot read properties of undefined (reading 'byteLength')
My Code to call it:
const res = await Storage.get("test.json");
This includes any Storage.list, Storage.put commands.
This leads me to believe it has trouble getting the key from storage. I have done it all through the cli which is puzzling.

When I add the sample s3 bucket policy
Question is: What do I add for the principal for an authenticated user? Does anyone have a sample?
The policies mentioned are to be attached to the IAM cognito roles Auth_Role and Unauth_Role (or however are the roles for the cognito users named) not the S3 bucket. Then the principal is the IAM role itself.
In theory you may attach the defined policies to the S3 bucket and define the principals as the Cognito roles for the authenticated and unauthenticated users

Related

ChainableTemporaryCredentials getPromise and Missing credentials in config, if using AWS_CONFIG_FILE

I have an node application deployed in GCP.
The application includes code to access ressources in AWS-cloud.
For this purpose it uses the aws-SDK with ChainableTemporaryCredentials.
The relevant code lines are...
const credentials = new ChainableTemporaryCredentials({
params: {
RoleArn: `arn:aws:iam::${this.accountId}:role/${this.targetRoleName}`,
RoleSessionName: this.targetRoleName,
},
masterCredentials: new WebIdentityCredentials({
RoleArn: `arn:aws:iam::${this.proxyAccountId}:role/${this.proxyRoleName}`,
RoleSessionName: this.proxyRoleName,
WebIdentityToken: token,
}),
})
await credentials.getPromise()
The WebIdentityToken was received from google and looks good.
At AWS-side I created an proxy-role (the line from masterCredentials RoleArn).
However at runtime I get the error:
Missing credentials in config, if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1
I do not understand this error. Because my application runs in GCP and I use temporary credentials I do not understand why I should use aws-credentials in form of an credentials file or environment variables like AWS_ACCESS_KEY_ID or AWS_SECRET_ACCESS_KEY. I thought the idea to use ChainableTemporaryCredentials is NOT to have direct aws-credentials. Right?
You can see the public code at:
https://github.com/cloud-carbon-footprint/cloud-carbon-footprint/blob/trunk/packages/aws/src/application/GCPCredentials.ts
and documentation regarding env-variables at:
https://www.cloudcarbonfootprint.org/docs/configurations-glossary/
Any help which leads to understanding of this error message is welcome.
Thomas
Solved it. "Missing credentials in config, if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1 was totally misleading." In reality it was a problem with the field-names in the GCP-JWT-token und the policy in aws. See https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_iam-condition-keys.html#ck_aud

Consul KV Store returns 403 on the parent folder of my key

I have a key in my KV store, let's say /global/test/my-key and I use a token that has the following policy :
key "/global/test/my-key" {
policy = "read"
}
Why, using the UI, I can access the URL http://localhost:8500/v1/kv/global/test/my-key/edit but I have a 403 on the following URLs http://localhost:8500/v1/kv/global/test and http://localhost:8500/v1/kv/global ?
Is there a way for me to access my key from the UI starting at the URL http://localhost:8500/v1/kv ?
NOTE: I have tried the "list" policy, but it gives read access to the other keys, which is not what I want.
EDIT: I just realized I had forgot to mention another condition that I am trying to meet. I have another key called for instance /global/secret/my-other-key and I don't want that key to be viewed from the UI nor the folder /global/secret/.
If you wish to have access to all of the mentioned paths, you should use this policy instead:
key_prefix "global" {
policy = "read"
}
This policy will give you access to global and any "sub-paths" of it.
Consul does not currently support performing recursive reads on paths where your token only has access to a subset of the keys under that parent path.
There's an open GitHub issue requesting this functionality be added https://github.com/hashicorp/consul/issues/4513. I recommend upvoting that issue to indicate your interest, and subscribe to it for updates so that you can track its progress.
If your particular use case is not accurately reflected in the initial description, feel free to leave a comment with additional information.

Using Google Cloud Dataflow how do I run with proper credentials on a GCE Compute instance?

I'm new to Google Cloud Dataflow, as is probably obvious from my questions below.
I've got a dataflow application written and can get it to run without issue using my personal credentials both locally and on a GCE instance. However, I can't seem to crack the proper steps to get it to run using the compute engine instance's service credentials or service credentials I've created using the API & AUTH section of the console. I always get a 401 not authorized error when I run.
Here's what I've tried...
1) Created virtual machine granting access rights to storage, datastore, sql and compute engine. My understanding is that this supposedly created a CI specific services account that is the server's default credentials. These should be used the same way a user's authentication is used when an app is run on this instance. Here's where I get a 401. My question is... Where can I see this service account that was supposedly created? Or do I just rely that it exists somewhere?
2) Created service credentials in API & Auth portion of developer's console. Then used cloud auth activate-service-account and activated that account by pointing the command at the credentials json file I downloaded. Kind of like the OAUTH round trip when you use gcloud auth login. Here I also get the 401.
3) This last thing was using the service credentials from step 2 separate from the GCE instance and then create an object that implements the CredentialFactory interface and pass it off to the PipelineOptions. However, when it runs the app crashes now with an error saying that it is looking for a method, fromOptions, that isn't in the CredentialFactory interface. How the options were configured, what the credentials factory looked like and the stack trace from this follows.
I would be happy to utilize any of the above 3 methods to make use of service credentials, if I could get any of them to work. Any insight you can provide on what I'm doing wrong, steps I'm leaving out, other unexplored options would be greatly appreciated. The documentation is a little dis-jointed. If there is a clear step by step guide a link to that would be sufficient. What I've found so far on my own has been of little assistance.
If I can provide any additional information please let me know.
Here's some code that may be helpful and the stack trace I get when the code runs using the credential factory.
Options setup code looks like this:
GcrDataflowPipelineOptions options = PipelineOptionsFactory.fromArgs(args)
.withValidation()
.as(GcrDataflowPipelineOptions.class);
options.setKind("Counties");
options.setCredentialFactoryClass(GoogleCredentialProvider.class);
GoogleCredentialProvider.java
Notice the json file I downloaded as part of creating the services account (renamed) is what's loaded as a resource from my apps class path.
public class GoogleCredentialProvider implements CredentialFactory {
#Override
public Credential getCredential() throws IOException, GeneralSecurityException {
final String env = System.getProperty("gcr_dataflow_env", "local");
Properties props = new Properties();
ClassLoader loader = this.getClass().getClassLoader();
props.load(loader.getResourceAsStream(env + "-gcr-dataflow.properties"));
final String credFileName = props.getProperty("gcloud.dataflow.service.account.file");
InputStream credStream = loader.getResourceAsStream(credFileName);
GoogleCredential credential = GoogleCredential.fromStream(credStream);
return credential;
}
}
Stacktrace:
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
2015-07-03 09:55:42,519 | main | DEBUG | co.sc.gc.da.ap.GcrDataflowApp | destroying
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more
You likely do not have the proper credentials. When you execute a Dataflow job from GCE, The service account attached to the instance will be used for validation by DataFlow.
Did you do this when creating your machines?
create a service account for the instance on GCE?
https://cloud.google.com/compute/docs/authentication#using
Set the required scopes for using Dataflow such as storage, compute,
and bigquery? https://www.googleapis.com/auth/cloud-platform

Inserting images in a GCE account

I am trying to register/insert an image in a gce account. This raw image source is shared publicly. However, I see this error when making the insert call
{u'status': u'DONE', u'kind': u'compute#operation', u'name': u'operation-1413287109771-505608c24bef9-5c02ac49-1dbd219b', u'startTime': u'2014-10-14T04:45:10.142-07:00', u'httpErrorMessage': u'FORBIDDEN', u'insertTime': u'2014-10-14T04:45:09.871-07:00', u'targetLink': u'https://www.googleapis.com/compute/v1/projects/qubole-gce-test/global/images/image-v1-36', u'operationType': u'insert', u'error': {u'errors': [{u'message': u"Required 'read' permission for 'rawDisk.source'", u'code': u'PERMISSIONS_ERROR'}]}, u'progress': 100, u'endTime': u'2014-10-14T04:45:11.625-07:00', u'httpErrorStatusCode': 403, u'id': u'15732625722022858454', u'selfLink': u'https://www.googleapis.com/compute/v1/projects/qubole-gce-test/global/operations/operation-1413287109771-505608c24bef9-5c02ac49-1dbd219b', u'user': u'964307357192-smkpef2g0v8q3oopq44tvh1d3h1lplgk#developer.gserviceaccount.com'}
I googled and from the posts I found, it says thay you have to share the image publicly - which I have already done.
I am using this API https://cloud.google.com/compute/docs/reference/latest/images/insert
The rawDisk.source that I am using here is the GCS URL which I have made public, yet I am getting the error I pasted.
As discussed on the gce-disussion mailing list this is a known regression in GCE that the engineering team is working on. As a workaround you can get this API working by adding the GCS read-write scope (https://www.googleapis.com/auth/devstorage.read_write) to the scopes you request when performing OAuth2 authentication.

KeePass, GoogleSync: "Error occurred while sending a direct message or getting the response"

When using KeePass with the GoogleSync plugin (to sync the keypass db with Google Drive), I somehow messed up my config and received this message on every sync attempt:
Error occurred while sending a direct message or getting the response
I think what I actually did was try to switch my Google API credentials (from one API key to another)
The user config saves a Google auth key. In my case I changed the API credentials I wanted to use, but it could become corrupted in other ways and there doesn't seem to be a mechanism in KeePass/GoogleSync to rectify it.
Exit KeePass
Delete (or backup to a different name) the file %LOCALAPPDATA%\Dominik_Reichl\KeePass[...]\[version]\user.config
Restart KeePass and it'll regenerate it with the correct config
Found my reference here:
http://sourceforge.net/p/kp-googlesync/discussion/general/thread/5dc763ba/