Download source code from Google Cloud App Engine - json

I have been trying to download/extract the source code for an application in the google app engine. I found this more current post, How to download App Engine Source Code from Google Cloud Console?, but I do not see any files or folders in the staging bucket. My login is an owner of the project. I have also tried the appcfg.py commands but receive the authentication error below. Is this because this is deprecated? Or are there some other permissions my login requires?
<ApiError 401, Message: "Rejected by creds_policy: Permission 'auth.creds.useNormalUserEUC' not granted to app-engine-appserver#prod.google.com, because it satisfies none of the 3 rules granting that permission. (Policy {type=creds, id=/apps/framework/auth/v0/mapping0/80102d58-16e4-455f-bf24-13feb564b751}); RpcSecurityPolicy http://rpcsp/p/9aw5LWxoo9BLqcVKVlGreN0-Q5-wt_0mk2RWLjm140U
com.google.security.context.validation.CredsPermissionException: Rejected by creds_policy: Permission 'auth.creds.useNormalUserEUC' not granted to app-engine-appserver#prod.google.com, because it satisfies none of the 3 rules granting that permission. (Policy {type=creds, id=/apps/framework/auth/v0/mapping0/80102d58-16e4-455f-bf24-13feb564b751}); RpcSecurityPolicy http://rpcsp/p/9aw5LWxoo9BLqcVKVlGreN0-Q5-wt_0mk2RWLjm140U ">

Related

Google Cloud Functions using service account file not found

I am trying to list the buckets located in the other GCP Project, essentially using code from here: https://cloud.google.com/docs/authentication/production#auth-cloud-explicit-python
To do this I need to validate access with Json file.
Unfortunately I cant resolve error with linking my Json file to the function. This is the code I use:
def explicit(argument):
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json(
'gs://PROJECT/PATH/service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
The error I am getting:
Traceback (most recent call last): File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 2073, in wsgi_app response = self.full_dispatch_request() File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1518, in full_dispatch_request rv = self.handle_user_exception(e) File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1516, in full_dispatch_request rv = self.dispatch_request() File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1502, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args) File "/layers/google.python.pip/pip/lib/python3.9/site-packages/functions_framework/__init__.py", line 99, in view_func return function(request._get_current_object()) File "/workspace/main.py", line 6, in explicit storage_client = storage.Client.from_service_account_json( File "/layers/google.python.pip/pip/lib/python3.9/site-packages/google/cloud/client.py", line 106, in from_service_account_json with io.open(json_credentials_path, "r", encoding="utf-8") as json_fi: FileNotFoundError: [Errno 2] No such file or directory: 'gs://PROJECT/PATH/service_account.json'
How can I properly refference the file to execute the function?
Google's client libraries all support Application Default Credentials (ADCs) very helpfully including finding credentials automatically. You should utilize this.
When your code runs on the Cloud Functions service, it runs as a Service Account identity. You're probably running the Cloud Functions using the default Cloud Functions Service Account. You can specify a user-defined Service Account when you deploy|update the Cloud Function.
NOTE It's better to run your Cloud Functions using user-defined, non-default Service Accounts.
When the Cloud Function tries to create a Cloud Storage client, it does so using it's identity (the Service Account).
A solution is to:
Utilize ADCs and have the Cloud Storage client be created using these i.e. storage_client = storage.Client()
Adjust the Cloud Function's Service Account to have the correct IAM roles|permissions for Cloud Storage.
NOTE You should probably adjust the Bucket's IAM Policy. Because you're using a Cloud Storage Bucket in a different project, if you want to adjust the Project's IAM Policy instead, ensure you use the Bucket's (!) Project's IAM Policy and not the Cloud Function's Project's IAM Policy.
See IAM roles for Cloud Storage: predefined roles and perhaps roles/storage.objectViewer as this includes storage.objects.list permission that you need.
You're trying to list buckets located in another GCP projects while authorizing your client by pulling a Service Account (SA) key file that you also pull from from Google Cloud Storage (GCS).
I'd recommend a different security pattern to resolve this. Essentially use a single SA that has permissions to invoke your cloud function and permissions to list contents of your GCS bucket. It'll circumvent the need for pulling in a file from GCS that contains the key file while still maintaining security since a bad actor will require access to your GCP account. The following steps show you how to do so.
Create a Service Account (SA) with the role Cloud Functions Invoker in your first Cloud Functions project
Grant your user account or user group the Service Account User role on this new SA
Change your Cloud Function to use this newly created SA, where to change Cloud Function Runtime SA
Grant the Cloud Function runtime SA a GCS role in the second project or GCS bucket
In this pattern, you will "actAs" the Cloud Function runtime SA allowing you to invoke the Cloud Function. Since the Cloud Function runtime SA has adequate permissions to your GCS bucket in your other project there's no need for an SA key since the runtime SA already has adequate permissions.

GCE API Required 'compute.zones.list' permission for 'projects/someproject'

I have created project in GCP. Then i create service account with ComputeAdmin role. Then i enable "Compute Engine API" for project.
But can't work with instances:
#gcloud compute instances list
ERROR: (gcloud.compute.instances.list) Some requests did not succeed: - Required 'compute.zones.list'
permission for 'projects/someproject'
what am I doing wrong ?
Edited
from service account:
ERROR: (gcloud.projects.get-iam-policy) User
[cloud66#project_id.iam.gserviceaccount.com] does not have permission
to access project [project_id:getIamPolicy] (or it may not exist): The
caller does not have permission
When i switch to main google account:
$ gcloud projects get-iam-policy project_id bindings:
members:
serviceAccount:cloud66#project_id.iam.gserviceaccount.com
role: roles/compute.admin
members:
serviceAccount:service-855312803173#compute-system.iam.gserviceaccount.com
role: roles/compute.serviceAgent
members:
serviceAccount:855312803173-compute#developer.gserviceaccount.com
serviceAccount:855312803173#cloudservices.gserviceaccount.com
role: roles/editor
members:
user:my_google_user role: roles/owner
From Logs view:
2020-01-28 16:46:30.932 EET Compute Engine list zones
cloud66#project_id.iam.gserviceaccount.com PERMISSION_DENIED
code: 7
message: "PERMISSION_DENIED"
Have a look at the documentation first. Usually such error occurs if your service account doesn't have enough permissions. In this case, you should check available roles, search there for required permission like compute.zones. and add it to your service account as it described here. For example it could be Compute Instance Admin (v1) role.
EDIT It look like Compute Admin role should work for you. To be sure, check granted roles in your project with command:
gcloud projects get-iam-policy YOUR_PROJECT_ID
If want to use your service account with Cloud API from some application have a look at this instructions.
EDIT2 Try to check your service account and key from 3rd place (like some linux desktop or server) as it described in the documentation.
The first time I created a service account "cloud66" in the Google test period. Most likely, this affected the access rights. Then I switched billing from a test period to a paid one. I deleted and recreated the cloud66 service account in the "APIs & Services -> Credentials" section. But there was an access policy for "cloud66" with the role of "ComputeAdmin" in the "IAM" section. When I deleted the access policy from the "IAM" section and recreated the service account, the problem was resolved.

Hyperledger Composer CLI Ping to a Business Network returns AccessException

Im trying to learn Hyperledger Composer but seems to be a relatively new technology, i mean there are few tutorials and few solutions to a lot of questions, tutorial does not mention possible error case when following the commands and which means there are is also no solution for those errors.
I have joined the composer channel in their community chat, looks like its running in Discord or something, and asked the same question without a response, i have a better experience here in SO.
This is the problem: I have deployed my business network, installed it, started it, created my network admin card and imported it, then to test if everything is ok i have to command composer network ping --card NAME-OF-MY-ADMIN-CARD
And this error comes:
juan#JuanDeDios:~/proyectos/inovacion/a3-poliza-microservice$ composer network ping --card admin#a3-policy-microservice
Error: transaction returned with failure: AccessException: Participant 'org.hyperledger.composer.system.NetworkAdmin#admin' does not have 'READ' access to resource 'org.hyperledger.composer.system.Network#a3-policy-microservice#0.0.1'
Command failed
I think that it has to do something with the permission.acl file, and gave permission to everyone to everything so there would not be any restrictions to anyone, and tryied again, but failed.
So i thought i had to uninstall my business network and create it again, i deleted my .bna and my network.card files also so everything would be created again, but the same error result.
My other attempt was to update the business network, but didn't work, the same error happened and I'm sure i didn't miss any step from the tutorial. I do also followed the playground tutorial. What i have not done its to create another app with the Yeoman but i will do if i don't find a solution to this problem which would not require me to create another app.
This were my steps:
1-. Created my app with Yeoman
yo hyperledger-composer:businessnetwork
2-. Selected Apache-2.0 for my license
3-. Created a3-policy-microservice as the name of the business network
4-. Created org.microservice.policy (Yeah i switched names but Im totally aware)
5-. Generated my app with a template selecting the NO option
6-. Created my assets, participants and transactions
7-. Changed my permission rules to mine
8-. I generated the .bna file
composer archive create -t dir -n .
9-. Then installed my bna file
composer network install --card PeerAdmin#hlfv1 --archiveFile a3-policy-microservice#0.0.1.bna
10-. Then started my network and created my networkadmin card
composer network start --networkName a3-policy-network --networkVersion 0.0.1 --networkAdmin admin --networkAdminEnrollSecret adminpw --card PeerAdmin#hlfv1 --file networkadmin.card
11-. Imported my card
composer card import --file networkadmin.card
12-. Tried to ping my network
composer network ping --card admin#a3-poliza-microservice
And the error happens
Later i tried to create everything again shutting down my fabric and started it again and creating the network from the first step.
My other attempt was to change the permissions and upgrade my bna network, but it failed too. Im running out of options
Hope this description its not too long to ignore it. Thanks in advance
thanks for the question!
First possibility is that your network name is a3-policy-network but you're pinging a network called a3-poliza-microservice - once you do get the correct ACLs in place (currently, that's the error you're trying to resolve).
The procedure for upgrade would normally be the procedure below:
After your step 12 (where you can't ping the business network due to restrictive ACL conditions, assuming you are using the right network name) you would have:
Make the changes to to include your System ACLs this time eg.
/**
* Sample access control list.
*/
rule SystemACL {
description: "System ACL to permit all access"
participant: "org.hyperledger.composer.system.Participant"
operation: ALL
resource: "org.hyperledger.composer.system.**"
action: ALLOW
}
rule NetworkAdminUser {
description: "Grant business network administrators full access to user resources"
participant: "org.hyperledger.composer.system.NetworkAdmin"
operation: ALL
resource: "**"
action: ALLOW
}
rule NetworkAdminSystem {
description: "Grant business network administrators full access to system resources"
participant: "org.hyperledger.composer.system.NetworkAdmin"
operation: ALL
resource: "org.hyperledger.composer.system.**"
action: ALLOW
}
Update the "version" field in your existing package.json in your Business Network project directory (ie need to change it next increment - eg. update the version property from 0.0.1 to 0.0.2.)
From the same directory, run the following command:
composer archive create --sourceType dir --sourceName . -a a3-policy-network#0.0.2.bna
Now install the new business network code firstly:
composer network install --card PeerAdmin#hlfv1 --archiveFile a3-policy-network#0.0.2.bna
Then perform the requisite upgrade step (single '-' for short form of the parameter):
composer network upgrade -c PeerAdmin#hlfv1 -n a3-policy-network -V 0.0.2
After a few seconds, ping the network again to see ACL changes are now in effect:
composer network ping -c a3-policy-network

Open EdX (Bitnami install) fails Gmail authentication for email registration.

I am unable to get Open EdX to authenticate to my Gmail account to send registration emails. Here's what I'm working with:
New installation of Open EdX via Bitnami.
Edited lms.env.json and cms.env.json based on this guide from the Bitnami wiki, including adding EMAIL_HOST_USER and EMAIL_HOST_PASSWORD fields.
Recompiled and restarted server.
Registered a new user and got a successful response via the LMS (account created and let me in).
However, no email confirmation came through.
Google account that I'm using allows access for less secure apps.
Log shows the following:
File "/opt/bitnami/python/lib/python2.7/smtplib.py", line 731, in sendmail
raise SMTPSenderRefused(code, resp, from_addr)
SMTPSenderRefused: (530, '5.5.1 Authentication Required. Learn more at\n5.5.1 support.google.com/mail/answer/14257 x123sm6973392pfb.54 - gsmtp', u'nyedid#sandtontechnologies.com')
2016-05-10 19:22:38,850 INFO 13202 [audit] models.py:1802 - Login success - user.id: 5
2016-05-10 19:22:38,919 INFO 13202 [audit] views.py:1822 - Login success on new account creation - Test2
I can log in to the account with no problem.
The link provided in the error message (https://support.google.com/mail/answer/14257) notes that you may receive this error if you have 2-factor authentication enabled on your account. In that case, you should generate an App Password (https://support.google.com/accounts/answer/185834?hl=en#ASPs) specifically for your Open edX instance, and use that in place of your normal GMail password.
Does that help?

Authorization error: migrating db for Google cloud SQL

I've been trying to make a db using google cloud sql, but I got an error when migrating the db.
python manager.py db init works well: A folder named migrations was made.
However, python manager.py db migrate produces an error:
File "/usr/local/google_appengine/google/storage/speckle/python/api/rdbms.py", line 946, in MakeRequest
raise _ToDbApiException(response.sql_exception)
sqlalchemy.exc.InternalError: (InternalError) (0, u'End user Google Account not authorized.') None None
It looks like a kind of authorization errors. How should I solve it?
Former authentication information was saved in a file, '.googlesql_oauth2.dat', if you had been authorized with a different id.
In this case, you have to remove the file before authentication process is performed.
Mac:
~/.googlesql_oauth2.dat
Windows:
%USERPROFILE%\.googlesql_oauth2.dat
Ref: http://jhlim.kaist.ac.kr/?p=182