I am using a Google cloud function and can't get the default service account to work properly. The service account works fine if I pass in a JSON credentials file. However, if I set the same service account to be the default account for the function I get an error. Also, the code works fine if I run locally and use the defalt service account which is also set up to the same account. The error I get when the function runs in the cloud is:
The user does not belong to a G Suite customer.
This generally means either the service account is not authorized for domain wide delegation or that no G-Suite user was specified for impersonation. Usage metrics from the cloud console indicate the correct service account is used to run the function. Also this account appears in the run configuration section of the function as expected. I am using both G-suite functions(Vault) and Google cloud functions (storage). The service account is set up for domain wide delegation and I am setting a Gsuite user to impersonate.
In summary, the same service account and user:
Works in the cloud or desktop if I pass in a JSON credentials file,
Does not work in the cloud with the same service account being set as the default,
Works on my laptop with the same service account set as the Default Service Account.
The run tab on the function show the correct service account and the usage statistics also indicate the correct service account is being used. I want to use the Default Service Account so I don't have to manage storing and retrieving credentials.
The relevant code is as follows:
public HttpRequestInitializer buildInitializer() throws Exception {
if (VaultConfig.isApplicationRunMode())
return new HttpCredentialsAdapter(buildDefaultCredentials());
else if (VaultConfig.isServiceRunMode())
return new HttpCredentialsAdapter(buildServiceCredentials());
else throw new Exception("Request initializer only supports service account or application default");
}
private GoogleCredentials buildServiceCredentials() throws Exception {
String credentialsPath = VaultConfig.getServiceCredentials();
logger.info("Credentials path " + credentialsPath);
InputStream is=GetRequestInitializer.class.getResourceAsStream(credentialsPath);
if (is == null) {
throw new FileNotFoundException("Resource not found: " + credentialsPath);
}
GoogleCredentials credentials = GoogleCredentials.fromStream(is)
.createScoped(VaultConfig.getScopes()).createDelegated(VaultConfig.getUserToImpersonate());
logger.debug("Credentials created. Scopes " + VaultConfig.getScopes());
logger.debug(credentials.toString());
return credentials;
}
private GoogleCredentials buildDefaultCredentials() throws Exception {
logger.info("Using application default credentials");
GoogleCredentials credentials = GoogleCredentials.getApplicationDefault()
.createScoped(VaultConfig.getScopes())
.createDelegated(VaultConfig.getUserToImpersonate());
logger.debug("Credentials created. Scopes " + VaultConfig.getScopes());
logger.debug(credentials.toString());
return credentials;
}
The error occurs later in the code where I actually call the Vault service and the server returns a 400 response with the indicated error. Please note the debug statement logger.debug(credentials.toString()). In all situations where the function works the credentials.toString() returns expected results as follows:
ServiceAccountCredentials{clientId=1111111111222233, clientEmail=myfunction#appspot.gserviceaccount.com, privateKeyId=aaaabb12bgbaaaaaaaaaa, transportFactoryClassName=com.google.auth.oauth2.OAuth2Utils$DefaultHttpTransportFactory, tokenServerUri=https://oauth2.googleapis.com/token, scopes=[https://www.googleapis.com/auth/devstorage.read_write, https://www.googleapis.com/auth/devstorage.full_control, https://www.googleapis.com/auth/ediscovery], defaultScopes=[], serviceAccountUser=my.user.account#mydomain.com, quotaProjectId=null, lifetime=3600, useJwtAccessWithScope=false}
However, when using the Default service account in the cloud credentials.toString() returns
ComputeEngineCredentials{transportFactoryClassName=com.google.auth.oauth2.OAuth2Utils$DefaultHttpTransportFactory}
It looks to me like my problem is that GoogleCredentials.getApplicationDefault() does not return the expected result when run on the cloud. I set up the service account following https://cloud.google.com/docs/authentication/production.
Is there something else I need to do? Any tips on additional diagnostic statements that might verify that the problem is? The run tab on the function shows the correct account and I am using what should be the default account for the project.
Related
I have created a simple Google Cloud Function with Spring Cloud Function library to get triggered on the arrival of the Pub/Sub message. I followed the sample function-sample-gcp-background. Whenever a message is triggered to the Pub/Sub, it gets printed from the Cloud Function as expected.
But I wonder how can I get the metadata of the Pub/Sub message in the Cloud Functon. The Google Cloud Function documentation says that
This metadata is accessible via the context object that is passed to
your function when it is invoked.
How can I access this metadata (or the context object) in a Spring Cloud Function application?
UPDATE :- Version spring-cloud-function-adapter-gcp:3.1.2
UPDATE 2:- I raised an issue in github and got the issue resolved. Thanks to Spring Cloud Function team.
When you use background function, the PubSub message and context are extracted and provided in the PubSub message. If you have a look to the PubSub object here; you have the Published Time and the Message ID embedded in it. You only have to use them!
Got the issue resolved as per the advice from spring-cloud-function team. The Consumer function needs to accept the parameter of type Message<PubSubMessage> instead of PubSubMessage to get the Context object.
#Bean
public Consumer<Message<PubSubMessage>> pubSubFunction() {
return message -> {
// The PubSubMessage data field arrives as a base-64 encoded string and must be decoded.
// See: https://cloud.google.com/functions/docs/calling/pubsub#event_structure
PubSubMessage payload = message.getPayload();
String decodedMessage = new String(
Base64.getDecoder().decode(message.getPayload().getData()), StandardCharsets.UTF_8);
System.out.println("Hello!!! Received Pub/Sub message with data: " + decodedMessage);
// Print out timestamp and event id
Context ctx = message.getHeaders().get("gcf_context", Context.class);
System.out.println(ctx.eventId());
System.out.println(ctx.timestamp());
};
}
Ref :- github issue #695
I'm trying to set a user's OU from an App Script inside App Maker.
(user is a variable with an email address)
function getUser(user) {
var x = AdminDirectory.Users.update(
{
orgUnitPath: "/",
userKey: user,
});
console.log("function ran");
}
This code errors with:
Exception: Invalid number of arguments provided. Expected 2-3 only at getUser (ServerScripts:107)
Invalid number of arguments provided. Expected 2-3 only
at getUser (ServerScripts:107)
at getUser (ClientHandoff:21:21)
at TestMoveOU.Panel1.Button1.onClick:1:1
What am I doing wrong here? Looking at the docs, you only need to provide the properties you're changing.
The Apps Script documentation says the following:
For detailed information on this service, see the reference documentation for the Admin SDK Directory API. Like all advanced services in Apps Script, the Admin SDK Directory service uses the same objects, methods, and parameters as the public API.
Therefore, we need to consult the documentation to get clarification on how to achieve this.
The method requires at least two parameters: that means that the the first parameter is a user object resource and the second parameter is the email address of the user: AdminDirectory.Users.update(resource, userKey). So you need to do this:
function getUser(user) {
var userResource = {
orgUnitPath: "/"
};
var updated = AdminDirectory.Users.update(userResource, user);
console.log(updated.primaryEmail);
}
So why do you need to specify the user email in the method when it is already being specified in the userResource object? Well, the email address in the userResource object would be the new value, in case you want to change the email address.
P.S. Perhaps you might wanna change the name of the function to something that is more of a match; updateUser() perhaps? I hope this helps!
I've been trying for a couple of days now to crack this but have not had any success.
I have a web application that I want to use with Google Drives API.
I want the web application to check if there is an access token it can use and if not redirect to Google so the user can log in and grant access.
Seemingly a simple task but it's driving me mad! I've checked the Google documentation but it all seems to be geared around console applications
Google provides an interface UserService which stores details of the users using the application. If the users is not logged in redirect the user to login page using:
response.sendRedirect(userService.createLoginURL(request.getRequestURI()))
Later or if the user is logged in, redirect him to "Request for Permission" page using:
List<String> scopes = Arrays.asList(PlusScopes.PLUS_LOGIN,PlusScopes.PLUS_ME,PlusScopes.USERINFO_EMAIL,PlusScopes.USERINFO_PROFILE......); // Add/remove scopes as per your requirement
List<String> responseTypes = Arrays.asList("code");
GoogleAuthorizationCodeRequestUrl gAuthCode = new GoogleAuthorizationCodeRequestUrl(Google project client id, redirect url, scopes);
gAuthCode.setAccessType("offline");
gAuthCode.setClientId(Google project client id);
gAuthCode.setResponseTypes(responseTypes);
gAuthCode.setApprovalPrompt("force");
authURl = gAuthCode.toURL().toString();
response.sendRedirect(authURl);
Make sure you add all required scopes of the API methods you will be using. After the user has accepted, you will have to create a servlet with "/oauth2callback" mapping to get the authorization code.
request.getParameter("code")
In the same servlet using the code obtained, get refresh and access token making a rest call.
URL url = new URL("https://www.googleapis.com/oauth2/v3/token");
HttpURLConnection connection= (HttpURLConnection)url.openConnection();
connection.setRequestMethod("post");
connection.setDoInput(true);
connection.setDoOutput(true);
DataOutputStream dw= new DataOutputStream(connection.getOutputStream());
dw.writeBytes("code="+authorizationCode+"&client_id="+CLIENT_ID+"&client_secret="+CLIENT_SECRET+"&redirect_uri="+REDIRECT_URL+"&grant_type=authorization_code");
dw.flush();
dw.close();
InputStream inputStream= connection.getInputStream();
Parse the input stream to get your refresh token and access token and redirect the user to your landing page.
Now you have access token to query your api whose scopes were provided in authorization flow. Also you have a refresh token which can be used to regenerate new access token if the previously issued access token has expired.
You should be able to implement the OAuthHandshake using HTTP requests and a redirect URL to your web application. You can play around with the requests here to see what the headers and responses look like: https://developers.google.com/oauthplayground/
You can store the authorization code and tokens any way you like. You would have your web application refer to these tokens to see if they are expired. For example:
def getTokenFromFile(self):
creds = self.readCredsFromDisk()
# check if token is expired
expiration_time = datetime.datetime.strptime(creds['token_expiry'], '"%Y-%m-%dT%H:%M:%S.%f"')
if expiration_time < datetime.datetime.now():
self.refreshToken()
# reload creds
creds = self.readCredsFromDisk()
return creds['access_token']
I'm writing just a python script that does the handshake and saves the token to a plain text file. Any time the script runs a function to the Google API it will use this function.
The refresh function:
def refreshToken(self):
with open('client_secret.json') as s:
secret = json.load(s)
secret = secret['installed']
creds = self.readCredsFromDisk()
refresh_url = secret['token_uri']
post_data = {'client_id':secret['client_id'],
'client_secret':secret['client_secret'],
'refresh_token':creds['refresh_token'],
'grant_type':'refresh_token'}
headers = {'Content-type':'application/x-www-form-urlencoded'}
(resp, content) = self.http.request(refresh_url,
method='POST',
body=urlencode(post_data),
headers=headers)
content = json.loads(content)
creds['access_token'] = content['access_token']
date = datetime.datetime.now() + datetime.timedelta(seconds=content['expires_in'])
creds['token_expiry'] = json.dumps(date.isoformat())
self.writeCredsToDisk(json.dumps(creds))
You would write a function similar to this to trade the original authorization code and access code following the logic the OAuth Playground shows you.
I'm new to Google Cloud Dataflow, as is probably obvious from my questions below.
I've got a dataflow application written and can get it to run without issue using my personal credentials both locally and on a GCE instance. However, I can't seem to crack the proper steps to get it to run using the compute engine instance's service credentials or service credentials I've created using the API & AUTH section of the console. I always get a 401 not authorized error when I run.
Here's what I've tried...
1) Created virtual machine granting access rights to storage, datastore, sql and compute engine. My understanding is that this supposedly created a CI specific services account that is the server's default credentials. These should be used the same way a user's authentication is used when an app is run on this instance. Here's where I get a 401. My question is... Where can I see this service account that was supposedly created? Or do I just rely that it exists somewhere?
2) Created service credentials in API & Auth portion of developer's console. Then used cloud auth activate-service-account and activated that account by pointing the command at the credentials json file I downloaded. Kind of like the OAUTH round trip when you use gcloud auth login. Here I also get the 401.
3) This last thing was using the service credentials from step 2 separate from the GCE instance and then create an object that implements the CredentialFactory interface and pass it off to the PipelineOptions. However, when it runs the app crashes now with an error saying that it is looking for a method, fromOptions, that isn't in the CredentialFactory interface. How the options were configured, what the credentials factory looked like and the stack trace from this follows.
I would be happy to utilize any of the above 3 methods to make use of service credentials, if I could get any of them to work. Any insight you can provide on what I'm doing wrong, steps I'm leaving out, other unexplored options would be greatly appreciated. The documentation is a little dis-jointed. If there is a clear step by step guide a link to that would be sufficient. What I've found so far on my own has been of little assistance.
If I can provide any additional information please let me know.
Here's some code that may be helpful and the stack trace I get when the code runs using the credential factory.
Options setup code looks like this:
GcrDataflowPipelineOptions options = PipelineOptionsFactory.fromArgs(args)
.withValidation()
.as(GcrDataflowPipelineOptions.class);
options.setKind("Counties");
options.setCredentialFactoryClass(GoogleCredentialProvider.class);
GoogleCredentialProvider.java
Notice the json file I downloaded as part of creating the services account (renamed) is what's loaded as a resource from my apps class path.
public class GoogleCredentialProvider implements CredentialFactory {
#Override
public Credential getCredential() throws IOException, GeneralSecurityException {
final String env = System.getProperty("gcr_dataflow_env", "local");
Properties props = new Properties();
ClassLoader loader = this.getClass().getClassLoader();
props.load(loader.getResourceAsStream(env + "-gcr-dataflow.properties"));
final String credFileName = props.getProperty("gcloud.dataflow.service.account.file");
InputStream credStream = loader.getResourceAsStream(credFileName);
GoogleCredential credential = GoogleCredential.fromStream(credStream);
return credential;
}
}
Stacktrace:
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
2015-07-03 09:55:42,519 | main | DEBUG | co.sc.gc.da.ap.GcrDataflowApp | destroying
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more
You likely do not have the proper credentials. When you execute a Dataflow job from GCE, The service account attached to the instance will be used for validation by DataFlow.
Did you do this when creating your machines?
create a service account for the instance on GCE?
https://cloud.google.com/compute/docs/authentication#using
Set the required scopes for using Dataflow such as storage, compute,
and bigquery? https://www.googleapis.com/auth/cloud-platform
Can we relly that while Google App Script is executed by a Time Trigger, and makes two subsequest request using UrlFetchApp, both are made by same server with the same IP?
I need to ensure it, because in one request I query for an Access token for a remote service and with another I'm using this Access token. The remote service that I'm quering checks if the Access token was requested by the client with the same IP as requests that use this Access Token.
EDIT
I examined the behavior by time-triggering some dumb scripts with just few consecutive UrlFetchApp requests in them and checked server logs. I had two clear observations:
IP may vary in consecutive calls within one trigger
There is clear rotation of the IPs, sometimes there is a group of 7 consecutive calls with the same IP, sometimes 6. But in general there are always groups.
Because I wanted to only use Google infrastructure for my script and occasional failure was not a problem, I came up with a ugly ugly but working solution:
function batchRequest(userLogin, userPassword, webapiKey, resource, attributes, values ){
var token = requestToken(userLogin, userPassword, webapiKey ); // requestToken method uses UrlFetchApp.fetch
var result = request(resource, token, attributes, values); // requestToken method uses UrlFetchApp.fetch with options.muteHttpExceptions set to true so that we can read the response code
var i = 0;
while (result.getResponseCode() == 500 && i < 10){
token = requestToken(userLogin, userPassword, webapiKey ); // requestToken method uses UrlFetchApp.fetch
result = request(resource, token, attributes, values);
i++;
}
return result;
}
So I simply try hard max 10 times and hope to hook up to have the two requests — one for token and another for some bussiness logic — done in a same ‘IP group’.
I put more detailed description here: https://medium.com/p/dd0746642d7
Within the same trigger call yes. From another trigger no. Based on experience nce but i havent seen this docummented.