Using Google Cloud Dataflow how do I run with proper credentials on a GCE Compute instance? - google-compute-engine

I'm new to Google Cloud Dataflow, as is probably obvious from my questions below.
I've got a dataflow application written and can get it to run without issue using my personal credentials both locally and on a GCE instance. However, I can't seem to crack the proper steps to get it to run using the compute engine instance's service credentials or service credentials I've created using the API & AUTH section of the console. I always get a 401 not authorized error when I run.
Here's what I've tried...
1) Created virtual machine granting access rights to storage, datastore, sql and compute engine. My understanding is that this supposedly created a CI specific services account that is the server's default credentials. These should be used the same way a user's authentication is used when an app is run on this instance. Here's where I get a 401. My question is... Where can I see this service account that was supposedly created? Or do I just rely that it exists somewhere?
2) Created service credentials in API & Auth portion of developer's console. Then used cloud auth activate-service-account and activated that account by pointing the command at the credentials json file I downloaded. Kind of like the OAUTH round trip when you use gcloud auth login. Here I also get the 401.
3) This last thing was using the service credentials from step 2 separate from the GCE instance and then create an object that implements the CredentialFactory interface and pass it off to the PipelineOptions. However, when it runs the app crashes now with an error saying that it is looking for a method, fromOptions, that isn't in the CredentialFactory interface. How the options were configured, what the credentials factory looked like and the stack trace from this follows.
I would be happy to utilize any of the above 3 methods to make use of service credentials, if I could get any of them to work. Any insight you can provide on what I'm doing wrong, steps I'm leaving out, other unexplored options would be greatly appreciated. The documentation is a little dis-jointed. If there is a clear step by step guide a link to that would be sufficient. What I've found so far on my own has been of little assistance.
If I can provide any additional information please let me know.
Here's some code that may be helpful and the stack trace I get when the code runs using the credential factory.
Options setup code looks like this:
GcrDataflowPipelineOptions options = PipelineOptionsFactory.fromArgs(args)
.withValidation()
.as(GcrDataflowPipelineOptions.class);
options.setKind("Counties");
options.setCredentialFactoryClass(GoogleCredentialProvider.class);
GoogleCredentialProvider.java
Notice the json file I downloaded as part of creating the services account (renamed) is what's loaded as a resource from my apps class path.
public class GoogleCredentialProvider implements CredentialFactory {
#Override
public Credential getCredential() throws IOException, GeneralSecurityException {
final String env = System.getProperty("gcr_dataflow_env", "local");
Properties props = new Properties();
ClassLoader loader = this.getClass().getClassLoader();
props.load(loader.getResourceAsStream(env + "-gcr-dataflow.properties"));
final String credFileName = props.getProperty("gcloud.dataflow.service.account.file");
InputStream credStream = loader.getResourceAsStream(credFileName);
GoogleCredential credential = GoogleCredential.fromStream(credStream);
return credential;
}
}
Stacktrace:
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more
java.lang.RuntimeException: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
2015-07-03 09:55:42,519 | main | DEBUG | co.sc.gc.da.ap.GcrDataflowApp | destroying
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:268)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:123)
at com.google.cloud.dataflow.sdk.io.Read$Bound$1.evaluate(Read.java:120)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:684)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)
at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)
at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:99)
at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:208)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:640)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:354)
at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:76)
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:149)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.run(GcrDataflowApp.java:65)
at com.scotcro.gcr.dataflow.app.GcrDataflowApp.main(GcrDataflowApp.java:49)
Caused by: java.lang.RuntimeException: Unable to find factory method com.scotcro.gcr.dataflow.components.pipelines.GoogleCredentialProvider#fromOptions
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224)
at com.google.cloud.dataflow.sdk.util.InstanceBuilder.build(InstanceBuilder.java:161)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:180)
at com.google.cloud.dataflow.sdk.options.GcpOptions$GcpUserCredentialsFactory.create(GcpOptions.java:175)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:288)
at com.google.cloud.dataflow.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:127)
at com.sun.proxy.$Proxy42.getGcpCredential(Unknown Source)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.getDatastore(DatastoreIO.java:335)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:320)
at com.google.cloud.dataflow.sdk.io.DatastoreIO$Source.createReader(DatastoreIO.java:186)
at com.google.cloud.dataflow.sdk.runners.dataflow.BasicSerializableSourceFormat.evaluateReadHelper(BasicSerializableSourceFormat.java:259)
... 13 more

You likely do not have the proper credentials. When you execute a Dataflow job from GCE, The service account attached to the instance will be used for validation by DataFlow.
Did you do this when creating your machines?
create a service account for the instance on GCE?
https://cloud.google.com/compute/docs/authentication#using
Set the required scopes for using Dataflow such as storage, compute,
and bigquery? https://www.googleapis.com/auth/cloud-platform

Related

ChainableTemporaryCredentials getPromise and Missing credentials in config, if using AWS_CONFIG_FILE

I have an node application deployed in GCP.
The application includes code to access ressources in AWS-cloud.
For this purpose it uses the aws-SDK with ChainableTemporaryCredentials.
The relevant code lines are...
const credentials = new ChainableTemporaryCredentials({
params: {
RoleArn: `arn:aws:iam::${this.accountId}:role/${this.targetRoleName}`,
RoleSessionName: this.targetRoleName,
},
masterCredentials: new WebIdentityCredentials({
RoleArn: `arn:aws:iam::${this.proxyAccountId}:role/${this.proxyRoleName}`,
RoleSessionName: this.proxyRoleName,
WebIdentityToken: token,
}),
})
await credentials.getPromise()
The WebIdentityToken was received from google and looks good.
At AWS-side I created an proxy-role (the line from masterCredentials RoleArn).
However at runtime I get the error:
Missing credentials in config, if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1
I do not understand this error. Because my application runs in GCP and I use temporary credentials I do not understand why I should use aws-credentials in form of an credentials file or environment variables like AWS_ACCESS_KEY_ID or AWS_SECRET_ACCESS_KEY. I thought the idea to use ChainableTemporaryCredentials is NOT to have direct aws-credentials. Right?
You can see the public code at:
https://github.com/cloud-carbon-footprint/cloud-carbon-footprint/blob/trunk/packages/aws/src/application/GCPCredentials.ts
and documentation regarding env-variables at:
https://www.cloudcarbonfootprint.org/docs/configurations-glossary/
Any help which leads to understanding of this error message is welcome.
Thomas
Solved it. "Missing credentials in config, if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1 was totally misleading." In reality it was a problem with the field-names in the GCP-JWT-token und the policy in aws. See https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_iam-condition-keys.html#ck_aud

How to add a new app setting to Azure Web App using pulumi without removing the existing settings?

I'm using pulumi azure native for infrastructure as code. I need to create an Azure Web App (based on an App Service Plan) and add some app settings (and connection strings) throughout the code, e.g., Application Insights instrumentation key, Blob Storage account name, etc.
I figured out that there is a method, WebAppApplicationSettings, that can update web app settings:
from pulumi_azure_native import web
web_app = web.WebApp(
'my-web-app-test123',
...
)
web.WebAppApplicationSettings(
'myappsetting',
name=web_app.name,
resource_group='my-resource-group',
properties={'mySetting': 123456},
opts=ResourceOptions(depends_on=[web_app])
)
It turns out that WebAppApplicationSettings replaces the entire app settings with the value given in the properties parameter, which is not what I need. I need to append a new setting to the existing settings.
So, I tried this:
Fetch the existing settings from web app using list_web_app_application_settings_output
Add the new settings the existing settings
Update the app settings using WebAppApplicationSettings
from pulumi_azure_native import web
app = web.WebApp(
'my-web-app-test123',
...
)
current_apps_settings = web.list_web_app_application_settings_output(
name=web_app.name,
resource_group_name='my-resource-group',
opts=ResourceOptions(depends_on=[web_app])
).properties
my_new_setting = {'mySetting': 123456}
new_app_settings = Output.all(current=current_apps_settings).apply(
lambda args: my_new_setting.update(args['current'])
)
web.WebAppApplicationSettings(
'myappsetting',
name=app.name,
resource_group='my-resource-group',
properties=new_app_settings,
opts=ResourceOptions(depends_on=[web_app])
)
However, this doesn't work either and throws the following error during pulumi up:
Exception: invoke of azure-native:web:listWebAppApplicationSettings failed: invocation of azure-native:web:listWebAppApplicationSettings returned an error: request failed /subscriptions/--------------/reso
urceGroups/pulumi-temp2/providers/Microsoft.Web/sites/my-web-app-test123/config/appsettings/list: autorest/azure: Service returned an error. Status=404 Code="ResourceNotFound" Message="The Resource 'Microsoft.Web/sites/my-web-app-test123' under resource group 'pulumi-temp2' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"
error: an unhandled error occurred: Program exited with non-zero exit code: 1
Is there way that I can add a new app setting to Azure Web App using pulumi without changing/removing the existing settings?
Here's a suboptimal workaround: App Configuration and Enable Azure Function Dynamic Configuration.
And as far as I can tell it comes with some drawbacks:
cold start time may increase
additional costs
care must be taken to avoid redundant calls (costly)
additional boilerplate code needed for every function app
Maybe there's a better way, I mean I hope there is, I just haven't found it yet either.
After some searching and reaching out to pulumi-azure-native people, I found an answer:
Azure REST API doesn't currently support this feature, i.e., updating a single Web App setting apart from the others. So, there isn't such a feature in pulumi-azure-native as well.
As a workaround, I stored (kept) all the app settings I needed to be added, updated, or removed in a dictionary throughout my Python script, and then I passed them to the web.WebAppApplicationSettings class at the end of the script so that they will be applied all at once to the Web App resource. This is how I solved my problem.

Spring Boot/Micrometer sending metrics to GCP Stackdriver

I'm trying to implement a simple solution to send http request metrics to Stackdriver in GCP from my API hosted in a compute engine instance.
Using recent version of Spring Boot (2.1.5). I've also pulled in actuator and micrometer-registry-stackdriver packages, actuator works for health endpoint at the moment, but am unclear on how to implement metrics for this.
In the past (separate project, different stack), I mostly used the auto-configured elements with influx. Using management.metrics.export.influx.enabled=true, and some other properties in properties file, it was a pretty simple setup (though it is quite possible the lead on my team did some of the heavy lifting while I wasn't aware).
Despite pulling in the stackdriver dependency I don't see any type of properties for stackdriver. Documentation is all generalized, so I'm unclear on how to do this for my use case. I've searched for examples and can find none.
From the docs: Having a dependency on micrometer-registry-{system} in your runtime classpath is enough for Spring Boot to configure the registry.
I'm a bit of a noob, so I'm not sure what I need to do to get this to work. I don't need any custom metrics really, just trying to get some metrics data to show up.
Does anyone have or know of any examples in setting this up to work with Stackdriver?
It seems like the feature for enabling Stackdriver Monitoring for COS is currently in Alpha. If you are down to try GCE COS VM with the agent, you can request access via this form .Curiously, I was able to install monitoring agent during instance creation as a test. I used COS image : Container-Optimized OS 75-12105.97.0 stable.
Inspecting COS, collect d agent seems to be installed here :/etc/stackdriver/monitoring.config.d and
Inspecting my monitoring Agent dashboard, I can see activity from the VM (CPU usage, etc.). I'm not sure if this is what you're trying to achieve but hopefully it points you in the right direction.
From my understanding, you try to monitor a 3rd party software that you built and get the results in GCP Stackdriver? If that’s right, I would like to suggest you to implement the stackdriver monitoring agent [1] on your VM instance, including the Stackdriver API output plugin. This agent gathers system and 3rd party application metrics and pushes the information to a monitoring system like Stackdriver.
The Stackdriver Monitoring Agent is based on the open-source “collectd” daemon so let me share some more precious documentation from its website [2].
Prior to spring-boot 2.3 StackDriver is not supported out of the box, but it's not much configuration to make it work.
#Bean
StackdriverConfig stackdriverConfig() {
return new StackdriverConfig() {
#Override
public String projectId() {
return MY_PROJECT_ID;
}
#Override
public String get(String key) {
return null;
}
}
}
#Bean
StackdriverMeterRegistry meterRegistry(StackdriverConfig stackdriverConfig) {
return StackdriverMeterRegistry.builder(stackdriverConfig).build();
}
https://micrometer.io/docs/registry/stackdriver

Wirecloud FI-Ware Testbed compatibility

I was wondering if Wirecloud offers complete support for object storage with FI-WARE Testbed instead of Fi-lab. I have successfully integrated Wirecloud with Testbed and have developed a set of widgets that are able to upload/download files to specific containers in Fi-lab with success. However, the same widgets do not seem to work in Fi-lab, as i get an error 500 when trying to retrieve the auth tokens (also with the well known object-storage-test widget) containing the following response:
SyntaxError: Unexpected token
at Object.parse (native)
at create (/home/fiware/fi-ware-keystone-proxy/controllers/Token.js:343:25)
at callbacks (/home/fiware/fi-ware-keystone-proxy/node_modules/express/lib/router/index.js:164:37)
at param (/home/fiware/fi-ware-keystone-proxy/node_modules/express/lib/router/index.js:138:11)
at pass (/home/fiware/fi-ware-keystone-proxy/node_modules/express/lib/router/index.js:145:5)
at Router._dispatch (/home/fiware/fi-ware-keystone-proxy/node_modules/express/lib/router/index.js:173:5)
at Object.router (/home/fiware/fi-ware-keystone-proxy/node_modules/express/lib/router/index.js:33:10)
at next (/home/fiware/fi-ware-keystone-proxy/node_modules/express/node_modules/connect/lib/proto.js:195:15)
at Object.handle (/home/fiware/fi-ware-keystone-proxy/server.js:31:5)
at next (/home/fiware/fi-ware-keystone-proxy/node_modules/express/node_modules/connect/lib/proto.js:195:15)
I noticed that the token provided in the beggining (to start the transaction) is
token: Object
id: "%fiware_token%"
Any idea regarding what might have gone wrong?
The WireCloud instance available at FI-WARE's testbed is always the latest stable version while the FI-LAB instance is currently outdated, we're working on updating it as soon as possible. One of the things that changes between those versions is the Object Storage API, so sorry for the inconvenience as you will not be able to use widgets/operators using the Object Storage in both environments.
Anyway, the response you were obtaining seems to indicate the object storage instance you are accessing is not working properly, so you will need to send an email to one of the available mail lists for getting help (fiware-testbed-help or fiware-lab-help) telling what is happening to you (remember to include your account information as there are several object storage nodes and ones can be up and the others down).
Regarding the strange request body:
"token": {
id: "%fiware_token%"
}
This behaviour is normal, as the WireCloud client code has no direct access to the IdM token of the user. It's the WireCloud's proxy which replaces the %fiware_token% pattern with the correct value.

Google Drive/OAuth - Can't figure out how to get re-usable GoogleCredentials

I've successfully installed and run the Google Drive Quick Start application called DriveCommandLine. I've also adapted it a little to GET file info for one of the files in my Drive account.
What I would like to do now is save the credentials somehow and re-use them without the user having to visit a web page each time to get an authorization code. I have checked out this page with instructions to Retrieve and Use OAuth 2.0 credentials. In order to use the example class (MyClass), I have modified the line in DriveCommandLine where the Credential object is instantiated:
Credential credential = MyClass.getCredentials(code, "");
This results in the following exception being thrown:
java.lang.NullPointerException
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)
at com.google.api.client.json.jackson.JacksonFactory.createJsonParser(JacksonFactory.java:84)
at com.google.api.client.json.JsonFactory.fromInputStream(JsonFactory.java:247)
at com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets.load(GoogleClientSecrets.java:168)
at googledrive.MyClass.getFlow(MyClass.java:145)
at googledrive.MyClass.exchangeCode(MyClass.java:166)
at googledrive.MyClass.getCredentials(MyClass.java:239)
at googledrive.DriveCommandLine.<init>(DriveCommandLine.java:56)
at googledrive.DriveCommandLine.main(DriveCommandLine.java:115)
I've been looking at these APIs (Google Drive and OAuth) for 2 days now and have made very little progress. I'd really appreciate some help with the above error and the problem of getting persistent credentials in general.
This whole structure seems unnecessarily complicated to me. Anybody care to explain why I can't just create a simple Credential object by passing in my Google username and password?
Thanks,
Brian O Carroll, Dublin, Ireland
* Update *
Ok, I've just gotten around the above error and now I have a new one.
The way I got around the first problem was by modifying MyClass.getFlow(). Instead of creating a GoogleClientServices object from a json file, I have used a different version of GoogleAuthorizationCodeFlow.Builder that allows you to enter the client ID and client secret directly as Strings:
flow = new GoogleAuthorizationCodeFlow.Builder(httpTransport, jsonFactory, "<MY CLIENT ID>", "<MY CLIENT SECRET>", SCOPES).setAccessType("offline").setApprovalPrompt("force").build();
The problem I have now is that I get the following error when I try to use flow (GoogleAuthorizationCodeFlow object) to exchange the authorization code for the Credentials object:
An error occurred: com.google.api.client.auth.oauth2.TokenResponseException: 400 Bad Request
{
"error" : "invalid_scope"
}
googledrive.MyClass$CodeExchangeException
at googledrive.MyClass.exchangeCode(MyClass.java:185)
at googledrive.MyClass.getCredentials(MyClass.java:262)
at googledrive.DriveCommandLine.<init>(DriveCommandLine.java:56)
at googledrive.DriveCommandLine.main(DriveCommandLine.java:115)
Is there some other scope I should be using for this? I am currently using the array of scopes provided with MyClass:
private static final List<String> SCOPES = Arrays.asList(
"https://www.googleapis.com/auth/drive.file",
"https://www.googleapis.com/auth/userinfo.email",
"https://www.googleapis.com/auth/userinfo.profile");
Thanks!
I feel your pain. I'm two months in and still getting surprised.
Some of my learnings...
When you request user permissions, specify "offline=true". This will ("sometimes" sic) return a refreshtoken, which is as good as a password with restricted permissions. You can store this and reuse it at any time (until the user revokes it) to fetch an access token.
My feeling is that the Google SDKs are more of a hinderence than a help. One by one, I've stopped using them and now call the REST API directly.
On your last point, you can (just) use the Google clientlogin protocol to access the previous generation of APIs. However this is totally deprecated and will shortly be turned off. OAuth is designed to give fine grained control of authorisation which is intrinsically complex. So although I agree it's complicated, I don't think it's unnecessarily so. We live in a complicated world :-)
Your and mine experiences show that the development community is still in need of a consolidated document and recipes to get this stuff into our rear-view mirrors so we can focus on the task at hand.
Oath2Scopes is imported as follows:
import com.google.api.services.oauth2.Oauth2Scopes;
You need to have the jar file 'google-api-services-oauth2-v2-rev15-1.8.0-beta.jar' in your class path to access that package. It can be downloaded here.
No, I don't know how to get Credentials without having to visit the authorization URL at least once and copy the code. I've modified MyClass to store and retrieve credentials from a database (in my case, it's a simple table that contains userid, accesstoken and refreshtoken). This way I only have to get the authorization code once and once I get the access/refresh tokens, I can reuse them to make a GoogleCredential object. Here's how Imake the GoogleCredential object:
GoogleCredential credential = new GoogleCredential.Builder().setJsonFactory(jsonFactory)
.setTransport(httpTransport).setClientSecrets(clientid, clientsecret).build();
credential.setAccessToken(accessToken);
credential.setRefreshToken(refreshToken);
Just enter your clientid, clientsecret, accessToken and refreshToken above.
I don't really have a whole lot of time to separate and tidy up my entire code to post it up here but if you're still having problems, let me know and I'll see what I can do. Although, you are effectively asking a blind man for directions. My understanding of this whole system is very sketchy!
Cheers,
Brian
Ok, I've finally solved the second problem above and I'm finally getting a working GoogleCredential object with an access token and a refresh token.
I kept trying to solve the scopes problem by modifying the list of scopes in MyClass (the one that manages credentials). In the end I needed to adjust the scopes in my modified version of DriveCommandLine (the one that's originally used to get an authorization code). I added 2 scopes from Oauth2Scopes:
GoogleAuthorizationCodeFlow flow = new GoogleAuthorizationCodeFlow.Builder(
httpTransport, jsonFactory, CLIENT_ID, CLIENT_SECRET,
Arrays.asList(DriveScopes.DRIVE, Oauth2Scopes.USERINFO_EMAIL, Oauth2Scopes.USERINFO_PROFILE))
.setAccessType("offline").setApprovalPrompt("force").build();
Adding the scopes for user information allowed me to get the userid later in MyClass. I can now use the userid to store the credentials in a database for re-use (without having to get the user to go to a URL each time). I also set the access type to "offline" as suggested by pinoyyid.