Where can I find reference documentation for OpenShift yaml definitions - openshift

I can find some fragments of openshift yaml definitions in the OpenShift documentation, for example in this document
But where can I find the whole reference documentation of openshift yaml definitions?

The API reference. For example, https://docs.openshift.com/container-platform/4.11/rest_api/index.html
That said, because of all of the optional components and custom CRDs that extend the list of APIs, you should also learn how to access the API definitions via oc api-resources abd oc explain: this will list the APIs available on your particular cluster.

Related

How do I quickly list all Google Cloud projects in an organization?

I would like to quickly list all Google Cloud projects in an organization, without AppScript folders.
gcloud projects list can be very slow. This documentation is about speeding it up, but does not show how to retrieve the Appscript folder which is used for filtering. Can that be done from the command line?
Also, gcloud projects list does not have a way to filter by organization. It seems that that is impossible as projects are not linked to their organization except through a tree of folders.
The documentation shows a way of walking the tree, apparently with Resource Manager API, which might do the job, but only pseudocode is shown. How can this be done with gcloud -- or else with Python or another language?
And if there is no way to accelerate this: How do I page through results using gcloud projects list? The documentation shows that page-size can be set, but does not show how to step through page by page (presumably by sending a page number with each command).
See also below for a reference to code I wrote that is the imperfect but best solution I could find.
Unfortunately there isn’t a native Apps Script resource available to work with Cloud Resource Manager API.
Although, it is possible to make a HTTP call directly to the Resource Manager API projects.list() endpoint with the help of UrlFetchApp service.
Alternatively, using Python as mentioned, the recommended Google APIs client library for python supports calls to Resource Manager API. You can find the specific projects.list() method documentation here.
On additional note, if you happen to use a Cloud project to generate credentials and authenticate the API call, you may want to enable Cloud Resource Manager API on your project by following this URL.
I’d also recommend submitting a new Feature Request using this template.
Here is some code that lists projects in an organization as quickly as possible. It is in Clojure, but it uses Java APIs and you can translate it easily.
Key steps
Query all accessible projects using CloudResourceManager projects(), using setQuery to accelerate the query by filtering out, for example, the hundreds of sys- projects often generated by AppScript. The query uses paging.
From the results
Accept those that are the child of the desired org
Reject those that are the child of another org.
For those that are the child of a folder, do this (concurrently, for speed): Use gcloud projects get-ancestors $PROJECT_ID to find the projects in your organization. (I don't see a way to do that in Java, and so I call the CLI.)

Google Drive API Key which is read only

Our server is customer deployed and uses a Google Drive API key to obtain a tutorial file
listing via
https://www.googleapis.com/drive/v3/files?q=%27FILE_ID%27+in+parents+and+trashed=false&maxResults=1000&key=API_KEY&fields=files(name,webViewLink,id,kind,mimeType)
and file contents via
https://www.googleapis.com/drive/v3/files/FILE_ID/export&key=API_KEY
It is unclear how we can set that API key to be read only though.
I do not see anything on these pages for example,
https://developers.google.com/drive/api/guides/about-auth
https://cloud.google.com/docs/authentication/api-keys
The restrictions you can set to an API key can be found here, so it is not possible to do it this way. Now, the way to achieve what you are trying to do would be by setting up the project with the correct OAuth scopes and using read only scopes, but that can limit your implementation as sometimes the API needs more scopes.
For example, if you were trying to list users using the Directory API, you can see the list of scopes needed here. If you check the list you will see that there is a read only scope listed there.
So, in your project you could just use this specific scope for your implementation, but again some actions require more than just this scope to work, so you could be limited by that as well depending on what your implementation is doing.
Same example for the Drive API in your case. The list of scopes for the Files: list method are here, and you also have read only scopes as you can see in the image below.

Is there any systemic way to find the minimum access right or role required for each of Azure CLI commands?

I am working on a project in which I need to define the exact minimum security role for each operation.
Is there any systemic way or documentation to find the minimum access right or role required for each of Azure CLI commands?
Well, there is no systemic way or doc to find it directly, it needs some experience and test, you could refer to the things below, it applies to most situations.
Azure CLI commands essentially call the Azure REST API, you could use --debug parameter with a CLI command, then you can find the API the command calls.
For example, I use the az vm list to list all the VMs in a resource group.
az vm list -g <group-name> --debug
Then you will find it calls Virtual Machines - List API, then you can search for the resource provider and resource type i.e. Microsoft.Compute/virtualMachines in this doc, easily we can find Microsoft.Compute/virtualMachines/read, here you need some experience, from my sight, the action permission should be correct.
Then you can create a custom role with this action to have a test, and change the permissions depend on the result, in most situations, the command will include the action permission you need in the error message if you don't have enough permissions to do the operations.

AWS WorkDocs SDK - How to Search a Folder?

Given the ID for a folder in AWS WorkDocs, how can I search that folder for a file or sub-folder that has a specific name, using the SDK? And can such a search be recursively deep vs shallow?
Is there a better way besides fetching the metadata for all of the items and stopping once there's a match? It appears not, from this quote on a page that provides a Python example:
Note that this code snippet searches for the file only in the current folder in the user’s MyDocs. For files in subfolders, the code needs to iterate over the subfolders by calling describe_folder_contents() on them and performing the lookup.
I see that the pricing schedule mentions search ...
• $58/10K calls for SEARCH requests (0.0058/call)
... but neither the API reference nor the FAQ mentions search in the answer for "What specific actions can be taken on Amazon WorkDocs content programmatically using the Amazon WorkDocs SDK?" -- The FAQ says:
The Amazon WorkDocs SDK allows you to perform create, read, update, and delete (CRUD) actions on WorkDocs’ users, folders, files, and permissions. You can access and modify file attributes, tag files, and manage comments associated with files.
In addition to API actions, you can also subscribe to notifications that Amazon WorkDocs sends with Amazon SNS. The detailed information, including syntax, responses and data types for the above actions, is available in the WorkDocs API Reference Documentation.
The labelling API might be the answer...
The labelling API allows you to tag files and folders so that you can better organize them, and to use tags when searching for files programmatically.
... but I'm having trouble finding an example, or even which classes comprise the "labelling API". Are they referring to Package software.amazon.awssdk.services.resourcegroupstaggingapi ?
Description
Resource Groups Tagging API
A tag is a label that you assign to an AWS resource. A tag consists of
a key and a value, both of which you define. For example, if you have
two Amazon EC2 instances, you might assign both a tag key of "Stack."
But the value of "Stack" might be "Testing" for one and "Production"
for the other.
Tagging can help you organize your resources and enables you to
simplify resource management, access management and cost allocation.
You can use the resource groups tagging API operations to complete the
following tasks:
Tag and untag supported resources located in the specified Region for
the AWS account.
Use tag-based filters to search for resources located in the specified
Region for the AWS account.
List all existing tag keys in the specified Region for the AWS
account.
List all existing values for the specified key in the specified Region
for the AWS account.
In the list of supported resources on that page, it lists S3 (buckets only) and WorkSpaces, but there's no mention of WorkDocs. Is this what I'm looking for?

How to create proxies with wild card paths in Azure api manager?

Good Afternoon,
I have a situation where three swagger files will have different resources but they belong to the same domain. And I can't merge them into a single swagger as we have many such scenarios and managing of a single swagger and single api proxy will be a big over head.
For example:
I have 3 apis with the following paths and resources
/supermarket/v1/aisles/{aisleId}/itemcategories
/supermarket/v1/aisles/{aisleId}/itemcategories/{itemcategoryId}/seasonedvegetabletypes
/supermarket/v1/aisles/{aisleId}itemcategories/{itemcategoryId}/seasonedvegetabletypes/{vegetablestypeId}/apples
All the above 3 should be in 3 different swagger files, so I need to create 3 api proxies for the above.
Since the path suffix is same for all of them "/supermaket" the Azure API Manager will not allow to create another api proxy with the same path suffix as it MUST Be unique.
So to achieve this in Apigee Edge (Google Edge) api management product. I will have the basepaths as below
/supermarket/v1
/supermarket/v1/aisles//itemcategories/
/supermarket/v1/aisles//itemcategories/*/seasonedvegetabletypes
so that I can avoid the unique path constraint also achieve creating 3 api proxies.
But the Azure API Manager is not accepting the "wildcard" entries into the API Path Suffix field when creating the API Proxy.
Note:
You may suggest combining the 3 apis into a single swagger file might solve the issue but the example I gave above is only 30% of the swagger and we have many such paths that will fall into a single business domain so we must have them in different swagger files and different api proxies.
We should be in a position to deploy different API Proxies with the same path suffix by allowing wild cards or regex into the API Path Suffix.
Your help to resolve this is highly appreciated. Thanks.
At this point this is a limitation that can't be worked around. The only way to make APIM serve those URIs is to put all of them under single API, which is not what you want, unfortunatelly.