Retrieve the credentials.zip file from GenerateAutonomousDataWarehouseWalletDetails - oracle-cloud-infrastructure

We are trying to download the wallet credentials.zip file for Autonomous Datawarehouse via Python SDK.
We have an option called --file when we do the same operation using oci cli.
oci db autonomous-data-warehouse generate-wallet --autonomous-data-warehouse-id <ocid> --password <my_admin_password> --file <filename.zip>
We are trying the same thing using the python sdk, but we do not get an option to download the zip file. We are executing the below code:
wallet=database_client.generate_autonomous_data_warehouse_wallet("oicd",Password).
We get a response of 200.
But how do we download the zip file?
We tried wallet.data and wallet.headers. Not sure which sub-options to use.
Would be great if someone could help us on this!

According to the Python SDK API reference for this operation, this operation returns a "Response object with data of type stream."
So all you need to do is save the response body (wallet.data in your example) to a file with the proper file extension.

Try something like this:
wallet = database_client.generate_autonomous_data_warehouse_wallet(<OCID>, <password>)
with open('<wallet_file>.zip', 'wb') as f:
for chunk in wallet.data.raw.stream(1024 * 1024, decode_content=False):
f.write(chunk)
The response object (your wallet) has a data field that needs to be streamed into a zip-file.

Related

How can I store JSON in Drone and write it to a file without it getting malformed?

Here's the context of what I'm trying to do.
I would like have a Drone step to run database migrations against a Google Cloud SQL Postgres instance.
I need to use Cloud SQL Proxy in order to access the database. Cloud SQL Proxy requires you provide a credential file to the proxy.
The problem I'm having is that when I try to echo or printf the environment variable to a file (as suggested here) the JSON comes out malformed.
Note: I've tried adding the JSON via Drone GUI and Drone CLI.
The best solution I found to this problem is to simply base64 encode the JSON before putting it into Drone.
Decode the base64 when you need it in your step.
Example commands:
Encode: base64 data.txt > data.b64
Decode: echo $CREDS_B64 | base64 --decode > sql-deploy-creds.json

Azure Logic Apps: Read XML from file and Write a JSON with data

I'm new in it and trying to understand Azure Logic Apps.
I would like to create a LogicApp that:
Looks for new XML-Files
and for each file:
Read the XML
Check if Node "attachment" is present
and for each Attachment:
Read the Filename
Get the File from FTP and do BASE64-encoding
End for each Attachment.
Write JSON File (I have a schema)
DO HTTP-Post to API with JSON file as "application/json"
Is this possible with the Logic-Apps?
Yes, you can.
Check if a node is present, with xpath expression (e.g. xpath(xml(item()),'string(//Part/#ref)'))
For Get File from FTP, use the action FTP - Get File Content
Write JSON File, use the action Data Operations - Compose. If you need transformations, you have to use an Integration Account and Maps.
Do HTTP Post to API, use de action HTTP

Read and write to JSON on x10Hosting

I have an angular 2 project, how can I read and write to a JSON file on my server?
I can do what I want within my code itself bit I don't want to have to change my code, recompile and upload my website every time.
Any help? Examples are greatly appreciated
Angular can read the remote JSON file using the HTTP Client but it can't directly write to the remote file.
For writing, you can use a server side script such as PHP (supported by x10Hosting) to provide a url that allows Angular to post to (also using the HTTP Client), to update the JSON.
For example something like this PHP:
$data = json_decode('./data.json'); // decode the json
$data->something = $_POST['something']; // update the something property
file_put_contents('./data.json', json_encode($data)); // write back to data.json

how to import strongloop based api into json or yaml specification

I have an API who is generarted using loopback / strongloop and its running as follows.
and i want to export the the generated api into yaml or json so that i want to reuse it in another application. i m looking for swagger.json file.
SO in sawgger you get your json from your running api by going to localhost:3300/api-docs. How do i get it from here ?
You can do that with
localhost:3300/explorer/resources to get a list of all resources and localhost:3300/explorer/resource/ModelPluralName to get swagger for specific resource.
Also you can click on "Raw" link in your API explorer.
*Assuming of course that your application is running on localhost port 3300.
you can do that simply by typing
localhost:3300/explorer/swagger.json ,
you need to download the json file and load that into editor.swagger.io using file import then download it as YAML to feed into your restAPi ppliaction.

How can I display an XML page instead of JSON, for a dataset

I am using the pycsw extension to produce a CSW file. I have harvested data from one CKAN instance [1], into another [2], and am now looking to run the pycsw 'paster load' command:
paster ckan-pycsw load -p /etc/ckan/default/pycsw.cfg -u [CKAN INSTANCE]
I get the error:
Could not pass xml doc from [ID], Error: Start tag expected, '<' not found, line 1, column 1
I think it is because when I visit this url:
[CKAN INSTANCE 2]/harvest/object/[ID]
It comes up with a JSON file as opposed to an XML (which it is expecting)
I have run the pycsw load command on other ckan instances and have had no problems with them. They also display an XML file at the url stated above, so I wanted to know how to get CKAN to serve an XML file instead of JSON?
Thanks in advance for any help!
As you've worked out, your datasets need to be in ISO(XML) format to load into a CSW server. A CKAN only has a copy of the dataset in ISO(XML) format if it harvested them from a CSW.
If you use the CKAN(-to-CKAN) harvester in the chain then the ISO(XML) record doesn't get transferred with it. So you'd either need to add this functionality to the CKAN(-to-CKAN) harvester, or get rid of the CKAN-to-CKAN harvest step.
Alternatively if the record originated in a CKAN, then it has no ISO(XML) version anyway, and you'd need to create that somehow.