How to write an object to gcp object store with x-goog-if-generation-match from a cloud function - google-cloud-functions

I'd like to write an object to gcp object store, while using the x-goog-if-generation-match feature. Using #google-cloud/storage npm library, the file object does not seem to have an option for setting the required object generation.
What are the alternatives?

As you noticed, the #google-cloud/storage npm library doesn't support generation and metageneration preconditions.
As an alternative, you may use either the Storage XML API or the Storage JSON API which do support it. Depending on if you want to use one or the other, you'll be able to use preconditions via HTTP Headers or query string parameters. You'll find the whole list of those here.

Another alternative is to use some kind of optimistic locking:
get the generation id
write object
get the generation id again
repeat until generation after = generation before + 1

Related

dynamically update the request json and send it as multipart form data in karate [duplicate]

In my karate tests i need to write response id's to txt files (or any other file format such as JSON), was wondering if it has any capability to do this, I haven't seen otherwise in the documentation. In the case of no, is there a simple JavaScript function to do so?
Try the karate.write(value, filename) API but we don't encourage it. Also the file will be written only to the current "build" directory which will be target for Maven projects / stand-alone JAR.
value can be any data-type, and Karate will write the bytes (or plain-text) out. There is no built-in support for any other format.
Here is an example.
EDIT: for others coming across this answer in the future the right thing to do is:
don't write files in the first place, you never need to do this, and this question is typically asked by inexperienced folks who for some reason think that the only way to "save" a response before validation is to write it to a file. No, please don't waste your time - and please just match against the response. You can save it (or parts of it) to variables while you make other HTTP requests. And do not write your tests so that scenarios (or features) depend on other scenarios, this is a very bad practice. Also note that by default, Karate will dump all HTTP requests and responses in the log file (typically in target/karate.log) and also in the HTML report.
see if karate.write() works for you as per this answer
write a custom Java (or JS function that uses the JVM) to do what you want using Java interop
Also note that you can use karate.toCsv() to convert JSON into CSV if needed.
My justification for writing to a file is a different one. I am using karate explicitly to implement a mock. I want to expose an endpoint wherein the upstream system will send some basic data through json payload using POST/PUT method and karate will construct the subsequent payload file and stores it the specific folder, and this newly created payload file will be exposed through another GET call.

Couchbase java-client vs couchbase-client

What is the difference between Couchbase java-client and couchbase-client?
I can see bulk get operation in java-client but not in couchbase-client
Is it possible to do bulk get operation in if we use couchbase-client?
For couchbase-client you refer to REST API Couchbase?
if this its true SDK java client create connection with database directly and there are many functions to get, create and update documents
you can map document in java class and even if you want put your Example.java class directly in document in spring there are documentarion for implement this.
In other way with API you can send n1ql query the response contain the document in json format.
Depends on the implementation you want.
I recomended that you will use SDK JAVA
Good luck.

Skipping precaching: Cannot read property 'concat' of null`

Here's my question: How might I try to get rid of the 'skipping precaching' and cache everything that comes in from https://laoadventist.info/beta/r as the precache list?
Also, is it correct for me to set precache="https://laoadventist.info/beta/r" or should I be setting that to a function that grabs the data and returns it instead?
Skipping precaching: Cannot read property 'concat' of null
comes out on the console when using My Polymer App
<platinum-sw-cache default-cache-strategy="fastest" cache-config-file="cache-config.json" precache="https://laoadventist.info/beta/r">
I am assuming correctly I can precahce a URL like this, right?
I am trying to load a json result from laravel 5.1 to set what my precache should be... I know it's not the most elegant, but I'm new to Polymer, cache, service workers, etc, and using this app as a learning opportunity. It'll be a bit different at the end of the day, but for now I just want to load everything. :)
I want to precache all of the data so that a user can fully utilize this app when offline (though later I'll set it up so that they don't have to precache loads and loads of json requests, only the ones they want, like per book - but that's for later).
If you have a array of resource URLs that you want precached, the cleanest way to specify them is to use the cacheConfigFile option and to point to a JSON file that contains your array as its precache property. Take a look at the example in the docs for cacheConfigFile: https://elements.polymer-project.org/elements/platinum-sw?active=platinum-sw-cache
You shouldn't have to use the precache attribute on the element if you're using cacheConfigFile.
It sounds like you're using Polymer Starter Kit, and that will create the JSON config file and populate it for you with an array corresponding to your local resources. But if you'd like to specify additional resources to be precached, you can modify the build process to append those URLs to the auto-generated list.
The reason you're seeing that error is because you're pointing to a JSON config file that is effectively empty, and is just meant for the development environment.

How to use JSON API without document

This might be a weird question but I am open for all the suggestions.
The background is I want to use script to automatically deploy/remove docker container on Jelastic, but unfortunately this part is not well documented in Jelastic official API document. Jelastic provided me a piece of sample code demonstrated how to use bash to create a new environment with a new docker container but it is not enough, I still don't know how to create/remove docker container by looking at the sample code.
Since Jelastic is using standard JSON API, I am wondering is there any tool which can automatically retrieve/detect that the parameters I can use with Jelastic JSON API?
If you were me, how would you get over this if there is no document as reference?
I am keen to use Jelastic, but this issue stopped me from onboarding, many thanks.
J.
All the parameters that can be used with Jelastic JSON API are specified at http://docs.jelastic.com/api/ page.
To use JSON API without document I suggests to you check the Postman API tool https://www.getpostman.com/. This application allows you to see all the sent/received data and allows you to passes JSON values without any document or any additional actions.
Simplest scenario for beginners: Go to API docs, section Users>Authorization, using Signin method you should to obtain the session value, that is necessary almost for all further actions. Then you need to obtain information about environment, section Environment>Environment, at first you should to executes GetEnvs method, then using the application identifier of the environment that was obtained from the previous command you should to executes GetEnvInfo method. As a results of the described scenario you will get all parameters and values that can be used with Jelastic JSON API for certain type of the environment.

how to pass a value to Odata connector

I am trying to pass a value to my O data Connector that I build in VS 2013, so that it should return results based on that value (a number value from a table that i have).could you tell me some steps to resolve this?
You might find the following article on MDSN to be of use: http://msdn.microsoft.com/en-us/library/ff478141.aspx
A more comprehensive set of documentation on OData URL conventions can be found over here: http://www.odata.org/documentation/odata-version-3-0/url-conventions Note: This is assuming that your OData connection is being instantiated via URL inside HTML Client javascript code.
If you are using an external application, simply install the Microsoft.OData.Client binary using NuGet and you can just write linq queries against the connection. See http://msdn.microsoft.com/en-us/library/dd673933(v=vs.110).aspx for additional information about this pattern.
Hope that helps!