I am using MATLAB to write data from MATLAB to firebase. I am using following lines of code to do so:
thingSpeakURL = 'https://hybrid-cabinet-265907.firebaseio.com/Ship A/Time Stamp.json';
lat = num2str(42);
lon = num2str(42);
data = struct('lat',lat,'lon',lon);
webwrite(thingSpeakURL,data)
Data is successfully written to Firebase. It is making my original JSON data as a child to a random string been generated on run-time.
For example, my JSON string is {lat: '40',lon:'40'} but instead it is creating a random string, let say, "Mxkkllslsll-1112", making that random string as parent and writing something like {"Mxkkllslsll-1112": lat:'40', lon:'40'} to the firebase database.
Please have a look at following image. It shows that for ship A, I have written data from MATLAB and it is not writing properly(I am facing the problem which I discussed above). I want to make it something like data written for Ship B.
I want to write the data without making any random string as a parent. Kindly assist me in that.
This is because webwrite uses the HTTP POST method by default.
As shown in the Firebase Realtime Database REST API documentation, if you do a POST you will push the data and therefore automatically generate a unique key every time a new child is added to the specified Firebase reference (the -MDJVMk..... value we can see in your question).
You need to use the PUT method.
I don't know matlab but a rapid look at the documentation shows that you need to use the RequestMethod option with a put value, in the weboptions object.
The above pushed me in the right direction (thanks!), and I had success with the following.
CAUTION: The following will overwrite everything in your database!
url = 'https://***.firebaseio.com/.json';
data.users(1) = struct('first','John','last','Locke');
data.users(2) = struct('first','Thomas','last','Hobbes');
data.users(3) = struct('first','Rene','last','Descartes');
headers = {'Content-Type' 'application/json'; 'Accept' 'application/json'};
options = weboptions('RequestMethod', 'put', 'HeaderFields', headers, 'ArrayFormat', 'json');
response = webwrite(url, data, options);
If your data is stored in a .json file (i.e., you don't want to create structures manually in Matlab), you can read it using "fileread" and pass in data as a string (instead of a structure).
Related
I have a JSON file in Azure Blob storage that I need to parse and insert rows into SQL using the Logic App.
I am using the "Get Blob Content" and my first attempt was to then pass to "Parse JSON". It returns and error": InvalidTemplate. Unable to process template language expressions in action 'Parse_JSON' inputs at line '1' and column '2856'"
I found some discussion that indicated that the content needs to be converted to a string so I used "Compose" and edited the code as suggested to
"inputs": "#base64ToString(body('Get_blob_content').$content)"
This works but then the InvalidTemplate issue gets pushed to the Parse function and I get the InvalidTemplate error there. I have tried wrapping the output in JSON expression and a few other things but I just can't get it to parse.
If I take a sample or even the entire JSON and put it into the INPUT of the Parse function it works without issue but it will not accept the blob content as JSON.
The only thing I have been able to do successfully from blob content is to take it as a string and update a row in SQL to later use the OPENJSON in SQL...but I run into an issue there that is for another post.
I am at a loss of what to do.
You don't post much information about your logic app actions, so maybe you could refer to my flow design. I test with a json data with array.
The below is my flow picture. I'm not using compose action, and use decodeBase64(body('Get_blob_content')['$content']) as the Parse Json content.
And if select property from the json, you need set the array index. I set a variable to get a value 'body('Parse_JSON')1['name']'.
you could have a try with this, if still fail, please provide more information or some sample to let us have a test.
I have a post API , i want methods of JsonSchemaValidator to be used as I want the whole reponse to be validated rather than selected reponse by performing assertion
I have tried to use
matchesJsonSchemaInClasspath("my file name") and
matchesJsonSchema(my file object)
my reposne is coming to be true, method is getting passed but there is no checking or validation with my schema file
public void directLoginWihSchemaValiadtor(){
File file = new File("C:/Users/abeey/git/SlingAppWebService/Configurations/JsonSchemaValidator_DirectLogin_AWS.json");
jsonasmap.put("contactNo", "some number");
jsonasmap.put("loginType","0");
jsonasmap.put("appid","2");
jsonasmap.put("co*****ode","IN");
JsonSchemaFactory jsonSchemaFactory = JsonSchemaFactory.newBuilder().
setValidationConfiguration(ValidationConfiguration.newBuilder().freeze()).freeze();
given().contentType(ContentType.JSON).body(jsonasmap).when().
post("https://60i*****.execute-api.us-west-2.amazonaws.com/some-api").
then().assertThat().
body(JsonSchemaValidator.matchesJsonSchema(file))).
log().all();
jsonasmap.clear();
}
//body(JsonSchemaValidator.matchesJsonSchemaInClasspath("JsonSchemaValidator_DirectLogin_AWS.json").using(jsonSchemaFactory))
I tried to use jsonSchemaFactory to do this but i didnt get that either on what to set as the draftversion or from where to get it
I am new to this , please bear with me if you found this question too simple to be asked
For such case usually I do following:
use schema generator and create the schema for json body (I use this tool json-schema-generator)
put generated json schemas in the classpath (for example test/resources)
use this code as part of REST Assured test:
.body(matchesJsonSchemaInClasspath("your_schema_name.json"))
If you want to make sure that schema validation is working, you can edit schema file and change the type of any required field to something else and see that your test will fail.
You can refer to this post of mine to see some code samples.
I'm new in creating jsons and having bais knowledge of java.
I'm trying to convert database table data to json.
Having option to store table data in any format of file than convert that into json.
Here is my table data.
Table: PKGS
Price, pd, Id, Level
1 , 266 , 59098 , 5
2 , 247 , 59098 , 5
I want my table data in this json format. Its just an example...to show level in JSON
"Id":59098
"pd":266
"Level":5
"price":1
"Id":59098
"pd":247
"Level":5
"price":2
In this json there is two loops are going If am not wrong. I was able to do it for one loop in ETL..but couldnt do it for two loops.
Not getting values for reimbursementId and packageId
Have goggled alot but couldn't find any code to understand properly and approach for the same.
Code tried little bit
FileInputStream inp = new FileInputStream("D:/json.xlsx" );
Workbook workbook = WorkbookFactory.create( inp );
Sheet sheet = workbook.getSheetAt( 0 );
JSONObject json = new JSONObject();
JSONArray rows = new JSONArray();
but dont know what to next !!
can anyone tell me how to do this ?
I advice you to use a specific tool. This is not a whole new case.
Try Talend Open Studio. It is not so complicate to use if you want to convert a file (CSV, JSON, Database directly, etc) to another. Please see TalendForge for basics.
In your case, you can connect to your database, and send all data in JSON.
Edit:
Your representation is not following the same logic than JSON. Here how I see it (and this is probably wrong because I can't understand)
If you just want Excel to JSON without any changes:
{
"rows":[
{
"Price":"1",
"pd":"266",
"Id":"59098",
"Level":"5"
},
{
"Price":"1",
"pd":"266",
"Id":"59098",
"Level":"5"
},
//and again and again
{
"Price":"2",
"pd":"247",
"Id":"59098",
"Level":"5"
}
]
}
If you want to reorganize, then define what you want. Try to imagine a sample of your data in a Java Object using ArrayList, int, String and subclass or even better in a JavaScript Object.
For the last example it will give you:
public class myJson{
ArrayList<myObject> rows;
with
public class myObject{
String Price;
String pd;
String Id;
String Level; //Or int, or date, or whatever
If you want to reorganize your data model, please give us this model.
Converting Excel file data to Json format is a a bit complex process, it depends on the structure of data, we do not have exact online tool as such so far...
custom code is required, there are various technologies available to be used, but best suited should be VBA, because VBA fits within Excel and can generate Json file quickly and flexible to edit code compared to any other technology that requires to automate to excel and import data and then process.
We can find there are various websites provide code to generate json from excel data, here is one such site looks expertise in this area. http://www.xlvba.net/tools/excel-automation-to-convert-excel-data-to-json-format.html
Ragavendra
I have an AngularJS/WebAPI app that is a wizard style app where you go step by step and type in some information and in the end you get an answer.
I would like to automate this and use a data source but the problem is every page is passing a giant JSON object back to the server with the parameters changing as the user goes through the app.
So for example one of the parameters entered is a ZIP CODE. But to use the DATA SOURCE concept as demoed I would need to create a CSV file which would be like
, , etc..
And that Body/String Body does not have option to add a data source...
Any ideas?
The StringBody fields can contain content parameters. They can be edited via the properties panel of the relevant part of the request in Visual Studio. They can be set to text in the style
some text {{ContextParameter1}} more text {{ContextParameter2}} even more text
the items with doubled curly braces will be replaced with the named context parameters. The rest is taken from the original string body. Values from the data source are made available as context parameter and so can be included. You may need to set the "Select columns" properties of the data source to "Select all columns" to make all values available, the default is just those that are explicitly bound.
Use this method to parameterise sections of the recorded string bodies.
It is also possible to edit the ".webtest" file, it is just an XML representation of the test. However all the string bodies I have seen are 16-bit values (ie 16 bits per character) that are then base-64 encoded.
You should be able to set up an extraction rule on the first response. This value will be stored in a context variable for use in subsequent requests.
e.g.
Request1 with either no variables or data sourced variables
Response1 from which is extracted myVariable
Request2 utilising {{myVariable}} extracted previously
Response2 extract similar . . .
At each subsequent request, replace the section of the json in the string body with {{myVariable}}.
I'm trying to pass data from a FSharp.Data.CsvProvider to a Deedle.Frame.
I'm almost new in F#, and I need to convert some CSV files from culture "it-IT" to "en-US", so I can use the data.
I found Deedle, and I want to learn how to use it, but I was not able to directly convert the data from a CSV file in Deedle (at least is what is printed in F# interactive).
I noticed that the CsvProvider makes the conversion, but after some days of attempts I am not able to pass the data.
I believe that Deedle should be able to deal with CSV files that use non-US culture:
let frame = Frame.ReadCsv("C:\\test.csv", culture="it-IT")
That said, if you want to use the CSV type provider for some reason, you can use:
let cs = new CsvProvider<"C:/data/fb.csv">()
cs.Rows
|> Frame.ofRecords
|> Frame.indexColsWith cs.Headers.Value
This uses Frame.ofRecords which creates a data frame from any .NET collection and expands the properties of the objects as columns. The CSV provider represents data as tuples, so this does not name the headers correctly - but the Frame.indexColsWith function lets you name the headers explicity.