As part of a revit addin that I am running in design automation, I need to extract some data from the file, send it in json format to an external server for analysis, and get the result to update my revit file with new features. I was able to satisfy my requirement by following the indicated in: https://forge.autodesk.com/blog/communicate-servers-inside-design-automation, which worked as I needed, the problem arises when the size of the data to send for the analysis grows, it results in the following error:
[11/12/2020 07:54:08] Error: Payload for "onProgress" callback exceeds $ 5120 bytes limit.
When checking my data it turns out that the payload is around 27000 bytes, are there other ways to send data from design automation for Payloads larger than 5120 bytes?
I was unable to find documentation related to the use of ACESAPI: acesHttpOperation
There is no other way at the moment to send data from your work item to another server.
So either you would have to split up the data into multiple 5120 byte parts and send them like that or have two work items: one for getting the data from the file before doing the analysis and one for updating the file afterwards.
Related
I am now using Jmeter to run the test of APIs.
The situation is that I have a login Api which will return a token inside response. I use a JSON extractor to save the token as a variable. Then, I use the ${token} is the header of other requests.
However, I found that when I was trying to run 40-50 threads, the ${token} in some threads would be empty, and caused a high error rate.
Therefore, may I ask is there any method to solve it and why?
Thanks very much.
Try saving full response from the Login API, most probably your server gets overloaded and cannot return the token and returning some error message instead.
There are following options:
If you're running JMeter in command-line non-GUI mode you can amend JMeter's Results File Configuration to store the results in XML form and include the response data, add the next lines to user.properties file:
jmeter.save.saveservice.output_format=xml
jmeter.save.saveservice.response_data=true
and when you run your test next time the .jtl results file will contain response bodies for all the requests.
Another option is using a Listener like Simple Data Writer configured like:
and when you run the test the responses.xml file will contain the response data
Both .jtl results file and responses.xml can be inspected using View Results Tree listener
More information: How to Save Response Data in JMeter
We are using a data acquisition system as a device and send some signals values via MQTT protocol into a container which is assigned to an iot-hub. The connection works well between device and iot-hub, and we receive some JSON data. When we open a JSON data, We cannot read the temperature values in "Body" inside the JSON data, since they are encoded. I would be thankful if you tell us, how we should automatically convert the JSON data to a proper format so that we could read the values in numbers?
Please find below three of our code's lines in JSON Data. The rest of the lines are the same, but they are encoded differently.
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8600000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8600000Z"},"Body":"My42MjI3NTQ="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8750000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8750000Z"},"Body":"My42ODEyNDY="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.9070000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.9070000Z"},"Body":"My43Mzk1OTI="}
Thanks in advance!
Br
Masoud
you should add to the message topic two parameters such as the content-type (ct) and content-encoding (ce) like is shown in the following example:
devices/device1/messages/events/$.ct=application%2Fjson&$.ce=utf-8
I am checking a load with a minimum of 1000 threads in JMeter in the command line mode. But at the end of the execution, I am getting an aggregated result. What I actually wanted is the time taken for each thread in a CSV or in Graph.
Note: Request and response is in JSON
Please check simple data writer.
It will help to save the data in csv. From the listener configure, you can control what to capture. From the csv, you can find the time taken per thread.
I'm implementing a web service that needs to query a JSON file( size: ~100MB; format: [{},{},...,{}] ) about 70-80 times per second, and the JSON file will be updated every hour. "query a JSON file" means checking if there's a JSON object in the file that has an attribute with a certain value.
Currently I think I will implement the service in Node.js, and import ( mongoimport ) the JSON file into a collection in MongoDB. When a request comes in, it will query the MongoDB collection instead of reading and looking up in the file directly. In the Node.js server, there should be another timer service, which in every hour checks whether the JSON file has been updated, and if it has, it needs to "repopulate" the collection with the data in the new file.
The JSON file is retrieved by sending a request to an external API. The API has two methods: methodA lets me download the entire JSON file; methodB is actually just an HTTP HEAD call, which simply tells whether the file has been updated. I cannot get the incrementally updated data from the API.
My problem is with the hourly update. With the service running, requests are coming in constantly. When the timer detects there is an update for the JSON file, it will download it and when download finishes it will try to re-import the file to the collection, which I think will take at least a few minutes. Is there a way to do this without interrupting the queries to the collection?
Above is my first idea to approach this. Is there anything wrong about the process? Looking up in the file directly just seems too expensive, especially with the requests coming in about 100 times per seconds.
I am developing an iOS app that uses a single context architecture. I make frequent calls to my API (PHP) and I want to "cache" the output for as long as the session is active. Right now I am saving the output to a variable that is defined in the app.s.
var contacts = {
contactsData: null
};
So I do this to save the output, is it really a good idea? Will it slow things down?
contacts.contactsData = output;
Thankful for all input!
It consist of how big is json file in mb. If device have enough RAM - it is the best way. Also be sure you save decoded json not just request response, so you will not decode it every time.
If json data is too big you must think about some kind of local storage. If Json is always the same (no need to synch every time) save it local.
If you need update it often you can upload extremly needed part with 1 limited request (API config needed) and other data with second background request.