URL template parameters not working in APIM - azure-api-management

I have a scenario:
I want to connect to my backend apis by providing the api endpoints as path.
For eg. apis would look like following
/Measure/Test/Calories?q=*
/Measure/Test/Weight
/Food/Test/IntakeAmount/
/v1/Food/Test/Summary
Though, when I provide the absolute path to api endpoints it do work but providing the endpoints from url template parameters it throws error of 404 Not Found
Also when I check the trace the inbound request is not able to find the operation:
> api-inspector (0.008 ms) {
> "configuration": {
> "api": {
> "from": "/testapi",
> "to": {
> "scheme": "http",
> "host": "dev-foodmeasures-summary.com",
> "port": 80,
> "path": "/",
> "queryString": "",
> "query": {},
> "isDefaultPort": true
> },
> "version": null,
> "revision": "1"
> },
> **"operation": "-"**,
> "user": {
> "id": "1",
> "groups": [
> "Administrators",
> "Developers"
> ]
> },
> "product": {
> "id": "unlimited"
> }
> } }
Below is the snapshot for path parameter
Thanks!

Related

Autodesk Forge API - Patch Item Input Error

I am trying to use the PATCH projects/:project_id/items/:item_id endpoint in order to update the "displayName" attribute of an item in BIM360. There is an example on how to do exactly that in the Forge documentation, but for some reason I am getting an 400 error.
Here is my payload:
{
"jsonapi": {
"version": "1.0"
},
"data": {
"type": "items",
"id": "ITEM_URN",
"attributes": {
"displayName": "NEW_ITEM_NAME.EXTENSION"
}
}
}
Here is the error I get:
{
"jsonapi": {
"version": "1.0"
},
"errors": [
{
"id": "a4c43bbb-9e34-4973-9f9c-58a7e1d7bdb6",
"status": "400",
"code": "BAD_INPUT",
"title": "One or more input values in the request were bad",
"detail": "Request input is invalid for this operation."
}
]
}
I successfully use the same endpoint in order to change the parent folder for this same item (as described in this post answer: Autodesk Forge; Pragmatically changing file location without new upload ), so the problem must be in the update "displayName" portion. Here the successful payload sample that returns a 200 answer:
{
"jsonapi": {
"version": "1.0"
},
"data": {
"type": "items",
"id": "ITEM_URN",
"relationships": {
"parent": {
"data": {
"type": "folders",
"id": "DESTINATION_FOLDER_URN"
}
}
}
}
}
Forge Documentation with example: https://forge.autodesk.com/en/docs/data/v2/reference/http/projects-project_id-items-item_id-PATCH/
What am I missing in order to update the "displayName" attribute?
If you want to change the file name, you can change tip version name and title , creating new version is required, but you don't have to upload file again. Please try the API at https://forge.autodesk.com/en/docs/data/v2/reference/http/projects-project_id-versions-POST/ :
POST /versions?copyFrom={tip_version_urn}
{
"jsonapi": {
"version": "1.0"
},
"data": {
"type": "versions",
"attributes": {
"name": "newName"
}
}
}
A new tip version will be created with the updated name.

How can I create an activity for data conversion in Design Automation API?

I'm prototyping a web service to convert data using Design Automation API in Autodesk Forge.
My approach is to invoke an activity that executes a script to import a target data file (such as STEP, IGES format).
As an example, I created an activity to convert a STEP file to DWG as follows:
{
"HostApplication": "",
"RequiredEngineVersion": "22.0",
"Parameters": {
"InputParameters": [{
"Name": "Source",
"LocalFileName": "input.stp"
}, {
"Name": "HostDwg",
"LocalFileName": "$(HostDwg)"
}],
"OutputParameters": [{
"Name": "Result",
"LocalFileName": "output.dwg"
}]
},
"Instruction": {
"CommandLineParameters": null,
"Script": "import\ninput.stp\nsaveas\n\noutput.dwg\n"
},
"Version": 1,
"Id": "Step2Dwg"
}
The workitem to invoke this activity was executed without errors, but the output file (output.dwg) had nothing imported from the input file (input.stp).
Perhaps this is because some fields (e.g., AllowedChildProcess) were missing in the definition of the activity "Step2Dwg", but I do not know how to fix it.
My questions are:
How to fix the definition of the activity "Step2Dwg" to convert data successfully?
Is there any other approach to create an activity to convert data successfully?
You can use the Activity “Translate-STEP2DWG". It takes a .stp file as input and generate result.dwg as output. This is a public activity that anybody can send workitems against to it.
The activity is defined like this:
{
"Id": "Translate-STEP2DWG",
"AppPackages": [],
"HostApplication": "AcTranslators.exe",
"RequiredEngineVersion": "22.0",
"Parameters": {
"InputParameters": [
{
"Name": "HostDwg",
"LocalFileName": "source.stp"
}
],
"OutputParameters": [
{
"Name": "Result",
"LocalFileName": "result.dwg"
}
]
},
"Instruction": {
"CommandLineParameters": "-i source.stp -o result.dwg",
"Script": ""
},
"AllowedChildProcesses": [
],
"IsPublic": true,
"Version": 1,
"Description": ""
}
Here is a sample workitem request body:
{
"ActivityId": "Translate-STEP2DWG",
"Arguments": {
"InputArguments": [
{
"Resource": "https://s3.amazonaws.com/AutoCAD-Core-Engine-Services/TestDwg/3DStep.stp",
"Name": "HostDwg"
}
],
"OutputArguments": [
{
"Name": "Result",
"HttpVerb": "POST"
}
]
}
}

ElasticSearch Date Range Aggregation over different fields

I want to be able to fetch documents over a certain date range as provided by the user and the display the document to be viewed. The query that i have currently constructed looks like this :
query
{
"aggs": {
"range": {
"date_range": {
"field": "metadata.o2r.temporal.begin",
"ranges": [
{ "from": "2017-03-30T12:35:41.142Z", "to": "2017-08-02T22:00:00.000Z", "key": "quarter_01" }
],
"keyed": true
}
}
}
}
'
The temporal part of the json documents that i am trying to fetch are like the following:
JSON
"temporal": {
"begin": "2017-08-01T22:00:00.000Z",
"end": "2017-03-30T12:35:41.142Z"
},
Currently i can either query "begin" or "end" but i want to be able to modify the query in such a way that begin becomes a value for "from" and end becomes a values for "to". The catch is here that i do not want my original JSON to be modified.
Updated Query
curl -XGET 'localhost:9201/test/_search?size=0&pretty' -H 'Content-Type: application/json' -d'
> {
> "query": {
> "bool": {
> "must": [
> {
> "range": {
> "metadata.o2r.temporal.begin": {
> "from": "2016-01-01T12:35:41.142Z"
> }
> }
> } ,
> {
> "range": {
> "metadata.o2r.temporal.end": {
> "to": "2016-12-30T22:00:00.000Z"
> }
> }
> }
> ]
> }
> }
> }
> '
Response
{
"took" : 1678,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 1,
"max_score" : 0.0,
"hits" : [ ]
}
}
This might help you
{
"query": {
"bool": {
"must": [
{
"range": {
"metadata.o2r.temporal.begin": {
"from": "2017-03-30T12:35:41.142Z"
}
}
} ,
{
"range": {
"metadata.o2r.temporal.end": {
"to": "2017-08-02T22:00:00.000Z"
}
}
}
]
}
}
}

expected [END_OBJECT] but got [FIELD_NAME], possibly too many query clauses error in kibana

when I am trying to comprise a compound bool query that has a fuzzy must requirement and several should reqs with one being a wildcard, I run into this error message. So far, no alterations to the syntax have helped me to resolve this issue.
The query:
{
"query": {
"bool": {
"must": {
"fuzzy": {
"message": "<fuzzy string>",
"fuzziness": "auto"
}
},
"should": [
{ "query": { "message": "<string>" } },
{ "query": { "message": "<string>" } },
{ "wildcard":
{
"query": { "message": "<partial string*>"}
}
}
],
"minimum_should_match": "50%"
}
}
}
The text inside <> is replaced with my searched string.
You need to replace query with match in your bool/should clause:
> { "query": {
> "bool": {
> "must": {
> "fuzzy": {
> "message": "<fuzzy string>",
> "fuzziness": "auto"
> }
> },
> "should": [
> {"match": {"message": "<string>"}}, <-- here
> {"match": {"message": "<string>"}}, <-- and here
> {"wildcard": {"query": {"message": "<partial string*>"}}}
> ],
> "minimum_should_match": "50%"
> } } }

NodeJS deserialization from a stream

I am having an issue deserializing from a stream in node (specifically the pricing feed from the Bitcoin GOX exchange). Basically a chunk arrives which is well formed complete and verified JSON. Here is the code:
var gox = require('goxstream');
var fs = require('fs');
var options = {
currency: 'AUD',
ticker: true,
depth: false
};
var goxStream = gox.createStream(options);
goxStream.on('data', function(chunk) {
console.log(JSON.parse(chunk));
});
When trying to parse it I get the following
undefined:0
^
SyntaxError: Unexpected end of input
Any ideas? I have included a sample chunk:
> {"channel": "eb6aaa11-99d0-4f64-9e8c-1140872a423d", "channel_name":
> "ticker.BTCAUD", "op": "private", "origin": "broadcast", "private":
> "ticker", "ticker": {
> "high": {
> "value": "121.51941",
> "value_int": "12151941",
> "display": "AU$121.51941",
> "display_short": "AU$121.52",
> "currency": "AUD"
> },
> "low": {
> "value": "118.00001",
> "value_int": "11800001",
> "display": "AU$118.00001",
> "display_short": "AU$118.00",
> "currency": "AUD"
> },
> "avg": {
> "value": "119.58084",
> "value_int": "11958084",
> "display": "AU$119.58084",
> "display_short": "AU$119.58",
> "currency": "AUD"
> },
> "vwap": {
> "value": "119.80280",
> "value_int": "11980280",
> "display": "AU$119.80280",
> "display_short": "AU$119.80",
> "currency": "AUD"
> },
> "vol": {
> "value": "249.73550646",
> "value_int": "24973550646",
> "display": "249.73550646\u00a0BTC",
> "display_short": "249.74\u00a0BTC",
> "currency": "BTC"
> },
> "last_local": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "last_orig": {
> "value": "108.99500",
> "value_int": "10899500",
> "display": "$108.99500",
> "display_short": "$109.00",
> "currency": "USD"
> },
> "last_all": {
> "value": "118.79965",
> "value_int": "11879965",
> "display": "AU$118.79965",
> "display_short": "AU$118.80",
> "currency": "AUD"
> },
> "last": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "buy": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "sell": {
> "value": "119.99939",
> "value_int": "11999939",
> "display": "AU$119.99939",
> "display_short": "AU$120.00",
> "currency": "AUD"
> },
> "item": "BTC",
> "now": "1376715241731341" }}
You can verify it here: http://jsonlint.com
Also it is probably worth mentioning I have already tried parsing and removing the escaped characters. Also have tried a couple of different serializers with the same results
You are getting the data chunk by chunk. Chunks themselves may not be complete JSON objects. Either buffer all of the data, or use something to do it for you (say the request module), or if you need to parse a long stream take a look at the JSONparse module.
You are getting two separate chunks (or at least: that's what I am getting when re-creating your issue). One (the first) is a valid JSON object, while the other (the second) is "almost empty": it is a 1-byte string containing just an LF (ASCII 0x0a).
The second one fails parsing, of course.
Read my first answer: this is exactly such a case. If you concat the two chunks together you get a complete JSON object with a trailing LF, easily passing JSON.parse(). If you try to parse the chunks separately, though, however, the first one succeeds (a trailing LF is not a mandatory) while the second one fails (an LF by itself is not a valid JSON object).
For your case, you would have to:
1) Either assume Mt.Gox always sends data "this way", ignore those "almost empty" chunks, and parse only the "non empty" chunks.
2) Or use JSONparse which parses JSON streams.