My couchbase version : Couchbase Server Enterprise Edition 6.0.3 build 2895.
My Query :
SELECT t1.text
FROM bucket_name AS t1
WHERE SEARCH(t1, {
"explain": false,
"fields": [
"*"
],
"highlight": {},
"query": {
"match": "earth",
"field": "text",
"analyzer": "standard"
},
"size" : 10
})
i ran that query via couchbase web ui, then error shown'up, and just like :
"Invalid function SEARCH. - at )",
have no idea about at )
The SEARCH() function in the N1QL is supported in 6.5.0 not in 6.0.3
. You can use 6.5.0 beta or in 6.0.3 use CURL() function described section 2.1 at the following link https://blog.couchbase.com/n1ql-and-search-how-to-leverage-fts-index-in-n1ql-query/
https://blog.couchbase.com/tag/fts/
Related
https://management.azure.com/subscriptions/{subscription Id}/resourceGroups/{Resource Group}/providers/Microsoft.Sql/servers/{servername}/providers/microsoft.insights/metrics?api-version=2017-05-01-preview&$filter=(name.value eq 'dtu_consumption_percent' ) and startTime eq 2017-09-10 and endTime eq 2017-09-11 and timeGrain eq duration'PT1H'
The above is the url I am passing to get the data for sql server metrics
The response i am getting is :
{
"cost": 0,
"timespan": "2017-09-12T03:56:27Z/2017-09-12T04:56:27Z",
"interval": "PT1M",
"value": [
{
"id": "/subscriptions/{subscription Id}/resourceGroups/{Resource Group}/providers/Microsoft.Sql/servers/{server name}/providers/Microsoft.Insights/metrics/dtu_consumption_percent",
"type": "Microsoft.Insights/metrics",
"name": {
"value": "dtu_consumption_percent",
"localizedValue": "DTU percentage"
},
"unit": "Percent",
"timeseries": []
}
]}
There is no data in time series.
What could be the issue ?
If I do not specify the filters, the dtu_consumption_percent (default metric) can be returned when I use " 2017-05-01-preview" as the API version.
Data of timeseries:
If I specify the filters as yours and use " 2017-05-01-preview" as the API version, I get 400 error.
Under “Retrieve Metric Values” section in this article, I find:
Note
To retrieve metric values using the Azure Monitor REST API, use "2016-06-01" as the API version.
You can try to use "2016-06-01" as the API version to retrieve metric values.
After asked this question : topic restHeart with a satisfactory answer, i have a new question.
I used this http restHeart request :
PATCH http://test:8081/purge/test3 { rts: [
{
"name": "addRequestProperties",
"phase": "REQUEST",
"scope": "CHILDREN",
"args": { "log": [ "dateTime", "epochTimeStamp" ] } }]}
and now when I insert some json data, mongo db automaticaly add datetime and epochTimeStamp like that :
"invoiceNumber": "6666"
"log": {
"dateTime": "[23/Mar/2016:16:24:24 +0000]"
"epochTimeStamp": 1458750264
}
So my problem is now to make my query.
I tried something like that but does not work :
http://test:8081/purge/test3?filter={"log":{"epochTimeStamp":{"$lte":"1458750378"}}}
Finally my query retrieve nothing...
Version mongo 3.2 / restheart 1.2
Hope you can help me :)
You are passing a string to the $lte operator.
You need to pass a number:
http://test:8081/purge/test3?filter={"log.epochTimeStamp":{"$lte": 1458750378}}
This question is very similar to Missing attributes on Orion CB Entity when registering device through IDAS but found no definitive answer there.
I have been trying FiWare to get UL2.0 via IDAS to the Orion CB working in the Fiware-Lab env:
using latest GitHub
https://github.com/telefonicaid/fiware-figway/tree/master/python-IDAS4
scripts
following the tutorials in particular
http://www.slideshare.net/FI-WARE/fiware-iotidasintroul20v2
I have a FI-WARE Lab account with token generated. Adapted the config.ini file:
[user]
# Please, configure here your username at FIWARE Cloud and a valid Oauth2.0 TOKEN for your user (you can use get_token.py to obtain a valid TOKEN).
username=MY_USERNAME
token=MY_TOKEN
[contextbroker]
host=130.206.80.40
port=1026
OAuth=no
# Here you need to specify the ContextBroker database you are querying.
# Leave it blank if you want the general database or the IDAS service if you are looking for IoT devices connected by you.
# fiware_service=
fiware_service=bus_auto
fiware-service-path=/
[idas]
host=130.206.80.40
adminport=5371
ul20port=5371
OAuth=no
# Here you need to configure the IDAS service your devices will be sending data to.
# By default the OpenIoT service is provided.
# fiware-service=fiwareiot
fiware-service=bus_auto
fiware-service-path=/
#apikey=4jggokgpepnvsb2uv4s40d59ov
apikey=4jggokgpepnvsb2uv4s40d59ov
[local]
#Choose here your System type. Examples: RaspberryPI, MACOSX, Linux, ...
host_type=MACOSX
# Here please add a unique identifier for you. Suggestion: the 3 lower hexa bytes of your Ethernet MAC. E.g. 79:ed:af
# Also you may use your e-mail address.
host_id=a0:11:00
I used the SENSOR_TEMP template, adding the 'protocol' field (PDI-IoTA-UltraLight which as the first problem I stumbled upon):
{
"devices": [
{ "device_id": "DEV_ID",
"entity_name": "ENTITY_ID",
"entity_type": "thing",
"protocol": "PDI-IoTA-UltraLight",
"timezone": "Europe/Amsterdam",
"attributes": [
{ "object_id": "otemp",
"name": "temperature",
"type": "int"
} ],
"static_attributes": [
{ "name": "att_name",
"type": "string",
"value": "value"
}
]
}
]
}
Now I can Register the device ok. Like
python RegisterDevice.py SENSOR_TEMP NexusPro Temp-Otterlo
and see it in Device List:
python ListDevices.py
I can send Observations like
python SendObservation.py Temp-Otterlo 'otemp|17'
But in the ContextBroker I see the Entity but never the measurements, e.g.
python GetEntity.py Temp-Otterlo
Gives
* Asking to http://130.206.80.40:1026/ngsi10/queryContext
* Headers: {'Fiware-Service': 'bus_auto', 'content-type': 'application/json', 'accept': 'application/json', 'X-Auth-Token': 'NULL'}
* Sending PAYLOAD:
{
"entities": [
{
"type": "",
"id": "Temp-Otterlo",
"isPattern": "false"
}
],
"attributes": []
}
...
* Status Code: 200
* Response:
{
"contextResponses" : [
{
"contextElement" : {
"type" : "thing",
"isPattern" : "false",
"id" : "Temp-Otterlo",
"attributes" : [
{
"name" : "TimeInstant",
"type" : "ISO8601",
"value" : "2015-10-03T14:04:44.663133Z"
},
{
"name" : "att_name",
"type" : "string",
"value" : "value",
"metadatas" : [
{
"name" : "TimeInstant",
"type" : "ISO8601",
"value" : "2015-10-03T14:04:44.663500Z"
}
]
}
]
},
"statusCode" : {
"code" : "200",
"reasonPhrase" : "OK"
}
}
]
}
I get an TimeInstant attribute strangely. I tried playing with settings of the .ini like fiware-service=fiwareiot, but to no avail. I am out of ideas. The documentation at the catalogue. for IDAS4
is talking about observations to be sent to port 8002 and setting "OpenIoT" service, but that failed as well.
Any help appreciated.
You should run "python SendObservation.py NexusPro 'otemp|17'" instead of "python SendObservation.py Temp-Otterlo 'otemp|17'".
The reason is that you are providing an observation at the southbound and then, the DEV_ID should be used.
The entity does not include an attribute until an observation is received so then it is normal you are not able to see it. Once you try the one above it should all work.
Cheers,
I'm trying to explore Apache Drill. I'm not a Data Analyst, just an Infra support Guy. I see documentation on Apache Drill is too limited
I need some details about custom data storage that can be used with Apache Drill
Is it possible to query HDFS without Hive, using Apache Drill just like dfs do
Is it possible to query old age RDBMS like MySQL and Microsoft SQL
Thanks in advance
Update:
My HDFS Storage defention says error (Invalid JSON mapping)
{
"type":"file",
"enabled":true,
"connection":"hdfs:///",
"workspaces":{
"root":{
"location":"/",
"writable":true,
"storageformat":"null"
}
}
}
If I replace hdfs:/// with file:///, it seems to accept it.
I copied all the library files from the folder
<drill-path>/jars/3rdparty to <drill-path>/jars/
Cannot make it work. Please help. I'm not a dev at all, I'm Infra guy.
Thanks in advance
Yes.
Drill directly recognizes the schema of the file based on the metadata. Refer the link for more info -
https://cwiki.apache.org/confluence/display/DRILL/Connecting+to+Data+Sources
Not Yet.
While there is a MapR driver that lets you achieve the same but it is not inherently supported in Drill now. There have been several discussions around this and it might be there soon.
YES, it is possible that drill can communicate with both the Hadoop system and the RDBMS systems together. Infact you can have queries joining both the systems.
The HDFS storage plug in can be as :
{
"type": "file",
"enabled": true,
"connection": "hdfs://xxx.xxx.xxx.xxx:8020/",
"workspaces": {
"root": {
"location": "/user/cloudera",
"writable": true,
"defaultInputFormat": null
},
"tmp": {
"location": "/tmp",
"writable": true,
"defaultInputFormat": null
}
},
"formats": {
"parquet": {
"type": "parquet"
},
"psv": {
"type": "text",
"extensions": [
"tbl"
],
"delimiter": "|"
},
"csv": {
"type": "text",
"extensions": [
"csv"
],
"delimiter": ","
},
"tsv": {
"type": "text",
"extensions": [
"tsv"
],
"delimiter": "\t"
},
"json": {
"type": "json"
}
}
}
The connection URL will be your mapR/Coudera URL with port number 8020 by default . You should be able to spot that in the configuration of Hadoop on your system with configuration key : "fs_defaultfs"
I have a ready Django project that serves JSON via the rest_framework and its viewsets. Now I would like to write the client using Ember. Here is my setup:
Django 1.6.5
Ember 1.6.1
Ember-Data 1.0.0-beta.8.2a68c63a
jQuery 2.1.1
My Django server runs on the default port 8000 under localhost. I test my Ember application by opening index.html in the browser. Therefore, I customised the ApplicationAdapter like so
App.ApplicationAdapter = DS.RESTAdapter.extend({
host: 'http://localhost:8000',
});
I try to fetch a list of artists from http://localhost:8000/artists. The specified route is
App.ArtistsRoute = Ember.Route.extend({
model: function() {
this.store.find('artists');
}
});
And the response I get back from the server when I open the mentioned url in the browser is
[
{
"id": 1,
"name": "Kollegah",
"origin": "Germany",
"genre": "German Rap"
},
{
"id": 2,
"name": "Peter Fox",
"origin": "Germany",
"genre": "Hip-Hop"
},
{
"id": 3,
"name": "Farid Bang",
"origin": "Germany",
"genre": "German Rap"
},
{
"id": 4,
"name": "Eko Fresh",
"origin": "Germany",
"genre": "German Rap"
}
]
When fetching the data I these two Ember errors:
-Error while processing route: artists No model was found for 'artists' Error: No model was found for 'artists'
-No model was found for 'artists' Error: No model was found for 'artists'
The problem is that I have specified a model already
var attr = DS.attr;
App.Artist = DS.Model.extend({
name: attr,
origin: attr,
genre: attr,
});
I suppose the problem is the missing root element at the beginning of each JSON response. I suggest it should look like this example from the Ember Guides:
{
"post": {
"id": 1,
"title": "Rails is omakase",
"comments": ["1", "2"],
"user" : "dhh"
},
"comments": [{
"id": "1",
"body": "Rails is unagi"
}, {
"id": "2",
"body": "Omakase O_o"
}]
}
After searching a short while I found a similar problems with Rails. I tried out the solution for Django mentioned in another Stackoverflow question but I got the same errors.
Does anybody know a server-side solution for this problem? The Ember Data Django Adapter could be one for the client side. Unfortunately, it is designed as a node plugin and at the moment I don't use it for my project.
The Ember Data Django Adapter is the most simple way of solving your problem, and if you pay attention to the docs, there are instructions for using it without ember-cli.