non-string key Missing field in Couchbase - couchbase

For migration I have used the following couchbase query
INSERT INTO `bucket`
(KEY _k, VALUE _v, OPTIONS {"expiration": ttl})
SELECT REPLACE(META().id, "2_", "3_", 1) _K, META(_v).expiration AS ttl
from `bucket` _v
where _v._class="classname"
AND META().id LIKE "2_%";
While running getting the following error
{
"code": 5070,
"msg": "Cannot INSERT non-string key Missing field or index _k. of type value.missingValue."
}

The problem was different cases(lower and upper) used in different places. changing _k to small case solved the problem.

Related

Couchbase Full Text search with combination of dynamic fields and N1QL

Consider following documents and consider that is created Full-text index over following documents:
{
email : "A",
"data" : {
"dynamic_property" : "ANY_TYPE",
"dynamic_property2" : {
"property" : "searchableValue"
},
"field" : "VALUE"
}
},
{
email : "B",
"data" : {
"other_dynamic_prop" : "test-searchableValue-2",
}
},
{
email : "A",
"data" : {
"thirdDynamicProp" : {
"childProp" : "this should be searchableValue!"
}
}
}
The goal: Create N1QL query which will match all the documents with have associated given email address AND the data property contains given substring.
Basically following:
SELECT * FROM `bucket` WHERE `email` = 'A' AND `data` LIKE '%searchableValue%';
The expected result is the first and second document because matching criteria. But the query does not work because data is not a text type but is object type. If the data property would be like:
{"data" : "this should be searchableValue!" }
The query would return expected result.
The question is:
How to create such a N1QL query which would return expected result?
I know that Couchbase is not able to do compare substring in the text, but using Full-text index it should be possible since Couchbase 4.5+
Couchbase4.6 and 5.0 have more/better options (explained below). In couchbase4.5, you can use Array Indexing to solve this:
https://developer.couchbase.com/documentation/server/4.5/n1ql/n1ql-language-reference/indexing-arrays.html
https://www.couchbase.com/blog/2016/october/n1ql-functionality-enhancements-in-couchbase-server-4.5.1
For instance, using the travel-sample sample bucket, following array index, and query would do the kind of substring search you want.
create index tmp_geo on `travel-sample`(DISTINCT ARRAY x FOR x IN object_values(geo) END) where type = "airport";
select meta().id, geo from `travel-sample` where type = "airport"
and ANY x IN object_values(geo) SATISFIES to_string(x) LIKE "12%" END;
N1QL introduced a function TOKENS() in 4.6, which can help you create functional index on tokenized sub-objects (instead of array index in the above example):
https://developer.couchbase.com/documentation/server/4.6/n1ql/n1ql-language-reference/string-functions.html
https://dzone.com/articles/more-than-like-efficient-json-search-with-couchbas
And, Couchbase 5.0 developer build (https://blog.couchbase.com/2017/january/introducing-developer-builds) has N1QL function CURL(), which allows you to access any HTTP/REST endpoint as part of the N1QL query (hence, can access the FTS endpoint). See following blogs for more details & examples:
- https://blog.couchbase.com/2017/january/developer-release--curl-n1ql
- https://dzone.com/articles/curl-comes-to-n1ql-querying-external-json-data
Btw, can you clarify if you want partial tokens or only full tokens in the query?
-Prasad
Here are the specific queries based on the answer from #prasad.
Using Couchbase 4.5:
CREATE INDEX idx_email ON `bucket`( email );
SELECT *
FROM `bucket`
WHERE
`email` = 'A'
AND ANY t WITHIN `data` SATISFIES t LIKE '%searchableValue%' END;
Using Couchbase 4.6:
CREATE INDEX idx_email ON `bucket`( email );
CREATE INDEX idx_tokens ON `bucket`( DISTINCT ARRAY t FOR t IN TOKENS( `data` ) END );
SELECT *
FROM `bucket`
WHERE
`email` = 'A'
AND ANY t IN TOKENS( `data` ) SATISFIES t = 'searchableValue' END;

how to check whether "knownlanguages" key contains value ="English" from json datatype of mysql using query

{
"Actor": {
"knownlanguages": [
"English"
]
}
}
This JSON is stored in JSON columntype of MySQL with name data.
My question is how to check whether knownlanguages key contains value English from JSON datatype of MySQL using query?
Easy:
SELECT * FROM table WHERE JSON_CONTAINS(json, '"English"', "$.Actor.knownlanguages")
or (depending on what you have to do):
SELECT JSON_CONTAINS(json, '"English"', "$.Actor.knownlanguages") FROM table
reference: https://dev.mysql.com/doc/refman/5.7/en/json-search-functions.html
you can use like keyword with wildcard %
SELECT * FROM table where data like "%English%"
NOTE:
this won't search in JSON field knownlanguages, but in column data

Couchbase N1qlQuery: use key value from select

I have this queries:
select otherDocKey from bucket use keys '1234'
update bucket use keys 'hear I need the result of the first query' set ...
I want to do something like that:
update bucket use keys (select otherDocKey from bucket use keys '1234') set kuku = 3
but the response I get is:
[
{
"code": 5030,
"msg": "Missing or invalid primary key map[otherDocKey:\"56443\"] of type map[string]interface {}."
}
]
is there a way to do that in one query?
I am using couchbase version 4.5
The problem with your query is that the nested subquery returns a json result. I.e., the query:
select otherDocKey from bucket use keys '1234'
will return a result that looks like:
{"otherDocKey":"This_is_the_key_for_the_other_doc"}
But you don't want json, you just want the value from the json. For that you need to use 'select raw'. E.g.,
select raw otherDocKey from bucket use keys '1234'
That should give you a result that looks like:
["This_is_the_key_for_the_other_doc"]
When the subquery returns that kind of result, the "use keys" should work properly.

MariaDB COLUMN_JSON query returns binary

I've been trying to use dynamic columns with an instance of MariaDB v10.1.12.
First, I send the following query:
INSERT INTO savedDisplays (user, name, body, dataSource, params) VALUES ('Marty', 'Hey', 'Hoy', 'temp', COLUMN_CREATE('type', 'tab', 'col0', 'champions', 'col1', 'averageResults'));
Where params' type was defined as a blob, just like the documentation suggests.
The query is accepted, the table updated. If I COLUMN_CHECK the results, it tells me it's fine.
But when I try to select:
"SELECT COLUMN_JSON(params) AS params FROM savedDisplays;
I get a {type: "Buffer", data: Array} containing binary returned to me, instead of the {"type":"tab", "col0":"champions", "col1":"averageResults"} I expect.
EDIT: I can use COLUMN_GET just fine, but I need every column inside the params field, and I need to check the type property first to know what kind of and how many columns there are in the JSON / params field. I could probably make it work still, but that would require multiple queries, as opposed to only one.
Any ideas?
Try:
SELECT CONVERT(COLUMN_JSON(params) USING utf8) AS params FROM savedDisplays
In MariaDB 10 this works at every table:
SELECT CONVERT(COLUMN_JSON(COLUMN_CREATE('t', text, 'v', value)) USING utf8)
as json FROM test WHERE 1 AND value LIKE '%12345%' LIMIT 10;
output in node.js
[ TextRow { json: '{"t":"test text","v":"0.5339044212345805"}' } ]

PostgreSQL casting with type specified in string variable

I am experimenting with the new JSON/JSONB objects in the latest(9.4) PostgreSQL. First, I will show you my test table:
CREATE TABLE "JSONtest" (
data jsonb
);
COPY "JSONtest" (data) FROM stdin;
{"id": 1, "age": 24, "male": false, "name": "Martha"}
{"id": 2, "age": 49, "male": true, "name": "Jim"}
\.
ALTER TABLE ONLY "JSONtest"
ADD CONSTRAINT "JSONtest_data_key" UNIQUE (data);
From this, I try to get the data and the type of certain columns:
SELECT "data"#>'{"male"}' FROM "JSONtest"; -- this returns true|false
SELECT jsonb_typeof("data"#>'{"male"}') FROM "JSONtest"; -- this returns 'boolean'
As read in the documentation, currently PostgreSQL can return data as json/jsonb types, if you use a single angle bracket in your query, or as text if you use double angle brackets:
SELECT '{"a":1}'::jsonb->'a' -- this will be of type jsonb
SELECT '{"a":1}'::jsonb->>'a' -- this will be of type string
But I need the actual type of the data in the JSON. What I tried was using the CAST function:
SELECT CAST('{"id":19}'::jsonb->>'a' AS integer) -- this will give back an int type
SELECT CAST('{"id":19}'::jsonb->>'a' AS json_typeof('{"id":19}'::jsonb->>'a')) -- this will obviously give an ERROR, because the type is given as a string
So my question is:
Can you cast with the target type specified as a string?
I could bypass this with a stored function, because there is only 6 options, what a json_typeof could return(object, array, boolean, string, number and null), but if there is a better way, then I would happily go for that.
Thank you all for the answers in advance!
--- edit #1 ---
Here is, what I came up with today as an experiment, but it caused an error with the following text: CASE types integer and text cannot be matched
SELECT
CASE jsonb_typeof("data"#>"query")
WHEN 'number' THEN ("data"#>>"query")::int
WHEN 'string' THEN "data"#>>"query"
WHEN 'boolean' THEN ("data"#>>"query")::boolean
END
FROM (
SELECT
'{"name" : "Jim", "dogs" : ["Barks", "Fluffy", "Spocky"], "id" : 4}'::jsonb AS "data",
'{"id"}'::text[] AS "query"
) AS "input"
Functions would cause the same trouble, because I need to specify the return type of it, which in this case cannot be determined.
json_populate_record() and json_to_record() also need type specified.