problem is occurred new index are not created show the error message [ { "code": 5000, "msg": "GSI CreatePrimaryIndex() - cause: Create index or Alter replica cannot proceed due to rebalance in progress, another concurrent create index request, network partition, node failover, indexer failure, or presence of duplicate index name.", "query": "CREATE PRIMARY INDEX customer_requests_indx ON customer_requests\n" } ]
Related
For migration I have used the following couchbase query
INSERT INTO `bucket`
(KEY _k, VALUE _v, OPTIONS {"expiration": ttl})
SELECT REPLACE(META().id, "2_", "3_", 1) _K, META(_v).expiration AS ttl
from `bucket` _v
where _v._class="classname"
AND META().id LIKE "2_%";
While running getting the following error
{
"code": 5070,
"msg": "Cannot INSERT non-string key Missing field or index _k. of type value.missingValue."
}
The problem was different cases(lower and upper) used in different places. changing _k to small case solved the problem.
I have a bucket in Couchbase called mybucket. When I select Documents and then choose my bucket, It has an option to retrieve the documents. When I choose the first one, the web platform of Couchbase shows the content of that document to me:
{
"type": "activity",
"version": "1.0.0",
....
}
So, with that, I am sure that I can see some documents that have "type" = "activity" in my bucket. However, when I want to retrieve them using the Query editor and the following N1QL query:
select * from `mybucket` where `type` = "activity" limit 10;
I get the following response:
[
{
"code": 4000,
"msg": "No index available on keyspace `default`:`mybucket` that matches your query. Use CREATE PRIMARY INDEX ON `default`:`mybucket` to create a primary index, or check that your expected index is online.",
"query": "select * from `mybucket` where `type` = \"activity\" limit 10;"
}
]
Bucket Retrieve documents uses DCP stream vs Query Editor uses N1QL which required the secondary index or primary index
Option 1) CREATE INDEX ix1 ON mybuckte(type);
OR
Option 2) CREATE PRIMARY INDEX on mybucket;
I have about 8 Million Documents in my Collection.
And I want to remove the special Characters in one of the fields.
I will post my Statement below.
I am using the mongo shell in the Mongo db compass tool.
The update is working about 30-50 Minutes and then throws the following error:
MongoServerError: Error on remote shard thisisjustforstack.com:27000 :: caused by :: cursor id 1272890412590646833 not found
I also see that after throwing this error, he did not update all documents.
db.getCollection('TEST_Collection').aggregate(
[{
$match: {
'1List.Comment': {
$exists: true
}
}
}, {
$project: {
'1List.Comment': 1
}
}]
)
.forEach(function(doc,Index) {doc.1List.Comment=doc.1List.Comment.replace(/[^a-zA-Z 0-9 ]/g, '');
db.TEST_Collection.updateMany({ "_id": doc._id },{ "$set": { "1List.Comment": doc.1List.Comment } });})
Can somebody please help to get this update statement working without running in some sort of timeout? I have read something about noCursorTimeout() but I am not sure on how to use it with my statement and using it in the shell.
Thank you all!
Cursor timeout can't be disabled on individual aggregation cursors.
But you can set on global config:
mongod --setParameter cursorTimeoutMillis=3600000 #1 hour
Anyway I think dividing the task in small batches is a better option
From latest couchbase doc,Could see FTS index can be created/updated using below
PUT /api/index/{indexName}
Creates/updates an index definition.
I have created index with name fts-idx and created successfully.
But looks like update of index is failing with REST API.
Response:
responseMessage : ,{"error":"rest_create_index: error creating index: fts-idx, err: manager_api: cannot create index because an index with the same name already exists: fts-idx"
Anything i have missed here.
I was able to replicate this issue, and I think figured it out. It's not a bug, but it should really be documented better.
You need to pass in the index's UUID as part of the PUT (I think this is a concurrency check). You can get the index's current uuid via GET /api/index/fts-index (it's in indexDef->uuid)
And once you have that, make it part of your update PUT body:
{
"name": "fts-index",
"type": "fulltext-index",
"params": {
// ... etc ...
},
"sourceType": "couchbase",
"sourceName": "travel-sample",
"sourceUUID": "307a1042c094b7314697980312f4b66b",
"sourceParams": {},
"planParams": {
// ... etc ...
},
"uuid": "89a125824b012319" // <--- right here
}
Once I did that, the update PUT went through just fine.
I have an index with a couple of fields of type Edm.String and Collection(Edm.String). I want to have another index with the same fields plus another field of type Edm.Double. When I create such an index and try to add the same values (plus the newly added Edm.Double value) as I did to the first index, I'm getting the following error:
{
"error": {
"code": "",
"message": "The request is invalid. Details: parameters : An unexpected 'StartArray' node was found when reading from the JSON reader. A 'PrimitiveValue' node was expected.\r\n"
}
}
Does anyone know what this error means? I tried looking for it on the Internet but I couldn't find anything related to my situation. A sample request I'm sending to the new index looks like this:
POST https://myservicename.search.windows.net/indexes/newindexname/docs/index?api-version=2016-09-01
{
"value": [{
"#search.action": "upload",
"keywords": ["red", "lovely", "glowing", "cute"],
"name": "sample document",
"weight": 0.5,
"id": "67"
}]
}
The old index is the same but it doesn't have the "weight" parameter.
Edit: I created the index using the portal, so I don't have the exact JSON to create the index, but the fields are roughly like this:
Field Type Attributes Analyzer
---------------------------------------------------------------------------------------
id Edm.String Key, Retrievable
name Edm.String Searchable, Filterable, Retrievable Eng-Microsoft
keywords Collection(Edm.String) Searchable, Filterable, Retrievable Eng-Microsoft
weight Edm.Double Filterable, Sortable
The reason I got the error was because I made a mistake and was trying to send a Collection(Edm.String) when the actual type on the index was Edm.String.