Error Loading json file on elasticsearch aws - json

I've just set up an elasticsearch domain using Elastic search service from aws.
Now I want to feed it with some json file using:
curl -XPOST 'my-aws-domain-here/_bulk/' --data-binary #base_enquete.json
according to the documentation here.
My json file looks like the following:
[{"INDID": "10040","DATENQ": "29/7/2013","Name": "LANDIS MADAGASCAR SA"},
{"INDID": "10050","DATENQ": "14/8/2013","Name": "MADAFOOD SA","M101P": ""}]
which gives me this error:
{"error":"ActionRequestValidationException[Validation Failed: 1: no requests added;]","status":400}
I tried without [ and ] same error!
Note that I already set up access policy to be open to the world for dev stage purpose.
Any help of any kind will be helpful :)

This is because of the wrong format of data.
Please go through the documentation here.
Ideally it should be in format -
action_and_meta_data\n
optional_source\n
action_and_meta_data\n
optional_source\n
....
action_and_meta_data\n
optional_source\n
This means that content of the file you are sending should be in following format -
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{"INDID": "10040","DATENQ": "29/7/2013","Name": "LANDIS MADAGASCAR SA"}
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "2" } }
{"INDID": "10050","DATENQ": "14/8/2013","Name": "MADAFOOD SA","M101P": ""}

Related

Mapping definition for [suggest] has unsupported parameters: [payloads : true]

I am using an example right from ElasticSearch documentation here using the Completion Suggestor but I am getting an error saying payloads: true is an unsupported parameter. Which obviously is supported unless the docs are wrong? I have the latest Elasticsearch app install (5.3.0).
Here is my cURL:
curl -X PUT localhost:9200/search/pages/_mapping -d '{
"pages" : {
"properties": {
"title": {
"type" : "string"
},
"suggest" : {
"type" : "completion",
"analyzer" : "simple",
"search_analyzer" : "simple",
"payloads" : true
}
}
}
}';
And the error:
{
"error" : {
"root_cause" : [
{
"type" : "mapper_parsing_exception",
"reason" : "Mapping definition for [suggest] has unsupported parameters: [payloads : true]"
}
],
"type" : "mapper_parsing_exception",
"reason" : "Mapping definition for [suggest] has unsupported parameters: [payloads : true]"
},
"status" : 400
}
The payloadparameter has been removed in ElasticSearch 5.3.0 by the following commit: Remove payload option from completion suggester . Here is the comit message:
The payload option was introduced with the new completion
suggester implementation in v5, as a stop gap solution
to return additional metadata with suggestions.
Now we can return associated documents with suggestions
(#19536) through fetch phase using stored field (_source).
The additional fetch phase ensures that we only fetch
the _source for the global top-N suggestions instead of
fetching _source of top results for each shard.

Moving mapping from old ElasticSearch to latest ES (5)

I've inherited some pretty old (v2.something) ElasticSearch instance running in cloud somewhere and need to get the data out starting with mappings to local instance of latest ES (v5). Unfortunately, it fails with following error:
% curl -X PUT 'http://127.0.0.1:9200/easysearch?pretty=true' --data #easysearch_mapping.json
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "unknown setting [index.easysearch.mappings.espdf.properties.abstract.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
}
],
"type" : "illegal_argument_exception",
"reason" : "unknown setting [index.easysearch.mappings.espdf.properties.abstract.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
},
"status" : 400
}
The mapping I got from old instance does contain some fields of this kind:
"espdf" : {
"properties" : {
"abstract" : {
"type" : "string"
},
"document" : {
"type" : "attachment",
"fields" : {
"content" : {
"type" : "string"
},
"author" : {
"type" : "string"
},
"title" : {
"type" : "string"
},
This "espdf" thing probably comes from Meteor's "EasySearch" component, but I have more structures like this in the mapping and new ES rejects each of them (I tried editing the mapping and deleting the "espdf" key and value).
How can I get the new ES to accept the mapping? Is this some legacy issue from 2.x ES and I should somehow convert this to new 5.x ES format?
The reason it fails is because the older ES had a plugin installed called mapper-attachments, which would add the attachment mapping type to ES.
In ES 5, this plugin has been replace by the ingest-attachment plugin, which you can install like this:
bin/elasticsearch-plugin install ingest-attachment
After running this command in your ES_HOME folder, restart your ES cluster and it should go better.

Elasticsearch - Sense - Indexing JSON files?

I'm trying to load some JSON files to my local ES instance via Sense, but I can't seem to figure the code out. I know ES has the Bulk API and the Index API, but I can't seem to bring the code together. How can I upload/index JSON files to my local ES instance using Sense? Thank you!
Yes, ES has a bulk api to upload JSON files to the ES cluster. I don't think that API is exposed in low level languages as in case of Sense it is Javascript in the browser. High level clients are available in Java or C# which expose more control over the ES cluster. I don't think chrome browser will support execution of this command.
To upload a JSON file to elastic using the bulk api.
1) This command uploads JSON documents from a JSON file.
curl -s -XPOST localhost:9200/_bulk --data-binary #path_to_file;
2)The JSON file should be formatted as follows:
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value3" }
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "doc" : {"field2" : "value2"} }
Where JSON object doc represents each JSON object data and the corresponding index JSON object represent metadata for that particular JSON doc like document id, type in index,index name.
link to bulk upload
Also you can refer my previous answer

Invalid request error in AWS::Route53::RecordSet when creating stack with AWS CloudFormation json

Invalid request error in AWS::Route53::RecordSet when creating stack with AWS CloudFormation json. Here is the error:
CREATE_FAILED AWS::Route53::RecordSet ApiRecordSet Invalid request
Here is the ApiRecordSet:
"ApiRecordSet" : {
"Type" : "AWS::Route53::RecordSet",
"Properties" : {
"AliasTarget" :{
"DNSName": {"Fn::GetAtt" : ["RestELB", "CanonicalHostedZoneName"]},
"HostedZoneId": {"Fn::GetAtt": ["RestELB", "CanonicalHostedZoneNameID"]}
},
"HostedZoneName" : "some.net.",
"Comment" : "A records for my frontends.",
"Name" : {"Fn::Join": ["", ["api",{"Ref": "Env"},".some.net."]]},
"Type" : "A",
"TTL" : "300"
}
}
What is wrong/invalid in this request?
The only thing I see immediately wrong is that you are using both an AliasTarget and TTL at the same time. You can't do that since the record uses the TTL defined in the AliasTarget. For more info check out the documentation on RecordSet here.
I also got this error and fixed it by removing the "SetIdentifier" field on record sets where it was not needed.
It is only needed when the "Name" and "Type" fields of multiple records are the same.
Documentation on AWS::Route53::RecordSet

How to index couchdb from elasticsearch server with the help of elasticsearch river plugin and hence get JSON data

I am working on graphical representation of data. The graph accepts JSON data,hence I need to fetch the required data from couchdb. I am using elasticsearch server for indexing couchdb and hence retrieve required data.
I am using elasticsearch river plugin to make couchdb and elasticsearch server together.
I have Created the CouchDB Database 'testdb' and created some test documents for the same.
Setup elasticsearch with the database.
On testing the same by writing CURl GET command with default search criteria, we must get 'total hits' more than 0 and the 'hits' must have some response value for searched criteria.
But we are getting 'total hits' as 0 and 'hits':[] (i.e. null)
Procedures I followed.
1) Downloaded and installed couchdb latest version
2) Verified CouchDB is running
curl localhost:5984
I got response that starts with:
{"couchdb":"Welcome"...
3) Downloaded ElasticSearch and installed service
service.bat install
curl http://127.0.0.1:9200
I got response as
{ "ok" : true, "status" : 200,.....
4) Installed the CouchDB River Plugin for ElasticSearch 1.4.2
plugin -install elasticsearch/elasticsearch-river-couchdb/2.4.1
5) To Create the CouchDB Database and ElasticSearch Index
curl -X PUT "http://127.0.0.1:5984/testdb"
6) To Create some test documents:
curl -X PUT "http://127.0.0.1:5984/testdb/1" -d "{\"name\":\"My
Name 1\"}"
curl -X PUT "http://127.0.0.1:5984/testdb/2" -d
"{\"name\":\"My Name 2\"}"
curl -X PUT
"http://127.0.0.1:5984/testdb/3" -d "{\"name\":\"My Name 3\"}"
curl
-X PUT "http://127.0.0.1:5984/testdb/4" -d "{\"name\":\"My Name 4\"}"
7) To Setup ElasticSearch with the Database
curl -X PUT "127.0.0.1:9200/_river/testdb/_meta" -d "{ \"type\" :
\"couchdb\", \"couchdb\" : { \"host\" : \"localhost\", \"port\" :
5984, \"db\" : \"testdb\", \"filter\" : null }, \"index\" : {
\"index\" : \"testdb\", \"type\" : \"testdb\", \"bulk_size\" :
\"100\", \"bulk_timeout\" : \"10ms\" } }"
8) To test it
curl "http://127.0.0.1:9200/testdb/testdb/_search?pretty=true"
on testing we should get this
{
"took" : 4,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 4,
"max_score" : 1.0,
"hits" : [ {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "4",
"_score" : 1.0, "_source" : {"_rev":"1-7e9376fc8bfa6b8c8788b0f408154584","_id":"4","name":"My Name 4"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "1",
"_score" : 1.0, "_source" : {"_rev":"1-87386bd54c821354a93cf62add449d31","_id":"1","name":"My Name"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "2",
"_score" : 1.0, "_source" : {"_rev":"1-194582c1e02d84ae36e59f568a459633","_id":"2","name":"My Name 2"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "3",
"_score" : 1.0, "_source" : {"_rev":"1-62a53c50e7df02ec22973fc802fb9fc0","_id":"3","name":"My Name 3"}
} ]
}
}
But I got something like this
{
"error" : "IndexMissingException[[testdb] missing]",
"status" : 404
}
This curl string doesn't need the additional testb. This:
curl "http://127.0.0.1:9200/testdb/testdb/_search?pretty=true"
Should be this:
curl 'http://localhost/testdb/_search?pretty=true'
You can view all your by running the following and ensuring your search is against one of your indices:
curl -X GET 'localhost:9200/_cat/indices'