i want to index and search mysql database using elastic search & I followed this tutorial
elasticsearch-river-jdbc
At first i downloaded elastic search and installed river-jdbc in its plugin folder.
then added mysql-jdbc inside ES_HOME/elasticsearch-0.90.1/plugins/river-jdbc/
Then started elasticsearch and Started another terminal window,
and created a new JDBC river with name my_jdbc_river with this curl command
curl -XPUT 'localhost:9200/_river/my_jdbc_river/_meta' -d '{
"type" : "jdbc",
"jdbc" : {
"driver" : "com.mysql.jdbc.Driver",
"url" : "jdbc:mysql://localhost:3306/bablool",
"user" : "root",
"password" : "babloo",
"sql" : "select * from details"
},
"index" : {
"index" : "jdbc",
"type" : "jdbc"
}
}'
then when I run this command: curl -XGET 'localhost:9200/jdbc/jdbc/_search?pretty&q=*'
and im getting following error:
"error": "IndexMissingException[[jdbc] missing]", "status" : 404
plz help me....
I'm bumped into the same error following the tutorial mentioned in the question... after reading again the documentation, I noticed that I forgot to restart the ES after install the river_jdbc plugin.
This tutorial is also interesting if you starting to learn about indexing SQL databases.
Related
I have get dyanamically data from MySQL tables in my elasticSearch index. For that i have used following link for but not get propper result:
I have used following code:
echo '{
"type":"jdbc",
"jdbc":{
"url":"jdbc:mysql://localhost:3306/CDFL",
"user":"root",
"password":"root",
"useSSL":"false",
"sql":"SELECT * FROM event",
"index":"event",
"type":"event",
"autocommit":"true",
"metrics": {
"enabled" : true
},
"elasticsearch" : {
"cluster" : "servercluster",
"host" : "localhost",
"port" : 9300
}
}
}' | java -cp "/etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/lib/*" -"Dlog4j.configurationFile=file:////etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/bin/log4j2.xml" "org.xbib.tools.Runner" "org.xbib.tools.JDBCImporter"
and for that get solution i have used following link:
ElasticSearch how to integrate with Mysql
https://github.com/jprante/elasticsearch-jdbc
Fetching changes from table with ElasticSearch JDBC river
https://github.com/logstash-plugins/logstash-input-jdbc
I have got a answer for that question:
make one file in root directory called event.sh and following code in that file
event.sh
curl -XDELETE 'localhost:9200/event'
bin=/etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/bin
lib=/etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/lib
echo '{
"type":"jdbc",
"jdbc":{
"url":"jdbc:mysql://localhost:3306/CDFL",
"user":"root",
"password":"root",
"useSSL":"false",
"sql":"SELECT * FROM event",
"index":"event",
"type":"event",
"poll" : "6s",
"autocommit":"true",
"metrics": {
"enabled" : true
},
"elasticsearch" : {
"cluster" : "servercluster",
"host" : "localhost",
"port" : 9300
}
}
}' | java -cp "/etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/lib/*" -"Dlog4j.configurationFile=file:////etc/elasticsearch/elasticsearch-jdbc-2.3.4.0/bin/log4j2.xml" "org.xbib.tools.Runner" "org.xbib.tools.JDBCImporter"
echo "sleeping while importer should run..."
sleep 10
curl -XGET 'localhost:9200/event/_refresh'
and run that file in cmd type following command:
sh elasticSearch/event.sh
that is work fine
I've inherited some pretty old (v2.something) ElasticSearch instance running in cloud somewhere and need to get the data out starting with mappings to local instance of latest ES (v5). Unfortunately, it fails with following error:
% curl -X PUT 'http://127.0.0.1:9200/easysearch?pretty=true' --data #easysearch_mapping.json
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "unknown setting [index.easysearch.mappings.espdf.properties.abstract.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
}
],
"type" : "illegal_argument_exception",
"reason" : "unknown setting [index.easysearch.mappings.espdf.properties.abstract.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
},
"status" : 400
}
The mapping I got from old instance does contain some fields of this kind:
"espdf" : {
"properties" : {
"abstract" : {
"type" : "string"
},
"document" : {
"type" : "attachment",
"fields" : {
"content" : {
"type" : "string"
},
"author" : {
"type" : "string"
},
"title" : {
"type" : "string"
},
This "espdf" thing probably comes from Meteor's "EasySearch" component, but I have more structures like this in the mapping and new ES rejects each of them (I tried editing the mapping and deleting the "espdf" key and value).
How can I get the new ES to accept the mapping? Is this some legacy issue from 2.x ES and I should somehow convert this to new 5.x ES format?
The reason it fails is because the older ES had a plugin installed called mapper-attachments, which would add the attachment mapping type to ES.
In ES 5, this plugin has been replace by the ingest-attachment plugin, which you can install like this:
bin/elasticsearch-plugin install ingest-attachment
After running this command in your ES_HOME folder, restart your ES cluster and it should go better.
I've just set up an elasticsearch domain using Elastic search service from aws.
Now I want to feed it with some json file using:
curl -XPOST 'my-aws-domain-here/_bulk/' --data-binary #base_enquete.json
according to the documentation here.
My json file looks like the following:
[{"INDID": "10040","DATENQ": "29/7/2013","Name": "LANDIS MADAGASCAR SA"},
{"INDID": "10050","DATENQ": "14/8/2013","Name": "MADAFOOD SA","M101P": ""}]
which gives me this error:
{"error":"ActionRequestValidationException[Validation Failed: 1: no requests added;]","status":400}
I tried without [ and ] same error!
Note that I already set up access policy to be open to the world for dev stage purpose.
Any help of any kind will be helpful :)
This is because of the wrong format of data.
Please go through the documentation here.
Ideally it should be in format -
action_and_meta_data\n
optional_source\n
action_and_meta_data\n
optional_source\n
....
action_and_meta_data\n
optional_source\n
This means that content of the file you are sending should be in following format -
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{"INDID": "10040","DATENQ": "29/7/2013","Name": "LANDIS MADAGASCAR SA"}
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "2" } }
{"INDID": "10050","DATENQ": "14/8/2013","Name": "MADAFOOD SA","M101P": ""}
I am working on graphical representation of data. The graph accepts JSON data,hence I need to fetch the required data from couchdb. I am using elasticsearch server for indexing couchdb and hence retrieve required data.
I am using elasticsearch river plugin to make couchdb and elasticsearch server together.
I have Created the CouchDB Database 'testdb' and created some test documents for the same.
Setup elasticsearch with the database.
On testing the same by writing CURl GET command with default search criteria, we must get 'total hits' more than 0 and the 'hits' must have some response value for searched criteria.
But we are getting 'total hits' as 0 and 'hits':[] (i.e. null)
Procedures I followed.
1) Downloaded and installed couchdb latest version
2) Verified CouchDB is running
curl localhost:5984
I got response that starts with:
{"couchdb":"Welcome"...
3) Downloaded ElasticSearch and installed service
service.bat install
curl http://127.0.0.1:9200
I got response as
{ "ok" : true, "status" : 200,.....
4) Installed the CouchDB River Plugin for ElasticSearch 1.4.2
plugin -install elasticsearch/elasticsearch-river-couchdb/2.4.1
5) To Create the CouchDB Database and ElasticSearch Index
curl -X PUT "http://127.0.0.1:5984/testdb"
6) To Create some test documents:
curl -X PUT "http://127.0.0.1:5984/testdb/1" -d "{\"name\":\"My
Name 1\"}"
curl -X PUT "http://127.0.0.1:5984/testdb/2" -d
"{\"name\":\"My Name 2\"}"
curl -X PUT
"http://127.0.0.1:5984/testdb/3" -d "{\"name\":\"My Name 3\"}"
curl
-X PUT "http://127.0.0.1:5984/testdb/4" -d "{\"name\":\"My Name 4\"}"
7) To Setup ElasticSearch with the Database
curl -X PUT "127.0.0.1:9200/_river/testdb/_meta" -d "{ \"type\" :
\"couchdb\", \"couchdb\" : { \"host\" : \"localhost\", \"port\" :
5984, \"db\" : \"testdb\", \"filter\" : null }, \"index\" : {
\"index\" : \"testdb\", \"type\" : \"testdb\", \"bulk_size\" :
\"100\", \"bulk_timeout\" : \"10ms\" } }"
8) To test it
curl "http://127.0.0.1:9200/testdb/testdb/_search?pretty=true"
on testing we should get this
{
"took" : 4,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 4,
"max_score" : 1.0,
"hits" : [ {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "4",
"_score" : 1.0, "_source" : {"_rev":"1-7e9376fc8bfa6b8c8788b0f408154584","_id":"4","name":"My Name 4"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "1",
"_score" : 1.0, "_source" : {"_rev":"1-87386bd54c821354a93cf62add449d31","_id":"1","name":"My Name"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "2",
"_score" : 1.0, "_source" : {"_rev":"1-194582c1e02d84ae36e59f568a459633","_id":"2","name":"My Name 2"}
}, {
"_index" : "testdb",
"_type" : "testdb",
"_id" : "3",
"_score" : 1.0, "_source" : {"_rev":"1-62a53c50e7df02ec22973fc802fb9fc0","_id":"3","name":"My Name 3"}
} ]
}
}
But I got something like this
{
"error" : "IndexMissingException[[testdb] missing]",
"status" : 404
}
This curl string doesn't need the additional testb. This:
curl "http://127.0.0.1:9200/testdb/testdb/_search?pretty=true"
Should be this:
curl 'http://localhost/testdb/_search?pretty=true'
You can view all your by running the following and ensuring your search is against one of your indices:
curl -X GET 'localhost:9200/_cat/indices'
curl -XPUT localhost:9200/_river/my_jdbc_river/_meta -d
"{
"type" : "jdbc",
"jdbc" :
{
"driver" : "com.mysql.jdbc.Driver",
"url" : "jdbc:mysql://localhost:3306/springtest",
"user" : "root",
"password" : "root",
"sql" : "select * from register",
"index" : "my_register",
"type" : "my_register_type"
}
}"
the connection successfull but error tab will generated like
error: NoClassSettingsException[Failed to load class with value [jdbc]]; nested:
ClassNotFoundException[jdbc];
I presume you need to add
"driver": "com.mysql.jdbc.Driver"
to your jdbc definition.
Also, check that you have done all the steps mentioned there: https://github.com/jprante/elasticsearch-river-jdbc/wiki/Quickstart
(especially pp.4 and 5, related to registering MySQL driver with your ElasticSearch instance)