logstash: Unknown setting ‘“index”’ for elasticsearch - mysql

I'm new to the elastic search concept to make connection with mySQL.
I followed multiple tutorials to install but I'm getting these errors:
Unknown setting '"index"'and '"host'" for elasticsearch
The output of
sudo -Hu root /usr/share/logstash/bin/logstash --path.settings /etc/logstash/
returns:
> Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
> [2019-04-20T17:48:47,293][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.0"}
> [2019-04-20T17:48:53,873][ERROR][logstash.outputs.elasticsearch] Unknown setting '"document_type"' for elasticsearch
> [2019-04-20T17:48:53,878][ERROR][logstash.outputs.elasticsearch] Unknown setting '"hosts"' for elasticsearch > [2019-04-20T17:48:53,878][ERROR][logstash.outputs.elasticsearch] Unknown setting '"index"' for elasticsearch
> [2019-04-20T17:48:53,891][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:86:inconfig_init'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:60:in initialize'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:232:ininitialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:48:in initialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:30:ininitialize'", "org/logstash/plugins/PluginFactoryExt.java:239:in plugin'", "org/logstash/plugins/PluginFactoryExt.java:137:inbuildOutput'", "org/logstash/execution/JavaBasePipelineExt.java:50:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:inblock in converge_state'"]}
> [2019-04-20T17:48:54,190][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
> [2019-04-20T17:48:59,066][INFO ][logstash.runner ] Logstash shut down.
Here is the content of the logstash.conf file:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/archief"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "pswxxx"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/usr/share/java/mysql-connector-java-8.0.15.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM archief"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => ["localhost:9200"]
"index" => "archief"
}
}

There is no double quotes in the options name.
output {
stdout { codec => json_lines }
elasticsearch {
hosts => ["localhost:9200"]
index => "archief"
}
}

I had the same issue, and solved it by changing my logstash version from 7.4.2 to 6.3.2.
Logstash 6.3.2 link

Related

How correctly push data from Logstash to Elasticsearch server?

I am new in ELK. I need to visualize data from a PostgreSQL in Kibana. I ran into a little problem and need a help.
I use:
Elasticsearch 6.4.1
Kibana 6.4.1
Logstash 6.4.1
When I run next logstash.conf file it don't send me correct data to elasticsearch server. What I need to change in my configuration file?
logstash.conf:
input
{
jbdc_connection_string => "path_to_database"
jdbc_user => "postgres"
jdbc_password => "postgres"
jdbc_driver_library => "/path_to/postgresql-42.2.5.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from documents"
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
index => "documents"
}
}
Only when in output I use next configuration I see correct data in terminal:
strout
{
codes => json_lines
}

logstash: not able to connect mysql with logstash?

I am trying to make connection with MySQL using logstash and write into elastic search below in my code in conf file
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://192.168.2.24:3306/test"
# The user we wish to execute our statement as
jdbc_user => "uname"
jdbc_password => "pass"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/usr/local/Cellar/logstash/6.2.4/mysql-connector-java-8.0.11.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM report_table"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "mysqlsample"
document_type => "record"
}
}
on running the above getting the below error :
Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included
the correct jdbc driver in :jdbc_driver_library? Exception:
LogStash::ConfigurationError Stack:
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:162:in
open_jdbc_connection'
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:220:in
execute_statement'
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:264:in
execute_query'
/usr/local/Cellar/logstash/6.2.4/libexec/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:250:in
run'
/usr/local/Cellar/logstash/6.2.4/libexec/logstash-core/lib/logstash/pipeline.rb:514:in
inputworker'
/usr/local/Cellar/logstash/6.2.4/libexec/logstash-core/lib/logstash/pipeline.rb:507:in
block in start_input'
Sounds like it's an issue with jdbc_driver_library => "/usr/local/Cellar/logstash/6.2.4/mysql-connector-java-8.0.11.jar".
Are you sure it's a valid path? Is that the correct connector? Maybe try to use the one that the documentation mentions: mysql-connector-java-5.1.36-bin.jar

Migration from MySQL to Elasticsearch using Logstash

I am new to ELK stack. I am working with data migration from MySQL TO elasticsearch. I am following this tutorial:
https://qbox.io/blog/migrating-mysql-data-into-elasticsearch-using-logstash
and I have installed and configured MySQL and ElasticSearch. I could not configure Logstash.
I dont know where to find logstash.conf, so i created a file named logstash.conf in my conf.d file of logstash folder. I wrote the following in logstash.conf:
input {
jdbc {
jdbc_driver_library => "usr/share/java/mysql-connector-java.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/books"
jdbc_user => "root"
jdbc_password => "root"
statement => "SELECT * FROM authors"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "my-authors"
"document_type" => "data"
}
}
But when I run my command bin/logstash -f logstash.conf by going into /etc/logstash/conf.d folder from ubuntu terminal, It gives an error stating that bin/logstash does not exist.
Please help me with the issue.

I can't connect my MySQL Database with jdbc in logstash

I want to input some data from MySQL database with logstash.
Here is my jdbc.conf
input {
jdbc {
jdbc_driver_library => "/mysql-connector-java-5.1.40/mysql-connector-java-5.1.40-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://111.11.11.111:3306/dbname"
jdbc_user => "user"
jdbc_password => "****"
statement => "SELECT title from test"
}
}
output {
stdout { codec => json }
}
username, password, host, dbname and column_name are fake. and output is just for testing.
My database is on the same VPS server.
--configtest is cleared. However, I got this error.
/opt/logstash/bin/logstash -f /opt/logstash/bin/config/jdbc.conf
Settings: Default pipeline workers: 4
Pipeline aborted due to error {
:exception=>"LogStash::ConfigurationError",
:backtrace=>[
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-3.1.0/lib/logstash/plugin_mixins/jdbc.rb:159:in `prepare_jdbc_connection'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-3.1.0/lib/logstash/inputs/jdbc.rb:187:in `register'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:330:in `start_inputs'",
"org/jruby/RubyArray.java:1613:in `each'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:329:in `start_inputs'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:180:in `start_workers'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'",
"/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"],
:level=>:error}
stopping pipeline {:id=>"main"}
I got the LogStash::ConfigurationError. Waht's wrong with my config?
I finally figured it out.
It was just a bug of jdbc-driver.
While I was facing to the bug, I was using ver5.1.40 via the mysql webpage.
After I changed it to mysql-connector-java-5.1.17 via yum, It does work.

Mnesia DB elasticsearch

I using ejabberd server for chat communication. I'd like be able dynamicly search my archive messages. Now I'm using elasticsearch and logstash, but it working only on mysql db. It's my logstash config
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/ejabberd"
jdbc_user => "ejabber"
jdbc_password => "password"
jdbc_driver_library => "mysql-connector-java-5.1.39-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM ejabberd.archive"
}
}
output {
# stdout { codec => json_lines }
elasticsearch {
index => "muc_room"
hosts => ["localhost:9200"]
}
}
I need use mnesia DB, its default base for ejabber. How can connect mnesia DB with logstash, or it is possible use another way to include search engione to mnesia DB. Thank you
I would send the data directly to elasticsearch from ejabberd. That way, you don't need to have two separate things that need to be updated if you change storage engines. There's an Erlang package to talk to Elasticsearch. The documentation on it isn't great, but it's a pretty simple interface anyway.