I try to run a conf file to use mysql database on input and on output I want to get a json-line. But an error is happening.
I'm putting the error and the conf file in this question.
Sending Logstash's logs to C:/logstash-6.0.1/logs which is now configured via log4j2.properties
[2017-12-13T17:33:43,978][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/logstash-6.0.1/modules/fb_apache/configuration"}
[2017-12-13T17:33:43,978][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x1896b5d #module_name="fb_apache", #directory="C:/logstash-6.0.1/modules/fb_apache/configuration", #kibana_version_parts=["6", "0", "0"]>}
[2017-12-13T17:33:43,978][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/logstash-6.0.1/modules/netflow/configuration"}
[2017-12-13T17:33:43,978][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x18b75ca #module_name="netflow", #directory="C:/logstash-6.0.1/modules/netflow/configuration", #kibana_version_parts=["6", "0", "0"]>}
[2017-12-13T17:33:44,038][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"C:/logstash-6.0.1/config/pipelines.yml"}
[2017-12-13T17:33:44,038][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-13T17:33:44,068][DEBUG][logstash.agent ] Agent: Configuring metric collection
[2017-12-13T17:33:44,068][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-13T17:33:44,088][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-13T17:33:44,138][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-13T17:33:44,148][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-13T17:33:44,148][DEBUG][logstash.agent ] starting agent
[2017-12-13T17:33:44,158][DEBUG][logstash.agent ] Starting puma
[2017-12-13T17:33:44,158][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-12-13T17:33:44,168][DEBUG [logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/logstash-6.0.1/bin/BCHARTS-MTGOXUSD.csv", "C:/logstash-6.0.1/bin/cpdump", "C:/logstash-6.0.1/bin/ingest-convert.sh", "C:/logstash-6.0.1/bin/logstash", "C:/logstash-6.0.1/bin/logstash-plugin", "C:/logstash-6.0.1/bin/logstash-plugin.bat", "C:/logstash-6.0.1/bin/logstash.bat", "C:/logstash-6.0.1/bin/logstash.conf", "C:/logstash-6.0.1/bin/logstash.lib.sh", "C:/logstash-6.0.1/bin/mysql-connector-java-5.1.38.jar", "C:/logstash-6.0.1/bin/mysql-connector-java-5.1.45-bin.jar", "C:/logstash-6.0.1/bin/ruby", "C:/logstash-6.0.1/bin/setup.bat", "C:/logstash-6.0.1/bin/system-install"]}
[2017-12-13T17:33:44,178][DEBUG][logstash.api.service ] [api-service] start
[2017-12-13T17:33:44,178][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"C:/logstash-6.0.1/bin/db.conf"}
[2017-12-13T17:33:44,208][DEBUG][logstash.agent ] Converging pipelines
[2017-12-13T17:33:44,208][DEBUG][logstash.agent ] Needed actions to converge {:actions_count=>1}
[2017-12-13T17:33:44,218][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2017-12-13T17:33:44,298][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-13T17:33:44,768][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"jdbc", :type=>"input", :class=>LogStash::Inputs::Jdbc}
[2017-12-13T17:33:44,778][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"ArgumentError", :message=>"invalid byte sequence in UTF-8", :backtrace=>["org/jruby/RubyString.java:2541:in `gsub'", "C:/logstash-6.0.1/logstash-core/lib/logstash/util/environment_variables.rb:28:in `replace_env_placeholders'", "C:/logstash-6.0.1/logstash-core/lib/logstash/util/environment_variables.rb:18:in `deep_replace'", "C:/logstash-6.0.1/logstash-core/lib/logstash/config/mixin.rb:109:in `block in config_init'", "org/jruby/RubyHash.java:1343:in `each'", "C:/logstash-6.0.1/logstash-core/lib/logstash/config/mixin.rb:108:in `config_init'", "C:/logstash-6.0.1/logstash-core/lib/logstash/inputs/base.rb:62:in `initialize'", "C:/logstash-6.0.1/logstash-core/lib/logstash/pipeline.rb:152:in `plugin'", "(eval):8:in `<eval>'", "org/jruby/RubyKernel.java:994:in `eval'", "C:/logstash-6.0.1/logstash-core/lib/logstash/pipeline.rb:82:in `initialize'", "C:/logstash-6.0.1/logstash-core/lib/logstash/pipeline.rb:215:in `initialize'", "C:/logstash-6.0.1/logstash-core/lib/logstash/pipeline_action/create.rb:35:in `execute'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "C:/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:90:in `execute'", "C:/logstash-6.0.1/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "C:/logstash-6.0.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2017-12-13T17:33:44,798][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Stopping
[2017-12-13T17:33:44,798][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Stopping
[2017-12-13T17:33:44,798][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Stopping
The conf file
input {
jdbc {
jdbc_driver_library => "C:\logstash-6.0.1\bin\mysql-connector-java-5.1.45-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/database"
jdbc_user => "root"
jdbc_password => "root"
statement => "SELECT * FROM table"
}
}
output {
stdout { codec => rubydebug }
}
EDIT
This is the first row of the data
cod,cod_veiculo.cod_destino,cod_origem
5530041,16555,84661,1187
Related
Am relatively new to logstash & Elasticsearch...
Installed logstash & Elasticsearch using on macOS Mojave (10.14.2):
brew install logstash
brew install elasticsearch
When I check for these versions:
brew list --versions
Receive the following output:
elasticsearch 6.5.4
logstash 6.5.4
When I open up Google Chrome and type this into the URL Address field:
localhost:9200
This is the JSON response that I receive:
{
"name" : "9oJAP16",
"cluster_name" : "elasticsearch_local",
"cluster_uuid" : "PgaDRw8rSJi-NDo80v_6gQ",
"version" : {
"number" : "6.5.4",
"build_flavor" : "oss",
"build_type" : "tar",
"build_hash" : "d2ef93d",
"build_date" : "2018-12-17T21:17:40.758843Z",
"build_snapshot" : false,
"lucene_version" : "7.5.0",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}
Inside:
/usr/local/etc/logstash/logstash.yml
Resides the following variables:
path.data: /usr/local/Cellar/logstash/6.5.4/libexec/data
pipeline.workers: 2
path.config: /usr/local/etc/logstash/conf.d
log.level: info
path.logs: /usr/local/var/log
Inside:
/usr/local/etc/logstash/pipelines.yml
Resides the following variables:
- pipeline.id: main
path.config: "/usr/local/etc/logstash/conf.d/*.conf"
Have setup the following logstash_etl.conf file underneath:
/usr/local/etc/logstash/conf.d
Its contents:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://myapp-production.crankbftdpmc.us-west-2.rds.amazonaws.com:3306/products"
jdbc_user => "products_admin"
jdbc_password => "products123"
jdbc_driver_library => "/etc/logstash/mysql-connector/mysql-connector-java-5.1.21.jar"
jdbc_driver_class => "com.mysql.jdbc.driver"
schedule => "*/5 * * * *"
statement => "select * from products"
use_column_value => false
clean_run => true
}
}
# sudo /usr/share/logstash/bin/logstash-plugin install logstash-output-exec
output {
if ([purge_task] == "yes") {
exec {
command => "curl -XPOST 'localhost:9200/_all/products/_delete_by_query?conflicts=proceed' -H 'Content-Type: application/json' -d'
{
\"query\": {
\"range\" : {
\"#timestamp\" : {
\"lte\" : \"now-3h\"
}
}
}
}
'"
}
}
else {
stdout { codec => json_lines}
elasticsearch {
"hosts" => "localhost:9200"
"index" => "product_%{product_api_key}"
"document_type" => "%{[#metadata][index_type]}"
"document_id" => "%{[#metadata][index_id]}"
"doc_as_upsert" => true
"action" => "update"
"retry_on_conflict" => 7
}
}
}
When I do this:
brew services start logstash
Receive the following inside my /usr/local/var/log/logstash-plain.log file:
[2019-01-15T14:51:15,319][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x399927c7 run>"}
[2019-01-15T14:51:15,663][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-15T14:51:16,514][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-15T14:57:31,432][ERROR][logstash.inputs.jdbc ] Unable to connect to database. Tried 1 times {:error_message=>"Java::ComMysqlCjJdbcExceptions::CommunicationsException: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server."}
[2019-01-15T14:57:31,435][ERROR][logstash.inputs.jdbc ] Unable to connect to database. Tried 1 times {:error_message=>"Java::ComMysqlCjJdbcExceptions::CommunicationsException: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server."}[2019-01-15T14:51:15,319][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x399927c7 run>"}
[2019-01-15T14:51:15,663][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-15T14:51:16,514][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-15T14:57:31,432][ERROR][logstash.inputs.jdbc ] Unable to connect to database. Tried 1 times
What am I possibly doing wrong?
Is there a way to obtain a dump (e.g. mysqldump) from an Elasticsearch server (Stage or Production) and then reimport into a local instance running Elasticsearch without using logstash?
This is the same configuration file that works inside an Amazon EC-2 Production Instance but don't know why its not working in my local macOS Mojave instance?
You may encounter the SSL issue of RDS, since
If you use either the MySQL Java Connector v5.1.38 or later, or the MySQL Java Connector v8.0.9 or later to connect to your databases, even if you haven't explicitly configured your applications to use SSL/TLS when connecting to your databases, these client drivers default to using SSL/TLS. In addition, when using SSL/TLS, they perform partial certificate verification and fail to connect if the database server certificate is expired.
as described in AWS RDS Doc
To overcome, either set up the trust store for the LogStash, which is described in the above link as well.
Or take the risk to disable the SSL in the connecting string, like
jdbc_connection_string => "jdbc:mysql://myapp-production.crankbftdpmc.us-west-2.rds.amazonaws.com:3306/products?sslMode=DISABLED"
For some reason Logstash with the Elastic Stack X-Pack is breaking. I believe it's to do with the recent renaming of the mySQL connector, which hasn't been updated in the various config files. However, I can't find where the error file is originating in this error log. Additionally, if anyone knows how to rename the actual mySQL connector from com.mysql.cj.jdbc.Driver to com.mysql.jdbc.Driver this should fix everything.
Error Log:
C:\Program Files\logstash-6.3.2\bin>logstash -f sql.conf
Sending Logstash's logs to C:/Program Files/logstash-6.3.2/logs which is now configured via log4j2.properties
[2018-08-31T15:01:52,502][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-08-31T15:01:53,031][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2018-08-31T15:01:55,217][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-08-31T15:02:06,342][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-08-31T15:02:06,342][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-08-31T15:02:06,539][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-08-31T15:02:06,575][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-08-31T15:02:06,592][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-08-31T15:02:06,609][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-08-31T15:02:06,625][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-08-31T15:02:06,659][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-08-31T15:02:06,909][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x598b8674 sleep>"}
[2018-08-31T15:02:06,996][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-08-31T15:02:07,359][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-08-31T15:02:07,895][ERROR][logstash.inputs.jdbc ] Unable to connect to database. Tried 1 times {:error_message=>"Java::JavaSql::SQLNonTransientConnectionException: Cannot load connection class because of underlying exception: com.mysql.cj.exceptions.WrongArgumentException: Failed to parse the host:port pair 'localhost:3306;user=test;password=test123;databaseName=test;integratedSecurity=true;'."}
[2018-08-31T15:02:07,916][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_connection_string=>"jdbc:mysql://localhost:3306;user=test;password=test123;databaseName=test;integratedSecurity=true;", jdbc_driver_class=>"com.mysql.cj.jdbc.Driver", jdbc_user=>"doesntmatterwithauthentication", statement=>"SELECT * FROM phones", id=>"de31f73d4505e1de7e76bce4917c48b412909473f3872288edd51acccf0e0be6", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_ad3ad9d5-a3e8-492a-9231-1e729e8c4190", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 01:00:00 +0100}, last_run_metadata_path=>"C:\\Users\\ross.massie/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: Java::JavaSql::SQLNonTransientConnectionException: Cannot load connection class because of underlying exception: com.mysql.cj.exceptions.WrongArgumentException: Failed to parse the host:port pair 'localhost:3306;user=test;password=test123;databaseName=test;integratedSecurity=true;'.
Exception: Sequel::DatabaseConnectionError
Stack: com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(com/mysql/cj/jdbc/exceptions/SQLError.java:110)
com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(com/mysql/cj/jdbc/exceptions/SQLError.java:97)
com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(com/mysql/cj/jdbc/exceptions/SQLError.java:89)
com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(com/mysql/cj/jdbc/exceptions/SQLError.java:63)
com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(com/mysql/cj/jdbc/exceptions/SQLError.java:73)
com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(com/mysql/cj/jdbc/exceptions/SQLExceptionsMapping.java:79)
com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(com/mysql/cj/jdbc/exceptions/SQLExceptionsMapping.java:131)
com.mysql.cj.jdbc.NonRegisteringDriver.connect(com/mysql/cj/jdbc/NonRegisteringDriver.java:227)
java.lang.reflect.Method.invoke(java/lang/reflect/Method)
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:423)
org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:290)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.adapters.jdbc.connect(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/adapters/jdbc.rb:215)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.adapters.jdbc.RUBY$method$connect$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/sequel_minus_5_dot_10_dot_0/lib/sequel/adapters/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/adapters/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.make_new(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool.rb:127)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.RUBY$method$make_new$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/sequel_minus_5_dot_10_dot_0/lib/sequel/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.threaded.assign_connection(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool/threaded.rb:206)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.threaded.RUBY$method$assign_connection$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/sequel_minus_5_dot_10_dot_0/lib/sequel/connection_pool/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool/threaded.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.threaded.acquire(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool/threaded.rb:138)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.threaded.RUBY$method$acquire$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/sequel_minus_5_dot_10_dot_0/lib/sequel/connection_pool/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool/threaded.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.connection_pool.threaded.hold(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/connection_pool/threaded.rb:90)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.database.connecting.synchronize(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/database/connecting.rb:270)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.database.connecting.test_connection(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/database/connecting.rb:279)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.database.connecting.connect(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/database/connecting.rb:58)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.sequel_minus_5_dot_10_dot_0.lib.sequel.core.connect(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/sequel-5.10.0/lib/sequel/core.rb:116)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.block in jdbc_connect(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:114)
org.jruby.RubyKernel.loop(org/jruby/RubyKernel.java:1292)
org.jruby.RubyKernel$INVOKER$s$0$0$loop.call(org/jruby/RubyKernel$INVOKER$s$0$0$loop.gen)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.jdbc_connect(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:111)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.RUBY$method$jdbc_connect$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9/lib/logstash/plugin_mixins/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.open_jdbc_connection(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:164)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.RUBY$method$open_jdbc_connection$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9/lib/logstash/plugin_mixins/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.execute_statement(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:220)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.plugin_mixins.jdbc.RUBY$method$execute_statement$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9/lib/logstash/plugin_mixins/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.inputs.jdbc.execute_query(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:264)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.inputs.jdbc.RUBY$method$execute_query$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9/lib/logstash/inputs/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.inputs.jdbc.run(C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:250)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9.lib.logstash.inputs.jdbc.RUBY$method$run$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/vendor/bundle/jruby/$2_dot_3_dot_0/gems/logstash_minus_input_minus_jdbc_minus_4_dot_3_dot_9/lib/logstash/inputs/C:/Program Files/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.logstash_minus_core.lib.logstash.pipeline.inputworker(C:/Program Files/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:512)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.logstash_minus_core.lib.logstash.pipeline.RUBY$method$inputworker$0$__VARARGS__(C_3a_/Program_20_Files/logstash_minus_6_dot_3_dot_2/logstash_minus_core/lib/logstash/C:/Program Files/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb)
C_3a_.Program_20_Files.logstash_minus_6_dot_3_dot_2.logstash_minus_core.lib.logstash.pipeline.block in start_input(C:/Program Files/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:505)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)
sql.conf:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/test?useSSL=false&serverTimezone=GMT&DatabaseName=test"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_driver_library => "C:\Program Files (x86)\MySQL\Connector J 8.0\mysql-connector-java-8.0.12.jar"
jdbc_user => "test"
jdbc_password => "test123"
statement => "SELECT * FROM phones"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "phones"
}
}
Rewriting the whole connection string worked after a few tweaks. At the time of the issue a few drivers references I found within x-pack were referencing the wrong driver if memory serves and needed changing.
When I first created Logstash jdbc conf file to import my MySQL data to Elasticsearch, it was working good. But, suddenly, the same files which worked okay, are not working any more and giving an error "Error registering plugin".
Here is my sms-logstash.conf file
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/sms"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/bin/mysql-connector-java-5.1.45-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM salon_reg"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "sms"
"document_type" => "salon_reg"
}
}
When I run this command as bin/logstash -f sms-logstash.conf
It gives the following error
C:\Users\robesh\Downloads\logstash-6.2.3\logstash-6.2.3\bin>logstash -f sms-logstash.conf
Sending Logstash's logs to C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logs which is now configured via log4j2.properties
[2018-04-15T15:05:46,900][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/modules/fb_apache/configuration"}
[2018-04-15T15:05:47,028][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/modules/netflow/configuration"}
[2018-04-15T15:05:47,665][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-04-15T15:05:49,635][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-04-15T15:05:51,303][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-15T15:06:04,935][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[//localhost:9200], index=>"sms", document_type=>"salon_reg", id=>"7eecf64f77b050d7ebba1e645e2de1d988a4f3d4b88814c75044d6e6c4606a2b", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_cbc6f6b2-287c-44bf-8771-3a951d7ceabf", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-04-15T15:06:05,141][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-04-15T15:06:06,405][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-04-15T15:06:06,426][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-04-15T15:06:07,079][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-04-15T15:06:07,280][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-15T15:06:07,289][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-15T15:06:07,336][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-15T15:06:07,424][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-15T15:06:07,533][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-04-15T15:06:08,863][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Jdbc jdbc_connection_string=>\"jdbc:mysql://localhost:3306/sms\", jdbc_user=>\"root\", jdbc_driver_library=>\"C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/bin/mysql-connector-java-5.1.46-bin.jar\", jdbc_driver_class=>\"com.mysql.jdbc.Driver\", statement=>\"SELECT * FROM salon_reg\", id=>\"0c99246377cb88117db974a51d7bdcb982e8fe882ab825575c8ebdc3c890fb5a\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_d687f831-1ac5-4480-b23d-e7fc976f5e9a\", enable_metric=>true, charset=>\"UTF-8\">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>\"info\", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, last_run_metadata_path=>\"C:\\\\Users\\\\robesh/.logstash_jdbc_last_run\", use_column_value=>false, tracking_column_type=>\"numeric\", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>", :error=>"(<unknown>): 'reader' unacceptable code point '\u0000' (0x0) special characters are not allowed\nin \"'reader'\", position 0 at line 0 column 0", :thread=>"#<Thread:0x1ad70417 run>"}
[2018-04-15T15:06:09,386][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Psych::SyntaxError: (<unknown>): 'reader' unacceptable code point ' ' (0x0) special characters are not allowed
in "'reader'", position 0 at line 0 column 0>, :backtrace=>["org/jruby/ext/psych/PsychParser.java:231:in `parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:377:in `parse_stream'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:325:in `parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/psych.rb:252:in `load'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.5/lib/logstash/plugin_mixins/value_tracking.rb:102:in `read'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.5/lib/logstash/plugin_mixins/value_tracking.rb:78:in `get_initial'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.5/lib/logstash/plugin_mixins/value_tracking.rb:36:in `initialize'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.5/lib/logstash/plugin_mixins/value_tracking.rb:29:in `build_last_value_tracker'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.5/lib/logstash/inputs/jdbc.rb:216:in `register'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:341:in `register_plugin'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:352:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:352:in `register_plugins'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:502:in `start_inputs'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:393:in `start_workers'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:289:in `run'", "C:/Users/robesh/Downloads/logstash-6.2.3/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:249:in `block in start'"], :thread=>"#<Thread:0x1ad70417 run>"}
[2018-04-15T15:06:09,506][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
C:\Users\robesh\Downloads\logstash-6.2.3\logstash-6.2.3\bin>
The interesting thing here is that this same file was working fine previously and it indexed my data nicely to Elasticsearch, but suddenly now its giving an error.
It may be an encoding problem, ensure that the logstash file has UTF-8 encoding.
Make sure your config is correct :
bin/logstash --config.test_and_exit -f <path_to_config_file>
If your config is valid, then looks at $USER_HOME/.logstash_jdbc_last_run file, probably that file exists but isn't valid YAML. Fix what's broken about the file, or just delete it.
I appreciate this is probably not a JHipster specific issue, but how some of my dependencies may be installed. It happens on 3.0.0 and 3.1.0. On the other hand 2.27.0 is fine.
This happens when I run yo jhipster.
...
Server app generated successfully.
Client app generated successfully.
(node:38364) fs: re-evaluating native module sources is not supported. If you are using the graceful-fs module, please update it to a more recent version.
[21:00:11] Using gulpfile ~/dev/projects/blog/gulpfile.js
[21:00:11] Starting 'install'...
[21:00:11] Starting 'wiredep:test'...
[21:00:11] Starting 'wiredep:app'...
[21:00:11] Starting 'ngconstant:dev'...
[21:00:11] 'ngconstant:dev' errored after 12 ms
[21:00:11] Error in plugin 'gulp-tslint-log'
TypeError: Path must be a string. Received null
at assertPath (path.js:7:11)
at Object.dirname (path.js:1324:5)
at getFilePath (/Users/me/dev/projects/blog/node_modules/gulp-ng-constant-fork/index.js:95:27)
at DestroyableTransform.objectStream [as _transform] (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/index.js:60:25)
at DestroyableTransform.Transform._read (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_transform.js:184:10)
at DestroyableTransform.Transform._write (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_transform.js:172:12)
at doWrite (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_writable.js:237:10)
at writeOrBuffer (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_writable.js:227:5)
at DestroyableTransform.Writable.write (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_writable.js:194:11)
at DestroyableTransform.Writable.end (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/node_modules/readable-stream/lib/_stream_writable.js:352:10)
at ngConstantPlugin (/Users/alberto/dev/projects/blog/node_modules/gulp-ng-constant-fork/index.js:33:16)
at Gulp.<anonymous> (/Users/alberto/dev/projects/blog/gulpfile.js:164:12)
at module.exports (/Users/alberto/dev/projects/blog/node_modules/orchestrator/lib/runTask.js:34:7)
at Gulp.Orchestrator._runTask (/Users/alberto/dev/projects/blog/node_modules/orchestrator/index.js:273:3)
at Gulp.Orchestrator._runStep (/Users/alberto/dev/projects/blog/node_modules/orchestrator/index.js:214:10)
at Gulp.Orchestrator.start (/Users/alberto/dev/projects/blog/node_modules/orchestrator/index.js:134:8)
Then when I start the application I get this on the browser's console:
Uncaught Error: [$injector:modulerr] Failed to instantiate module blogApp due to:
Error: [$injector:nomod] Module 'blogApp' is not available! You either misspelled the module name or forgot to load it. If registering a module ensure that you specify the dependencies as the second argument.
http://errors.angularjs.org/1.5.2/$injector/nomod?p0=badgeritoApp
at http://127.0.0.1:8080/bower_components/angular/angular.js:68:12
at http://127.0.0.1:8080/bower_components/angular/angular.js:2034:17
at ensure (http://127.0.0.1:8080/bower_components/angular/angular.js:1958:38)
at module (http://127.0.0.1:8080/bower_components/angular/angular.js:2032:14)
at http://127.0.0.1:8080/bower_components/angular/angular.js:4524:22
at forEach (http://127.0.0.1:8080/bower_components/angular/angular.js:321:20)
at loadModules (http://127.0.0.1:8080/bower_components/angular/angular.js:4508:5)
at createInjector (http://127.0.0.1:8080/bower_components/angular/angular.js:4430:19)
at doBootstrap (http://127.0.0.1:8080/bower_components/angular/angular.js:1710:20)
at bootstrap (http://127.0.0.1:8080/bower_components/angular/angular.js:1731:12)
Here is the content of .yo-rc.json
{
"generator-jhipster": {
"jhipsterVersion": "3.0.0",
"baseName": "blog",
"packageName": "com.albertofaci.blog",
"packageFolder": "com/albertofaci/blog",
"serverPort": "8080",
"authenticationType": "session",
"hibernateCache": "no",
"clusteredHttpSession": "no",
"websocket": "no",
"databaseType": "mongodb",
"devDatabaseType": "mongodb",
"prodDatabaseType": "mongodb",
"searchEngine": "no",
"buildTool": "gradle",
"enableSocialSignIn": false,
"rememberMeKey": "94cf21b11aff8d8d4a9b9b3724834876b995e5b1",
"useSass": false,
"applicationType": "monolith",
"testFrameworks": [
"gatling"
],
"enableTranslation": true,
"nativeLanguage": "en",
"languages": [
"en",
"es"
]
}}
node version: v6.0.0
npm version: 3.8.7
java 8
Any hints or ideas?
I had exactly the same problem as the OP, I changed to use the LTS version of Node as Gaƫl Marziou suggested. This fixed the problem.
I'm trying to configure the shovel plugin via the config file (running in docker) but I get this error:
BOOT FAILED
===========
Error description:
{error,{failed_to_cluster_with,[rabbit#dalmacpmfd57],
"Mnesia could not connect to any nodes."}}
The config is set up this way because the destination for shovel will be created on demand when a dev environment is spun up... the source is a permanent rabbitmq instance running that the new, dev environment will attach to.
Here is the config file contents:
[
{rabbitmq_shovel,
[{shovels,
[{indexer_replica_static,
[{sources,
[{broker, [ "amqp://guest:guest#rabbitmq/newdev" ]},
{declarations,
[{'queue.declare', [{queue, <<"Indexer_Replica_Static">>}, durable]},
{'queue.bind',[ {exchange, <<"Indexer">>}, {queue, <<"Indexer_Replica_Static">>}]}
]
}
]
},
{destinations,
[{broker, "amqp://"},
{declarations, [ {'exchange.declare', [ {exchange, <<"Indexer_Replica_Static">>}
, {type, <<"fanout">>}, durable]},
{'queue.declare', [
{queue, <<"Indexer_Replica_Static">>},
durable]},
{'queue.bind',
[ {exchange, <<"Indexer_Replica_Static">>}
, {queue, <<"Indexer_Replica_Static">>}
]}
]
}
]
},
{queue, <<"Indexer_Replica_Static">>},
{prefetch_count, 0},
{ack_mode, on_confirm},
{publish_properties, [ {delivery_mode, 2} ]},
{reconnect_delay, 2.5}
]
}
]
},
{reconnect_delay, 2.5}
]
}
].
[UPDATE]
This is being run in docker but since I couldn't debug the issue in docker I tried booting up rabbit locally with the same config file. I noticed in the logs that the rabbit config system variable I set (RABBITMQ_CONFIG_FILE) isn't reflected in the log and the shovel settings haven't been applied (no surprise huh). I verified the variable with an echo statement and the correct path is displayed: /dev/rabbitmq_server-3.3.4/rabbitmq
=INFO REPORT==== 3-Sep-2014::15:30:37 ===
node : rabbit#dalmacpmfd57
home dir : /Users/e002678
config file(s) : (none)
cookie hash : n6vhh8tY7Z+uR2DV6gcHUg==
log : /usr/local/rabbitmq_server-3.3.4/sbin/../var/log/rabbitmq/rabbit#dalmacpmfd57.log
sasl log : /usr/local/rabbitmq_server-3.3.4/sbin/../var/log/rabbitmq/rabbit#dalmacpmfd57- sasl.log
database dir : /usr/local/rabbitmq_server-3.3.4/sbin/../var/lib/rabbitmq/mnesia/rabbit#dalmacpmfd57
Thanks!