Datomic datalog: ArrayIndexOutOfBounds Exception when using aggregates - exception

Given that db is a database value, the following query throws an exception:
(d/q
'[:find ?concert (count-distinct ?demand)
:in $ ?campaignId
:with ?concert
:where
[?c :campaign/id ?campaignId]
[?concert :crowd-concert/campaign ?c]
[?demand :demand/concert ?concert]
]
db "546b7e0f2348f10200abf5ea")
(The information model is that there are several concerts belonging to a campaign, and demands that , and we want to fetch the concert fields and the number of demands for each concert).
The stacktrace is here:
java.lang.Exception: processing rule: (q__25764 ?concert ?demand ?concert), message: processing clause: [?demand :demand/concert ?concert], message: java.lang.ArrayIndexOutOfBoundsException: 2
at datomic.datalog$eval_rule$fn__6161.invoke(datalog.clj:1441)
at datomic.datalog$eval_rule.invoke(datalog.clj:1421)
at datomic.datalog$eval_query.invoke(datalog.clj:1464)
at datomic.datalog$qsqr.invoke(datalog.clj:1553)
at datomic.datalog$qsqr.invoke(datalog.clj:1510)
at datomic.query$q.invoke(query.clj:674)
at datomic.api$q.doInvoke(api.clj:35)
at clojure.lang.RestFn.invoke(RestFn.java:439)
at bs.routes.campaigns$eval25762.invoke(NO_SOURCE_FILE:1)
at clojure.lang.Compiler.eval(Compiler.java:6782)
at clojure.lang.Compiler.eval(Compiler.java:6745)
at clojure.core$eval.invoke(core.clj:3081)
at clojure.main$repl$read_eval_print__7099$fn__7102.invoke(main.clj:240)
at clojure.main$repl$read_eval_print__7099.invoke(main.clj:240)
at clojure.main$repl$fn__7108.invoke(main.clj:258)
at clojure.main$repl.doInvoke(main.clj:258)
at clojure.lang.RestFn.invoke(RestFn.java:1523)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__1025.invoke(interruptible_eval.clj:53)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:630)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1868)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke(interruptible_eval.clj:51)
at clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__1067$fn__1070.invoke(interruptible_eval.clj:183)
at clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__1060.invoke(interruptible_eval.clj:152)
at clojure.lang.AFn.run(AFn.java:22)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.Exception: processing clause: [?demand :demand/concert ?concert], message: java.lang.ArrayIndexOutOfBoundsException: 2
at datomic.datalog$eval_clause$fn__6135.invoke(datalog.clj:1387)
at datomic.datalog$eval_clause.invoke(datalog.clj:1350)
at datomic.datalog$eval_rule$fn__6161.invoke(datalog.clj:1436)
... 29 more
Caused by: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException: 2
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at clojure.core$deref_future.invoke(core.clj:2186)
at clojure.core$deref.invoke(core.clj:2207)
at clojure.core$mapv$fn__6727.invoke(core.clj:6616)
at clojure.lang.PersistentVector.reduce(PersistentVector.java:333)
at clojure.core$reduce.invoke(core.clj:6518)
at clojure.core$mapv.invoke(core.clj:6616)
at datomic.datalog$fn__5678.invoke(datalog.clj:588)
at datomic.datalog$fn__5536$G__5508__5551.invoke(datalog.clj:51)
at datomic.datalog$join_project_coll.invoke(datalog.clj:116)
at datomic.datalog$fn__5607.invoke(datalog.clj:219)
at datomic.datalog$fn__5515$G__5510__5530.invoke(datalog.clj:51)
at datomic.datalog$eval_clause$fn__6135.invoke(datalog.clj:1356)
... 31 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 2
at clojure.lang.RT.aset(RT.java:2326)
at datomic.datalog$fn__5678$project__5749.invoke(datalog.clj:480)
at datomic.datalog$fn__5678$join__5767.invoke(datalog.clj:578)
at datomic.datalog$fn__5678$fn__5772$fn__5773.invoke(datalog.clj:588)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:630)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1868)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invoke(core.clj:634)
at clojure.core$bound_fn_STAR_$fn__4439.doInvoke(core.clj:1890)
at clojure.lang.RestFn.invoke(RestFn.java:397)
at clojure.lang.AFn.call(AFn.java:18)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Am I doing something wrong ?

Removing the :with clause solved the problem.
I don't know Datalog enough to know why exactly, but I assume that's because there's some redundancy in ?concert being in both :find and :with clauses.

Related

Store Pig result into Hive table using Hcat - ClassNotFoundException: org.datanucleus.NucleusContext

I want to load the simple json data into hive table without using Json Serde.
So I have tried to Load the json file in Pig and store the result into hive table.
A = LOAD '/data/pig_files/test.json' Using JsonLoader('merchant_id: chararray, platform_id: chararray, last_cdd_review_date: chararray, risk_factors:{(area:chararray,risk:chararray)}, expected_annual_transaction_number: chararray');
STORE A INTO 'Test_tbl' USING org.apache.hive.hcatalog.pig.HCatStorer();
It works fine but after server patching getting below error
Pig Stack Trace
---------------
ERROR 1115: org.apache.hive.hcatalog.common.HCatException : 2001 :
Error setting output information. Cause :
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1002: Unable to store alias A
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1779)
at org.apache.pig.PigServer.registerQuery(PigServer.java:708)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1110)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:512)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:175)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
Caused by: org.apache.pig.impl.plan.VisitorException: ERROR 1115: <line 3, column 0> Output Location Validation Failed for: 'test_tbl More info to follow:
org.apache.hive.hcatalog.common.HCatException : 2001 : Error setting output information.
Cause : com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:64)
at org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:99)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
at org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53)
at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
at org.apache.pig.newplan.logical.relational.LogicalPlan.validate(LogicalPlan.java:234)
at org.apache.pig.PigServer$Graph.compile(PigServer.java:1852)
at org.apache.pig.PigServer$Graph.access$300(PigServer.java:1528)
at org.apache.pig.PigServer.execute(PigServer.java:1441)
at org.apache.pig.PigServer.access$500(PigServer.java:119)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1774)
... 14 more
Caused by: org.apache.pig.PigException: ERROR 1115: org.apache.hive.hcatalog.common.HCatException : 2001 :
Error setting output information. Cause : com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:196)
at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:57)
... 26 more
Caused by: org.apache.hive.hcatalog.common.HCatException : 2001 : Error setting output information. Cause :
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:220)
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:191)
... 27 more
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263)
at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789)
at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:229)
at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:204)
at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:563)
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:90)
... 29 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1775)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:115)
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:234)
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:229)
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
... 35 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1773)
... 45 more
Caused by: java.lang.NoClassDefFoundError: org/datanucleus/NucleusContext
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1741)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:688)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:654)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:648)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:717)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:420)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:7036)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:254)
at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:334)
... 50 more
Caused by: java.lang.ClassNotFoundException: org.datanucleus.NucleusContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 64 more
================================================================================
Help me to resolve this or suggest me other ways to load Json data into hive table without Serde.

Hive throws ORC does not support type conversion from file type error when create table as select with to_json UDF

With following simple SQL:
ADD JAR ivy://com.klout:brickhouse:0.6.+?transitive=false;
CREATE TEMPORARY FUNCTION to_json AS 'brickhouse.udf.json.ToJsonUDF';
create table test (b boolean);
insert into table test values (true);
create test1 as select to_json(b) as b from test;
I got the following exception.
Diagnostic Messages for this Task:
Error: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:265)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:212)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:332)
at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:724)
at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:438)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:177)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:171)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:251)
... 11 more
Caused by: org.apache.hadoop.hive.ql.io.orc.SchemaEvolution$IllegalEvolutionException: ORC does not support type conversion from file type boolean (1) to reader type string (1)
at org.apache.hadoop.hive.ql.io.orc.SchemaEvolution.buildConversionFileTypesArray(SchemaEvolution.java:254)
at org.apache.hadoop.hive.ql.io.orc.SchemaEvolution.buildConversionFileTypesArray(SchemaEvolution.java:214)
at org.apache.hadoop.hive.ql.io.orc.SchemaEvolution.<init>(SchemaEvolution.java:94)
at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.<init>(RecordReaderImpl.java:225)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.rowsOptions(ReaderImpl.java:550)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.createReaderFromFile(OrcInputFormat.java:240)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.<init>(OrcInputFormat.java:164)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:1067)
at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:67)
... 16 more
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
After days' struggle, I figured out this issue is: to_json(b) as b.
As I use the same column name after it's converted to a string with to_json. Seems Hive will use the original table schema to derive column type for the new table. And when we store a different type into that column, it throws an exception. What a wired behavior.

MySQL Sink Connector - JsonConverter - DataException: Unknown schema type: null

I have created MySQL sink connector and successfully run and i see the logs and got 200 response, But the sink connector not able to push the data into mysql db (data is available in the topic)
{
"name":"mysql-sink-connector",
"config":{ "tasks.max":"2",
"batch.size":"1000",
"batch.max.rows":"1000",
"poll.interval.ms":"500",
"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
"connection.url":"jdbc:mysql://mysql.azure.com:3306/db_test_dev",
"table.name.format":"tbl_clients_merchants",
"topics":"createorder",
"connection.user":"user",
"connection.password":"password",
"auto.create":"true",
"auto.evolve":"true",
"value.converter":"org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable":"true",
"key.converter":"org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable":"true"
}}
getting an error
[2021-07-29 09:41:03,157] ERROR WorkerSinkTask{id=jdbc-mysql-sink-connector-1} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177) org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.kafka.connect.errors.DataException: Converting byte[] to Kafka Connect data failed due to serialization error:
at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:344)
at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.kafka.connect.errors.DataException: Unknown schema type: null
at org.apache.kafka.connect.json.JsonConverter.convertToConnect(JsonConverter.java:743
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) ... 13 more
[2021-07-29 13:26:11,347] ERROR WorkerSinkTask{id=mysql-sink-connector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)
As the error suggests, you have null tombstone records in the topic, which cannot be mapped to any MySQL columns since there are no field names.
Option 1) Fix your producer to not send these messages
Option 2) Filter them in Connect
For example,
"transforms": "Filter",
"transforms.Filter.type": "org.apache.kafka.connect.transforms.Filter",
"transforms.Filter.predicate" : "DropNull",
"predicates" : "DropNull",
"predicates.DropNull.type": "org.apache.kafka.connect.transforms.predicates.RecordIsTombstone"

Imposible to start PhpStorm 2016.1 after update

I've just update PhpStorm to v2016.1 and I can't start it, it is crashing from start...
Here is the error log.
Internal error. Please report to http://jb.gg/ide/critical-startup-errors
com.intellij.diagnostic.PluginException: cannot create class 'com.jetbrains.php.actions.PhpStormNewProjectStep' [Plugin: com.intellij]
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.a(ActionManagerImpl.java:194)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.convertStub(ActionManagerImpl.java:166)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.a(ActionManagerImpl.java:511)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.a(ActionManagerImpl.java:491)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.getAction(ActionManagerImpl.java:484)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.preloadActions(ActionManagerImpl.java:1263)
at com.intellij.openapi.actionSystem.impl.ActionPreloader.preload(ActionPreloader.java:31)
at com.intellij.openapi.application.Preloader$2$1.run(Preloader.java:78)
at com.intellij.openapi.progress.impl.CoreProgressManager$2.run(CoreProgressManager.java:142)
at com.intellij.openapi.progress.impl.CoreProgressManager.a(CoreProgressManager.java:446)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:392)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:54)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:127)
at com.intellij.openapi.application.Preloader$2.run(Preloader.java:74)
at com.intellij.util.concurrency.BoundedTaskExecutor$2.run(BoundedTaskExecutor.java:187)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.intellij.util.ReflectionUtil.newInstance(ReflectionUtil.java:520)
at com.intellij.openapi.actionSystem.impl.ActionManagerImpl.convertStub(ActionManagerImpl.java:154)
... 16 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at com.intellij.util.ReflectionUtil.newInstance(ReflectionUtil.java:484)
... 17 more
Caused by: java.lang.IllegalStateException: #NotNull method io/j99/idea/vue/module/VueProjectGenerator.getName must not return null
at io.j99.idea.vue.module.VueProjectGenerator.getName(VueProjectGenerator.java:26)
at com.intellij.ide.util.projectWizard.ProjectSettingsStepBase.<init>(ProjectSettingsStepBase.java:62)
at com.jetbrains.php.actions.PhpStormProjectSpecificSettingsStep.<init>(PhpStormProjectSpecificSettingsStep.java:16)
at com.jetbrains.php.actions.PhpStormNewProjectStep$Customization.createProjectSpecificSettingsStep(PhpStormNewProjectStep.java:42)
at com.intellij.ide.util.projectWizard.AbstractNewProjectStep$Customization.getActions(AbstractNewProjectStep.java:112)
at com.intellij.ide.util.projectWizard.AbstractNewProjectStep$Customization.getActions(AbstractNewProjectStep.java:99)
at com.intellij.ide.util.projectWizard.AbstractNewProjectStep.<init>(AbstractNewProjectStep.java:65)
at com.jetbrains.php.actions.PhpStormNewProjectStep.<init>(PhpStormNewProjectStep.java:15)
... 22 more
Any idea
One of your plugins is incompatible with new 2006.1 version -- please either disable it or remove the plugin file completely.
The faulty plugin is vue-for-idea.

problematic results after Elastic Search upgrade

I've updated my Elastic Search from 0.19.4 to 0.20.6 and I'm not getting the desired results.. I'm doing a dynamic search after key presses and the code that was working fine before acts different now.
The Search:
'' = (empty field) fine
a = 9400 hits
ab = 126 hits ERROR
abc = 2 hits ERROR
abcd 0 hits fine
Dependencies I've changed:
runtime 'org.elasticsearch:elasticsearch-lang-groovy:1.1.0' --> runtime 'org.elasticsearch:elasticsearch-lang-groovy:1.3.0'
runtime 'org.elasticsearch:elasticsearch:0.19.4' --> runtime 'org.elasticsearch:elasticsearch:0.20.6'
runtime 'org.xerial.snappy:snappy-java:1.0.4.1' (new)
Stack:
2013-04-09 10:47:58,130 [http-bio-8080-exec-2] DEBUG xxxx.SearchController - result stuff is: [hits:org.elasticsearch.search.internal.InternalSearchHits#9b0d61b]
2013-04-09 10:47:58,137 [http-bio-8080-exec-2] ERROR xxxx.SearchController - Problem...
org.codehaus.groovy.grails.web.converters.exceptions.ConverterException: Error converting Bean with class org.elasticsearch.search.internal.InternalSearchHit
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.render(JSON.java:134)
at grails.converters.JSON.render(JSON.java:150)
at xxx.xxxx.xxxx.SearchController.autocomplete(SearchController.groovy:514)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.reflect.InvocationTargetException
... 13 more
Caused by: java.lang.NullPointerException
at org.elasticsearch.common.compress.lzf.LZFCompressor.isCompressed(LZFCompressor.java:76)
at org.elasticsearch.common.compress.CompressorFactory.compressor(CompressorFactory.java:124)
at org.elasticsearch.common.compress.CompressorFactory.uncompressIfNeeded(CompressorFactory.java:174)
at org.elasticsearch.search.internal.InternalSearchHit.sourceRef(InternalSearchHit.java:172)
at org.elasticsearch.search.internal.InternalSearchHit.getSourceRef(InternalSearchHit.java:181)
... 13 more
Where the code fails:
try {
log.debug("result stuff is: ${results}");
render results as JSON
}
catch ( Exception e ) {
log.error("Problem...",e);
}
This was due to a slight change in ES structure, had to rewrite my conversion method :)