Mallet LDA ArrayIndexOutOfBoundsException while training the model - lda

I am trying to build a model with 500 or 1000 topics on a 1M document dataset with Mallet LDA. After 60 iterations I am getting an ArrayIndexOutOfBoundsException. The error message is as below:
<60> LL/token: -7.64386
overflow on type 8
java.lang.ArrayIndexOutOfBoundsException: 500
at cc.mallet.topics.WorkerRunnable.buildLocalTypeTopicCounts(WorkerRunnable.java:208)
at cc.mallet.topics.WorkerRunnable.run(WorkerRunnable.java:280)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
overflow on type 8
The command I am running is:
bin/mallet train-topics
--input data.mallet
--output-model lda.model
--inferencer-filename topic-inferencer-model.mallet
--output-topic-keys topic-keys.txt
--topic-word-weights-file topic-word-weights.txt
--word-topic-counts-file word-topic-counts-file.txt
--output-doc-topics doc-topics.txt
--num-topics 500
--num-threads 16
--num-iterations 1500
--use-symmetric-alpha FALSE
Any suggestion is much appreciated.

Related

Store Pig result into Hive table using Hcat - ClassNotFoundException: org.datanucleus.NucleusContext

I want to load the simple json data into hive table without using Json Serde.
So I have tried to Load the json file in Pig and store the result into hive table.
A = LOAD '/data/pig_files/test.json' Using JsonLoader('merchant_id: chararray, platform_id: chararray, last_cdd_review_date: chararray, risk_factors:{(area:chararray,risk:chararray)}, expected_annual_transaction_number: chararray');
STORE A INTO 'Test_tbl' USING org.apache.hive.hcatalog.pig.HCatStorer();
It works fine but after server patching getting below error
Pig Stack Trace
---------------
ERROR 1115: org.apache.hive.hcatalog.common.HCatException : 2001 :
Error setting output information. Cause :
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1002: Unable to store alias A
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1779)
at org.apache.pig.PigServer.registerQuery(PigServer.java:708)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1110)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:512)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:175)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
Caused by: org.apache.pig.impl.plan.VisitorException: ERROR 1115: <line 3, column 0> Output Location Validation Failed for: 'test_tbl More info to follow:
org.apache.hive.hcatalog.common.HCatException : 2001 : Error setting output information.
Cause : com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:64)
at org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:99)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
at org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53)
at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
at org.apache.pig.newplan.logical.relational.LogicalPlan.validate(LogicalPlan.java:234)
at org.apache.pig.PigServer$Graph.compile(PigServer.java:1852)
at org.apache.pig.PigServer$Graph.access$300(PigServer.java:1528)
at org.apache.pig.PigServer.execute(PigServer.java:1441)
at org.apache.pig.PigServer.access$500(PigServer.java:119)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1774)
... 14 more
Caused by: org.apache.pig.PigException: ERROR 1115: org.apache.hive.hcatalog.common.HCatException : 2001 :
Error setting output information. Cause : com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:196)
at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:57)
... 26 more
Caused by: org.apache.hive.hcatalog.common.HCatException : 2001 : Error setting output information. Cause :
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException:
Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:220)
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:191)
... 27 more
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263)
at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789)
at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:229)
at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:204)
at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:563)
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:90)
... 29 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1775)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:115)
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:234)
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:229)
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
... 35 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1773)
... 45 more
Caused by: java.lang.NoClassDefFoundError: org/datanucleus/NucleusContext
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1741)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:688)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:654)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:648)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:717)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:420)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:7036)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:254)
at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:334)
... 50 more
Caused by: java.lang.ClassNotFoundException: org.datanucleus.NucleusContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 64 more
================================================================================
Help me to resolve this or suggest me other ways to load Json data into hive table without Serde.

Datomic datalog: ArrayIndexOutOfBounds Exception when using aggregates

Given that db is a database value, the following query throws an exception:
(d/q
'[:find ?concert (count-distinct ?demand)
:in $ ?campaignId
:with ?concert
:where
[?c :campaign/id ?campaignId]
[?concert :crowd-concert/campaign ?c]
[?demand :demand/concert ?concert]
]
db "546b7e0f2348f10200abf5ea")
(The information model is that there are several concerts belonging to a campaign, and demands that , and we want to fetch the concert fields and the number of demands for each concert).
The stacktrace is here:
java.lang.Exception: processing rule: (q__25764 ?concert ?demand ?concert), message: processing clause: [?demand :demand/concert ?concert], message: java.lang.ArrayIndexOutOfBoundsException: 2
at datomic.datalog$eval_rule$fn__6161.invoke(datalog.clj:1441)
at datomic.datalog$eval_rule.invoke(datalog.clj:1421)
at datomic.datalog$eval_query.invoke(datalog.clj:1464)
at datomic.datalog$qsqr.invoke(datalog.clj:1553)
at datomic.datalog$qsqr.invoke(datalog.clj:1510)
at datomic.query$q.invoke(query.clj:674)
at datomic.api$q.doInvoke(api.clj:35)
at clojure.lang.RestFn.invoke(RestFn.java:439)
at bs.routes.campaigns$eval25762.invoke(NO_SOURCE_FILE:1)
at clojure.lang.Compiler.eval(Compiler.java:6782)
at clojure.lang.Compiler.eval(Compiler.java:6745)
at clojure.core$eval.invoke(core.clj:3081)
at clojure.main$repl$read_eval_print__7099$fn__7102.invoke(main.clj:240)
at clojure.main$repl$read_eval_print__7099.invoke(main.clj:240)
at clojure.main$repl$fn__7108.invoke(main.clj:258)
at clojure.main$repl.doInvoke(main.clj:258)
at clojure.lang.RestFn.invoke(RestFn.java:1523)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__1025.invoke(interruptible_eval.clj:53)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:630)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1868)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke(interruptible_eval.clj:51)
at clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__1067$fn__1070.invoke(interruptible_eval.clj:183)
at clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__1060.invoke(interruptible_eval.clj:152)
at clojure.lang.AFn.run(AFn.java:22)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.Exception: processing clause: [?demand :demand/concert ?concert], message: java.lang.ArrayIndexOutOfBoundsException: 2
at datomic.datalog$eval_clause$fn__6135.invoke(datalog.clj:1387)
at datomic.datalog$eval_clause.invoke(datalog.clj:1350)
at datomic.datalog$eval_rule$fn__6161.invoke(datalog.clj:1436)
... 29 more
Caused by: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException: 2
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at clojure.core$deref_future.invoke(core.clj:2186)
at clojure.core$deref.invoke(core.clj:2207)
at clojure.core$mapv$fn__6727.invoke(core.clj:6616)
at clojure.lang.PersistentVector.reduce(PersistentVector.java:333)
at clojure.core$reduce.invoke(core.clj:6518)
at clojure.core$mapv.invoke(core.clj:6616)
at datomic.datalog$fn__5678.invoke(datalog.clj:588)
at datomic.datalog$fn__5536$G__5508__5551.invoke(datalog.clj:51)
at datomic.datalog$join_project_coll.invoke(datalog.clj:116)
at datomic.datalog$fn__5607.invoke(datalog.clj:219)
at datomic.datalog$fn__5515$G__5510__5530.invoke(datalog.clj:51)
at datomic.datalog$eval_clause$fn__6135.invoke(datalog.clj:1356)
... 31 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 2
at clojure.lang.RT.aset(RT.java:2326)
at datomic.datalog$fn__5678$project__5749.invoke(datalog.clj:480)
at datomic.datalog$fn__5678$join__5767.invoke(datalog.clj:578)
at datomic.datalog$fn__5678$fn__5772$fn__5773.invoke(datalog.clj:588)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:630)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1868)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invoke(core.clj:634)
at clojure.core$bound_fn_STAR_$fn__4439.doInvoke(core.clj:1890)
at clojure.lang.RestFn.invoke(RestFn.java:397)
at clojure.lang.AFn.call(AFn.java:18)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Am I doing something wrong ?
Removing the :with clause solved the problem.
I don't know Datalog enough to know why exactly, but I assume that's because there's some redundancy in ?concert being in both :find and :with clauses.

export more than 100,000 rows to csv in vaadin

I try to export data from vaadin table with more than 100,000 records to a CSV 2007 file because it support more than 1 million rows.
i try this code:
CsvExport csvExport = new CsvExport(exporttable,exportedReportName,exportedReportName);
csvExport.excludeCollapsedColumns();
csvExport.setExportFileName(exportedReportName+".csv");
csvExport.setDisplayTotals(false);
csvExport.export();
using tableexport-for-vaadin-1.5.1.5 and poi-3.10-FINAL.jar
and i got this exception;
Caused by: java.lang.IllegalArgumentException: Invalid row number (65536) outside allowable range (0..65535)
at org.apache.poi.hssf.usermodel.HSSFRow.setRowNum(HSSFRow.java:239)
at org.apache.poi.hssf.usermodel.HSSFRow.<init>(HSSFRow.java:87)
at org.apache.poi.hssf.usermodel.HSSFRow.<init>(HSSFRow.java:71)
at org.apache.poi.hssf.usermodel.HSSFSheet.createRow(HSSFSheet.java:232)
at org.apache.poi.hssf.usermodel.HSSFSheet.createRow(HSSFSheet.java:68)
at com.vaadin.addon.tableexport.ExcelExport.addDataRow(ExcelExport.java:518)
at com.vaadin.addon.tableexport.ExcelExport.addDataRows(ExcelExport.java:469)
at com.vaadin.addon.tableexport.ExcelExport.convertTable(ExcelExport.java:264)
at com.vaadin.addon.tableexport.TableExport.export(TableExport.java:80)
at com.mobinets.fm.gui.views.AlarmStatusView$4.buttonClick(AlarmStatusView.java:605)
at sun.reflect.GeneratedMethodAccessor277.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.vaadin.event.ListenerMethod.receiveEvent(ListenerMethod.java:508)

Getting deadloack when using C3P0, OneJar and MySQL

Getting deadloack when using C3P0, OneJar and MySQL
C3P0 Setting is as follows
pooledDataSource=new ComboPooledDataSource();
set database properies ie userName password driver etc
pooledDataSource.setMaxPoolSize(100);
pooledDataSource.setMinPoolSize(10);
pooledDataSource.setInitialPoolSize(10);
pooledDataSource.setAcquireIncrement(4);
pooledDataSource.setIdleConnectionTestPeriod(900);
pooledDataSource.setTestConnectionOnCheckin(true);
pooledDataSource.setMaxIdleTimeExcessConnections(300);
pooledDataSource.setNumHelperThreads(6);
pooledDataSource.setMaxAdministrativeTaskTime(60);
pooledDataSource.setUnreturnedConnectionTimeout(120);
pooledDataSource.setDebugUnreturnedConnectionStackTraces(true);
**Thread Dump is as follows**
Found one Java-level deadlock:
=============================
"com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#3":
waiting to lock monitor 0x00007f443abe32a0 (object 0x00000000c6071690, a com.simontuffs.onejar.JarClassLoader),
which is held by "main"
"main":
waiting to lock monitor 0x00007f443abe2b68 (object 0x00000000c615e950, a com.simontuffs.onejar.JarClassLoader$1),
which is held by "com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#3"
Java stack information for the threads listed above:
===================================================
"com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#3":
at java.lang.ClassLoader.loadClass(ClassLoader.java:403)
- waiting to lock <0x00000000c6071690> (a com.simontuffs.onejar.JarClassLoader)
at com.simontuffs.onejar.JarClassLoader.loadClass(JarClassLoader.java:630)
at java.lang.ClassLoader.loadClass(ClassLoader.java:410)
- locked <0x00000000c615e950> (a com.simontuffs.onejar.JarClassLoader$1)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at com.simontuffs.onejar.JarClassLoader$1.loadClass(JarClassLoader.java:296)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
- locked <0x00000000c615e950> (a com.simontuffs.onejar.JarClassLoader$1)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at com.simontuffs.onejar.JarClassLoader$1.loadClass(JarClassLoader.java:296)
at com.mysql.jdbc.DatabaseMetaData$9.forEach(DatabaseMetaData.java:5276)
at com.mysql.jdbc.DatabaseMetaData$9.forEach(DatabaseMetaData.java:5239)
at com.mysql.jdbc.IterateBlock.doForAll(IterateBlock.java:50)
at com.mysql.jdbc.DatabaseMetaData.getTables(DatabaseMetaData.java:5238)
at com.mchange.v2.c3p0.impl.DefaultConnectionTester.activeCheckConnectionNoQuery(DefaultConnectionTester.java:185)
at com.mchange.v2.c3p0.impl.DefaultConnectionTester.activeCheckConnection(DefaultConnectionTester.java:62)
at com.mchange.v2.c3p0.AbstractConnectionTester.activeCheckConnection(AbstractConnectionTester.java:67)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.testPooledConnection(C3P0PooledConnectionPool.java:368)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.refurbishResourceOnCheckin(C3P0PooledConnectionPool.java:301)
at com.mchange.v2.resourcepool.BasicResourcePool.attemptRefurbishResourceOnCheckin(BasicResourcePool.java:1606)
at com.mchange.v2.resourcepool.BasicResourcePool.access$200(BasicResourcePool.java:32)
at com.mchange.v2.resourcepool.BasicResourcePool$1RefurbishCheckinResourceTask.run(BasicResourcePool.java:1228)
at com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)
"main":
at java.lang.ClassLoader.loadClass(ClassLoader.java:403)
- waiting to lock <0x00000000c615e950> (a com.simontuffs.onejar.JarClassLoader$1)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at com.simontuffs.onejar.JarClassLoader$1.loadClass(JarClassLoader.java:296)
at com.simontuffs.onejar.JarClassLoader.findClass(JarClassLoader.java:642)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
- locked <0x00000000c6071690> (a com.simontuffs.onejar.JarClassLoader)
at com.simontuffs.onejar.JarClassLoader.loadClass(JarClassLoader.java:630)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at org.hibernate.cfg.SettingsFactory.createBatcherFactory(SettingsFactory.java:443)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:195)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2863)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2859)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1870)
at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:268)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
- locked <0x00000000fa3ce208> (a java.util.concurrent.ConcurrentHashMap)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:591)
- locked <0x00000000fa3ce258> (a java.util.concurrent.ConcurrentHashMap)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
- locked <0x00000000fa3ce358> (a java.lang.Object)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:139)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:93)
at com.goldensource.activegateway.backoffice.launcher.BackOfficeJobLauncher.sendNotification(BackOfficeJobLauncher.java:464)
at com.goldensource.activegateway.backoffice.launcher.BackOfficeJobLauncher.main(BackOfficeJobLauncher.java:186)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.simontuffs.onejar.Boot.run(Boot.java:340)
at com.simontuffs.onejar.Boot.main(Boot.java:166)
Found 1 deadlock.
but when i use jtds to connect with SQL database every thing works fine?
Can anyone tell me whats going wrong?
So, this looks like an issue with c3p0's help Threads trying to load classes using the OneJar classloader. Try updating to the latest pre-release of c3p0-0.9.5 (currently c3p0-0.9.5-pre6) and setting the configuration property contextClassLoaderSource to library or none. See http://www.mchange.com/projects/c3p0/#contextClassLoaderSource

problematic results after Elastic Search upgrade

I've updated my Elastic Search from 0.19.4 to 0.20.6 and I'm not getting the desired results.. I'm doing a dynamic search after key presses and the code that was working fine before acts different now.
The Search:
'' = (empty field) fine
a = 9400 hits
ab = 126 hits ERROR
abc = 2 hits ERROR
abcd 0 hits fine
Dependencies I've changed:
runtime 'org.elasticsearch:elasticsearch-lang-groovy:1.1.0' --> runtime 'org.elasticsearch:elasticsearch-lang-groovy:1.3.0'
runtime 'org.elasticsearch:elasticsearch:0.19.4' --> runtime 'org.elasticsearch:elasticsearch:0.20.6'
runtime 'org.xerial.snappy:snappy-java:1.0.4.1' (new)
Stack:
2013-04-09 10:47:58,130 [http-bio-8080-exec-2] DEBUG xxxx.SearchController - result stuff is: [hits:org.elasticsearch.search.internal.InternalSearchHits#9b0d61b]
2013-04-09 10:47:58,137 [http-bio-8080-exec-2] ERROR xxxx.SearchController - Problem...
org.codehaus.groovy.grails.web.converters.exceptions.ConverterException: Error converting Bean with class org.elasticsearch.search.internal.InternalSearchHit
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.convertAnother(JSON.java:162)
at grails.converters.JSON.value(JSON.java:199)
at grails.converters.JSON.render(JSON.java:134)
at grails.converters.JSON.render(JSON.java:150)
at xxx.xxxx.xxxx.SearchController.autocomplete(SearchController.groovy:514)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.reflect.InvocationTargetException
... 13 more
Caused by: java.lang.NullPointerException
at org.elasticsearch.common.compress.lzf.LZFCompressor.isCompressed(LZFCompressor.java:76)
at org.elasticsearch.common.compress.CompressorFactory.compressor(CompressorFactory.java:124)
at org.elasticsearch.common.compress.CompressorFactory.uncompressIfNeeded(CompressorFactory.java:174)
at org.elasticsearch.search.internal.InternalSearchHit.sourceRef(InternalSearchHit.java:172)
at org.elasticsearch.search.internal.InternalSearchHit.getSourceRef(InternalSearchHit.java:181)
... 13 more
Where the code fails:
try {
log.debug("result stuff is: ${results}");
render results as JSON
}
catch ( Exception e ) {
log.error("Problem...",e);
}
This was due to a slight change in ES structure, had to rewrite my conversion method :)