JMeter CLI report generation fails - org.apache.jmeter.report.dashboard.GenerationException: Data exporter "json" - json

JMeter 5.3
I use the CLI as follows:
C:\Users\guyl\OneDrive - xxxxLTD\Guy\apache-jmeter-5.3\bin>jmeter -n -t "C:\Users\guyl\OneDrive - xxxxLTD\Guy\JMeter\DCS DB threaded test.jmx" -l"C:\Users\guyl\OneDrive - xxxxLTD\Guy\JMeter\db_report.csv" -Jthreads=5 -Jloops=100 -e -o"C:\Users\guyl\OneDrive - xxxxLTD\Guy\JMeter\output\19082020\"
Creating summariser <summary>
Created the tree successfully using C:\Users\guyl\OneDrive - xxxLTD\Guy\JMeter\DCS DB threaded test.jmx
Starting standalone test # Wed Aug 26 14:22:56 IDT 2020 (1598440976507)
Waiting for possible Shutdown/StopTestNow/HeapDump/ThreadDump message on port 4445
Generate Summary Results + 91 in 00:00:03 = 32.0/s Avg: 35 Min: 0 Max: 1031 Err: 0 (0.00%) Active: 2 Started: 2 Finished: 0
summary + 91 in 00:00:03 = 32.1/s Avg: 35 Min: 0 Max: 1031 Err: 0 (0.00%) Active: 2 Started: 2 Finished: 0
summary + 104409 in 00:00:29 = 3621.4/s Avg: 1 Min: 0 Max: 448 Err: 50400 (48.27%) Active: 0 Started: 5 Finished: 5
summary = 104500 in 00:00:32 = 3299.8/s Avg: 1 Min: 0 Max: 1031 Err: 50400 (48.23%)
Generate Summary Results + 104409 in 00:00:29 = 3620.9/s Avg: 1 Min: 0 Max: 448 Err: 50400 (48.27%) Active: 0 Started: 5 Finished: 5
Generate Summary Results = 104500 in 00:00:32 = 3299.1/s Avg: 1 Min: 0 Max: 1031 Err: 50400 (48.23%)
Tidying up ... # Wed Aug 26 14:23:28 IDT 2020 (1598441008843)
Error generating the report: org.apache.jmeter.report.dashboard.GenerationException: Data exporter "json" is unable to export data.
... end of run
At the end, you can see we have:
Error generating the report: org.apache.jmeter.report.dashboard.GenerationException: Data exporter "json" is unable to export data.
... end of run
Note that it didn't matter whether the report's extension was CSV or JTL.
I was able to generate the JTL report, and then run jmeter -g <my JTL file>, but I'd like the -e option to work.
Update: Now I get errors with the g option:
C:\Users\guyl\OneDrive - xxxLTD\Guy\apache-jmeter-5.3\bin>jmeter -g "C:\Users\guyl\OneDrive - xxxLTD\Guy\JMeter\db_report" -o"C:\Users\guyl\OneDrive - xxxLTD\Guy\JMeter\output\19082020\"
An error occurred: Data exporter "json" is unable to export data.
errorlevel=1
Press any key to continue . . .
Here is what I found in the jmeter.log file:
2020-08-26 14:37:51,812 INFO o.a.j.r.p.AbstractSampleConsumer: class org.apache.jmeter.report.processor.FilterConsumer#stopProducing(): nameFilter produced 1567500 samples
2020-08-26 14:37:51,812 INFO o.a.j.r.p.AbstractSampleConsumer: class org.apache.jmeter.report.processor.FilterConsumer#stopProducing(): dateRangeFilter produced 313500 samples
2020-08-26 14:37:51,812 INFO o.a.j.r.p.AbstractSampleConsumer: class org.apache.jmeter.report.processor.NormalizerSampleConsumer#stopProducing(): normalizer produced 104500 samples
2020-08-26 14:37:51,813 INFO o.a.j.r.p.CsvFileSampleSource: produce(): 104500 samples produced in 6s 70 ms on channel 0
2020-08-26 14:37:51,813 INFO o.a.j.r.d.ReportGenerator: Exporting data using exporter:'json' of className:'org.apache.jmeter.report.dashboard.JsonExporter'
2020-08-26 14:37:51,814 INFO o.a.j.r.d.JsonExporter: Found data for consumer statisticsSummary in context
2020-08-26 14:37:51,814 INFO o.a.j.r.d.JsonExporter: Creating statistics for overall
2020-08-26 14:37:51,815 INFO o.a.j.r.d.JsonExporter: Creating statistics for other transactions
2020-08-26 14:37:51,815 INFO o.a.j.r.d.JsonExporter: Checking output folder
2020-08-26 14:37:51,816 ERROR o.a.j.JMeter: An error occurred:
org.apache.jmeter.report.dashboard.GenerationException: Data exporter "json" is unable to export data.
at org.apache.jmeter.report.dashboard.ReportGenerator.exportData(ReportGenerator.java:385) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.report.dashboard.ReportGenerator.generate(ReportGenerator.java:258) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.JMeter.start(JMeter.java:545) [ApacheJMeter_core.jar:5.3]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_241]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_241]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_241]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_241]
at org.apache.jmeter.NewDriver.main(NewDriver.java:252) [ApacheJMeter.jar:5.3]
Caused by: org.apache.jmeter.report.dashboard.ExportException: Error creating output folder C:\Users\guyl\OneDrive - Nayax LTD\Guy\JMeter\output\19082020"
at org.apache.jmeter.report.dashboard.JsonExporter.checkAndGetOutputFolder(JsonExporter.java:112) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.report.dashboard.JsonExporter.export(JsonExporter.java:77) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.report.dashboard.ReportGenerator.exportData(ReportGenerator.java:379) ~[ApacheJMeter_core.jar:5.3]
... 7 more
Caused by: java.io.IOException: Unable to create directory C:\Users\guyl\OneDrive - xxxLTD\Guy\JMeter\output\19082020"
at org.apache.commons.io.FileUtils.forceMkdir(FileUtils.java:2491) ~[commons-io-2.6.jar:2.6]
at org.apache.jmeter.report.dashboard.JsonExporter.checkAndGetOutputFolder(JsonExporter.java:110) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.report.dashboard.JsonExporter.export(JsonExporter.java:77) ~[ApacheJMeter_core.jar:5.3]
at org.apache.jmeter.report.dashboard.ReportGenerator.exportData(ReportGenerator.java:379) ~[ApacheJMeter_core.jar:5.3]
... 7 more
So it all started from:
Caused by: java.io.IOException: Unable to create directory C:\Users\guyl\OneDrive - xxx LTD\Guy\JMeter\output\19082020"
It's supposed to create that folder if it doesn't exist, isn't it?

JMeter will create only 1 level of folder and not the full hierarchy.
Second , try avoiding a folder with spaces in it.

I was getting this error as well and the solution to my problem was that from this command:
jmeter -n -t “location of test file” -l “location of your result file” -e -o “location of reports folder”
The third option "location of reports folder" needed to be a brand new non-existing folder. After telling the command to create a folder, my HTML reports were successful!

I've just experienced the same error message, using jMeter 5.4.1.
In my case, I ended up investigating the problem using ProcMon
Procmon showed that the closing double quote " was included in the folder path, which is an invalid character in windows folder/file names.
I was fortunate enough that my path did not have any spaces in it so removing the double quote did not impact me. Try to use folders without spaces in the names, or alternatively use the 8.3 short names in the paths instead.

Related

How to read/translate macOS 12 (Monterey) .ips crash files?

Starting in macOS 12 (Monterey), the system apparently writes crash file as .ips files, instead of the traditional .crash file format.
The file appears to contain JSON data:
{"app_name":"Helper","timestamp":"2021-10-30 18:49:32.00 +0100","app_version":"3.0.0(66) beta","slice_uuid":"673198dd-94ac-31a7-9e81-09fe6c781255","build_version":"3.0.0.66","platform":1,"bundleID":"com.dislt.helper","share_with_app_devs":0,"is_first_party":0,"bug_type":"309","os_version":"macOS 12.0.1 (21A559)","incident_id":"CC03C2EC-C1D4-4F6E-AA1F-6C4EC555D6B8","name":"Helper"}
{
"uptime" : 91000,
"procLaunch" : "2021-10-30 18:49:29.7791 +0100",
"procRole" : "Unspecified",
"version" : 2,
"userID" : 501,
"deployVersion" : 210,
"modelCode" : "MacBookPro14,3",
"procStartAbsTime" : 91844701503187,
"coalitionID" : 1244,
"osVersion" : {
"train" : "macOS 12.0.1",
"build" : "21A559",
"releaseType" : "User"
},
"captureTime" : "2021-10-30 18:49:32.4572 +0100",
"incident" : "92A89610-D70A-4D93-A974-A9018BB5C72A",
"bug_type" : "309",
"pid" : 77765,
"procExitAbsTime" : 91847378271126,
"cpuType" : "X86-64",
"procName" : "Helper",
...
When I preview the file or open it in the Console app, a traditional crash report is automatically generated:
-------------------------------------
Translated Report (Full Report Below)
-------------------------------------
Process: Helper [77765]
Path: /Users/USER/Library/Application Support/Helper.app/Contents/MacOS/Helper
Identifier: com.distl.helper
Version: 3.0.0(66) beta (3.0.0.66)
Code Type: X86-64 (Native)
Parent Process: TestBead [77726]
Responsible: TestBead [77726]
User ID: 501
Date/Time: 2021-10-30 18:49:32.4572 +0100
OS Version: macOS 12.0.1 (21A559)
Report Version: 12
Bridge OS Version: 3.0 (14Y908)
Anonymous UUID: CC03C2EC-C1D4-4F6E-AA1F-6C4EC555D6B8
Time Awake Since Boot: 91000 seconds
System Integrity Protection: enabled
Crashed Thread: 1 Dispatch queue: com.apple.NSXPCConnection.user.anonymous.77726
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x00007f780071a000
Exception Codes: 0x0000000000000001, 0x00007f780071a000
Exception Note: EXC_CORPSE_NOTIFY
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [77765]
...
I have customer support and development tools that scan these crash report files automatically, and I'd like to find out if there's a way to automate the translation of the JSON data back into the traditional crash report format?
I'd like to do this to (a) avoid rewriting my crash report scanning tools (although that wouldn't be impossible), and (b) automatically translate these files into a human readable format, without resorting to opening the file in the Console app.
I've run into the same problem. I haven't tried it myself yet, but someone has already created an ips2crash command available at GitHub. As the name implies, it should convert an .ips file to the (now) legacy crash report format.

WebSphere Liberty Profile blocking on Datasource lookup

I'm trying to configure a datasource in IBM WebSphere Liberty Profile (16.0.0.3) and this is what I've done so far:
server.xml
<authData id="dbuser" password="{xor}blablabla" user="MY_USER"/>
<dataSource id="Oracle" isolationLevel="TRANSACTION_READ_COMMITTED"
jdbcDriverRef="OracleDriver"
jndiName="EPMS_DS"
recoveryAuthDataRef="dbuser"
type="javax.sql.ConnectionPoolDataSource">
<properties.oracle databaseName="DBNAME" portNumber="1521" serverName="SERVERNAME"/>
</dataSource>
<jdbcDriver id="OracleDriver"
javax.sql.ConnectionPoolDataSource="oracle.jdbc.pool.OracleConnectionPoolDataSource"
libraryRef="shared-library"/>
web.xml
<resource-env-ref>
<description>The Oracle DS</description>
<resource-env-ref-name>jdbc/OracleDS</resource-env-ref-name>
<resource-env-ref-type>javax.sql.DataSource</resource-env-ref-type>
</resource-env-ref>
ibm-web-bnd.xml
<resource-ref name="jdbc/OracleDS" binding-name="EPMS_DS">
<authentication-alias name="dbuser" />
</resource-ref>
However, besides the application server is taking more than 2 minutes to startup, my application seems to freeze on the following instruction:
ctx = new InitialContext();
ctx.lookup("java:comp/env/jdbc/OracleDS");
The log doesn't show any errors, the last line it shows is an application's debug message indicating it is going to do a JNDI lookup.
I've also tried different configurations in server.xml, without <authData> and explicitly defining user and password on the datasource, but with identical results:
<dataSource id="Oracle" isolationLevel="TRANSACTION_READ_COMMITTED" jdbcDriverRef="OracleDriver" jndiName="EPMS_DS" type="javax.sql.ConnectionPoolDataSource">
<properties.oracle URL="jdbc:oracle:thin:#SERVERNAME:1521:DBNAME" password="{xor}blablabla" user="MY_USER"/>
</dataSource>
Sadly, Liberty Profile doesn't seem to provide a way to test the DB connection, but everything seems correctly configured (I can assure the credentials are correct, as well as the server name and port). What am I missing here?
EDIT #1
Following njr's suggestion, I've performed a thread dump and here is a summary:
- waiting on com.ibm.tx.jta.impl.EventSemaphore#737eaefc
- waiting on com.ibm.ws.objectManager.FileLogOutput$FlushHelper#19d51071
- waiting on com.ibm.ws.objectManager.FileLogOutput$NotifyHelper#2fa0da91
- waiting on com.ibm.ws.objectManager.ObjectManagerState$CheckpointHelper#5b0919fc
- waiting on com.ibm.ws.sib.msgstore.persistence.dispatcher.SpillDispatcher$DispatchingLock#1620db94
(8 Occorrences, but different instances)
...
- waiting on com.ibm.ws.threading.internal.BoundedBuffer$GetQueueLock#c8a05b6
(56 Occorrences, but different instances)
...
- waiting on java.lang.Object#4c1d5897
- waiting on java.lang.ref.Reference$Lock#5448da4c
- waiting on java.lang.ref.ReferenceQueue$Lock#f91b025
- waiting on java.util.LinkedList#5b213416
- waiting on java.util.LinkedList#6cb46e1f
- waiting on java.util.TaskQueue#f50561c
(14 Occorrences, but different instances)
...
- waiting on java.util.concurrent.atomic.AtomicReference#5476d077
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#4da17c93
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#513339c6
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#5dc2ae0f
- waiting on org.eclipse.osgi.framework.eventmgr.EventManager$EventThread#236970be
- waiting on org.eclipse.osgi.framework.eventmgr.EventManager$EventThread#6dfdd5
- waiting on org.eclipse.osgi.framework.eventmgr.EventManager$EventThread#72ce4e1c
- blocked on com.ibm.tx.jta.embeddable.impl.EmbeddableTMHelper#5748c911
- blocked on com.ibm.tx.jta.embeddable.impl.EmbeddableTMHelper#5748c911
Can someone help me to interpret this?
EDIT #2
Here's where the complete stack trace of the blocked threads:
LargeThreadPool-thread-148 [217] (BLOCKED)
com.ibm.tx.jta.embeddable.impl.EmbeddableTMHelper.start line: 63
com.ibm.tx.jta.util.TxTMHelper.start line: 461
com.ibm.tx.util.TMHelper.start line: 74
com.ibm.tx.jta.util.TxTMHelper.checkTMState line: 500
com.ibm.tx.util.TMHelper.checkTMState line: 116
com.ibm.tx.jta.impl.TranManagerSet.registerResourceInfo line: 270
com.ibm.ws.transaction.services.TransactionManagerService.registerResourceInfo line: 260
com.ibm.ejs.j2c.ConnectionManager.registerXAResourceInfo line: 2537
com.ibm.ejs.j2c.ConnectionManager.<init> line: 509
com.ibm.ejs.j2c.ConnectionManagerServiceImpl.getConnectionManager line: 407
com.ibm.ejs.j2c.ConnectionManagerServiceImpl.getConnectionManager line: 54
com.ibm.ws.jca.cm.AbstractConnectionFactoryService.createResource line: 146
com.ibm.ws.injectionengine.osgi.internal.IndirectJndiLookupObjectFactory.createResourceWithFilter line: 346
com.ibm.ws.injectionengine.osgi.internal.IndirectJndiLookupObjectFactory.createResource line: 319
com.ibm.ws.injectionengine.osgi.internal.IndirectJndiLookupObjectFactory.getObjectInstance line: 133
com.ibm.ws.injectionengine.osgi.internal.IndirectJndiLookupObjectFactory.getObjectInstance line: 99
com.ibm.wsspi.injectionengine.InjectionBinding.getInjectionObjectInstance line: 1556
com.ibm.wsspi.injectionengine.InjectionBinding.getInjectionObject line: 1433
com.ibm.wsspi.injectionengine.InjectionBinding.getInjectionObject line: 1389
com.ibm.ws.injectionengine.osgi.internal.naming.InjectionJavaColonHelper.getObjectInstance line: 116
com.ibm.ws.jndi.url.contexts.javacolon.internal.JavaURLContext.lookup line: 333
com.ibm.ws.jndi.url.contexts.javacolon.internal.JavaURLContext.lookup line: 371
org.apache.aries.jndi.DelegateContext.lookup line: 161
javax.naming.InitialContext.lookup line: 417
pt.sibs.epms.persistence.utils.EntityManagerFactoryController.jndiLookupUsed line: 264
pt.sibs.epms.persistence.utils.EntityManagerFactoryController.checkConfiguration line: 115
pt.sibs.epms.persistence.utils.EntityManagerFactoryController.<init> line: 95
pt.sibs.epms.persistence.utils.EntityManagerFactoryController.<init> line: 51
pt.sibs.epms.persistence.utils.EntityManagerFactoryController$SingletonHolder.<clinit> line: 81
pt.sibs.epms.persistence.utils.EntityManagerFactoryController.getInstance line: 88
pt.sibs.epms.util.logging.LoggerConfiguration.<clinit> line: 33
pt.sibs.epms.ecc.renderer.HtmlFormRenderer.<clinit> line: 25
java.lang.Class.forName0 line: not available [native method]
java.lang.Class.forName line: 348
com.ibm.ws.webcontainer.osgi.webapp.WebApp.addClassToHandlesTypesStartupSet line: 1104
com.ibm.ws.webcontainer.osgi.webapp.WebApp.scanForHandlesTypesClasses line: 1038
com.ibm.ws.webcontainer.webapp.WebApp.initializeServletContainerInitializers line: 2493
com.ibm.ws.webcontainer.webapp.WebApp.initialize line: 1037
com.ibm.ws.webcontainer.webapp.WebApp.initialize line: 6545
com.ibm.ws.webcontainer.osgi.DynamicVirtualHost.startWebApp line: 466
com.ibm.ws.webcontainer.osgi.DynamicVirtualHost.createRunnableHandler line: 264
com.ibm.ws.webcontainer.osgi.DynamicVirtualHost.createRunnableHandler line: 329
com.ibm.ws.http.internal.VirtualHostImpl.discriminate line: 251
com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.ready line: 301
com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleDiscrimination line: 471
com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleNewRequest line: 405
com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.processRequest line: 285
com.ibm.ws.http.channel.internal.inbound.HttpICLReadCallback.complete line: 66
com.ibm.ws.channel.ssl.internal.SSLReadServiceContext$SSLReadCompletedCallback.complete line: 1777
com.ibm.ws.tcpchannel.internal.WorkQueueManager.requestComplete line: 504
com.ibm.ws.tcpchannel.internal.WorkQueueManager.attemptIO line: 574
com.ibm.ws.tcpchannel.internal.WorkQueueManager.workerRun line: 929
com.ibm.ws.tcpchannel.internal.WorkQueueManager$Worker.run line: 1018
java.util.concurrent.ThreadPoolExecutor.runWorker line: 1142
java.util.concurrent.ThreadPoolExecutor$Worker.run line: 617
java.lang.Thread.run line: 745
And the second thread:
LargeThreadPool-thread-3 [33] (BLOCKED)
com.ibm.tx.jta.embeddable.impl.EmbeddableTMHelper.start line: 63
com.ibm.tx.jta.util.TxTMHelper.start line: 461
com.ibm.tx.util.TMHelper.start line: 74
com.ibm.tx.jta.util.TxTMHelper.checkTMState line: 500
com.ibm.tx.util.TMHelper.checkTMState line: 116
com.ibm.tx.jta.impl.TranManagerSet.begin line: 167
com.ibm.ejs.csi.TranStrategy.beginGlobalTx line: 593
com.ibm.ejs.csi.Required.preInvoke line: 56
com.ibm.ejs.csi.TransactionControlImpl.preInvoke line: 222
com.ibm.ejs.container.EJSContainer.preInvokeActivate line: 3176
com.ibm.ejs.container.EJSContainer.EjbPreInvoke line: 2576
com.ibm.ejs.container.TimedObjectWrapper.invokeCallback line: 84
com.ibm.ejs.container.TimerNpRunnable.doWork line: 196
com.ibm.ejs.container.TimerNpRunnable.run line: 103
java.util.concurrent.Executors$RunnableAdapter.call line: 511
java.util.concurrent.FutureTask.run line: 266
java.util.concurrent.ThreadPoolExecutor.runWorker line: 1142
java.util.concurrent.ThreadPoolExecutor$Worker.run line: 617
java.lang.Thread.run line: 745
Edit #3
The thread that is WAITING and, apparently, blocking the other two threads:
LargeThreadPool-thread-38 [103] (WAITING)
java.lang.Object.wait line: not available [native method]
java.lang.Object.wait line: 502
com.ibm.tx.jta.impl.EventSemaphore.waitEvent line: 71
com.ibm.tx.jta.impl.RecoveryManager.waitForReplayCompletion line: 1273
com.ibm.tx.jta.impl.TxRecoveryAgentImpl.initiateRecovery line: 413
com.ibm.ws.recoverylog.spi.RecoveryDirectorImpl.directInitialization line: 751
com.ibm.ws.recoverylog.spi.RecoveryDirectorImpl.driveLocalRecovery line: 1240
com.ibm.ws.recoverylog.spi.RecLogServiceImpl.start line: 125
com.ibm.tx.jta.embeddable.impl.EmbeddableTMHelper.start line: 130
com.ibm.tx.jta.util.TxTMHelper.start line: 461
com.ibm.tx.util.TMHelper.start line: 74
com.ibm.tx.jta.util.TxTMHelper.checkTMState line: 500
com.ibm.tx.util.TMHelper.checkTMState line: 116
com.ibm.tx.jta.impl.TranManagerSet.begin line: 167
com.ibm.ws.transaction.services.TransactionManagerService.begin line: 281
com.ibm.ws.concurrent.persistent.internal.PersistentExecutorImpl$PollingTask.run line: 2239
Not sure if it is related, but ffdc is showing the following exception:
------Start of DE processing------ = [09-11-2016 14:41:09:006 GMT]
Exception = com.ibm.ws.recoverylog.spi.LogIncompatibleException
Source = com.ibm.ws.recoverylog.spi.LogHandle
probeid = 326
Stack Dump = com.ibm.ws.recoverylog.spi.LogIncompatibleException
at com.ibm.ws.recoverylog.spi.LogFileHandle.fileOpen(LogFileHandle.java:522)
at com.ibm.ws.recoverylog.spi.LogHandle.openLog(LogHandle.java:324)
at com.ibm.ws.recoverylog.spi.MultiScopeRecoveryLog.openLog(MultiScopeRecoveryLog.java:602)at com.ibm.ws.recoverylog.spi.RecoveryLogImpl.openLog(RecoveryLogImpl.java:77)
at com.ibm.tx.jta.impl.RecoveryManager.run(RecoveryManager.java:1835)
at java.lang.Thread.run(Thread.java:745)
In your Edit #3, the thread that is waiting in
com.ibm.tx.jta.impl.RecoveryManager.waitForReplayCompletion
will have spawned another recoveryManager thread who's role is to access the transaction log files in your flesystem. That other thread Should do the minimal amount of file processing necessary before signalling to the waiting thread that it may continue. Can you see another thread with a stack containing
com.ibm.tx.jta.impl.RecoveryManager.run ?
I am concerned about the LogIncompatibleException. It suggests that the transaction log files on your filesystem are
corrupt. This should not cause the server to hang and I believe you've hit a product defect.
If you need to make progress quickly, it may be appropriate in your scenario to delete the transaction log files.
Please note that this is something we only suggest to customers with extreme care as the transaction logs ensure
the integrity of distributed transactions. In a production environment we'd generally recommend that such action
is only taken under the guidance of IBM Level 3 Service. But in a test/evaluation scenario it can be applicable.
The Liberty transaction log info is stored in the /wlp/usr/servers//tranlog
directory. If appropriate the tranlog and partnerlog subdirectories may be deleted and the server restarted.

OpenShift V3 and incremental builds

I have some issues using incremental builds with the image ruby-22-centos7.
I added the following script "save-artifacts" to .sti/bin directory :
#!/bin/sh -e
pushd ${HOME} >/dev/null
if [ -d ./bundle/ruby ]; then
tar cf - bundle/ruby
fi
popd >/dev/null
I have this error during the build steps :
I0330 13:53:05.022524 1 sti.go:213] Using assemble from image:///usr/libexec/s2i
15 I0330 13:53:05.022544 1 sti.go:213] Using run from image:///usr/libexec/s2i
16 I0330 13:53:05.022551 1 sti.go:213] Using save-artifacts from upload/src/.sti/bin
17 I0330 13:53:05.024552 1 sti.go:142] Existing image for tag 172.30.22.77:5000/blog/blog:latest detected for incremental build
18 I0330 13:53:05.024570 1 sti.go:147] Performing source build from file:///tmp/s2i-build462497527/upload/src
19 I0330 13:53:05.024654 1 sti.go:350] Saving build artifacts from image 172.30.22.77:5000/blog/blog:latest to path /tmp/s2i-build462497527/upload/artifacts
20 I0330 13:53:05.026788 1 docker.go:374] Both scripts and untarred source will be placed in '/tmp'
21 I0330 13:53:05.026820 1 docker.go:510] Creating container using config: {Hostname: Domainname: User: Memory:0 MemorySwap:0 CPUShares:0 CPUSet: AttachStdin:false AttachStdout:true AttachStderr:false PortSpecs:[] ExposedPorts:map[] Tty:false OpenStdin:false StdinOnce:false Env:[] Cmd:[/tmp/scripts/save-artifacts] DNS:[] Image:172.30.22.77:5000/blog/blog:latest Volumes:map[] VolumeDriver: VolumesFrom: WorkingDir: MacAddress: Entrypoint:[] NetworkDisabled:false SecurityOpts:[] OnBuild:[] Mounts:[] Labels:map[]}
22 I0330 13:53:05.685226 1 docker.go:524] Attaching to container
23 I0330 13:53:05.686542 1 docker.go:530] Starting container
24 E0330 13:53:10.836202 1 tar.go:207] Error reading next tar header: io: read/write on closed pipe
25 W0330 13:53:10.859154 1 sti.go:150] Clean build will be performed because of error saving previous build artifacts
26 I0330 13:53:10.859172 1 sti.go:152] ERROR: timeout waiting for tar stream
Any help would be greatly appreciated !

Unable to use JMockit with OpenJDK 1.7

While trying to use JMockit (1.21) with JUnit (4.8) test cases I ran into issue with OpenJDK (1.7). I'm using Eclipse. After searching on SO, I found the solution about adding '-javaagent:path/to/JMockit/jar' argument to JVM and putting JMockit dependency before JUnit in maven. But after adding that argument, my test won't run and instead I get following error. Did anyone have this issue and how did you solve it? It works if I use OracleJDK but I'm looking for solution where it runs with OpenJDK.
#
# A fatal error has been detected by the Java Runtime Environment:
#
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x0000000051cfbbe8, pid=9268, tid=2272
#
# JRE version: OpenJDK Runtime Environment (7.0) (build 1.7.0-45-asmurthy_2014_01_10_19_46-b00)
# Java VM: Dynamic Code Evolution 64-Bit Server VM (24.45-b06 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# V [jvm.dll+0x6bbe8]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# If you would like to submit a bug report, please visit:
# http://bugreport.sun.com/bugreport/crash.jsp
#
--------------- T H R E A D ---------------
Current thread (0x000000000270c000): VMThread [stack: 0x00000000074e0000,0x00000000075e0000] [id=2272]
siginfo: ExceptionCode=0xc0000005, reading address 0x0000000000000000
Registers:
RAX=0x0000000000000000, RBX=0x0000000000000000, RCX=0x0000000000000000, RDX=0x00000007fae04720
RSP=0x00000000075df190, RBP=0x00000000523a16d8, RSI=0x0000000002abe2b0, RDI=0x00000000523a16e0
R8 =0x0000000000000000, R9 =0x0000000000000100, R10=0x0000000000041999, R11=0x0000000008eab1f0
R12=0x000000000270c000, R13=0x00000007fb208040, R14=0x0000000000000001, R15=0x00000000000003d8
RIP=0x0000000051cfbbe8, EFLAGS=0x0000000000010206
Top of Stack: (sp=0x00000000075df190)
0x00000000075df190: 00000007fb208050 0000000002abe200
0x00000000075df1a0: 00000007fb208040 000000000270c000
0x00000000075df1b0: 00000000523a16e0 0000000051e0ecbc
0x00000000075df1c0: 0000000000000000 00000000523a16d8
0x00000000075df1d0: 0000000002abe2b0 000000000270c000
0x00000000075df1e0: 00000000521ea7b8 0000000052450100
0x00000000075df1f0: 0000000000000000 00000000521ea7a0
0x00000000075df200: 0000000000001000 00000000075df1e0
0x00000000075df210: 0000000000000100 0000000000000000
0x00000000075df220: 00000000073158d8 00000000000003d8
0x00000000075df230: 00000000073158d8 0000000002701ac0
0x00000000075df240: 00000000073154f0 0000000051e74dc7
0x00000000075df250: 00000000026c3de0 0000000000000001
0x00000000075df260: 0000000002abe2b0 0000000007315500
0x00000000075df270: 0000000007315500 00000000073154f0
0x00000000075df280: 0000000002701ac0 0000000051e74072
Instructions: (pc=0x0000000051cfbbe8)
0x0000000051cfbbc8: cc cc cc cc cc cc cc cc 48 89 5c 24 08 57 48 83
0x0000000051cfbbd8: ec 20 48 8b 05 17 20 69 00 48 8b 0d c0 e5 68 00
0x0000000051cfbbe8: 48 63 18 e8 60 c3 fa ff 33 ff 48 85 db 7e 37 66
0x0000000051cfbbf8: 0f 1f 84 00 00 00 00 00 48 8b 05 f1 1f 69 00 48
Register to memory mapping:
RAX=0x0000000000000000 is an unknown value
RBX=0x0000000000000000 is an unknown value
RCX=0x0000000000000000 is an unknown value
RDX=0x00000007fae04720 is an oop
{instance class}
- klass: {other class}
RSP=0x00000000075df190 is an unknown value
RBP=0x00000000523a16d8 is an unknown value
RSI=0x0000000002abe2b0 is pointing into the stack for thread: 0x00000000026c7000
RDI=0x00000000523a16e0 is an unknown value
R8 =0x0000000000000000 is an unknown value
R9 =0x0000000000000100 is an unknown value
R10=0x0000000000041999 is an unknown value
R11=0x0000000008eab1f0 is an unknown value
R12=0x000000000270c000 is an unknown value
R13=0x00000007fb208040 is an oop
{instance class}
- klass: {other class}
R14=0x0000000000000001 is an unknown value
R15=0x00000000000003d8 is an unknown value
Stack: [0x00000000074e0000,0x00000000075e0000], sp=0x00000000075df190, free space=1020k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
V [jvm.dll+0x6bbe8]
VM_Operation (0x0000000002abe2b0): RedefineClasses, mode: safepoint, requested by thread 0x00000000026c7000
--------------- P R O C E S S ---------------
Java Threads: ( => current thread )
0x000000000739c800 JavaThread "Attach Listener" daemon [_thread_blocked, id=6668, stack(0x0000000007c60000,0x0000000007d60000)]
0x000000000739b000 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=5620, stack(0x0000000007aa0000,0x0000000007ba0000)]
0x0000000007379800 JavaThread "Finalizer" daemon [_thread_blocked, id=9832, stack(0x00000000078c0000,0x00000000079c0000)]
0x0000000007370000 JavaThread "Reference Handler" daemon [_thread_blocked, id=7516, stack(0x0000000007680000,0x0000000007780000)]
0x00000000026c7000 JavaThread "main" [_thread_blocked, id=6580, stack(0x00000000029c0000,0x0000000002ac0000)]
Other Threads:
=>0x000000000270c000 VMThread [stack: 0x00000000074e0000,0x00000000075e0000] [id=2272]
VM state:at safepoint (normal execution)
VM Mutex/Monitor currently owned by a thread: ([mutex/lock_event])
[0x00000000026c3e60] Threads_lock - owner thread: 0x000000000270c000
[0x00000000026c4360] Heap_lock - owner thread: 0x00000000026c7000
[0x00000000026c4b60] RedefineClasses_lock - owner thread: 0x00000000026c7000
Heap
def new generation total 118016K, used 21035K [0x000000067ae00000, 0x0000000682e00000, 0x00000006fae00000)
eden space 104960K, 20% used [0x000000067ae00000, 0x000000067c28af18, 0x0000000681480000)
from space 13056K, 0% used [0x0000000681480000, 0x0000000681480000, 0x0000000682140000)
to space 13056K, 0% used [0x0000000682140000, 0x0000000682140000, 0x0000000682e00000)
tenured generation total 262144K, used 0K [0x00000006fae00000, 0x000000070ae00000, 0x00000007fae00000)
the space 262144K, 0% used [0x00000006fae00000, 0x00000006fae00000, 0x00000006fae00200, 0x000000070ae00000)
compacting perm gen total 21248K, used 4129K [0x00000007fae00000, 0x00000007fc2c0000, 0x0000000800000000)
the space 21248K, 19% used [0x00000007fae00000, 0x00000007fb208478, 0x00000007fb208600, 0x00000007fc2c0000)
No shared spaces configured.
Card table byte_map: [0x0000000005e60000,0x0000000006a90000] byte_map_base: 0x0000000002a89000
Polling page: 0x0000000000150000
Code Cache [0x0000000002da0000, 0x0000000003010000, 0x0000000005da0000)
total_blobs=187 nmethods=0 adapters=156 free_code_cache=48761Kb largest_free_block=49932032
Compilation events (0 events):
No events
GC Heap History (0 events):
No events
Deoptimization events (0 events):
No events
Internal exceptions (10 events):
Event: 1.210 Thread 0x00000000026c7000 Threw 0x000000067c036fa0 at C:\openjdk\jdk7u\hotspot\src\share\vm\interpreter\interpreterRuntime.cpp:347
Event: 1.211 Thread 0x00000000026c7000 Threw 0x000000067c03ae78 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.212 Thread 0x00000000026c7000 Threw 0x000000067c045f10 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.213 Thread 0x00000000026c7000 Threw 0x000000067c057398 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.214 Thread 0x00000000026c7000 Threw 0x000000067c066510 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.215 Thread 0x00000000026c7000 Threw 0x000000067c072ac0 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.216 Thread 0x00000000026c7000 Threw 0x000000067c084678 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.217 Thread 0x00000000026c7000 Threw 0x000000067c08c7b0 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.218 Thread 0x00000000026c7000 Threw 0x000000067c0934a8 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Event: 1.218 Thread 0x00000000026c7000 Threw 0x000000067c09cac8 at C:\openjdk\jdk7u\hotspot\src\share\vm\prims\jvm.cpp:1244
Events (10 events):
Event: 1.215 loading class 0x0000000008956520 done
Event: 1.216 loading class 0x0000000008959de0
Event: 1.216 loading class 0x0000000008959de0 done
Event: 1.217 loading class 0x0000000008ea91d0
Event: 1.217 loading class 0x0000000008ea91d0 done
Event: 1.218 loading class 0x0000000008d643e0
Event: 1.218 loading class 0x0000000008d643e0 done
Event: 1.218 loading class 0x0000000008958d20
Event: 1.218 loading class 0x0000000008958d20 done
Event: 1.219 Executing VM operation: RedefineClasses
Dynamic libraries:
0x000000013f7c0000 - 0x000000013f7f1000 S:\OpenJDK\bin\javaw.exe
0x0000000076ef0000 - 0x0000000077099000 C:\WINDOWS\SYSTEM32\ntdll.dll
0x0000000076cb0000 - 0x0000000076dcf000 C:\WINDOWS\system32\kernel32.dll
0x000007fefcde0000 - 0x000007fefce4b000 C:\WINDOWS\system32\KERNELBASE.dll
0x0000000074990000 - 0x0000000074a19000 C:\WINDOWS\System32\SYSFER.DLL
0x000007feff0f0000 - 0x000007feff1cb000 C:\WINDOWS\system32\ADVAPI32.dll
0x000007fefed40000 - 0x000007fefeddf000 C:\WINDOWS\system32\msvcrt.dll
0x000007fefe660000 - 0x000007fefe67f000 C:\WINDOWS\SYSTEM32\sechost.dll
0x000007fefefc0000 - 0x000007feff0ed000 C:\WINDOWS\system32\RPCRT4.dll
0x0000000076df0000 - 0x0000000076eea000 C:\WINDOWS\system32\USER32.dll
0x000007fefe270000 - 0x000007fefe2d7000 C:\WINDOWS\system32\GDI32.dll
0x000007fefe360000 - 0x000007fefe36e000 C:\WINDOWS\system32\LPK.dll
0x000007fefe370000 - 0x000007fefe43a000 C:\WINDOWS\system32\USP10.dll
0x000007fefb460000 - 0x000007fefb654000 C:\WINDOWS\WinSxS\amd64_microsoft.windows.common-controls_6595b64144ccf1df_6.0.7601.18837_none_fa3b1e3d17594757\COMCTL32.dll
0x000007fefee30000 - 0x000007fefeea1000 C:\WINDOWS\system32\SHLWAPI.dll
0x000007fefe630000 - 0x000007fefe65e000 C:\WINDOWS\system32\IMM32.DLL
0x000007fefeeb0000 - 0x000007fefefb9000 C:\WINDOWS\system32\MSCTF.dll
0x0000000052420000 - 0x00000000524f2000 S:\OpenJDK\jre\bin\msvcr100.dll
0x0000000051c90000 - 0x000000005241e000 S:\OpenJDK\jre\bin\server\jvm.dll
0x000007fef82b0000 - 0x000007fef82b9000 C:\WINDOWS\system32\WSOCK32.dll
0x000007fefede0000 - 0x000007fefee2d000 C:\WINDOWS\system32\WS2_32.dll
0x000007feff1d0000 - 0x000007feff1d8000 C:\WINDOWS\system32\NSI.dll
0x000007fefaca0000 - 0x000007fefacdb000 C:\WINDOWS\system32\WINMM.dll
0x00000000770c0000 - 0x00000000770c7000 C:\WINDOWS\system32\PSAPI.DLL
0x000007feece20000 - 0x000007feece2f000 S:\OpenJDK\jre\bin\verify.dll
0x000007fee0320000 - 0x000007fee0348000 S:\OpenJDK\jre\bin\java.dll
0x000007fee6ee0000 - 0x000007fee6f03000 S:\OpenJDK\jre\bin\instrument.dll
0x000007fee06e0000 - 0x000007fee06f5000 S:\OpenJDK\jre\bin\zip.dll
0x000007fef1f90000 - 0x000007fef20b5000 C:\WINDOWS\system32\dbghelp.dll
VM Arguments:
jvm_args: -javaagent:S:\.m2\org\jmockit\jmockit\1.21\jmockit-1.21.jar -Dfile.encoding=ISO-8859-1
java_command: org.eclipse.jdt.internal.junit.runner.RemoteTestRunner -version 3 -port 61294 -testLoaderClass org.eclipse.jdt.internal.junit4.runner.JUnit4TestLoader -loaderpluginname org.eclipse.jdt.junit4.runtime -classNames com.examples.JMockitTest
java_command: org.eclipse.jdt.internal.junit.runner.RemoteTestRunner -version 3 -port 61294 -testLoaderClass org.eclipse.jdt.internal.junit4.runner.JUnit4TestLoader -loaderpluginname org.eclipse.jdt.junit4.runtime -classNames com.examples.JMockitTest
Launcher Type: SUN_STANDARD
Environment Variables:
JRE_HOME=C:\Program Files (x86)\IBM\RationalSDLC\Common\Java5.0\jre
PATH=C:\Python27\;C:\Python27\Scripts;C:\Program Files (x86)\IBM\RationalSDLC\Clearquest\cqcli\bin;C:\PERL51001\Perl\site\bin;C:\PERL51001\Perl\bin;C:\Program Files (x86)\RSA SecurID Token Common;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\Program Files (x86)\Microsoft Application Virtualization Client;C:\Program Files (x86)\java\jre6\bin\;C:\Perl64\bin;C:\Program Files (x86)\Perforce;C:\Program Files (x86)\IBM\RationalSDLC\ClearCase\bin;C:\Program Files (x86)\IBM\RationalSDLC\common;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program Files\TortoiseGit\bin;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files (x86)\Microsoft SDKs\TypeScript\1.0\;C:\Program Files\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\nodejs\
Thanks

Error in configuring object in hive

I am trying to make a Sentimental Analysis project using tweets and hive in hadoop. I have created a table on top of the tweets json format and I am able to view the tweets after querying SELECT * FROM TWEETS; Then I created the following VIEW but this is not retrieving the data on Querying Select * from tweet_simple;
hive> CREATE VIEW tweets_simple AS
> SELECT
> id,
> cast ( from_unixtime( unix_timestamp(concat( '2014 ', substring(created_at,5,15)), 'yyyy MMM dd hh:mm:ss')) as timestamp) ts,
> text,
> user.time_zone
> FROM tweets;
OK
Time taken: 0.949 seconds
hive> Select * from tweets_simple limit 10;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201503010134_0001, Tracking URL = http://master:50030/jobdetails.jsp?jobid=job_201503010134_0001
Kill Command = /usr/lib/hadoop/bin/hadoop job -kill job_201503010134_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2015-03-01 02:21:28,124 Stage-1 map = 0%, reduce = 0%
2015-03-01 02:22:25,083 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201503010134_0001 with errors
Error during job, obtaining debugging information...
Job Tracking URL: http://master:50030/jobdetails.jsp?jobid=job_201503010134_0001
Examining task ID: task_201503010134_0001_m_000002 (and more) from job job_201503010134_0001
Task with the most failures(4):
-----
Task ID:
task_201503010134_0001_m_000000
URL:
http://master:50030/taskdetails.jsp?jobid=job_201503010134_0001& tipid=task_201503010134_0001_m_000000
-----
Diagnostic Messages for this Task:
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.jav a:1438)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
data is is in json format.you have created table on the top of data,so each time when you query the data it needs De-Serializer to interpret josn data into hive supported data format.so before running the query add the jar as:
ADD JAR /usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar;
this jar is having JSONSerDe.class which deserialize the JSON object.