New with WSO2-DAS - wso2-das

WSO2 installed on Linux with Oracle-RAC. Followed all steps (I think!)
When starting it for the first time, i don't got any error:
TID: [-1234] [] [2017-04-25 15:28:17,964] INFO {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor} - Started Spark CLIENT in the cluster pointing to MASTER local with the application name : CarbonAnalytics and UI port : 4040 {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor}
TID: [-1234] [] [2017-04-25 15:28:17,987] INFO {org.wso2.carbon.ml.core.internal.MLCoreDS} - H2O Server will start in local mode. {org.wso2.carbon.ml.core.internal.MLCoreDS}
TID: [-1234] [] [2017-04-25 15:28:18,655] INFO {org.wso2.carbon.ml.core.impl.H2OServer} - H2o Server has started. {org.wso2.carbon.ml.core.impl.H2OServer}
TID: [-1234] [] [2017-04-25 15:28:18,659] INFO {org.wso2.carbon.ml.core.internal.MLCoreDS} - Machine Learner Wizard URL : https://172.17.9.67:9443/ml {org.wso2.carbon.ml.core.internal.MLCoreDS}
TID: [-1234] [] [2017-04-25 15:28:18,660] INFO {org.wso2.carbon.ml.core.internal.MLCoreDS} - ML core bundle activated
TID: [-1234] [] [2017-04-25 15:28:19,229] INFO {org.wso2.carbon.ntask.core.impl.AbstractQuartzTaskManager} - Task scheduled: [-1234][ANALYTICS_SPARK_EVENTING][STORE_EVENT_ROUTER_TASK] {org.wso2.carbon.ntask.core.impl.AbstractQuartzTaskManager}
TID: [-1234] [] [2017-04-25 15:28:19,315] INFO successfully. {org.wso2.carbon.ml.core.internal.MLCoreDS}{org.wso2.carbon.core.init.JMXServerManager} - JMX Service URL : service:jmx:rmi://localhost:11111/jndi/rmi://localhost:9999/jmxrmi {org.wso2.carbon.core.init.JMXServerManager}
TID: [-1234] [] [2017-04-25 15:28:19,358] INFO {org.wso2.carbon.core.internal.StartupFinalizerServiceComponent} - Server : WSO2 Data Analytics Server-3.1.0 {org.wso2.carbon.core.internal.StartupFinalizerServiceComponent}
TID: [-1234] [] [2017-04-25 15:28:19,360] INFO {org.wso2.carbon.core.internal.StartupFinalizerServiceComponent} - WSO2 Carbon started in 40 sec {org.wso2.carbon.core.internal.StartupFinalizerServiceComponent}
TID: [-1234] [] [2017-04-25 15:28:19,983] INFO {org.wso2.carbon.ui.internal.CarbonUIServiceComponent} - Mgt Console URL : https://172.17.9.67:9443/carbon/ {org.wso2.carbon.ui.internal.CarbonUIServiceComponent}
TID: [-1] [] [2017-04-25 15:28:45,332] INFO {org.wso2.carbon.event.processor.manager.core.internal.CarbonEventManagementService} - Starting polling event receivers {org.wso2.carbon.event.processor.manager.core.internal.CarbonEventManagementService}
But I'm not able to run console, nothing showed when loading http url :-(
Also trying 172.17.9.67:9763/carbon/ after deleting comment on AllowHttp label

Related

WSO2, datamapper mediator fails with "[base64 string] is not defined in <eval> at line number 1"

On both Integration Studio 8.0.0 and Enterprise Integrator 6.6.0 i'm having an error when using a datamapper mediator to convert a JSON payload to another JSON payload.
The error has the base64 conversion of the datamapper .dmc file.
A full log mediator placed before the datamapper shows that I received a correct json response
TID: [-1234] [] [2021-09-08 13:43:01,076] DEBUG {org.apache.synapse.mediators.builtin.LogMediator} - Start : Log mediator
TID: [-1234] [] [2021-09-08 13:43:01,076] INFO {org.apache.synapse.mediators.builtin.LogMediator} - To: http://www.w3.org/2005/08/addressing/anonymous, WSAction: , SOAPAction: , MessageID: urn:uuid:8f511c89-c468-4f24-a067-d0f476a63fb7, Direction: response, Payload: (the whole json response)
TID: [-1234] [] [2021-09-08 13:43:01,077] DEBUG {org.apache.synapse.mediators.builtin.LogMediator} - End : Log mediator
The full error is
TID: [-1234] [] [2021-09-08 13:43:01,077] DEBUG {org.apache.synapse.mediators.base.SequenceMediator} - Building message. Sequence <SequenceMediator> is content aware
TID: [-1234] [] [2021-09-08 13:43:01,078] DEBUG {org.apache.synapse.transport.passthru.util.RelayUtils} - Content Type is application/json; charset=UTF-8
TID: [-1234] [] [2021-09-08 13:43:01,078] INFO {org.wso2.carbon.mediation.dependency.mgt.DependencyTracker} - Local entry : gov:datamappers/ricercaUoDataMapper.dmc was added to the Synapse configuration successfully
TID: [-1234] [] [2021-09-08 13:43:01,082] INFO {org.wso2.carbon.mediation.dependency.mgt.DependencyTracker} - Local entry : gov:datamappers/ricercaUoDataMapper_inputSchema.json was added to the Synapse configuration successfully
TID: [-1234] [] [2021-09-08 13:43:01,084] INFO {org.wso2.carbon.mediation.dependency.mgt.DependencyTracker} - Local entry : gov:datamappers/ricercaUoDataMapper_outputSchema.json was added to the Synapse configuration successfully
TID: [-1234] [] [2021-09-08 13:43:01,100] DEBUG {org.apache.synapse.config.SynapsePropertiesLoader} - Retrieving synapse properties from the cache
TID: [-1234] [] [2021-09-08 13:43:01,102] DEBUG {org.apache.synapse.commons.json.JsonReadOnlyStream} - #close
TID: [-1234] [] [2021-09-08 13:43:01,580] ERROR {org.wso2.carbon.mediator.datamapper.DataMapperMediator} - DataMapper mediator : mapping failed Error while reading input stream. Script engine unable to execute the script javax.script.ScriptException: ReferenceError: "bWFwX1Nfcm9vdF9TX3Jvb3QgPSBmdW5jdGlvbigpeyAKdmFyIG91dHB1dHJvb3Q9e307Cgp2YXIgY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGUgPSAwOwpvdXRwdXRyb290ID0gIHt9OwpvdXRwdXRyb290LmNvdW50ID0gaW5wdXRyb290LmZvdW5kVU9zLmNvdW50OwpvdXRwdXRyb290LnJlc3VsdCA9ICB7fTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zID0gIFtdOwoKZm9yKGlfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlIGluIGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU8pewpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdID0gIHt9OwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmlkID0gTnVtYmVyKGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU9baV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmlkKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdW8gPSBOdW1iZXIoaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3VvKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdW9fcGFkcmUgPSBOdW1iZXIoaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3VvX3BhZHJlKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2Rfc291cmNlID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3NvdXJjZTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2Rfc291cmNlX3BhZHJlID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3NvdXJjZV9wYWRyZTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdGlwbyA9IGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU9baV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmNvZF90aXBvOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLm5vbWUgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5ub21lOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLnV0ZW50ZV9hZGQgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS51dGVudGVfYWRkOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLnV0ZW50ZV9jaGcgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS51dGVudGVfY2hnOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmRhdGFfaW5pemlvID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9pbml6aW87Cm91dHB1dHJvb3QucmVzdWx0LnVvc1tjb3VudF9pX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9maW5lID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9maW5lOwoKY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGUrKzsKfQpyZXR1cm4gb3V0cHV0cm9vdDsKfTsKCgoKCgoKCgoKCgoKCgoK" is not defined in <eval> at line number 1
at org.wso2.carbon.mediator.datamapper.engine.input.readers.JSONInputReader.read(JSONInputReader.java:62)
at org.wso2.carbon.mediator.datamapper.engine.input.InputBuilder.buildInputModel(InputBuilder.java:59)
at org.wso2.carbon.mediator.datamapper.engine.core.mapper.MappingHandler.doMap(MappingHandler.java:90)
at org.wso2.carbon.mediator.datamapper.DataMapperMediator.transform(DataMapperMediator.java:390)
at org.wso2.carbon.mediator.datamapper.DataMapperMediator.mediate(DataMapperMediator.java:301)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:109)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:71)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.rest.Resource.process(Resource.java:331)
at org.apache.synapse.rest.API.process(API.java:380)
at org.apache.synapse.rest.RESTRequestHandler.apiProcessNonDefaultStrategy(RESTRequestHandler.java:149)
at org.apache.synapse.rest.RESTRequestHandler.dispatchToAPI(RESTRequestHandler.java:95)
at org.apache.synapse.rest.RESTRequestHandler.process(RESTRequestHandler.java:58)
at org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:327)
at org.apache.synapse.core.axis2.SynapseCallbackReceiver.handleMessage(SynapseCallbackReceiver.java:578)
at org.apache.synapse.core.axis2.SynapseCallbackReceiver.receive(SynapseCallbackReceiver.java:195)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180)
at org.apache.synapse.transport.passthru.ClientWorker.run(ClientWorker.java:284)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
TID: [-1234] [] [2021-09-08 13:43:02,307] WARN {org.apache.synapse.FaultHandler} - ERROR_CODE : 0
TID: [-1234] [] [2021-09-08 13:43:02,308] WARN {org.apache.synapse.FaultHandler} - ERROR_MESSAGE : DataMapper mediator : mapping failed
TID: [-1234] [] [2021-09-08 13:43:02,308] WARN {org.apache.synapse.FaultHandler} - ERROR_DETAIL : org.apache.synapse.SynapseException: DataMapper mediator : mapping failed
at org.apache.synapse.mediators.AbstractMediator.handleException(AbstractMediator.java:367)
at org.wso2.carbon.mediator.datamapper.DataMapperMediator.transform(DataMapperMediator.java:444)
at org.wso2.carbon.mediator.datamapper.DataMapperMediator.mediate(DataMapperMediator.java:301)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:109)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:71)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.rest.Resource.process(Resource.java:331)
at org.apache.synapse.rest.API.process(API.java:380)
at org.apache.synapse.rest.RESTRequestHandler.apiProcessNonDefaultStrategy(RESTRequestHandler.java:149)
at org.apache.synapse.rest.RESTRequestHandler.dispatchToAPI(RESTRequestHandler.java:95)
at org.apache.synapse.rest.RESTRequestHandler.process(RESTRequestHandler.java:58)
at org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:327)
at org.apache.synapse.core.axis2.SynapseCallbackReceiver.handleMessage(SynapseCallbackReceiver.java:578)
at org.apache.synapse.core.axis2.SynapseCallbackReceiver.receive(SynapseCallbackReceiver.java:195)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180)
at org.apache.synapse.transport.passthru.ClientWorker.run(ClientWorker.java:284)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: Error while reading input stream. Script engine unable to execute the script javax.script.ScriptException: ReferenceError: "bWFwX1Nfcm9vdF9TX3Jvb3QgPSBmdW5jdGlvbigpeyAKdmFyIG91dHB1dHJvb3Q9e307Cgp2YXIgY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGUgPSAwOwpvdXRwdXRyb290ID0gIHt9OwpvdXRwdXRyb290LmNvdW50ID0gaW5wdXRyb290LmZvdW5kVU9zLmNvdW50OwpvdXRwdXRyb290LnJlc3VsdCA9ICB7fTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zID0gIFtdOwoKZm9yKGlfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlIGluIGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU8pewpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdID0gIHt9OwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmlkID0gTnVtYmVyKGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU9baV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmlkKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdW8gPSBOdW1iZXIoaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3VvKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdW9fcGFkcmUgPSBOdW1iZXIoaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3VvX3BhZHJlKTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2Rfc291cmNlID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3NvdXJjZTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2Rfc291cmNlX3BhZHJlID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uY29kX3NvdXJjZV9wYWRyZTsKb3V0cHV0cm9vdC5yZXN1bHQudW9zW2NvdW50X2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5jb2RfdGlwbyA9IGlucHV0cm9vdC5mb3VuZFVPcy5pbnRlcm5hbFVPcy5VT3MuVU9baV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmNvZF90aXBvOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLm5vbWUgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS5ub21lOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLnV0ZW50ZV9hZGQgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS51dGVudGVfYWRkOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLnV0ZW50ZV9jaGcgPSBpbnB1dHJvb3QuZm91bmRVT3MuaW50ZXJuYWxVT3MuVU9zLlVPW2lfVU9fMTAyNDYxYTJfYzM1Y180NjE3X2E1MGFfNjExZDVjYjhhNWRlXS51dGVudGVfY2hnOwpvdXRwdXRyb290LnJlc3VsdC51b3NbY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGVdLmRhdGFfaW5pemlvID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9pbml6aW87Cm91dHB1dHJvb3QucmVzdWx0LnVvc1tjb3VudF9pX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9maW5lID0gaW5wdXRyb290LmZvdW5kVU9zLmludGVybmFsVU9zLlVPcy5VT1tpX1VPXzEwMjQ2MWEyX2MzNWNfNDYxN19hNTBhXzYxMWQ1Y2I4YTVkZV0uZGF0YV9maW5lOwoKY291bnRfaV9VT18xMDI0NjFhMl9jMzVjXzQ2MTdfYTUwYV82MTFkNWNiOGE1ZGUrKzsKfQpyZXR1cm4gb3V0cHV0cm9vdDsKfTsKCgoKCgoKCgoKCgoKCgoK" is not defined in <eval> at line number 1
at org.wso2.carbon.mediator.datamapper.engine.input.readers.JSONInputReader.read(JSONInputReader.java:62)
at org.wso2.carbon.mediator.datamapper.engine.input.InputBuilder.buildInputModel(InputBuilder.java:59)
at org.wso2.carbon.mediator.datamapper.engine.core.mapper.MappingHandler.doMap(MappingHandler.java:90)
at org.wso2.carbon.mediator.datamapper.DataMapperMediator.transform(DataMapperMediator.java:390)
... 18 more
TID: [-1234] [] [2021-09-08 13:43:02,309] WARN {org.apache.synapse.FaultHandler} - ERROR_EXCEPTION : org.apache.synapse.SynapseException: DataMapper mediator : mapping failed
TID: [-1234] [] [2021-09-08 13:43:02,309] WARN {org.apache.synapse.FaultHandler} - FaultHandler : org.apache.synapse.mediators.MediatorFaultHandler#11271f01
TID: [-1234] [] [2021-09-08 13:43:02,309] WARN {org.apache.synapse.mediators.MediatorFaultHandler} - Executing fault handler mediator : org.apache.synapse.mediators.base.SequenceMediator
TID: [-1234] [] [2021-09-08 13:43:02,309] DEBUG {org.apache.synapse.mediators.base.SequenceMediator} - Start : Sequence <anonymous>
TID: [-1234] [] [2021-09-08 13:43:02,309] DEBUG {org.apache.synapse.mediators.base.SequenceMediator} - Sequence <SequenceMediator> :: mediate()
TID: [-1234] [] [2021-09-08 13:43:02,309] DEBUG {org.apache.synapse.mediators.base.SequenceMediator} - Mediation started from mediator position : 0
TID: [-1234] [] [2021-09-08 13:43:02,310] DEBUG {org.apache.synapse.mediators.base.SequenceMediator} - End : Sequence <anonymous>
The whole interaction happens with a Accept: application/json header, but it happens even with a xml -> json datamapping.
If the payload is json why are you using xml->json data mapping? should it not be json->json data mapping?
Accept:application/json header means it would be json so ideally you should use json->json mapping

wso2 esb dblookup miss paramater

I have set the variable before I use dplookup to select result,the paramater is:
mc.setProperty("year","2019");
mc.setProperty("month","11");
mc.setProperty("date","7");
the dblookup is
<dblookup>
<connection>
<pool>
<password>XXXX</password>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://XXXX</url>
<user>XXXX</user>
</pool>
</connection>
<statement>
<sql><![CDATA[
SELECT IFNULL(max(ID_),0) as MAX_ID_ FROM KETTLE WHERE YEAR(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND MONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND DAYOFMONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ?;
]]></sql>
<parameter expression="get-property('year')" type="VARCHAR"/>
<parameter expression="get-property('month')" type="VARCHAR"/>
<parameter expression="get-property('date')" type="VARCHAR"/>
<result column="MAX_ID_" name="MAX_ID_"/>
</statement>
</dblookup>
Sometimes it works, sometimes it doesn't work
when success the log is:
TID: [-1234] [] [2019-11-07 15:11:40,072] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Setting as parameter : 1 value : 2019 as JDBC Type : 12(see java.sql.Types for valid types) {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 15:11:40,073] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Setting as parameter : 2 value : 11 as JDBC Type : 12(see java.sql.Types for valid types) {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 15:11:40,073] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Setting as parameter : 3 value : 7 as JDBC Type : 12(see java.sql.Types for valid types) {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 15:11:40,073] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Successfully prepared statement :
SELECT IFNULL(max(ID_),0) as MAX_ID_ FROM KETTLE WHERE YEAR(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND MONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND DAYOFMONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ?;
against DataSource : jdbc:XXXX{org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 15:11:40,235] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Processing the first row returned :
SELECT IFNULL(max(ID_),0) as MAX_ID_ FROM KETTLE WHERE YEAR(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND MONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND DAYOFMONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ?;
{org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 15:11:40,235] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Column : MAX_ID_ returned value : 0 Setting this as the message property : MAX_ID_ {org.apache.synapse.mediators.db.DBLookupMediator}
when failure the log is:
TID: [-1234] [] [2019-11-07 09:29:50,073] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Getting a connection from DataSource jdbc:mysql://XXXX and preparing statement :
SELECT IFNULL(max(ID_),0) as MAX_ID_ FROM KETTLE WHERE YEAR(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND MONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND DAYOFMONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ?;
{org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,074] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - [ DB Connection : org.apache.commons.dbcp.PoolableConnection#1ff8a7ec ] {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,075] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - [ DB Connection instance identifier : 1ff8a7ec ] {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,075] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - [ Number of Active Connection : 1 ] {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,075] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - [ Number of Idle Connection : 0 ] {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,075] DEBUG {org.apache.synapse.mediators.db.DBLookupMediator} - Setting as parameter : 1 value : 2019 as JDBC Type : 12(see java.sql.Types for valid types) {org.apache.synapse.mediators.db.DBLookupMediator}
TID: [-1234] [] [2019-11-07 09:29:50,075] ERROR {org.apache.synapse.mediators.db.DBLookupMediator} - SQL Exception occurred while executing statement :
SELECT IFNULL(max(ID_),0) as MAX_ID_ FROM KETTLE WHERE YEAR(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND MONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ? AND DAYOFMONTH(FROM_UNIXTIME(DETECTION_TIME_ / 1000)) = ?;
against DataSource : jdbc:mysql:XXXXX{org.apache.synapse.mediators.db.DBLookupMediator}
java.sql.SQLException: No operations allowed after statement closed.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)

couchbase Re-balance failing with error - Rebalance exited with reason {badmatch,failed}

I am setting up a cluster. I tried to join 3 nodes but while re-balancing. I got below error. So i extracted some info from debug.log and unable to identify the exact issue. Appreciate any help.
=========================CRASH REPORT=========================
crasher:
initial call: service_agent:-spawn_connection_waiter/2-fun-0-/0
pid: <0.18486.7>
registered_name: []
exception exit: {no_connection,"index-service_api"}
in function service_agent:wait_for_connection_loop/3 (src/service_agent.erl, line 305)
ancestors: ['service_agent-index',service_agent_children_sup,
service_agent_sup,ns_server_sup,ns_server_nodes_sup,
<0.170.0>,ns_server_cluster_sup,<0.89.0>]
messages: []
links: [<0.18481.7>,<0.18490.7>]
dictionary: []
trap_exit: false
status: running
heap_size: 987
stack_size: 27
reductions: 1195
neighbours:
[ns_server:error,2018-02-12T13:54:43.531-05:00,ns_1#xuodf9.firebrand.com:service_agent-index<0.18481.7>:service_agent:terminate:264]Terminating abnormally
[ns_server:debug,2018-02-12T13:54:43.531-05:00,ns_1#xuodf9.firebrand.com:<0.18487.7>:ns_pubsub:do_subscribe_link:145]Parent process of subscription {ns_config_events,<0.18481.7>} exited with reason {linked_process_died,
<0.18486.7>,
{no_connection,
"index-service_api"}}
[error_logger:error,2018-02-12T13:54:43.531-05:00,ns_1#xuodf9.firebrand.com:error_logger<0.6.0>:ale_error_logger_handler:do_log:203]** Generic server 'service_agent-index' terminating
** Last message in was {'EXIT',<0.18486.7>,
{no_connection,"index-service_api"}}
** When Server state == {state,index,
{dict,6,16,16,8,80,48,
{[],[],[],[],[],[],[],[],[],[],[],[],[],[],[],[]},
{{[[{uuid,<<"55a14ec6b06d72205b3cd956e6de60e7">>}|
'ns_1#xuodf7.firebrand.com']],
[],
[[{uuid,<<"c5e67322a74826bef8edf27d51de3257">>}|
'ns_1#xuodf8.firebrand.com']],
[],
[[{uuid,<<"3b55f7739e3fe85127dcf857a5819bdf">>}|
'ns_1#xuodf9.firebrand.com']],
[],
[[{node,'ns_1#xuodf7.firebrand.com'}|
<<"55a14ec6b06d72205b3cd956e6de60e7">>],
[{node,'ns_1#xuodf8.firebrand.com'}|
<<"c5e67322a74826bef8edf27d51de3257">>],
[{node,'ns_1#xuodf9.firebrand.com'}|
<<"3b55f7739e3fe85127dcf857a5819bdf">>]],
[],[],[],[],[],[],[],[],[]}}},
undefined,undefined,<0.18626.7>,#Ref<0.0.5.56873>,
<0.18639.7>,
{[{<0.18646.7>,#Ref<0.0.5.56891>}],[]},
undefined,undefined,undefined,undefined,undefined}
** Reason for termination ==
** {linked_process_died,<0.18486.7>,{no_connection,"index-service_api"}}
[error_logger:error,2018-02-12T13:54:43.532-05:00,ns_1#xuodf9.firebrand.com:error_logger<0.6.0>:ale_error_logger_handler:do_log:203]
=========================CRASH REPORT=========================
crasher:
initial call: service_agent:init/1
pid: <0.18481.7>
registered_name: 'service_agent-index'
exception exit: {linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}
in function gen_server:terminate/6 (gen_server.erl, line 744)
ancestors: [service_agent_children_sup,service_agent_sup,ns_server_sup,
ns_server_nodes_sup,<0.170.0>,ns_server_cluster_sup,
<0.89.0>]
messages: [{'EXIT',<0.18639.7>,
{linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}}]
links: [<0.18487.7>,<0.4805.0>]
dictionary: []
trap_exit: true
status: running
heap_size: 28690
stack_size: 27
reductions: 6334
neighbours:
[error_logger:error,2018-02-12T13:54:43.533-05:00,ns_1#xuodf9.firebrand.com:error_logger<0.6.0>:ale_error_logger_handler:do_log:203]
=========================SUPERVISOR REPORT=========================
Supervisor: {local,service_agent_children_sup}
Context: child_terminated
Reason: {linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}
Offender: [{pid,<0.18481.7>},
{name,{service_agent,index}},
{mfargs,{service_agent,start_link,[index]}},
{restart_type,permanent},
{shutdown,1000},
{child_type,worker}]
[ns_server:error,2018-02-12T13:54:43.533-05:00,ns_1#xuodf9.firebrand.com:service_rebalancer-index<0.18626.7>:service_rebalancer:run_rebalance:80]Agent terminated during the rebalance: {'DOWN',#Ref<0.0.5.56860>,process,
<0.18481.7>,
{linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}}
[error_logger:info,2018-02-12T13:54:43.534-05:00,ns_1#xuodf9.firebrand.com:error_logger<0.6.0>:ale_error_logger_handler:do_log:203]
=========================PROGRESS REPORT=========================
supervisor: {local,service_agent_children_sup}
started: [{pid,<0.20369.7>},
{name,{service_agent,index}},
{mfargs,{service_agent,start_link,[index]}},
{restart_type,permanent},
{shutdown,1000},
{child_type,worker}]
[ns_server:error,2018-02-12T13:54:43.534-05:00,ns_1#xuodf9.firebrand.com:service_agent-index<0.20369.7>:service_agent:handle_call:186]Got rebalance-only call {if_rebalance,<0.18626.7>,unset_rebalancer} that doesn't match rebalancer pid undefined
[ns_server:error,2018-02-12T13:54:43.534-05:00,ns_1#xuodf9.firebrand.com:service_rebalancer-index<0.18626.7>:service_agent:process_bad_results:815]Service call unset_rebalancer (service index) failed on some nodes:
[{'ns_1#xuodf9.firebrand.com',nack}]
[ns_server:warn,2018-02-12T13:54:43.534-05:00,ns_1#xuodf9.firebrand.com:service_rebalancer-index<0.18626.7>:service_rebalancer:run_rebalance:89]Failed to unset rebalancer on some nodes:
{error,{bad_nodes,index,unset_rebalancer,
[{'ns_1#xuodf9.firebrand.com',nack}]}}
[error_logger:error,2018-02-12T13:54:43.535-05:00,ns_1#xuodf9.firebrand.com:error_logger<0.6.0>:ale_error_logger_handler:do_log:203]
=========================CRASH REPORT=========================
crasher:
initial call: service_rebalancer:-spawn_monitor/6-fun-0-/0
pid: <0.18626.7>
registered_name: 'service_rebalancer-index'
exception exit: {linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}
in function service_rebalancer:run_rebalance/7 (src/service_rebalancer.erl, line 92)
ancestors: [cleanup_process,ns_janitor_server,ns_orchestrator_child_sup,
ns_orchestrator_sup,mb_master_sup,mb_master,<0.4893.0>,
ns_server_sup,ns_server_nodes_sup,<0.170.0>,
ns_server_cluster_sup,<0.89.0>]
messages: [{'EXIT',<0.18640.7>,
{linked_process_died,<0.18486.7>,
{no_connection,"index-service_api"}}}]
links: []
dictionary: []
trap_exit: true
status: running
heap_size: 2586
stack_size: 27
reductions: 6359
neighbours:
[ns_server:error,2018-02-12T13:54:43.536-05:00,ns_1#xuodf9.firebrand.com:cleanup_process<0.18625.7>:service_janitor:maybe_init_topology_aware_service:84]Initial rebalance for `index` failed: {error,
{initial_rebalance_failed,index,
{linked_process_died,<0.18486.7>,
{no_connection,
"index-service_api"}}}}
[ns_server:debug,2018-02-12T13:54:43.536-05:00,ns_1#xuodf9.firebrand.com:menelaus_cbauth<0.4796.0>:menelaus_cbauth:handle_cast:95]Observed json rpc process {"projector-cbauth",<0.5099.0>} needs_update
[ns_server:debug,2018-02-12T13:54:43.538-05:00,ns_1#xuodf9.firebrand.com:menelaus_cbauth<0.4796.0>:menelaus_cbauth:handle_cast:95]Observed json rpc process {"goxdcr-cbauth",<0.479.0>} needs_update
[ns_server:debug,2018-02-12T13:54:43.539-05:00,ns_1#xuodf9.firebrand.com:menelaus_cbauth<0.4796.0>:menelaus_cbauth:handle_cast:95]Observed json rpc process {"cbq-engine-cbauth",<0.5124.0>} needs_update
[ns_server:debug,2018-02-12T13:54:43.540-05:00,ns_1#xuodf9.firebrand.com:menelaus_cbauth<0.4796.0>:menelaus_cbauth:handle_cast:95]Observed json rpc process {"fts-cbauth",<0.5129.0>} needs_update
This is a blocker for cluster creation at this point.
The rebalance error is coming due to index service. You can check indexer.log to see if there are any errors and the process is able to bootstrap correctly.
Please make sure the communication ports are open as mentioned here: https://developer.couchbase.com/documentation/server/current/install/install-ports.html
projector_port 9999 being blocked can lead to this.

JsonLoader throws error in pig

I am unable to decode this simple json , i dont know what i am doing wrong.
please help me in this pig script.
I have to decode the below data in json format.
3.json
{
"id": 6668,
"source_name": "National Stock Exchange of India",
"source_code": "NSE"
}
and my pig script is
a = LOAD '3.json' USING org.apache.pig.builtin.JsonLoader ('id:int, source_name:chararray, source_code:chararray');
dump a;
the error i get is given below:
2015-07-23 13:40:08,715 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local1664361500_0001_m_000000_0
2015-07-23 13:40:08,775 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : [ ]
2015-07-23 13:40:08,780 [LocalJobRunner Map Task Executor #0] INFO org.apache.hadoop.mapred.MapTask - Processing split: Number of splits :1
Total Length = 88
Input split[0]:
Length = 88
Locations:
-----------------------
2015-07-23 13:40:08,793 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader - Current split being processed file:/home/hariprasad.sudo/3.json:0+88
2015-07-23 13:40:08,844 [LocalJobRunner Map Task Executor #0] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: a[1,4] C: R:
2015-07-23 13:40:08,861 [Thread-5] INFO org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2015-07-23 13:40:08,867 [Thread-5] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local1664361500_0001
java.lang.Exception: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 3]
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream#61a79110; line: 1, column: 3]
at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1291)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:385)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportInvalidEOF(JsonParserMinimalBase.java:318)
at org.codehaus.jackson.impl.JsonParserBase._handleEOF(JsonParserBase.java:354)
at org.codehaus.jackson.impl.Utf8StreamParser._skipWSOrEnd(Utf8StreamParser.java:1841)
at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:275)
at org.apache.pig.builtin.JsonLoader.readField(JsonLoader.java:180)
at org.apache.pig.builtin.JsonLoader.getNext(JsonLoader.java:164)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:211)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
2015-07-23 13:40:09,179 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-07-23 13:40:09,179 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local1664361500_0001 has failed! Stop running all dependent jobs
2015-07-23 13:40:09,179 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2015-07-23 13:40:09,180 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2015-07-23 13:40:09,180 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Detected Local mode. Stats reported below may be incomplete
2015-07-23 13:40:09,181 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.3.0-cdh5.1.3 0.12.0-cdh5.1.3 hariprasad.sudo 2015-07-23 13:40:07 2015-07-23 13:40:09 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_local1664361500_0001 a MAP_ONLY Message: Job failed! file:/tmp/temp-65649055/tmp1240506051,
Input(s):
Failed to read data from "file:///home/hariprasad.sudo/3.json"
Output(s):
Failed to produce result in "file:/tmp/temp-65649055/tmp1240506051"
Job DAG:
job_local1664361500_0001
2015-07-23 13:40:09,181 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2015-07-23 13:40:09,186 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias a
Details at logfile: /home/hariprasad.sudo/pig_1437673203961.log
grunt> 2015-07-23 13:40:14,754 [communication thread] INFO org.apache.hadoop.mapred.LocalJobRunner - map > map
Please help me in understanding what is wrong.
Thanks,
Hari
Have the compact version of json in 3.json. We can use http://www.jsoneditoronline.org for the same.
3.json
{"id":6668,"source_name":"National Stock Exchange of India","source_code":"NSE"}
with this we are able to dump the data :
(6668,National Stock Exchange of India,NSE)
Ref : Error from Json Loader in Pig where similar issue is discussed.
Extract from the above ref. link :
Pig doesn't usually like "human readable" json. Get rid of the spaces and/or indentations, and you're good.

Which namespace on assign operation in bpel to use - selectionFailure, no results for expression

The situation is that I have a fairly simple BPEL process that invokes a service. I want to access the response message elements and assign then to another service (or even to the result of the BPEL process itself to return to the client). The issue I am having is that the imported wsdl for the service to invoke has a namespace declaration in it e.g. ldap and all the imported xsd elements for that wsdl also have the same ldap namespace declared.
<definitions
xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"
xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:ldap="http://webservices.hrldaplookup.ecis.police.uk/"
xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.xmlsoap.org/wsdl/"
targetNamespace="http://webservices.hrldaplookup.ecis.police.uk/" name="LDAPLookupServiceImpl">
This is then imported into my BPEL process, again using the ldap namespace.
<bpel:process name="HRLDAPProces"
targetNamespace="http://ldap.ecis.police.uk/Person/process"
suppressJoinFailure="yes"
xmlns:tns="http://ldap.ecis.police.uk/Person/process"
xmlns:bpel="http://docs.oasis-open.org/wsbpel/2.0/process/executable"
xmlns:ldap="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns1="http://www.w3.org/2001/XMLSchema" xmlns:ns0="http://uk.police.ecis.police.uk/athena/services/ConstrainedValueService" xmlns:ns="http://webservices.cvmanagement.athena.ecis.police.uk/">
<!-- Import the client WSDL -->
<bpel:import namespace="http://webservices.cvmanagement.athena.ecis.police.uk/" location="ConstrainedValueService.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import namespace="http://uk.police.ecis.police.uk/athena/services/ConstrainedValueService" location="ConstrainedValueService_1.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import namespace="http://webservices.hrldaplookup.ecis.police.uk/" location="LDAPLookupServiceImpl.wsdl" importType="http://schemas.xmlsoap.org/wsdl/"></bpel:import>
<bpel:import location="HRLDAPProcesArtifacts.wsdl" namespace="http://ldap.ecis.police.uk/Person/process"
importType="http://schemas.xmlsoap.org/wsdl/" />
When the service is invoked the response message has its own arbitrary namespaces assigned to the elements.
<getPersonnelResponse xmlns="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:S="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns2="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns3="http://ldap.ecis.police.uk/Person" xmlns:ns4="http://ecis.police.uk/ldaplookupservice"><personnelData xmlns="http://ecis.police.uk/ldaplookupservice"><detail xmlns="http://ldap.ecis.police.uk/Person">
When I want to assign variable parameter parts to something else I don't know which namespace to use.
$LDAPLookupResponse.parameters/ldap:personnelData/ldap:detail/item[1]
or
$LDAPLookupResponse.parameters/ns2:personnelData/ns4:detail/ns4:item[1]
Neither seem to work.
I'm sure I am just missing something simple, I just need pointing in the right direction.
Thanks
I'm using WSO2 Business Process server.
Full bpel process is here as requested by Thilini Ishaka - thanks!
and the log file for the error is
TID: [0] [BPS] [2013-01-21 16:22:47,750] DEBUG {org.wso2.carbon.bpel.messagetrace} - Service invocation completed: MEXId: hqejbhcnphr7xlanvn6p6t :: {http://webservices.hrldaplookup.ecis.police.uk/}LDAPLookupServiceImpl.getPersonnel {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,750] TRACE {org.wso2.carbon.bpel.messagetrace} - Response message: MEXId: hqejbhcnphr7xlanvn6p6t :: <?xml version='1.0' encoding='utf-8'?><S:Envelope xmlns:S="http://schemas.xmlsoap.org/soap/envelope/"><S:Body><ns2:getPersonnelResponse xmlns:ns2="http://webservices.hrldaplookup.ecis.police.uk/" xmlns:ns4="http://ecis.police.uk/ldaplookupservice" xmlns:ns3="http://ldap.ecis.police.uk/Person"><ns4:personnelData><ns3:detail><ns3:item title="Managers Name">Bob NELSON PSE 56619</ns3:item><ns3:item title="Fullname">Conrad CRAMPTON PSE 52704</ns3:item><ns3:item title="Rank">PSE</ns3:item><ns3:item title="Collar Number">46052704</ns3:item><ns3:item title="Location">Headquarters</ns3:item><ns3:item title="Email address">conrad.crampton#kent.pnn.police.uk</ns3:item><ns3:item title="Last Name">Crampton</ns3:item><ns3:item title="Force Number">52704</ns3:item><ns3:item title="Managers Force Number">56619</ns3:item><ns3:item title="First Name">Conrad</ns3:item></ns3:detail></ns4:personnelData></ns2:getPersonnelResponse></S:Body></S:Envelope> {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,750] INFO {org.apache.ode.bpel.runtime.ASSIGN} - Assignment Fault: {http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure,lineNo=322,faultExplanation={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes. {org.apache.ode.bpel.runtime.ASSIGN}
TID: [0] [BPS] [2013-01-21 16:22:47,750] INFO {org.apache.ode.bpel.runtime.ASSIGN} - Assignment Fault: {http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure,lineNo=322,faultExplanation={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes. {org.apache.ode.bpel.runtime.ASSIGN}
TID: [0] [BPS] [2013-01-21 16:22:47,765] WARN {org.apache.ode.bpel.engine.BpelProcess} - Instance 3652 of {http://ldap.ecis.police.uk/Person/process}HRLDAPProces-31 has completed with fault: FaultData: [faultName={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure, faulType=null ({http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes.)] #322 {org.apache.ode.bpel.engine.BpelProcess}
TID: [0] [BPS] [2013-01-21 16:22:47,765] WARN {org.apache.ode.bpel.engine.BpelProcess} - Instance 3652 of {http://ldap.ecis.police.uk/Person/process}HRLDAPProces-31 has completed with fault: FaultData: [faultName={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure, faulType=null ({http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: R-Value expression "{OXPath10Expression $LDAPLookupResponse.parameters//ldap:item[#title = 'Rank']}" did not select any nodes.)] #322 {org.apache.ode.bpel.engine.BpelProcess}
TID: [0] [BPS] [2013-01-21 16:22:47,859] DEBUG {org.wso2.carbon.bpel.messagetrace} - Reply Sent: HRLDAPProces.{http://ldap.ecis.police.uk/Person/process}process {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:22:47,859] TRACE {org.wso2.carbon.bpel.messagetrace} - Response message: <?xml version='1.0' encoding='utf-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Header xmlns:wsa="http://www.w3.org/2005/08/addressing"><wsa:Action>http://ldap.ecis.police.uk/Person/process/HRLDAPProces/processResponse</wsa:Action><wsa:RelatesTo>http://identifiers.wso2.com/messageid/1358785364081/1999227541</wsa:RelatesTo></soapenv:Header><soapenv:Body><soapenv:Fault><faultcode>soapenv:Server</faultcode><faultstring xmlns:axis2ns2="http://docs.oasis-open.org/wsbpel/2.0/process/executable">axis2ns2:selectionFailure</faultstring><detail/></soapenv:Fault></soapenv:Body></soapenv:Envelope> {org.wso2.carbon.bpel.messagetrace}
TID: [0] [BPS] [2013-01-21 16:23:17,875] INFO {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil} - 'admin#carbon.super [-1234]' logged in at [2013-01-21 16:23:17,875+0000] {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil}
The problem could be a namespace conflict in your process file.
Ideally it should work with;
$LDAPLookupResponse.parameters/ldap:personnelData/ldap:detail/item[1]
Can you please post the full bpel config and the full error log to check whether any namespace conflicts in the configuration.