Unable to connect to host MySQL database on application deployed to CloudBees - mysql

I followed the instructions here but when attempted I got the following error:
hudson.util.IOException2: remote file operation failed: /scratch/jenkins/workspace/Xinco Demo Publish/Xinco/target/Xinco-2012-08-30_00-20-05.war at hudson.remoting.Channel#1fc6bdea:s-50b0ae50
at hudson.FilePath.act(FilePath.java:783)
at hudson.FilePath.act(FilePath.java:769)
at com.cloudbees.plugins.deployer.DeployPublisher.perform(DeployPublisher.java:108)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:707)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:682)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:660)
at hudson.model.Build$RunnerImpl.post2(Build.java:162)
at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:629)
at hudson.model.Run.run(Run.java:1433)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:238)
Caused by: hudson.remoting.ProxyException: hudson.util.IOException2: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:151)
at com.cloudbees.plugins.deployer.deployables.Deployable$DeployFileCallable.invoke(Deployable.java:342)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2048)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:287)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.api.BeesClient.readResponse(BeesClient.java:850)
at com.cloudbees.api.BeesClient.applicationDeployArchive(BeesClient.java:435)
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:123)
... 11 more
Build step 'Deploy to CloudBees' marked build as failure
The full output can be seen here.

Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
Your cloudbees-web.xml doesn't follow the correct format.
See http://wiki.cloudbees.com/bin/view/RUN/CloudBeesWebXml - as the cloudbees-web.xml needs to be wrapped in an outer <cloudbees-web-app> element

You also don't have to use cloudbees-web.xml if you don't want - if you bind your app to a DB it will make it available as a named datasource automatically.
(see bees app:bind command).
You only have to do this once - and then the app will know about the datasource.
http://developer.cloudbees.com/bin/view/RUN/Resource+Management
and
https://developer.cloudbees.com/bin/view/RUN/DatabaseGuide
(sorry, still working on docs).

Related

Error while trying to create external table in hive

I am trying to create an external table using hive with hadoop but somehow it failed. These are the error I get when I try to run my queries.
02:23:29.516 [HiveServer2-Background-Pool: Thread-39] ERROR hive.ql.exec.DDLTask - org.apache.hadoop.hive.ql.metadata.HiveException: Cannot validate serde: org.openx.data.jsonserde.JsonSerDe
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3858)
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:700)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3960)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:333)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1077)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:235)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:90)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:299)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:312)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: Class org.openx.data.jsonserde.JsonSerDe not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2329)
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3852)
... 22 more
How can I solve it?
The exception says
java.lang.ClassNotFoundException: Class
org.openx.data.jsonserde.JsonSerDe not found
Install JSONSerDe (download JARS from http://www.congiu.net/hive-json-serde/ and put them into hive/lib), read instructions here: Hive-JSON-Serde
Also instead of putting jars into hive/lib you can try adding jars in hive session:
ADD JAR ADD JAR /usr/lib/hive/lib/json-serde-1.3.8-jar-with-dependencies.jar;
ADD JAR ADD JAR /usr/lib/hive/lib/json-udf-1.3.8-jar-with-dependencies.jar;
Alternatively you can try native Hive JSONSerDe: org.apache.hive.hcatalog.data.JsonSerDe - just change the class name in table DDL SerDe. It should be already installed. Read more details about differences here: https://docs.aws.amazon.com/athena/latest/ug/json-serde.html
I have the same issue when I use the hive command in cmd. But it works normally when I use the beeline with hive2 connection.

Facing issue while implementing the XATransation implementation with tomcat,mssql,ActiveMQ

we have a project which is not an XA ,currently trying implementing XA implementaion I have followed the following setps for implementation:
in tomee.xml changed the resoure from
<Resource id="paymentsDS" type="javax.sql.DataSource">
to
<Resource id="paymentsDS" type="javax.sql.XADataSource" class-name="com.microsoft.sqlserver.jdbc.SQLServerXADataSource">
and followed the steps same in the provided document : XA with Microsoft SQL Server requires the MS DTC to be configured correctly, and sqljdbc_xa.dll to be installed. For instructions, please see this Microsoft article: https://learn.microsoft.com/en-us/sql/connect/jdbc/understanding-xa-transactions?view=sql-server-2017
In tomcat logs I'm observing Caused by: org.hsqldb.HsqlException: user lacks privilege or object not found: REPORT_INSTANCE ,
Caused by: java.sql.SQLSyntaxErrorException: user lacks privilege or object not found: REPORT_INSTANCE `at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source) at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source)
Caused by: org.hsqldb.HsqlException: user lacks privilege or object not found: REPORT_INSTANCE'
but didn't have any hsqlDB I'm using mssql
anyone has idea of resolving the issue and do I need to do any further changes.

Data Analytics Server 3.1.0 throwing exceptions

I am using WSO2 API Manager 2.0.0 & WSO2 DataAnalyticsServer 3.1.0.
I have made the following configurations:
Enabled Analytics in api-manger.xml
Directed it to my DAS Server Port
Added DAS_AGENT to log4j properties
The servers started properly
In DAS' management console, I uploaded the APIM_Realtime_Analytics.car
All this was in accordance with :
https://docs.wso2.com/display/AM200/Running+the+Product#RunningtheProduct-AccessingtheManagementConsole
https://docs.wso2.com/display/AM200/Configuring+APIM+Analytics
docs.wso2.com/display/DAS310/Quick+Start+Guide
But I am getting the following error:
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting org.wso2.apimgt.statistics.request:1.1.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId org.wso2.apimgt.statistics.request:1.1.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
... 7 more
[2016-10-08 16:05:49,621] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent for -1234
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting org.wso2.apimgt.statistics.execution.time:1.0.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId org.wso2.apimgt.statistics.execution.time:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
... 7 more
[2016-10-08 16:05:49,625] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent for -1234
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting org.wso2.apimgt.statistics.response:1.1.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId org.wso2.apimgt.statistics.response:1.1.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
Since the server wasn't getting certain Stream Definitions,
I also tried deploying APIM_Realtime_Analytics_REST.car(from a previous version of DAS) but to no avail. I'm getting similar exceptions for that
How do I rectify this?
Thanks in advance!
As mentioned in the document you are referring to, now APIM has its own Analytics Server, which is a customized DAS. So now you have a very few configurations to do to see API stats. That distribution already has required CApps installed too. So you don't need to install them manually.
But, as I understand, you are using a vanilla DAS Server instead of APIM Analytics Server. If possible, please try with that. If you can't for some reason, take the car file from that distribution and install it in DAS. That should solve your issue.

Error deploying WAR with mysql driver to Glassfish4 on CloudBees

I'm trying to deploy a WAR on CloudBees Glassfish4 server. I've followed the instructions at the bottom of http://developer.cloudbees.com/bin/view/RUN/Glassfish4 to include the jar in the META-INF/lib directory.
When I deploy with:
bees app:deploy target/app.war -a myDomain/app -t glassfish4-full
I get the error:
ERROR: Server.InternalError - java.lang.IllegalArgumentException: Platform error -
plugin_setup_error: glassfish4-full 1 [main] INFO com.cloudbees.clickstack.glassfish.Setup - Setup clickstack com.cloudbees.clickstack:glassfish-clickstack:4-full-1.0.2 - 2013-12-12T13:06:29.572+0100, current dir /mnt/genapp/apps/1cabb3f9/.
[main] INFO com.cloudbees.clickstack.glassfish.Setup - Setup: Environment{,
appUser='app_1cabb3f9',
appId='1cabb3f9',
appPort=8336,
appDir=/var/genapp/apps/1cabb3f9,
logDir=/var/genapp/apps/1cabb3f9/.genapp/log,
genappDir=/var/genapp/apps/1cabb3f9/.genapp,
controlDir=/var/genapp/apps/1cabb3f9/.genapp/control,
clickstackDir=/mnt/genapp-tmp/genapp-remote-plugin-1389871636905879,
packageDir=/mnt/genapp-tmp/stax-genapp-1389871636.236927/app,
}, com.cloudbees.clickstack.domain.metadata.Metadata#385cbbb1
Exception in thread "main" java.lang.Exception: Exception deploying on 10.159.35.35
at com.cloudbees.clickstack.glassfish.Setup.main(Setup.java:147)
Caused by: java.lang.IllegalArgumentException
at com.sun.nio.zipfs.ZipPath.relativize(ZipPath.java:238)
at com.cloudbees.clickstack.util.Files2$3.visitFile(Files2.java:188)
at com.cloudbees.clickstack.util.Files2$3.visitFile(Files2.java:184)
at java.nio.file.FileTreeWalker.walk(Unknown Source)
at java.nio.file.FileTreeWalker.walk(Unknown Source)
at java.nio.file.FileTreeWalker.walk(Unknown Source)
at java.nio.file.Files.walkFileTree(Unknown Source)
at java.nio.file.Files.walkFileTree(Unknown Source)
at com.cloudbees.clickstack.util.Files2.unzipSubDirectoryIfExists(Files2.java:184)
at com.cloudbees.clickstack.util.ApplicationUtils.extractContainerExtraLibs(ApplicationUtils.java:49)
at com.cloudbees.clickstack.glassfish.Setup.installApplication(Setup.java:259)
at com.cloudbees.clickstack.glassfish.Setup.setup(Setup.java:154)
at com.cloudbees.clickstack.glassfish.Setup.main(Setup.java:139)
I got a reply from CloudBees support.
The documentation at http://developer.cloudbees.com/bin/view/RUN/Glassfish4 was wrong, you don't need to include the mysql connector in your project.
As I replied to you on our support platform. We fixed the bug on both "glassfish4-full" and "glassfish4" (web profile) ClickStacks.
Sorry for the inconvenience,
Cyrille
Clickstacks release notes:
https://github.com/CloudBees-community/glassfish4-clickstack/releases/tag/v4-web-1.0.1
https://github.com/CloudBees-community/glassfish4-clickstack/releases/tag/v4-full-1.0.3

Error during Sonar Configuration ( use Derby default DB )

I changed my Sonar DB from Oracle to Default Derby. I successfully configured the Sonar Server, however I have error during the integration with Hudson.
Caused by: java.sql.SQLException: SQL driver not found oracle.jdbc.OracleDriver
at org.sonar.jpa.session.DriverDatabaseConnector.getConnection(DriverDatabaseConnector.java:91)
at org.sonar.jpa.session.AbstractDatabaseConnector.testConnection(AbstractDatabaseConnector.java:185)
... 41 more
Caused by: java.lang.ClassNotFoundException: oracle.jdbc.OracleDriver
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at org.sonar.jpa.session.DriverDatabaseConnector.getConnection(DriverDatabaseConnector.java:88)
... 42 more
Error states that I can't found OracleDriver, Which I should not use anymore.
In my Hudson configuration, I have removed my oracle configuration and replaced it with these :
Any idea on what I configured wrongly?
fixed by changing the driver to "org.apache.derby.jdbc.ClientDriver". Turns out that the remarks "Do not set if you use default embedded" is a misleading.