I'm trying to get Slick 3.0 running with MySql. I've made the following changes to the hello-slick-3.0 activator project:
In application.conf I've removed the h2mem1 entry and replaced it with:
horridDBStuff = {
url = "utterlyhorriddb.blahblahblah.us-west-2.rds.amazonaws.com:3306"
driver = com.mysql.jdbc.Driver
connectionPool = disabled
keepAliveConnection = true
}
I've replaced each Database.forConfig("h2mem1") entry in the scala
code with Database.forConfig("horridDBStuff")
I've replaced each import slick.driver.H2Driver.api._ with
import slick.driver.MySQLDriver.api._
In build.sbt I've added to libraryDependencies the item
"mysql" % "mysql-connector-java" % "5.1.35"
It compiles fine, but running gives the error Exception in thread "main" java.sql.SQLException: No suitable driver coming from the line val db = Database.forConfig("horridDBStuff").
How can I get Slick 3.0 running with MySql? Am I missing something simple here, or are there any other working examples? Thanks.
Fixed it. The URL in application.conf was in the wrong format. Should be
url = "jdbc:mysql://utterlyhorriddb.blahblah.us-west-2.rds.amazonaws.com/aardvark_schema"
where you've already created ardvark_schema in your database.
(That fixes the db access issue that I was asking about, but you'll still get a key-spec error. To fix that you need to remove the O.PrimaryKey entry from Tables.scala for "COF_NAME", which is described here: How to get around Slick 3.0 schema creation getting errors due to key specs without length.)
Related
I am using the versions for my project:
Django==2.2.7
djangorestframework==3.10.3
mysqlclient==1.4.5
In my database I work with geometric types, and for this I have configured the library: GDAL-3.0.2-cp38-cp38-win32
To run this library, I have to include these variables in the django properties file:
GEOS_LIBRARY_PATH
GDAL_LIBRARY_PATH
Now on my models, I do the following import:
from django.contrib.gis.db import models
For types like:
coordinates = models.GeometryField (db_column = 'Coordinates', blank = True, null = True)
It seems that the queries work correctly, but when creating a new element, I get the following error:
GDAL_ERROR 1: b'PROJ: proj_as_wkt: Cannot find proj.db '
But after this error, the object is persisted correctly in the database.
I would like to know how to solve this error.
I have not found information on the network, I have only tried to declare a new variable in the DJANGO properties file:
PROJ_LIB = 'APP / backend / env / Lib / site-packages / osgeo / data / proj / proj.db'
But the error still appears, and you may have problems in the production environment in an OpenSuse image
Why can't you find proj.db?
How do I solve it?
This post is marked for deletion, as the issue was with the IDE in not creating the proper jar, hence issues with the code interaction
I have a small flink application that reads from a kafka topic,
needs to query if the input from the topic (x) exists in a column of MySql Database before processing it (Not Ideal but its the current requirement)
When I run the Application through the IDE (Intellij) -> It works.
However when I submit the job to flink server it fails to open connection based on driver
Error from Flink Server
// ERROR
java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
// ---------------------
// small summary of MAIN
// ---------------------
Get Data from Source (x)
source.map(x => {
// open connection (Fails to open)
// check if data exist in db
})
// -------------------------------------
// open connection function (Scala Code)
// -------------------------------------
def openConnection() : Boolean = {
try {
// - set driver
Class.forName("com.mysql.jdbc.Driver")
// - make the connection
connection = DriverManager.getConnection(url, user, pswd)
// - set status controller
connection_open = true
}
catch {
// - catch error
case e: Throwable => e.printStackTrace
// - set status controller
connection_open = false
}
// return result
return connection_open
}
Question
1) Whats the correct way to interface to MySql Database from a flink application?
2) I will also at a later stage have to do similar interaction with MongoDB, whats the correct way interacting with MongoDB from FLink?
Unbelievable IntelliJ does not update dependencies on rebuild command.
In IntelliJ, You have to delete and re-setup your artifact creator for all dependencies do be added. (Build, Clean, Rebuild,Delete) does not update its settings.
I deleted and recreated the artifact file. And it Works
Apologies for the unnecessary inconvenience (As you can imagine my frustration). But it's a word of caution for those developing in IntelliJ, to manually delete and recreate artifacts
Solution:
(File -> Project Structure -> Artifacts -> (-) delete previous one -> (+) create new one -> Select Main Class)
I encountered an error while doing full-import in solr-6.6.0.
I am getting exception as bellow
This happens when I set
batchSize="-1" in my db-config.xml
If I change this value to say batchSize="100" then import will run without any error.
But recommended value for this is "-1".
Any suggestion why solr throwing this exception.
By the way the data am trying to import is not huge, data am trying to import is just 250 documents.
Stack trace:
org.apache.solr.handler.dataimport.DataImportHandlerException: java.sql.SQLException: Operation not allowed after ResultSet closed
at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:61)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:464)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:377)
at org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:133)
at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:75)
at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:267)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:475)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:516)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:414)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:329)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:415)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:474)
at org.apache.solr.handler.dataimport.DataImporter.lambda$runAsync$0(DataImporter.java:457)
at java.lang.Thread.run(Thread.java:745)
By the way am getting one more warning:
Could not read DIH properties from /configs/state/dataimport.properties :class org.apache.zookeeper.KeeperException$NoNodeException
This happens when config directory is not writable.
How can we make config directory writable in solrCloud mode.
Am using zookeeper as watch-dog. Can we go ahead and change permission of config files which are there is zookeeper?
your help greatly appreciated.
Using fetchSize="-1" is only recommended if you have problems running without it. Its behaviour is up to the JDBC driver, but the cause of people assuming its recommended is this sentence from the old wiki:
DataImportHandler is designed to stream row one-by-one. It passes a fetch size value (default: 500) to Statement#setFetchSize which some drivers do not honor. For MySQL, add batchSize property to dataSource configuration with value -1. This will pass Integer.MIN_VALUE to the driver as the fetch size and keep it from going out of memory for large tables.
Unless you're actually seeing issues with the default values, leave the setting alone and assume your JDBC driver does the correct thing (.. which it might not do with -1 as the value).
The reason for dataimport.properties having to be writable is that it writes a property for the last time the import ran to the file, so that you can perform delta updates by referencing the time of the last update in your SQL statement.
You'll have to make the directory writable for the client (solr) if you want to use this feature. My guess would be that you can ignore the warning if you're not using delta imports.
I'm trying to connect to a mysql database with slick 1.0.0.
What I've done so far:
in Build.scala I've added
val appDependencies = Seq(
anorm,
"mysql" % "mysql-connector-java" % "5.1.24",
"com.typesafe.slick" % "slick_2.10" % "1.0.0",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
in application.conf
db.default.driver=com.mysql.jdbc.Driver
db.default.url="url to mysql db"
db.default.user=user
db.default.pass=password
and now I'm trying to read an Entry from the DB. For this I have a model
package models
import scala.slick.driver.MySQLDriver.simple._
import Database.threadLocalSession
object Organisations extends Table[(Int, String)]("Organisation")
{
def id = column[Int]("id", O.PrimaryKey)
def name = column[String]("name")
def * = id ~ name
}
and now I'd like to just output the entries
val orgs =
for { o <- Organisations } yield o.name
println("Length" + orgs.toString())
But it doesn't work. I'm sure I've made plenty of errors, but there don't seem to be andy slick tutorials with mysql.
Thank you for your patience and I hope my explanations are clear.
Using Slick requires a bit of boilerplate, creating session and all that, checkout the Play-Slick plugin written by Fredrik Ekholdt (typesafe)!
It does all that plumbing for you and there are good examples on the wiki on how to use it.
https://github.com/freekh/play-slick/
The new Slick 2.0 also features a Code Generator that can be used together with Play Framework evolutions.
This means you don't have to write the boilerplate for Slick anymore. Just write your database changes using evolutions files, and immediately access the new tables from your code.
You can find a complete example using MySQL here:
https://github.com/papauschek/play-slick-evolutions
And more information on how it works:
http://blog.papauschek.com/2013/12/slick-2-0-code-generator-play-framework-evolutions/
The Play team have also been working on a slick benchmark for Techempower. It is a work in progress but we'll shortly be raising a PR on the completed version (next 24 hours I suspect):
https://github.com/nraychaudhuri/FrameworkBenchmarks/tree/adding_missing_slickness/play-slick
This is the code I have in Bootstrap:
public function _initRegistry()
{
$systemConfigModel = new Application_Model_DbTable_SystemConfig();
Zend_Registry::set('config', $systemConfigModel->getSystemConfig());
}
And this an exception I am getting:
( ! ) Fatal error: Uncaught exception 'Zend_Db_Table_Exception' with message 'No adapter found for Application_Model_DbTable_SystemConfig' in /usr/share/php5/Zend/Db/Table/Abstract.php on line 755
( ! ) Zend_Db_Table_Exception: No adapter found for Application_Model_DbTable_SystemConfig in /usr/share/php5/Zend/Db/Table/Abstract.php on line 755
It works just fine if I call it within my BaseController. It just looks like the PDO adapter that I specify in application.ini has not been initialized at the time Bootstrap is executed (strange?). What should I do to make the code work in Bootstrap? Is it necessary to create and set an adapter with Zend_Db_Table::setDefaultAdapter();?
I am asking because if the code is not in Bootstrap, it needs to be duplicated in two different places and it also kind of looks like it belongs to Bootstrap.
You are correct, during bootstrap, the Zend Application resource for your database has not yet been initialized.
Try changing your bootstrap method as follows so you explicitly bootstrap the db resource.
public function _initRegistry()
{
$this->bootstrap('db'); // Bootstrap the db resource from configuration
$db = $this->getResource('db'); // get the db object here, if necessary
// now that you have initialized the db resource, you can use your dbtable object
$systemConfigModel = new Application_Model_DbTable_SystemConfig();
Zend_Registry::set('config', $systemConfigModel->getSystemConfig());
}