How can I connect to a MySQL database using Scala? - mysql

I'm working on a little project where I'd like to parse some data, and then put it into a database. I'm not working with Lift, and I haven't been able to find a standard way to do this.
I'm fine writing the queries myself, but I'm not sure what to use to actually connect to the DB.

You can use JDBC - the standard means of getting Java to talk to databases. You'll need the appropriate MySQL JDBC driver. Apache DbUtils provides some utility classes surrounding JDBC and would be useful.
If you want a higher level API which takes some of the boilerplate out, then check out Spring's JDBC integration.
If you want an ORM (object-relational mapping), then Hibernate is a good choice.
I've used all three in Scala with success.

Off course you can use all Java version compatible with JDBC (Hibernate, Spring, etc), but for better use of Scala language, I recommend using a Scala specific framework, which have a better DSL.
ScalaQuery is an API / DSL (domain specific language) built on top of JDBC for accessing relational databases in Scala. It was designed with the following goals in mind:
Squeryl is a Scala ORM and DSL for talking with Databases with minimum verbosity and maximum type safety
SORM is a Scala ORM-framework designed to eliminate boilerplate code and solve the problems of scalability with a high level abstraction and a functional programming style
Slick - Typesafe backed project with Functional Relational Mapping
Check out more about these frameworks at https://stackoverflow.com/questions/1362748/looking-for-a-comparison-of-scala-persistence-frameworks

I've actually written a SQL command shell, in Scala, that talks to any arbitrary database for which a JDBC driver exists. As Brian Agnew notes, it works perfectly. In addition, there are tools like Querulous, SQueryL and OR/Broker that provide Scala-friendly database layers. They sit on top of JDBC, but they provide some additional semantics (via DSLs, in some cases) to make things easier for you.

Try O/R Broker:
case class MyObj(name: String, year: Int)
val ds = new com.mysql.jdbc.jdbc2.optional.MysqlDataSource
// set properties on ds
import org.orbroker._
val builder = new BrokerBuilder(ds)
val broker = builder.build
val myObj: MyObj = // Parse stuff to create MyObj instance
broker.transaction() { session =>
session.execute("INSERT INTO MYTABLE VALUES(:obj.name, :obj.year)", "obj"->myObj)s
}
val myObjs: Seq[MyObj] = // Parse stuff to create sequence of MyObj instances
broker.transaction() { session =>
session.executeBatch("INSERT INTO MYTABLE VALUES(:obj.name, :obj.year)", "obj"->myObjs)
}

For completeness, also check out RichSQL. It's demo code showing how to wrap JDBC to make more Scala-like operations, but it's actually quite usable. It has the advantage of being simple and small, so you can easily study the source to see what's going on. Don't forget to close() your PreparedStatements.

I just discovered ScalikeJDBC which offers a Scala like API wrapper for JDBC.
(I found ScalikeJDBC when researching how to use ScalaAnorm without Play Framework. Now it looks like I won't be needing Anorm for my project.)
Here is a simple example, though it offers many interesting features not shown here:
import scalikejdbc._
Class.forName("com.mysql.jdbc.jdbc2.optional.MysqlDataSource")
ConnectionPool.singleton("jdbc:mysql://localhost:3306/myschema", "user", "password")
DB.localTx { implicit conn =>
val data = sql"select mystringcol, myintcolumn from mytable".map {
rs => (rs.string("mystringcol"), rs.int("myintcolumn"))
}.list().apply()
println(data)
}
Some documentation links:
Documentation on running queries
Connection pool and loan pattern

Related

Recommended way to deserialize a JSON from an httpClient response

I am trying to understand the recommended way of parsing a JSON into an object, particularly from httpClient responses (but my question may also relate to parsing JSON from streams in general)
I have scoured the Internet reading many blog posts and that's what I have come up with:
I understand that parsing a stream to a string and then parsing the string to an object is a big no-no in terms of memory usage.
And according to many blog posts I have come across the traditional way of doing it, is or used to be working with streams, using the package Newtonsoft.JSON as follows:
using var streamReader = new StreamReader(stream);
using var jsonTextReader = new JsonTextReader(streamReader);
var myDeserializedObject = new JsonSerializer().Deserialize<MyObject>(jsonTextReader);
But then I have come across another way of doing it:
If you are using .NET Core 3 and above (not so sure about the version) you have a built-in way of deserializing the stream using System.Text.JSON:
var myDeserializedObject = await JsonSerializer.DeserializeAsync<MyObject>(stream);
and particularly to httpClient requests (and if you are using .NET 5 and above if I am not mistaken)
you can do:
var myDeserializedObject = httpClient.GetFromJsonAsync<MyObject>();
Please if someone could explain the ups and downs (if there are any) of each approach, especially in terms of performance and memory usage.
My company has been discussing the something of a similar situation. We ended with the following considerations .
Using Newtonsoft.JSON:
Pros:
A widely used library with a lot of features, including serializing and deserializing JSON.
Provides good control over the serialization and deserialization process.
Cons:
Can consume more memory due to string conversion of the JSON stream.
May have performance overhead when serializing and deserializing large JSON payloads.
Using System.Text.Json:
Pros:
Built-in and optimized for performance in .NET Core 3 and above.
Consumes less memory compared to Newtonsoft.JSON.
Has improved performance compared to Newtonsoft.JSON.
Cons:
Has limited options for customizing the serialization and deserialization process.
May not have all the features available in Newtonsoft.JSON.
For most cases, System.Text.Json should be sufficient for deserializing JSON payloads. However, for more complex JSON serialization/deserialization requirements, Newtonsoft.JSON may still be the preferred choice. Ultimately, the choice depends on the specific requirements of the project if there are any guiding and limitations for your project.

Writing unit tests for Solr plugin using JUnit4, including creating collections

I wrote a plugin for Solr which contains new stream expressions.
Now, I'm trying to understand what is the best way to write them unit tests:
Unit tests which need to include creation of collections in Solr, so I will be able to check if my new stream expressions return me the right data they suppose.
I saw over the web that there is a class called "SolrTestCaseJ4", but I didn't find how to use it for creating new collections in Solr and add them data and so on...
Can you please recommend me which class may I use for that purpose or any other way to test my new classes?
BTW, we are using Solr 7.1 in cloud mode and JUnit4.
Thanks in advance.
you could use MiniSolrCloudCluster
Here is an example how to create collections (all for unit test):
https://github.com/lucidworks/solr-hadoop-common/blob/159cce044c1907e646c2644083096150d27c5fd2/solr-hadoop-testbase/src/main/java/com/lucidworks/hadoop/utils/SolrCloudClusterSupport.java#L132
Eventually I found a better class, which simplifies everything and implements more functionality than MiniSolrCloudCluster (actually it contains MiniSolrCloudCluster inside it as a member).
This class called SolrCloudTestCase, and as you can see here, even Solr's source code uses it in their own unit tests.

Mapping Json type from Postgresql in Scala code

I have the table with json data type in Postgresql and i must choice the eligible mapping of this field in Scala code for case class field (i don't use slick where i can directly set something like sql.json). I consider the next variants:
java.lang.Object with following validation.
play.libs.Json or io.circe.Json
But i can't choice one of them and I don't other possible candidate for this place.
UPD For interacting with DB i use quill
As Sarvesh mentioned it, it highly depends on the library you are going to use. There is a nice pg-slick extension (https://github.com/tminglei/slick-pg) that supports most of the json libraries.
As for the other libraries, I did not find any relevant resource. It is however possible to insert json with plain SQL. Find resource here

Using app.config from F# Script

I have seen several related questions here on stack overflow, however none appear to have answers. I will pose the question and then include links to related SO questions I have found.
I have a Core domain library written in C# that leverages Entity Framework. As such, EF requires the dbcontext to pass a connection string to the base (dbcontext). In my case the connection string lives in the app.config (or web.config) depends on top level project of course and the name is "AnnotatorModel".
I need to instantiate the DBContext from within my F# script to test a few queries and analytics.
I have added this to the app.config in my F# project and tried a few of the answers on SO with no success. Does anybody know a good easy straight forward way to accomplish this?
Here is the code, realize it breaks on attempting to instantiate the dbcontext, AnnotatorModel.
let PredictBestEntities (number:int) (tokens:string seq) =
let db = new AnnotatorModel()
tokens
|> Seq.map ...etc etc
Thanks,
~David
Related questions:
Get and use connection string from App.config F# AppSettings provider
App.config and F# Interactive not working
https://social.msdn.microsoft.com/Forums/vstudio/en-US/5b4dba22-28ec-4bbd-bc53-c5102d0b938f/using-fsi-and-an-appconfig?forum=fsharpgeneral
This is not what you're asking, but I'll add this answer anyway:
Add a constructor overload to AnnotatorModel that enables you to pass a connection string. This will enable you to write:
let db = new AnnotatorModel("some connection string")
Relying exclusively on a connection string in app.config tightly couples a library to that single source of configuration. This isn't good library design. Not only are you having trouble with using it from FSI, but it also makes it difficult to change 'configuration values' at run-time, load them from a database instead of a file, etc.
Libraries shouldn't be coupled to app.config. Only applications should use app.config.

Finding it difficult to process JSON in delphi

I'm currently working on an application that will get data of your character from the WoW armory.
Example character: My WoW Character(link)
I will get all the info I want by calling the API provided by Blizzard and I will get the response in JSON.
Example JSON: JSON response for the character above(link)
At first I tried get the data from the JSON by string manipulation.
This mean, splitting my strings, searching for keywords in the string to find the position and formatting that into individual pieces of data such as talents and stats.
This worked great at the beginning but as I wanted more data this became harder because of the many functions I ran on all the strings it just became one big blur and unclear to see what I was doing at that moment.
Is there a good way to process my JSON?
I was thinking about getting the JSON and creating an empty class.
While working through the JSON it would generate properties and store the values in there.
But I have no idea if and how its possible to generate properties dynamically.
In the future I would like to get even more data but first I want to get this up and running before even thinking about that.
Does anyone have any ideas/advice on this?
Thanks in advance.
Your JSON seems rather short and basic. It does not seem you need special speed or exotic features. http://jsonviewer.stack.hu/#http://eu.battle.net/api/wow/character/moonglade/Xaveak?fields=stats,talents
And while since Delphi XE2 you really have stock JSON parser as part of DB-Express suite, still there are concerns:
1. It was told to cause problems with both speed and reliability.
2. It would make you program dependent on DB-Express package (why, if you not actually using it for DB access?)
3. It would bind your future to Enterprise edition of Delphi.
So you'd better try some 3rd-party library.
One of the fastest would probably be Synopse JSON parser, side-project of their mORMot library. It is generally good code, with large attention to speed and developers actively helping on their forum.
One more known and used library would be Henri Gourvest's SuperObject.
It made claims to be the fastest parser for Delphi, and while due to above that is probably no more true, the speed is quite adequate for most tasks. Henri himself is not actively supporting his ex-projects, always doing something new, so the scarce documentation (also duplicated in install package) would be all you have officially, plus there is a forum where other users might help you. OTOH the main idea behind SuperObject design was uniformity, and while some tasks could really be documented better - that is mostly due to uncertainty "if this task would really work in uniform matter without any special treatment". But usually it does.
PS. Since that is wiki you may try to enhance it for future users ;-)
So coming back to documentation, you would need
1) to load the whole JSON to the library. That you can do via creating TStream by your http library or providing string buffer wth the data: that is Parsing a JSON data structure section of the manual
2) reading values like "name" and "level" - described in How to read a property value of an object ? section there.
3) enlist arrays like "talents" - described in Browsing data structure section.
XE3 has "built in" JSON support (see docwiki), but I have heard (haven't used it myself) that it isn't very well optimised.
So perhaps look for some thirdparty option like SuperObject.
Your task is easily achievable using TSvSerializer which is included in my delphi-oop library. You only need to declare your model type and deserialize it from your json string. Your model (very simplified incomplete and untested version) should look something like this:
type
TStats = class
public
property health: Integer read fhealth write Fhealth;
...
end;
TTalent = class
public
property tier: Integer read Ftier write Ftier;
...
end;
TMainTalent = class
public
property selected: Boolean read Fselected write Fselected;
property talents: TObjectList<TTalent> read Ftalents write Ftalents;
end;
TWowCharacter = class
public
property lastModified: Int64 read FlastModified write FlastModified;
property name: string read Fname write Fname;
...
property stats: TStats read Fstats write Fstats;
property talents: TObjectList<TMainTalent> read Ftalents write Ftalents;
...
end;
Then you just need to do:
uses
SvSerializer;
var
LWowCharacter: TWowCharacter;
begin
LWowCharacter := TWowCharacter.FromJson(YourJsonString);
...
You can find my contact email in delphi-oop project, ask me if something's unclear, I'll try to help you in my spare time.