Read existing mysql tables using Grails by generating the domains - mysql

I'm trying Grails 3.0.8 for the first time. I'll use it to create web-services for mobile development.
I already have a mysql database with a lot of tables. I found that I can use "db-reverse-engineer:0.5.1" to generate the different domains from the tables. For some reason, I cannot install the plugin and it doesn't work. I think it has something to do with the new version of Grails which is 3.0.8.
As there are not a lot of documentation on this version, I was wondering if there was a way to generate domains from an existing MySQL database.
If not, is it possible to use the database without having to create domains for the tables?

The db-reverse-engineer plugin is for Grails 2. It's not compatible with Grails 3. See Grails 3 reverse engineer database to domain objects
You can run database queries if you get a Hibernate session. You can read about how to get one here.
With a Hibernate session, you can use the Session.createQuery(String) method to create a SQLQuery instance. Then just execute the SQLQuery.list() method to run the query. Here's an example of running an arbitrary query in an H2 database.
def q = session.createSQLQuery 'select * from INFORMATION_SCHEMA.COLUMNS'
q.list() // Runs the query.

Related

Building dynamic data table with nodeJS backend for mysql database

I would like to build a data table that provides features like searching, filtering, free text search and so on (if anyone has more idea please share to create modern data table). My backend has to be in NodeJS and Front end could be simple html, css. If I need to create any middleware to make the data load faster, how will that be? Any suggestion regarding this will be very much apppreciated.
P.S: I have a mysql database.
Building a dynamic data table with a Node.js backend for a MySQL database can be a challenge, but there are a few libraries that can help you get the job done.
One library that can be used is the mysqljs/mysql library. This library provides a Node.js interface for MySQL databases. It can be used to create a connection to a MySQL database, run queries, and close the connection when finished.
Another library that can be used is the node-mysql2/promise library. This library provides a Node.js interface for MySQL databases with promise support. It can be used to create a connection to a MySQL database, run queries, and close the connection when finished.
Finally, the sequelize library can be used. This library provides an easy-to-use interface for interacting with MySQL databases. It can be used to create a connection to a MySQL database, define models, and run queries.
All of these libraries can be used to build a dynamic data table with a Node.js backend for a MySQL database.

Domain classes and database are out of sync in Grails

I've this weird problem. I have a Grails app in which some database changelog files are missing. Therefore, the database has gone out of sync with the domain classes. I've done some changes in the domain classes. When I try to run the database migration plugin, it is creating a diff betweeb the current domain classes and the database and try to execute all the sql commands that has already been run which is causing error in executing the commands that I want to execute.
Is there is a solution for this problem?
If I understand your problem correctly, you can re-create all of the missing changelogs using dbm-generate-changelog. This will create changelogs based on the current data model. Then you can use dbm-changelog-sync to mark those changelogs as EXECUTED (which will populate the DATABASECHANGELOG table). Once the DATABASECHANGELOG table is in sync with the current data model, you can use dbm-gorm-diff to make sure you're not missing any other data model changes.
https://grails-plugins.github.io/grails-database-migration/1.4.0/ref/Maintenance%20Scripts/dbm-changelog-sync.html
NOTE: My answer assumes you're using Grails 2.x and Database Migration plugin 1.4.x, but I believe the process is similar in Grails 3.x with Database Migration Plugin 2.x or 3.x.

Configuring mysql linked server with db2

I have two database server one is mysql another is db2 both are running on different machine.I want to fetch records from tables from both the database by using a join.i have studied about linked server concept but the problem is i couldnt find any example for creating a linked server with db2(all i can find is SSMS i.e use Sql Server Mannagement Studio for creating linked server) but mine is case is of mysql and db2 and i need to create a linked server to one of them/vice versa.
Please suggest some help how can i achieve this.
Thanks in advance!
In DB2, there is a feature called federation (part of Information Integration), that allows you to present external resources to DB2 (wrapper and nickname); you can query those external resources from DB2, and even you can do joins between different sources (Other DB2 databases, Informix | MSSQL server | Oracle | MySQL databases, flat files, etc.)
In order to query external resources, this feature requieres a special licence. Instead, if you want to query other DB2 or informix databases, this feature does not requiere extra license because it is included as free (these are the IBM databases).
In order hand, there is an option called table functions. These functions return a table when they are called, and then, you can join the returned data with other table. These functions can be developed in SQL PL (IBM procedure language), C or Java.
With this second option, you can create a table function in Java, that queries the MySQL table, and then returns the data to DB2.
I have written an example about how to query a 'topic' in Twitter, and return that data to DB2. You have to do almost the same, but instead of querying Twitter, you configure your other database.
http://angocadb2.blogspot.fr/2012/02/accediendo-tweeter-desde-db2-table.html
#AngocA it doesnt work but thanx for ur suggestion .
After a long search i myself come up with an answer for the above self post and thought of posting it here as it will be helpful for others in case of any combination of scenario where we need to fetch data from two different database server which r remote/local in nature and when linked server concept fails.
We may use a third party jar called as Unity Jdbc which we can use in our java code in simple manner for loading driver then getting connection same like old jdbc.
1)Load driver like thisClass.forName("unity.jdbc.UnityDriver");
2) Get connection like this DriverManager.getConnection(jdbc:unity://test/xspec/mysqldb2.xml);
3) Get Records(DDL/DML)
4)Close Connection
one can visit Unity Jdbc http://www.unityjdbc.com/
Using this jdbc in our code we actualy load an xml based file which has definition of desired datasource of our requirement.
once everything is set one can then easily form a query from two different tables from two different remote databases. syntax : dbname.tablename.fieldname
Addingly we dont need to handle any further xml configuration for closing internal connection created after closing the outer actual connection.
Any issues write revert back.

What's the appropriate way to test code that uses MySQL-specific queries internally

I am collecting data and store this data in a MySQL database using Java. Additionally, I use Maven for building the project, TestNG as a test framework, and Spring-Jdbc for accessing the database. I've implemented a DAO layer that encapsulates the access to the database. Besides adding data using the DAO classes I want to execute some queries which aggregate the data and store the results in some other tables (like materialized views).
Now, I would like to write some testcases which check whether the DAO classes are working as they should. Therefore, I thought of using an in-memory database which will be populated with some test data. Since I am also using MySQL-specific SQL queries for aggregating data, I went into some trouble:
Firstly, I've thought of simply using the embedded-database functionality provided by Spring-Jdbc to instantiate an embedded database. I've decided to use the H2 implementation. There I ran into trouble because of the aggregation queries, which are using MySQL-specific content (e.g. time-manipulation functions like DATE()). Another disadvantage of this approach is that I need to maintain two ddl files - the actual ddl file defining the tables in MySQL (here I define the encoding and add comments to tables and columns, both features are MySQL-specific); and the test ddl file that defines the same tables but without comments etc. since H2 does not support comments.
I've found a description for using MySQL as an embedded database which I can use within the test cases (http://literatitech.blogspot.de/2011/04/embedded-mysql-server-for-junit-testing.html). That sounded really promising to me. Unfortunately, it didn't worked: A MissingResourceExcpetion occurred "Resource '5-0-21/Linux-amd64/mysqld' not found". It seems that the driver is not able to find the database daemon on my local machine. But I don't know what I have to look for to find a solution for that issue.
Now, I am a little bit stuck and I am wondering if I should have created the architecture differently. Do someone has some tips how I should setup an appropriate system? I have two other options in mind:
Instead of using an embedded database, I'll go with a native MySQL instance and setup a database that is only used for the testcases. This options sounds slow. Actually, I might want to setup a CI server later on and I thought that using an embedded database would be more appropriate since the test run faster.
I erase all the MySQL-specific stuff out of the SQL queries and use H2 as an embedded database for testing. If this option is the right choice, I would need to find another way to test the SQL queries that aggregates the data into materialized views.
Or is there a 3rd option which I don't have in mind?
I would appreciate any hints.
Thanks,
XComp
I've created Maven plugin exactly for this purpose: jcabi-mysql-maven-plugin. It starts a local MySQL server on pre-integration-test phase and shuts it down on post-integration-test.
If it is not possible to get the in-memory MySQL database to work I suggest using the H2 database for the "simple" tests and a dedicated MySQL instance to test MySQL-specific queries.
Additionally, the tests for the real MySQL database can be configured as integration tests in a separate maven profile so that they are not part of the regular maven build. On the CI server you can create an additional job that runs the MySQL tests periodically, e.g. daily or every few hours. With such a setup you can keep and test your product-specific queries while your regular build will not slow down. You can also run a normal build even if the test database is not available.
There is a nice maven plugin for integration tests called maven-failsafe-plugin. It provides pre- and post- integration test steps that can be used to setup the test data before the tests and to cleanup the database after the tests.

Cucumber with Database_Cleaner for MySQL and MongoDB

I am building a Rails app that uses MySQL for some models and MongoDB for others (through the mongo_mapper gem).
We have started to build out cucumber (with capybara and webdriver) tests for the app and are coming across some trouble with IDs being referenced that don't exist. I believe I have tracked this down to old data in the MongoDB.
At this point, database_cleaner is doing its job with the MySQL records, but not the MongoDB ones.
There is a discussion at the cucumber-rails project about using MongoDB, but I believe it assumes that you are only using MongoDB, not both MongoDB and MySQL together.
Is there a way to have the database_cleaner clean both MySQL and MongoDB? Or is it only one or the other?
I found this article on how to drop all of the MongoDB content before running the tests, but I believe this will delete all data including the records I am using for local development...
Thanks.
Assuming you are doing something like this when you tell rails which Mongo DB to talk to:
MongoDatabase = "mongodb://localhost/yourdb_#{Rails.env}"
Then in your tests, do:
/spec/spec_helper.rb
MongoMapper.database.collections.select { |c| c.name != 'system.indexes' }.each(&:drop)
(above is for MongoMapper, but idea is the same for Mongoid -- just drop to the database level and drop all collections).
This will only drop data in your test database, not your dev db. Used in conjunction with the DB cleaner, you're good to go.