How to execute mysql scripts in cloudfoundry - mysql

I have a .sql file (initial sql scripts). I have recently deployed application in cloudfoundry, So I want to run these scripts to make application work, Scripts will update more than 5 db tables.
Is there any other way to run the mysql scripts from the grails application on start up Or Is there any provision to run the scripts in the cloudfoundry.

you have several options here.
The first one (which I recommend), is to use something like http://liquibase.org/ (there is a Grails plugin for it: http://grails.org/plugin/liquibase). This tool will make sure that any script you give it will run prior to the app starting, without running the same script twice, etc. This is great to keep track of your database changes.
This works independently of CloudFoundry and would help anyone installing your app having an up to date schema
The second option would be to tunnel to the CloudFoundry database and run the script to the db. Have a look at http://docs.cloudfoundry.com/tools/vmc/caldecott.html or even easier with STS : http://blog.cloudfoundry.com/2012/07/31/cloud-foundry-integration-for-eclipse-now-supports-tunneling-to-services/

Yup, what ebottard said! :-) Although, personally I would opt for using the tunnel feature on VMC, but that said, I am a Ruby guy!
Be weary of the fact that there are timeouts against queries in MySQL if are bootstrapping your database with large datasets!

Related

How do I migrate a SQLAnywhere 9 db running in a remote server into a mysql server on my machine?

I am working at a company that has some CRM software running in a remote Windows XP server that uses a SQLAnywhere 9 db to store its data; I have access to this remote server with an administrator account.
I would like to extract the db into a .sql file so that I can run the db locally on my machine without affecting the running db in the server (since it is key for the company's day to day operation).
The reason I need this is that we are going to test some BI Software and we need data from this database to test it, but we don't know the structure of the database since the developers of the CRM software didn't give us any documentation on it. So we need to have the database locally so that, without affecting the running CRM, we can:
understand the structure by looking at the DDL
make queries to it to get sample data
I researched a bit, and the most common solution to my problem was to use dbunload on the remote server to unload the db into a reload.sql file that contained what I needed. But most tutorials on the subject mention that I have to stop the db first (which would be catastrophic). If this is the only option, then I guess I am willing to do it on the weekend when the CRM is not used, but I wanted to know if there was another solution first.
If there is no other solution, can you point me to where I can find the proper and safer way to do this?
I have researched a lot, but prior to this day I have never even heard of SQLAnywhere, so I really need all the help I can get. My main concern is doing something that impacts negatively the CRM software.
Thank you.
You can run dbunload across the network, you just have to tell it to do an "external" unload. The default is to do an internal unload which would only work from the machine where the database server is running.
I don't have SQL Anywhere 9 documentation right now to look up the exact switch, but dbunload -? should show you all the possible switches.
Edit:
-an will create a new database and load the data and schema from another data
-xi switch will do external unload and internal reload.
-c parameters to connect to your remote database

Where are databases stored in the rails server?

I'm having a lot of run with Ruby, creating some basic web apps. When looking at the logs of the rails server in terminal, I see mysql queries.
Refinery::User Load (0.2ms) SELECT "refinery_users".* FROM "refinery_users" WHERE "refinery_users"."id" = 1 LIMIT 1
Theses are relating to databases that I've created, but where do these databases exist? In the rails server? Where is the rails server stored in OSX? Can I browse what's inside, specifically, the databases?
Thanks, I know this doesn't have much practical use, but I want to understand the concepts behind what's going on, rather than just having superficial knowledge.
By default, Rails uses SQLite3. The database files are stored in the /db directory in the root of your app. There should be a file called development.sqlite3.
To access that database, open a terminal session, go to the root directory of your app and enter sqlite3 db/development.sqlite3. More info on the sqlite shell here: http://www.sqlite.org/sqlite.html
Rather than messing around in the SQLite shell, I think you'd be better off 1) looking at /db/schema.rb to see the structure of your database and 2) using rails console to look at the data.
If you want to know, for any given Rails app, what database it's using, look at /config/database.yml. That gives you the adapter, name of the database, location, etc.
Also, SQLite is generally just for kicking off development. I wouldn't recommend using it when your schema starts getting more complex. Personally, I never use it. I immediately set up a mySQL database for any new Rails projects.

rails 3.1 - moving data from local data processing server to heroku production server

I'm building a rails 3.1 app that requires a load of data to be crunched and processed on a local server (using a bunch of non-rails tools and writing to mysql) and then for the refined results to be punted up to a heroku production server (front end). because the data crunching aspect of the process needs to be run in batches, my first instinct was simply to upload the results table to production using something like "heroku db:push --tables data" - but the problem is that it is sslloowww and the app is without data for about 40mins at a time. the crunching batches need to be run about 4x per day - so it looks like this approach isn't really going to work. any suggestions how to speed this process up or any alternative schemes for getting the data less obtrusively up to the production server? thanks!
Sounds like you may have to rearchitect or what about running your rails app on EC2 and ditching Heroku? I think Heroku is great if your app is simple or you can make do with the plugins that they have. But when it gets complex I think it may be too complicated.
Heroku makes it clear that you cannot access their databases from outside. What you can do however is (if you want to stay on Heroku) is to use another database (like the RDS or roll your own) and have your application connect to it. Then upload data to that database directly.

I'm trying to migrate SQL SERVER Express to MySQL

I have a somewhat small database in SQL Server Express 2005 that I really need to migrate over to a MySQL install on my hosting service (Dreamhost). After reading for a couple days, everything pointed to the MySQL Migration Toolkit, which is unfortunately EOL. I was able to find an archive and install it on my server running Sql Server. I set the source database, and set my Dreamhost MySQL as the destination. For whatever reason I get tons of permission errors trying to migrate although the user I'm connecting to MySQL as full permissions (working with dreamhost on this).
Is there a better way to do this? I've heard that I should use some third party tools, (like dbtools) and then I heard NOT to use third party tools.
Like I said the database is small, with a few views, a few functions, and a few stored procs, which I can manually move over if needed.
What are my options? Thank you!
Export your SQL Server database to a downloadable package (SSIS?)
Install SQL Server Express locally.
Install MySQL locally.
Run the Migration Toolkit locally.
Dump the MySQL database
Upload and run the Dump file # DreamHost (via phpMyAdmin if possible).
For such a small database you may spend more time trying to get a 3rd party tool to work for your situation than it would take you to just move the stuff manually. If you used standard SQL and little to nothing proprietary to SQL Server, creating the objects manually in MySQL should be easy enough... you just have to be aware of the slight syntax differences between the two platforms. Once the structures are created, generating insert statements to populate the data should also be trivial.

Remote backup of MySQL database

Our Java server application logs data to a SQL database, which may or may not be on the same machine. Currently we use MS SQL Server, and we're now porting to MySQL. A user configures database backup parameters on our app server, e.g. time of day to run a backup, and the app server executes SQL Server's BACKUP DATABASE command at the appropriate time, via a sproc. It does incremental backups daily and full backups weekly.
MySQL lacks an equivalent feature to tell the database from a client connection to back itself up. Options we're considering are:
Create a UDF to shell out to mysqldump (or copy database files), which can be called from our app server via a sproc. Essentially we'd be implementing a version of BACKUP DATABASE for MySQL.
Create a service to run on the MySQL box that can get the backup settings from the app server and run mysqldump (or file copy) locally.
Create a backup sproc to mimic mysqldump, e.g. SHOW CREATE TABLES and SELECT INTO OUTFILE for each table.
Setting up a cron job, Perl script, third-party app or other tricks that'd work great in a data center aren't preferred; this is a shrink-wrap package that needs to be pretty robust and hands off.
Database sizes can range from roughly 10MB to 10GB.
I'm aware of the binary logs for the incremental piece. I figure the general solution will probably apply to them as well, if we decide to use them.
This is all on Windows 2003 32-bit or 2008R2 64-bit, MySQL 5.1.
The UDF option seems the best to me. The UDF Repository (http://www.mysqludf.org/) has mysqludf_sys, which may be all we need, but I thought I'd ask for opinions since after extensive googling it doesn't seem like others have reached the same conclusion, or maybe our needs are just out of the ordinary. Our app is the only thing in MySQL, so I'm not worried about other users having access to our UDF.
Any solutions I'm overlooking? Any experience with using UDFs in such a way?
Thanks,
Eric
For this an other reasons we decided to collocate our application with the database, so this problem became moot.