i have a courses_material Mysql database. It has many tables like
courses, lessons, exercises, quizzes, exams, questions
i'm running my application in 2 servers, one is the TEST SERVER and other is the PRODUCTION SERVER
basically we will create material of a course XYZ in test server and check it in the application for look and feel, course content etc..
If everything is fine then we will move the entire material of course XYZ to production database.
Test server may contain any other courses.
I want to select a course and dump all the data of that course into .sql file in the form of insert queries. Then i can run the sql file on the production site.
I have to do this using PHP, i'm using codeigniter (MVC) framework.
What is the best way to do it?.
i need some suggestions
CodeIgniter has a db utility class with a backup function. You could try using that or take the source and manipulate it to fit your needs.
Related
I'm kinda new to this kind of problem. I'm developing a web-app and changing DB design trying to improve it and add new tables.
well since we had not published the app since some days ago,
what I would do was to dump all the tables in server and import my local version but now we've passed the version 1 and users are starting to use it.
so I can't dump the server, but I still would need to update design of server DB when I want to publish a new version. What are the best practices here?
I like to know how I can manage differences between local and server in mysql?
I need to preserve data in server and just change the design, data on local DB are only for test.
Before this all my other apps were small and I would change a single table or column but I can't keep track of all changes now, since I might revert many of them later and managing all team members on this is impossible.
Assuming you are not using a framework that provides a migration tool for database, you need to keep track of the changes manually.
Create a folder sql_upgrades (or whatever name you name) in your code repository
Whenever a team member updates the SQL schema, he creates a file in this folder with the corresponding ALTER statements, and possibly UPDATE, CREATE TABLE etc. So basically the file contains all the statements used to update the dev database.
Name the files so that it's easy to manage, and that statements for the same feature are grouped together. I suggest something like YYYYMMDD-description.sql, e.g. 20150825-queries-for-feature-foobar.sql
When you push to production, execute the files to upgrade you SQL schema in production. Only execute the files that have been created since your last deployment, and execute them in the order they have been created.
Should you need to rollback a file, check the queries it contains, and write queries to undo what was done (drop added columns, re-create dropped columns, etc.). Note that this is "non-trivial", as many changes cannot be rolled back fully (e.g. you can recreate a dropped column, but you will have lost the data inside).
Many web frameworks (such as Ruby of Rails) have tools that will do exactly that process for you. They usually work together with the ORM provided by the framework. Keeping track of the changes manually in SQL works just as well.
We would like to be able to publish Filemaker data on our Wordpress website. The website is up and running and the filemaker database is set up. We do not need a live connection between both systems so we chose to export the FM data to .csv so we can import it to the mysql database on the server and from there we would like to display in on the website.
Now are my questions, since this kind of development is new to us:
can I setup an automated import to the mysql database from a source like dropbox or something? For example can we make the mysql database import and overwrite the existing database each 24 hours from a .csv file located somewhere? We need this automated overwrite option because the FM data changes often and we need up to date info on the website)
How can we display the data from the mysql database on the WP frontend?
I've been looking into this myself and couldn't find any clear answers or guides. Can you guys point me in the right direction?
(btw, I know there are table plugins I can use for WP but they do not fulfill our needs, and I think it's exciting to do it all by ourself with help from this great community)
Update 01
I've successfully connected FM with my MySQL db using ODBC and can now select tables from the MySQL db in FM's relational graph.
I was wondering how I can write the data from my existing FM file to the MySQL db using ODBC, can anybody help me on this?
I would like to display the data in some MySQL tables so I can fetch them using php on my website.
Thanks!
It is possible to write directly into (and read from) a remote MySQL database from FileMaker via ODBC.
You need an MySQL account which allows remote access. There are providers where this is not allowed.
On the local box the odbc driver needs to be installed. On Win you can use the open source version (http://dev.mysql.com/downloads/connector/odbc/), on Mac it works better with the Actual Tech (http://www.actualtech.com/de/product_opensourcedatabases.php) drivers.
An odbc system dsn (not user dsn) is set up. Be sure to use the 32-bit odbc manager on Win.
Now you can create the external data source within FileMaker and read and write into MySQL tables.
Once you have made the connection to the MySQL database, and you can see the shadow tables, you can write to the fields directly via Filemaker layouts. It's as simple as that.
Once the layout contains the fields from the MySQL database you can move through records, find stuff all as if the data were native in your FM database. Of course, for more automated processing, you can create scripts, relationships etc and manipulate/synchronise data. Be warned though, the connection speed can limit complex relationships and large databases. I would advise 'baby steps'.
I am trying to synchronize 2 different type of database together. Here is a better explanation of what I am trying to do:
I have the MySQL database on a server with the main database. I have an application installed on multiple computers.
I need the main database to be updated with the modification inside the computers and I need the computer version to get the updates from the main database.
I have seen the replication option in MySQL but it's not exactly what I want to do. I have seen other stuff like REPLACE INTO but I still don't see a clear solution.
I'm not asking for a full solution but maybe a good pseudocode or some cool functionality so I can try to implement it. This will be used on my end of school project.
I obviously have a timestamps on each row so I can detect changes.
This is how I would do this if I had an app that will run in different SQL environments.
It will be cleaner if you do the communication between yourApp & DB through some php class. This class instance will be created based on user's SQL version. Then, it is for your code in the class to decide how to connect to your DB.
My brother uses an application called Kennel Connection - http://www.kennelconnection.com/screen.html that has a Microsoft Access backend.
He wants me to create come reports and display them on a webpage. I know some PHP but freely(well maybe not freely) admit that I use dreamweaver as a crutch to speed along my php development. Most of my experience in PHP is based on a mysql backend.
The database has about 20 tables though only about 5 are used on a regular basis. Is there any reason I couldn't import those 5 tables in to mysql and create linked tables inside the application pointing to those mysql tables? In the short term I would just port the key tables with the longer term goal of porting all the tables to mysql?
My biggest concern would be related not knowing if the application would have any hiccups writing to a linked mysql table rather then an internal access table. Anyone have an experience with this?
You would probably have to rewrite any queries at the very least. It does not seem likely that you need real-time data, and it is not difficult to write a query that updates a MYSQL table from MS Access. This could be run either through Task Scheduler, or manually.
I am SQL Server developer and the current assignment is little different than what I have done in past. I found Stack Overflow very promising for my problem. I am working on the SQL Server 2005 database for the internal application for my client and the client also got the public facing web application with MySQL database. I do not have any details about this web application, but I got the assignment to update the MySQL database (on public domain) from the SQL Server database (internal domain) on daily basis as auto process. How can I achieve this through the SQL Server?
You might want to try Pentaho Data integrator.
http://wiki.pentaho.com/display/EAI/Latest+Pentaho+Data+Integration+%28aka+Kettle%29+Documentation
The product would allow you to speak to both data technologies. (MSSQL+MySQL) You will find the product similar to DTS. You may be able to construct your solution will little to no code.
SSIS will do this just fine. The hard part is determining how you want to transform the data from one structure to the other (I assume they are not exactly alike in terms of table design.)
But basically you create a dataflow task, connect to the SQL Server for the source data and use a query to define what data you are going to copy, then you do any transformations needed to make the data fit into the MySQL structure and connect to a MySQL destination.
Repeat this process for mulitple data sets you want to send to differnt places.
Once the SSIS pacakge is done, set up configurations so that you can run the package on the production server (you will want to test development to development of course!) then schedule the package to run at an appropriate time.
Depending on how different the two databases are and how much data you need to move, this can be a relatively simple process or very complicated.