I have two (same) applications running on dev.example.com and beta.example.com with different databases. Everything is set to run with passenger using Apache as web server.
What I have done was to copy the code from one directory to the other (myapp_dev and myapp_beta) and everything seems working fine, until I have to migrate a table. I get an error that the data I want to migrate is already migrated. Probably trying to migrate on the same database. Maybe I have to re-configure the way to run the applications, but don't know how and what to do. Any hints are appreciated.
Thanks!
you can change the used database in config/database.yml
Related
What I want to do is migrate my tables and data from phpmyadmin/MySQL from a local server to a production server.
For example if I was on my local and I made a change to the database how would I also push those changes to the production server/ testing server so that the changes will also be there? Does the Laravel migration functions do this in a handy way? Also I know for WordPress this can be difficult so just wondering if Laravel has some handy tool that can do this.
Thanks.
Hi IneedToAskQuestions
Have a look at migrations as suggested by #Devon and #Peter
https://laravel.com/docs/5.7/migrations
I use migrations to change data in a database the same as you do in a controller.
might not be best practice but works well.
hope this helps
I am working at a company that has some CRM software running in a remote Windows XP server that uses a SQLAnywhere 9 db to store its data; I have access to this remote server with an administrator account.
I would like to extract the db into a .sql file so that I can run the db locally on my machine without affecting the running db in the server (since it is key for the company's day to day operation).
The reason I need this is that we are going to test some BI Software and we need data from this database to test it, but we don't know the structure of the database since the developers of the CRM software didn't give us any documentation on it. So we need to have the database locally so that, without affecting the running CRM, we can:
understand the structure by looking at the DDL
make queries to it to get sample data
I researched a bit, and the most common solution to my problem was to use dbunload on the remote server to unload the db into a reload.sql file that contained what I needed. But most tutorials on the subject mention that I have to stop the db first (which would be catastrophic). If this is the only option, then I guess I am willing to do it on the weekend when the CRM is not used, but I wanted to know if there was another solution first.
If there is no other solution, can you point me to where I can find the proper and safer way to do this?
I have researched a lot, but prior to this day I have never even heard of SQLAnywhere, so I really need all the help I can get. My main concern is doing something that impacts negatively the CRM software.
Thank you.
You can run dbunload across the network, you just have to tell it to do an "external" unload. The default is to do an internal unload which would only work from the machine where the database server is running.
I don't have SQL Anywhere 9 documentation right now to look up the exact switch, but dbunload -? should show you all the possible switches.
Edit:
-an will create a new database and load the data and schema from another data
-xi switch will do external unload and internal reload.
-c parameters to connect to your remote database
I'm trying to migrate a web site to friend's server. The site uses MySQL, but he didn't previously have it set up. He's installed the package now and suggested that I could run my own instance of it. I'm at a loss for how to do so. Is it even possible? If so, how?
Some relevant information:
The OS in use is CentOS 5.9
Using MySQL 5.0.95
I only need the DB to be read locally via PHP when serving my site.
I have no root privileges on this system (although a full shell), but am close friends with the owner/administrator if that's necessary.
To clarify:
It's the daemon that I want to run my own instance of. So I guess what I want to know is if it's possible to have multiple users on the system running their own instances of mysqld containing different databases.
All I need this for is serving a web page. If I have to break down and switch to a pseudo-database using CSV files I will, but I'd much rather stick to MySQL if I can.
If MYSQL is setup on that machine yes you can run your own instance of it.
You can have it set up and your site given its on Database within that MYSQL instance and that specific MYSQL account can be used to access the tables etc. involved with your website.
Now if he is only putting MYSQL on the machine then having you set everything up you will just need an account for MYSQL and from ther eyou can get in and just create all the needed items that will allow your site to fully function.
EDIT
In response to your comment. You can you just can't clog the same port or have configurations that conflict each other. As long as it won't be to much of a performance hit and you configure a different user to start each instance you should be able to do Something like this.
I'm having a lot of run with Ruby, creating some basic web apps. When looking at the logs of the rails server in terminal, I see mysql queries.
Refinery::User Load (0.2ms) SELECT "refinery_users".* FROM "refinery_users" WHERE "refinery_users"."id" = 1 LIMIT 1
Theses are relating to databases that I've created, but where do these databases exist? In the rails server? Where is the rails server stored in OSX? Can I browse what's inside, specifically, the databases?
Thanks, I know this doesn't have much practical use, but I want to understand the concepts behind what's going on, rather than just having superficial knowledge.
By default, Rails uses SQLite3. The database files are stored in the /db directory in the root of your app. There should be a file called development.sqlite3.
To access that database, open a terminal session, go to the root directory of your app and enter sqlite3 db/development.sqlite3. More info on the sqlite shell here: http://www.sqlite.org/sqlite.html
Rather than messing around in the SQLite shell, I think you'd be better off 1) looking at /db/schema.rb to see the structure of your database and 2) using rails console to look at the data.
If you want to know, for any given Rails app, what database it's using, look at /config/database.yml. That gives you the adapter, name of the database, location, etc.
Also, SQLite is generally just for kicking off development. I wouldn't recommend using it when your schema starts getting more complex. Personally, I never use it. I immediately set up a mySQL database for any new Rails projects.
I am kind of a amateur at web development. But it seems like most people develop on their local machines and the upload to their remote servers when everything is ready. I want to start doing this. I've installed Xampp (Apache) on my local machine. But in order for this to really work, i need the the mysql databases that already exist on my remote server to be "synched" or "duplicated" on my local machine. But I'm finding this somewhat hard to figure out.
First, Should I be using mysql "reduplication" feature (with my remote server as master) and local machine as slave? Or is there a better way to do this? Should I be synchronizing instead of reduplicating
Second, Is anyone willing to give me a quick description of how I achieve this "reduplication" or "synchronization"
Thanks
It may be tempting, but it's hopeless to try to keep the database in two places. Instead, always keep the database on the server because it's much easier to develop and debug your code if it's in just one place. "Resynch" and "reduplicate" and all that business is just too much trouble, as you are discovering. The DB is going to end up on the server anyway, so you may as well put it there right now.
Also, you will not need a web server on your local machine, which will unburden you.
This next is applicable if you are writing CGI. If you aren't sure whether or not you are writing CGI, then you are not (well, probably not). {
If you mainain just one database and it lives on your server, you'll be able to write one piece of portable code that will run equally well on your local machine and on your server. This is a huge win, take my word for it.
To get this working, you will need the mysql library on your local machine; no other mysql component is needed there. The mysql server will run on your server only.
Read up on mysql "connector" for the language you're using.
}