How to copy only the content of drupal installation (migration)? - mysql

I have been told to upgrade Drupal website from very old version (probably 4) to the newest one (7). I tried different approaches, without success, due to various incompatibilities. I guess the best way to do that is to freshly install Drupal 7, copy the template and then the database. However, probably if I just copy database from old version it won't work, because even the table names are different. So my question is: is there a way to do that, i.e. to copy somehow only nodes and upload it to the new installation? Or maybe some other way? Has anybody approached this problem in the past?Any advice will be appreciated.

There is a very similiar question here: Is it possible to upgrade Drupal from very old version(4) to the newest one(7)?

Related

How to setup Git with Mediawiki?

I recently made my own personal MediaWiki and I would like it to be available on different computers. I set it up with XAMPP, so currently, what I did was make two repositories:
one for xampp\htdocs\(my-wiki)
one for xampp\mysql\data\(my-sql-folder)
Then I cloned those repositories to the same folders on another computer. However, when I go to localhost(my-wiki) on that computer, I get the error "Sorry! This site is experiencing technical difficulties. (Cannot access the database)."
Whenever I make changes to the Wiki, xampp\htdocs(my-wiki) does not change at all, while xampp\mysql\data(my-sql-folder) frequently shows edits. What am I doing wrong?
Edit: After looking at the internal error data, it appears that none of the tables in the wiki exist anymore (Table xxx doesn't exist in engine). I'm unsure of why this would be!
There are two things that change when you use a wiki: the uploads directory and the database, so for some some sort of decentralized wiki you need to replicate those. Uploads are simple (you could use git, or some shared central storage like NFS, or some decentralized file store - Wikipedia for example uses Swift). As for the database, there are a few experimental tools to use git as a storage engine (e.g. git-mediawiki), but nothing I would rely on. If your computers run all the time, you can use database replication, but that's not a beginner-level setup. In practice you'll probably be best off just using database dumps. Or buy a server on the internet (a decent VPS is pretty cheap these days) and use that as the wiki's DB backend so you can reach it from all your machines. (Or I guess you can just put your whole wiki on the internet at that point.)
Figured it out. I was missing the files ib_logfile0, ib_logfile1, and ibdata1 from the xampp/mysql/data folder. This, however, makes my Git setup even more annoying. If anybody has any suggestions for a better way to setup my Wiki and make it available across different computers, it'd be much appreciated! Thanks

MySql databases corrupted after upgrade?

I have been dealing with this issue for a while now. For some reason, when i went to run a Ubuntu upgrade, MySql-server upgrade failed. This was on about 8/10. This had happened before due to a "DATADIR" link (won't go into detail on that just now). I went through hell trying to get the package to upgrade and eventually got the package to upgrade by creating a new MySql database structure (after moving mine somewhere else). Once I did that (with some steps involved) the package upgrade completed.
Then, when I tried to replace the "new" databases with my old onse, it wouldn't start the service. I came to find out that the "Mmysql" (system) database folder was just completely gone.
So, I took the "new" database and overlayed it on my "old" database files. This got me in! Of course, old users, and anything else in the system database, was gone. So I started to rebuild them.
The problem occurred when i tried to go into some old databases. About half of them report that the table does not exist when trying to load them. Mostly, it is all of the tables in particular database, but there are a few databases where some tables "don't exist" and others do.
The thing is that the tables do exist. I believe they are simply corrupt.
So, I'm really trying here, but I can't seem to figure out how to get all of the tables to load. I have a backup from the 13th, presumably after the upgrade failed but before I really started messing with things. I'm going to try to use that, but if anyone knows how/why some tables are corrupted all of a sudden and why others are not and especially if someone knows how to fix this, that would be absolutely wonderful.
Unfortunately, my regular backups haven't been working for months, and the latest backup I currently have access to is 2 years old. Quite a bit has changed in the database since then, but as a last ditch effort, I may try to import that data and use "mysql_upgrade" to restore this, then overlay any new databases I have created since then into the directory structure and see if they import that way.
Thanks for any suggestions you may offer.
--mobrien
I believe this was due to a permissions issue that had some files locked and when I fixed the perm issue, the tables that were accessible were corrupted. I restored the same backup again and this time it worked. The only folder that was missing was the "mysql" folder, and for that I recreated a new one, then patched it in, then created new user permissions for the existing tables. This was working, but then I ran into another issue, so I will open a new questions for that. This has been a nightmare and the moral of the story is: keep better backups and test them!

Joomla 2.5 core installation database problems

I'm in quite a bit of a bind. I've got a copy of joomla 2.5 that I've been working with as a test environment for a site redesign. However, there was an issue, and one of the developers on my team "accidentally" (don't even get me started, here) deleted a row in one of the databases for the installation. Now, all of the content edit screens are essentially broken.
I have NO IDEA what specifically was deleted nor do I know from what table, and we don't have a workable backup from before this screw up.
From the front end, nothing at all appears amiss (every page with every bit of content is just fine), so it was definitely a core installation database row that was deleted.
Is there any (free) solution that anyone could suggest to remedy this situation? Would it be possible to simply copy the existing content from the database to a fresh installation, along with the core files (if so, could someone shed some light as to which tables we should be copying to the fresh installation)?
I think there is no solution to remedy your situation, But i guess you could do it by hand by following these steps:
Transfer all new contents table to the new installation.
Keep attention for Joomla tables like (#_extensions, #_assets, #__schema ...).
Try to merge between the old Joomla tables and new one.

Keeping a database structure up to date in a project where code is on subversion?

I have been working with Subversion for a while now, and it's been incredible for the management of my projects, and even to help managing the deployment to several different servers, but there is just the one thing that still annoys me. Whenever I make any changes to the database structure, I need to update every server manually, I have to keep track of any changes I made, and because some of my servers run branches of the project (modifications that are still being worked on, or were made for different purposes), it's a bit awkward.
Until now, I've been using a "database.sql" file, which is a dump of the database structure for a specific revision. But it just seems like such a bad way to manage this.
And I was wondering, how does everyone else manage their MySQL databases when they're working on a project and using Subversion?
In my team here is what we currently do:
we only have one branch: the trunk, which is where every developer checked in his changes.
When we want to release a new version of our solution, we create a new branch from the trunk. (after stabilizing it a bit).
For each release, we also have a file to migrate the schema of our databases from version n-1 to version n. We also have a script to rollback from n to n-1. So when we start a new release, we create new migration & rollback files which are comitted in the trunk.
Thus we are able to rebuild the database corresponding to any version of our solution starting from any "version" of a given schema.
Actually, we also had a lot of debates on this question and this is finally what we chose to do. But if you guys have some ideas to help us to improve, let us know :)
Liquibase might be something useful for you.
I've played around with this quite a bit, although not to the point of using it in anger.
Basically you define your database and scripts in their syntax, and they generate upgrade and from-scratch scripts for various databases for you.
Takes a bit of getting used to, but works quite well.

How to add a version number to an Access file in a .msi

I'm building an install using VS 2003. The install has an Excel workbook and two Access databases. I need to force the Access files to load regardless of the create/mod date of the existing databases on the user's computer. I currently use ORCA to force in a Version number on the two files, but would like to find a simpler, more elegant solution (hand editing a .msi file is not something I see as "best practice".
Is there a way to add a version number to the databases using Access that would then be used in the install?
Is there a better way for me to do this?
#LanceSc
I don't think MsiFileHash table will help here. See this excellent post by Aaron Stebner. Most likely last modified date of Access database on client computer will be different from its creation date. Windows Installer will correctly assume that the file has changed since installation and will not replace it.
The right way to solve this (as question author pointed out) is to set Version field in File table.
Unfortunately setup projects in Visual Studio are very limited. You can create simple VBS script that would modify records in File table (using SQL) but I suggest looking at alternative setup authoring tools instead, such as WiX, InstallShield or Wise. WiX in my opinion is the best.
Since it sounds like you don't have properly versioned resources, have you tried changing the REINSTALLMODE property?
IIRC, in the default value of 'omus', it's the 'o' flag that's only allowing you to install if you have an older version. You may try changing this from 'o' to 'e'. Be warned that this will overwrite missing, older AND equally versioned files.
Manually adding in versions was the wrong way to start, but this should ensure that you don't have to manually bump up the version numbers to get them to install.
Look into Build Events for your project. It may be possible to rev the versions of the files during a build event. [Just don't quote me on that]. I am not sure if you can or not, but that would be the place I would start investigating first.
You should populate the MsiFileHash table for these files. Look at WiFilVer.vbs thtat is part of the Microsoft Platform SDK to see how to do this.
My other suggestion would be to look at WiX instead of Visual Studio 2003 for doing installs. Visual Studio 2003 has very limited MSI support and you can end up spending a lot of time fighting it, rather than getting useful work don.