Joomla 2.5 core installation database problems - mysql

I'm in quite a bit of a bind. I've got a copy of joomla 2.5 that I've been working with as a test environment for a site redesign. However, there was an issue, and one of the developers on my team "accidentally" (don't even get me started, here) deleted a row in one of the databases for the installation. Now, all of the content edit screens are essentially broken.
I have NO IDEA what specifically was deleted nor do I know from what table, and we don't have a workable backup from before this screw up.
From the front end, nothing at all appears amiss (every page with every bit of content is just fine), so it was definitely a core installation database row that was deleted.
Is there any (free) solution that anyone could suggest to remedy this situation? Would it be possible to simply copy the existing content from the database to a fresh installation, along with the core files (if so, could someone shed some light as to which tables we should be copying to the fresh installation)?

I think there is no solution to remedy your situation, But i guess you could do it by hand by following these steps:
Transfer all new contents table to the new installation.
Keep attention for Joomla tables like (#_extensions, #_assets, #__schema ...).
Try to merge between the old Joomla tables and new one.

Related

How to properly Update Docker Azerothcore with customizations to both code (scripts), modules and database (added quests, vendors, items)

I'm running Azerothcore-WOLTK inside a Docker container.
I would like to update the server since I read there's an important security fix.
However I never updated the server since I first installed it last year (December 2019). Since then, I have customized the server in several ways:
I have customized a few boss scripts to work properly with two players.
I have installed a few modules, including one that also required some extra code to be compiled, and some SQL queries to be run.
I have modified the database myself, adding Quests, NPC, Vendors and Items
As such, I'm extremely concerned I would end up messing everything up. I would require your assistance on how to proceed to update the server to the latest version while maintaining all the customization I have performed.
I'm especially concerned about the database changes as I figure I could backup the updated boss scripts, do a git pull and replace them again before building (I should do a fork afterwards, I didn't think about it)...
But in any I case I would be extremely thankful if you could guide me step by step along the way, considering I am using a docker installation.
For anything Database related I use Heidi SQL, so I could use that for any Database procedure. I'm not very proficient in SQL queries, but I should be able to import .sql files as needed.
I realize I'm asking a lot, so please don't feel pressured to answer right away. I will be most thankful if you could help me whenever you have the chance.
Thank you for your time :)
I'll try to answer all points you mentioned:
1. The boss scripts.
The worst thing that can happen is that you get merge conflicts while pulling the latest changes using git. So you would have to manually solve them. It's not necessarily difficult, especially in your case. It's just boss scripts, so by nature, they are quite self-contained and you are sure to not break anything else when messing with them.
2. Modules
The modules should not be a problem at all. Modules exist exactly for this reason: being isolated and not causing issues in case of updating the core or similar.
My only concern here would be that module that required a core change. I don't know what module you installed, normally this shouldn't happen. A proper AzerothCore module should not require any core change.
However, again, the worst thing you can have is some git merge conflicts, nothing too big I hope (depends on how big and invasive were these changes required by the module).
3. Custom database changes.
The golden rule is: always store your custom SQL queries somewhere, in a way that they can be easily re-applied. For example, always use DELETE before INSERT, prefer UPDATE when possible, etc...
So all you need to have is a file (or a bunch of files) containing all your SQL code corresponding to the custom changes you made. If you don't have it, you can still extract it from your DB.
Then you can always re-apply them after you update your core, if you feel it's needed. It might also be the case that you don't need to re-apply them at all. Or maybe you want to start from a fresh AzerothCore world database and re-apply your changes. This really depends on the specific case, but anyway you will be fine (as long as you keep your changes in SQL files).
You can use Keira3 to edit your database, or just extract your changes in case you need to. For example, you can open an entity and copy its "full query".
Backup first
Before starting the upgrade procedure, create a backup of:
your DB
the source files that you have modified (e.g. bosses, etc...)
Update frequently!
However I never updated the server since I first installed it last year (December 2019).
This is not recommended at all! You are supposed to update your AzerothCore frequently (at least once a week). There are a lot of good reasons to do so, one of them is: it's way easier if you do it often.
How to update AzerothCore when using Docker
A generic question about updating AC with Docker has been asked already here: How to update azerothcore-wotlk docker container

MySql databases corrupted after upgrade?

I have been dealing with this issue for a while now. For some reason, when i went to run a Ubuntu upgrade, MySql-server upgrade failed. This was on about 8/10. This had happened before due to a "DATADIR" link (won't go into detail on that just now). I went through hell trying to get the package to upgrade and eventually got the package to upgrade by creating a new MySql database structure (after moving mine somewhere else). Once I did that (with some steps involved) the package upgrade completed.
Then, when I tried to replace the "new" databases with my old onse, it wouldn't start the service. I came to find out that the "Mmysql" (system) database folder was just completely gone.
So, I took the "new" database and overlayed it on my "old" database files. This got me in! Of course, old users, and anything else in the system database, was gone. So I started to rebuild them.
The problem occurred when i tried to go into some old databases. About half of them report that the table does not exist when trying to load them. Mostly, it is all of the tables in particular database, but there are a few databases where some tables "don't exist" and others do.
The thing is that the tables do exist. I believe they are simply corrupt.
So, I'm really trying here, but I can't seem to figure out how to get all of the tables to load. I have a backup from the 13th, presumably after the upgrade failed but before I really started messing with things. I'm going to try to use that, but if anyone knows how/why some tables are corrupted all of a sudden and why others are not and especially if someone knows how to fix this, that would be absolutely wonderful.
Unfortunately, my regular backups haven't been working for months, and the latest backup I currently have access to is 2 years old. Quite a bit has changed in the database since then, but as a last ditch effort, I may try to import that data and use "mysql_upgrade" to restore this, then overlay any new databases I have created since then into the directory structure and see if they import that way.
Thanks for any suggestions you may offer.
--mobrien
I believe this was due to a permissions issue that had some files locked and when I fixed the perm issue, the tables that were accessible were corrupted. I restored the same backup again and this time it worked. The only folder that was missing was the "mysql" folder, and for that I recreated a new one, then patched it in, then created new user permissions for the existing tables. This was working, but then I ran into another issue, so I will open a new questions for that. This has been a nightmare and the moral of the story is: keep better backups and test them!

How to copy only the content of drupal installation (migration)?

I have been told to upgrade Drupal website from very old version (probably 4) to the newest one (7). I tried different approaches, without success, due to various incompatibilities. I guess the best way to do that is to freshly install Drupal 7, copy the template and then the database. However, probably if I just copy database from old version it won't work, because even the table names are different. So my question is: is there a way to do that, i.e. to copy somehow only nodes and upload it to the new installation? Or maybe some other way? Has anybody approached this problem in the past?Any advice will be appreciated.
There is a very similiar question here: Is it possible to upgrade Drupal from very old version(4) to the newest one(7)?

Keeping a database structure up to date in a project where code is on subversion?

I have been working with Subversion for a while now, and it's been incredible for the management of my projects, and even to help managing the deployment to several different servers, but there is just the one thing that still annoys me. Whenever I make any changes to the database structure, I need to update every server manually, I have to keep track of any changes I made, and because some of my servers run branches of the project (modifications that are still being worked on, or were made for different purposes), it's a bit awkward.
Until now, I've been using a "database.sql" file, which is a dump of the database structure for a specific revision. But it just seems like such a bad way to manage this.
And I was wondering, how does everyone else manage their MySQL databases when they're working on a project and using Subversion?
In my team here is what we currently do:
we only have one branch: the trunk, which is where every developer checked in his changes.
When we want to release a new version of our solution, we create a new branch from the trunk. (after stabilizing it a bit).
For each release, we also have a file to migrate the schema of our databases from version n-1 to version n. We also have a script to rollback from n to n-1. So when we start a new release, we create new migration & rollback files which are comitted in the trunk.
Thus we are able to rebuild the database corresponding to any version of our solution starting from any "version" of a given schema.
Actually, we also had a lot of debates on this question and this is finally what we chose to do. But if you guys have some ideas to help us to improve, let us know :)
Liquibase might be something useful for you.
I've played around with this quite a bit, although not to the point of using it in anger.
Basically you define your database and scripts in their syntax, and they generate upgrade and from-scratch scripts for various databases for you.
Takes a bit of getting used to, but works quite well.

MySQL Version Control - Subversion

Wondering if it is possible to have a version control of a MySQL database.
I realize this question has been asked before however the newest is almost a year ago, and at the rate things change...
The problem is coming that each developer has apache/MySQL/PHP on their own computers to which they sometimes edit the database. Its rather inconvenient if they have to send an email to all the other developers and then manually edit the test servers database.
How do you deal with this problem?
Thanks
This is not a MySQL-related solution in itself, but we've had a lot of success with a product called liquibase. (http://www.liquibase.org/)
It's a migration solution which covers many different database vendors, allowing all database changes to be coded in configuration files, all of which are kept in Subversion. Since all configuration is kept in XML files, it's easy to merge other people's changes into the mainline script and it plays well with tags and branches.
The database can be brought up to the current revision level by running the "update database" command. Most changes also have the ability to roll-back a database change, which can be helpful too. I would recommend following the practice of making sure you get current before running the migration, as this would likely be easiest.
Finally, when it comes to a production delivery, you can choose to have all the database changes output as a full SQL script so it can allow DBAs to run it and maintain a separation of duties.
So far, it's worked like a charm.
Well we use Rails which keeps all the change in the migration files. I know that a couple of PHP frameworks do the same thing - Symphony for instance. So when all the changes are merged in our repository ( we user mercurial) - we can see all the changes in migrations that need to or were applied on database in development. Than the person responsible for production rolls out code to production after a full backup is made. However if you don't use a PHP framework that takes care of this than, awied's suggestion sounds very interesting - I haven't heard of liquidbase before but I will definitely check it out.
There is a tool called iBatis, now called MyBatis that handles versions of databases perfectly.
It takes a little work to have all your changes in script instead of with a graphical tool, but, if you are familiar with coding, it's not a problem.
When you have multiple databases (like dev-test-prod), you just make 3 environment files and you can update one environment with only one command-line instruction.