How to setup Git with Mediawiki? - mysql

I recently made my own personal MediaWiki and I would like it to be available on different computers. I set it up with XAMPP, so currently, what I did was make two repositories:
one for xampp\htdocs\(my-wiki)
one for xampp\mysql\data\(my-sql-folder)
Then I cloned those repositories to the same folders on another computer. However, when I go to localhost(my-wiki) on that computer, I get the error "Sorry! This site is experiencing technical difficulties. (Cannot access the database)."
Whenever I make changes to the Wiki, xampp\htdocs(my-wiki) does not change at all, while xampp\mysql\data(my-sql-folder) frequently shows edits. What am I doing wrong?
Edit: After looking at the internal error data, it appears that none of the tables in the wiki exist anymore (Table xxx doesn't exist in engine). I'm unsure of why this would be!

There are two things that change when you use a wiki: the uploads directory and the database, so for some some sort of decentralized wiki you need to replicate those. Uploads are simple (you could use git, or some shared central storage like NFS, or some decentralized file store - Wikipedia for example uses Swift). As for the database, there are a few experimental tools to use git as a storage engine (e.g. git-mediawiki), but nothing I would rely on. If your computers run all the time, you can use database replication, but that's not a beginner-level setup. In practice you'll probably be best off just using database dumps. Or buy a server on the internet (a decent VPS is pretty cheap these days) and use that as the wiki's DB backend so you can reach it from all your machines. (Or I guess you can just put your whole wiki on the internet at that point.)

Figured it out. I was missing the files ib_logfile0, ib_logfile1, and ibdata1 from the xampp/mysql/data folder. This, however, makes my Git setup even more annoying. If anybody has any suggestions for a better way to setup my Wiki and make it available across different computers, it'd be much appreciated! Thanks

Related

MySQL & GIT - Is it possible to use GIT versioning to merge MySQL databases? [duplicate]

I am a WordPress Designer/Developer, who is getting more and more heavily involved with using version control, notably Git, though I do use SVN for some projects. I am currently using Beanstalk for my remote repo.
Adding all of the WordPress files to my repo is no problem, if I wanted to I know I could .gitignore the wp-config file, but since I'm the only developer, currently, and these projects are closed source, it really makes little sense.
WordPress relies heavily on the database, as any CMS does, to keep textual content, and many settings depending on the specific plugin/theme configuration I'm using. I'm wondering what the best way of using version control on the database would be, if it's even possible. I guess I could do a SQL dump, though my MySQL server is running on Windows (read as: I don't know how to do it), and then add the SQL dump to my repository. But when I push something live, that poses huge security threats.
Is there an accepted practice of doing this?
You can backup your database within a git repository. Of course, if you place the data into git in a binary form, you will lose all of git's ability to efficiently store the data using diffs (changes). So the number one best practice is this: store the data in a text serialised format.
mysqldump is a suitable program to help you do this. It isn't perfect though. If anything disturbs the serialisation order of items (eg. as a result of creating new tables, etc.) then artificial breaks will enter into the diff. That will decrease the efficiency of storage. You could write a custom serialiser to serialise changes only -- but then you are doing the hard work that git is already good at. Just use the sql dump.
That being said, what you are wanting to do isn't what devs normally mean when they talk about putting the database in git. For instance, if you read the link posted by #eggyal (link to codinghorror) you will see that what is actually placed in git are the scripts needed to generate the initial database. There may be additional scripts, like those to populate the database data with a clean state, or to populate it with testing data. All such sql scripts are text files, and pretty much the same format as the sql dump you would get from mysqldump. So there's no reason you can't do it that way with your day-to-day data as well.
There are not many software available to version control databases like MySQL and MongoDB.
But one is under development and the beta version is about to be launched soon. Check out Klonio - Version Control for databases
The article How to Sync A Local & Remote WordPress Blog Using Version Control gives advice on how to automate sync between two instances (development, production) of a WordPress blog using Mercurial. Is mentions that for this scenario, Git and Mercurial are very similar.
Step 4 (Synchronizing The Databases) is of interest here.
The database content will be exported to a file that is tracked by the revision control. Each time we pull changes, the database content will be replaced by this file, making our database up-to-date.
Then, it elaborates on conflicts and the scripting part of the job.
There is a version control tutorial in Mercurial out there, if you're not familiar with it.
If you are only interested in schema changes under version control, there is a nice stuff SqlRog. It extracts schema into the project files that can be put under the git.
Be aware that Wordpress stores all news feed content in the database, so even if you don't make any changes, there will be a lot of changing content.

Using version control (Git) on a MySQL database

I am a WordPress Designer/Developer, who is getting more and more heavily involved with using version control, notably Git, though I do use SVN for some projects. I am currently using Beanstalk for my remote repo.
Adding all of the WordPress files to my repo is no problem, if I wanted to I know I could .gitignore the wp-config file, but since I'm the only developer, currently, and these projects are closed source, it really makes little sense.
WordPress relies heavily on the database, as any CMS does, to keep textual content, and many settings depending on the specific plugin/theme configuration I'm using. I'm wondering what the best way of using version control on the database would be, if it's even possible. I guess I could do a SQL dump, though my MySQL server is running on Windows (read as: I don't know how to do it), and then add the SQL dump to my repository. But when I push something live, that poses huge security threats.
Is there an accepted practice of doing this?
You can backup your database within a git repository. Of course, if you place the data into git in a binary form, you will lose all of git's ability to efficiently store the data using diffs (changes). So the number one best practice is this: store the data in a text serialised format.
mysqldump is a suitable program to help you do this. It isn't perfect though. If anything disturbs the serialisation order of items (eg. as a result of creating new tables, etc.) then artificial breaks will enter into the diff. That will decrease the efficiency of storage. You could write a custom serialiser to serialise changes only -- but then you are doing the hard work that git is already good at. Just use the sql dump.
That being said, what you are wanting to do isn't what devs normally mean when they talk about putting the database in git. For instance, if you read the link posted by #eggyal (link to codinghorror) you will see that what is actually placed in git are the scripts needed to generate the initial database. There may be additional scripts, like those to populate the database data with a clean state, or to populate it with testing data. All such sql scripts are text files, and pretty much the same format as the sql dump you would get from mysqldump. So there's no reason you can't do it that way with your day-to-day data as well.
There are not many software available to version control databases like MySQL and MongoDB.
But one is under development and the beta version is about to be launched soon. Check out Klonio - Version Control for databases
The article How to Sync A Local & Remote WordPress Blog Using Version Control gives advice on how to automate sync between two instances (development, production) of a WordPress blog using Mercurial. Is mentions that for this scenario, Git and Mercurial are very similar.
Step 4 (Synchronizing The Databases) is of interest here.
The database content will be exported to a file that is tracked by the revision control. Each time we pull changes, the database content will be replaced by this file, making our database up-to-date.
Then, it elaborates on conflicts and the scripting part of the job.
There is a version control tutorial in Mercurial out there, if you're not familiar with it.
If you are only interested in schema changes under version control, there is a nice stuff SqlRog. It extracts schema into the project files that can be put under the git.
Be aware that Wordpress stores all news feed content in the database, so even if you don't make any changes, there will be a lot of changing content.

Updating web content locally?

I'm really new to owning a website, I just bought mine today!
But, I've found found pretty fast that updating files can be a nuance.
Make a change locally, save, re-upload.
I really prefer to code locally, as the IDE is much better. Else I'd have no problem coding with the hosts "notepad".
So, my question is, is there software of any kind that will sync your local files, to those that are hosted on the web host? Or is this dependent on the web host?
i use winscp, i tell it to upload changed files, so once logged in, any changes i make locally are automatically updated. there are other ftp clients that do this also.
you can use a file versioning system like svn to download and update files, push new versions. But that isn't really any different than what you are doing now, except that svn offers ability to keep track of previous versions and keep people from overwriting each other's work. I think maybe you might mean more like "when I open and edit file.txt in my editor and save, it is automatically updated on the server" ? If that is the case, there are a lot of editors that will let you open a file remotely and when you save it, it saves it remotely. The one I personally use is html-kit

Should I use registry or a flat file to save a program's state?

We have a lot of products that are saving their "states" on the registry.
What is the best practice on saving program states? What are the advantages/disadvantages of saving program states as a registry entry or saving program states to a flat file such as XML?
Thanks!
The obvious awswer would be that storing those states in a normal file, makes it easier for users to backup/restore the state manually.
Also consider that the registry has some keys that are special for each user in the system.
I think registry is the best option to store user-specific information that can be discarded and recovered easily (eg, the last username used to login). Other data should be in a settings file that can be backed-up.
For years programmers had their app settings stored in config files. Then the times changed, and for years they used the registry instead - many of them used it badly, and it caused issues when Vista and its UAC came on the scene.
Nowadays, especially in the .Net world, Windows developers are moving back to storing stuff in config files again. Personally i think that is the best way, if you need to move your app to another machine, or reinstall your OS, all you have to make sure you do is save your config file to retain your settings.
There are things that you may still want to store in the registry though, such as (encrypted) licencing info. For everything else, config files are good. Do pay attention to UAC and file virtualisation though, so that you don't run in to trouble further down the track.
Personally I'd go for the flat file.
(I am assuming that "registry" means windows registry?)
A flat file allows you (or even the user) to inspect and eventually even modify manually the values.
Depending on your situation this could be helpful for debugging, repairing mis-saved data etc.
Unless you thing you want to have the data to be "opaque" and therefore "hard to find/manipulate", the registry offers little in terms of benefits. Maybe it's faster, but if you have lots of state to save you better use an embedded DB instead of a flat file.
I used to follow Redmond doctrines. My programs used .INI files. Then I dutifully switched to the registry - and users started complaining. So, I bucked the trend and switched back to .INI files.
Some want to edit them (good/bad?). Some want to back them up, or transfer to a new machine. Some don't want to lose them if they reinstall windows.
AS a user, I have multiple partitions. Windows/programs/data/swap (and a few others). No programs go onto c:\program files, they all go into the programs partition. No data which I can control goes into c:\user data, it all goes into the data partition (use tweakui power toy, or regedit to change the defaults (but not all programs are well behaved and read the registry for those paths - some just hard code them)).
Bottom line - when Windows gets its panties in a fankle, I do a total re-insatll (approx every three months), and I format the C: drive.
By formatting the windows partition, I get a clean install. My data and programs are safe, though I may need to reinstall a few programs, which is why I go with portable versions where at all possible.
Imo, the registry is the biggest evil ever perpetrated on Windows - a single point of failure.
My advice? Locally stored config files. INI if the user is allowed to edit, serialized or binary format if not.
Or, you could offer a choice ...
Personally I go for a flat file, whether it's an INI file or XML file makes no difference to me. However in my line of work, we've had customers prefer the registry instead due to issues relating to deployment. It depends on who your client base is, and what the person keeping your product working prefers.
I always use regular files because its much easier to develop =)
Simple io vs I don't remember how read/write registry
Simple file copy/paste vs export/import keys for backup/developpement multiple versions of config for testing
Note that all of these advantages also translate into deployment strategies and generic client usage of the configurations
Depends how heavy deployment is. Most of my applications are XCopy-Deployable, that is they don't need an installer and can just be copied/unzipped. So I use .ini Files (using my own INI File Parser as .net has no built in one)
However, if your application needs to be centrally manageable (for example, using Windows Group Policies) or if you have a "heavy" installer anyway, the registry is the prime choice. This is because Applications that are installed normally to to C:\Program Files, and normal users do not have write access to this directory. Sure, there are Alternatives (%APPDATA% or Isolated Storage which has to be used when the Application is a Silverlight app), but you can as well "go with the flow".
Of course, if your application is supposed to run on Mono, you can rule out the Registry anyway and should go Flat Files.

MySQL Version Control - Subversion

Wondering if it is possible to have a version control of a MySQL database.
I realize this question has been asked before however the newest is almost a year ago, and at the rate things change...
The problem is coming that each developer has apache/MySQL/PHP on their own computers to which they sometimes edit the database. Its rather inconvenient if they have to send an email to all the other developers and then manually edit the test servers database.
How do you deal with this problem?
Thanks
This is not a MySQL-related solution in itself, but we've had a lot of success with a product called liquibase. (http://www.liquibase.org/)
It's a migration solution which covers many different database vendors, allowing all database changes to be coded in configuration files, all of which are kept in Subversion. Since all configuration is kept in XML files, it's easy to merge other people's changes into the mainline script and it plays well with tags and branches.
The database can be brought up to the current revision level by running the "update database" command. Most changes also have the ability to roll-back a database change, which can be helpful too. I would recommend following the practice of making sure you get current before running the migration, as this would likely be easiest.
Finally, when it comes to a production delivery, you can choose to have all the database changes output as a full SQL script so it can allow DBAs to run it and maintain a separation of duties.
So far, it's worked like a charm.
Well we use Rails which keeps all the change in the migration files. I know that a couple of PHP frameworks do the same thing - Symphony for instance. So when all the changes are merged in our repository ( we user mercurial) - we can see all the changes in migrations that need to or were applied on database in development. Than the person responsible for production rolls out code to production after a full backup is made. However if you don't use a PHP framework that takes care of this than, awied's suggestion sounds very interesting - I haven't heard of liquidbase before but I will definitely check it out.
There is a tool called iBatis, now called MyBatis that handles versions of databases perfectly.
It takes a little work to have all your changes in script instead of with a graphical tool, but, if you are familiar with coding, it's not a problem.
When you have multiple databases (like dev-test-prod), you just make 3 environment files and you can update one environment with only one command-line instruction.