Is it possible to add an options screen to a Wordpress MU theme (options being saved for each user, so blogwide, not sitewide) ?
I'm used to program wordpress themes, but i'm a bit puzzled as to how make customization happen in a multi-user environment...
Wow, no one answered you after all this time? Okay, here's the answer. If you inspect your MySQL database after you get a few WPMU blogs up and running, you'll notice that each blog has a separate table prefix. The wp_1_ prefix goes to the main admin blog. And then wp_2_ prefix and so on go to all the non-main admin blogs that you create in the wp-admin system. If you want to use the Codex function to access which table prefix, it's actually easy -- just do "global $table_prefix;". In fact, as a side note, WordPress emits a ton of global vars that are quite useful and you can find out what these are by doing "print_r($GLOBALS);die();" in like a plugin or theme file.
But anyway, the answer to your question is that if you look into the MySQL database, you'll find that each blog in WPMU gets its own options table, and it is separate, not sitewide, but blogwide -- just as you desired. And when you use the standard WordPress options API, it will access the options table you need automatically without you having to use the $wpdb global object and without you needing to use $table_prefix global string.
So, if you are using get_option(), update_option(), add_option(), and delete_option() -- these will all work still in a WPMU environment. And even though the plugins folder is shared among all blogs, a plugin's settings are not and are exclusive to each blog.
Now, if you are not using the WP Options API in the Codex, but are going at it with the $wpdb global object, then you'll need to be aware that you'll need to address tables by the $table_prefix global string as part of the table path. There are some cases where this is desirable, such as having a LOT of data that you need to store in a custom table. For instance, storing HTTP_REFERER and user agent info into a table for connections coming in.
Related
We are planning to use mediawiki as the basis for our products documentation. Access control will be used to grant customers access to content.
We would also like to use mediawiki for some of our internal documentation, stuff that customers should not access.
Is it possible to configure one installation of mediawiki such that one group of users sees certain wiki content and that another group of users sees other wiki content? If so, please point me to the appropriate documentation as I am not even sure what this would be called (thus I am uncertain where to look).
Thank you.
If by one installation you mean one database, it is sort of possible but extremely unwise. See this section of the manual for explanation and Category:Page specific user rights extensions (especially the Lockdown extension) if you decide to try it anyway.
Using the same installation directory (ie. PHP files) but separate databases is fine. The manual page about wiki farms describes a few ways to do it.
If you mean, that you want to restrict the "view" permission for certain pages to a specific group, then the answer is kind of maybe. With the default MediaWiki installation, that is not possible, as MediaWiki is designed to be "open" to all users (as least the view persmission). You can "just" restrict, that a certain group can read or can't read, but this will always mean all pages.
Maybe your problem can be solved by having really two wikis, instead of holding two "sections" in one wiki. For this you would need:
One MediaWiki installation on your file system (unzipping the mediawiki tarball release), e.g. /var/www/html/mediawiki/
Two mysql databases (or you use two database prefixes)
Two different urls (e.g. example.com/wiki1 and example.com/wiki2 or wiki1.example.com and wiki2.example.com)
A bit more complex MediaWiki configuration
Now, you first need to create two virtual hosts in your webserver. Both should point to the installation directory of your mediawiki (/var/www/mediawiki/). In the next step you would need to create a configuration which will be different depending on the wiki requested by the user (depending on what url is used). This is a bit tricky and a mostly undocumented way in MediaWiki, but in fact it's working like this:
You create a wgConf object
You fill this wgConf object with valid wikis (usually you use a unique name, e.g. the dbname)
You let wgConf extract all settings (using the name of the wiki, e.g. the dbname)
This part is more or less documented at the wgConf manual page. The more tricky way is to parse the url correctly and set all the information you need. The Wikimedia foundation uses a script called MultiVersion. This tool does a bit more as just parsing the url to indentify the wiki, but ok. With MultiVersion you would then set the configuration variable wgDBname which you then use to load the wgConf data. For more information, you should ask specific questions and look into the git repository of the Wikimedia foundations configuration. I use a similar approach with just 2 wikis, and a lot smaller MultiVersion (but it's based on the idea of the WMF), so maybe this will help you understand the way to configure wikis, too.
You want, e.g., also make sure, that the wikis are able to create inter-wiki links to link, e.g., a documentation of your public wiki in your internal wiki and vice versa. And you probably want to make sure, that some database tables are shared by one wiki, so your users just need to register once to access both wikis (and set the internal read permission for users to false, so that you have to give access to users ecplicitly). See $wgSharedDB and the manual for shared databases. The configuration of my two wikis uses this feature to share user tables.
I have seen many questions related to database migration, but none which clearly state: When editing a database which of the tables do I actually need to edit?
As a force of habit I edit the entire mySQL database, and usually that works out. (However on occasion this can mess up URL left in user comments for instance.) However it would be good to know specifically which tables I actually need to edit in order to complete a migration correctly.
EDIT: I already understand how a migration works and which tool to use, and I have read the codex entry on migration. I am not having a specific problem migrating.
This is really more of a best practices question.
What I am looking for is a definitive list of what tables I can exclude from my search and replace. For instance I know that the basic URL info is in wp_options, I know that (some) image paths are stored in wp_postmeta. Basically I want to exclude every table I possibly can, while still preserving the site's widgets, images, settings, etc.
The only references that HAVE to change are in the wp_options table. One is the home url and the other is the siteurl. These will allow you to log into the admin and view the frontend. However, you will still have to update your permalinks and rewrite rules using the admin.
I would still suggest an automated solution, however. I use this tool to find and replace database references. It is specifically made for wordpress, but it will work for any database. It also will allow you to select which tables to update and will work with serialized strings so you should be able to avoid errors in the comments section.
Simply drop the file on the root of your wordpress install and run through the prompts. Make sure to deselect the wp_comments and wp_commentmeta tables. Also, make sure you remove the file on production as it presents a potential security threat.
UPDATE BASED ON COMMENTS
Other than the two spots above, there are several places in the database that URLs are stored. Most plugins will store their options in the wp_options table. Typically, plugins will also serialize the data to avoid a ton of queries. You can't simply change the URL in the serialized data however, because there are length references in the serialized string. So if your current URL is 15 characters long and the new one is 20, you need to update the URL and the string length too. If you don't, PHP will just ignore the value. I believe this is a security measure to avoid code injection.
For assets in the media section the URLs are stored in the wp_posts under the post type attachment. If you are hardcoding absolute links in your posts, you may need to parse those as well (if this is the case, you can probably just parse the entire table). If you are using any sort of custom field plugin or doing anything with post meta for URLs you are also going to want to go through wp_postmeta.
One other thing to mention is that some plugins will add their own database tables. These are obviously on a per case basis, but a good rule of thumb is to try running a query for "%http%" in any string columns to see if there are hardcoded URLs. Here is the query I will use:
SELECT * FROM `table` WHERE `column` LIKE '%http%'
Download the file from the following link http://interconnectit.com/products/search-and-replace-for-wordpress-databases/.
Put it in the root folder and access the file and follow the steps and replace the urls.
I have been using it for quite a long time without any issues.
Hope it helps!
Thanks
I'm hacking WordPress quite a lot for a current project. I'm storing data input by the user into two custom database tables (still inside the wp database though).
Along with the information I need to collect, I'm also saving the wp user id into the database. I've generated a URL I'd like each user to be able to visit, to see the results of their input. The structure of this URL is simple http://domainname/username/searchname
Now obviously, I can output this as a URL and it can be clicked, but WordPress obviously just spits out the 404 template instead.
I kind of need to emulate the same functionality that the post pages have in terms of being able to visit this link and have data spit out based on a template.
I realise that custom post types sound perfect for this kind of thing. But I need users to be able to submit data from the front end, I'm not sure this is possible with Custom Post Types?
I think WP_Rewrite class is what you're looking for:
WP_Rewrite is WordPress' class for managing the rewrite rules that allow you to use Pretty Permalinks feature. It has several methods that generate the rewrite rules from values in the database. It is used internally when updating the rewrite rules, and also to find the URL of a specific post, Page, category archive, etc..
There are several examples in its documentation page. For a more hands-on tutorial check this blog article.
From what I've read so far, using custom database tables should be reserved to situations where the default tools can't handle the task.
The benefit of the default tools is that you don't have to create custom functionality (and all the details that come with it) to search, manipulate and display the data.
You could have a CPT called User Feedback, configured as hierarchical, each parent post corresponds to one user and the child posts would be the user input data.
Or it could be non-hierarchical and a custom taxonomy would make the bridge between posts and users.
After submission on front-end, you simply use wp_insert_post to add the info to the database as the post type (use title and content as holding fields if needed) and its associated meta data (other fields).
If you set the CPT supports argument to false, it won't show anything in the editing screen but the Publish meta box. And all the submitted information stored as meta data can be displayed by custom meta boxes.
After having this setup, if you need to fine tune the URLs, it's time to use WP_Rewrite.
That's how, in general terms1, JetPack manages the feedback from its Contact Form.
1 Checking the code, there's actually some interesting techniques used and worth replicating.
Related posts:
How to integrate custom database table in Wordpress and using Wordpress functions
WordPress 3.0 and Custom Post Types
When to (not) use a Custom Taxonomy
When developing WordPress themes for a site with a large amount of posts, how can I dynamically pull existing post data from the live version of the site onto my testing site? I already know about WordPress's export feature, but that's one-and-done, not dynamically queried.
Plan A:
Proposed Solution:
Create read-only user in live site's database
PRECAUTION: change test site's prefix from "wp_" to "test_"
Problems:
Settings (like current theme) on test site cannot be changed, thanks to read-only user
No posts found in "test_posts", even though I'd like it to search "wp_posts"
Is there an easier way or existing solution to avoid rewriting WordPress system files on the test site? I'd really rather not rewrite WordPress's database interface...
Similar: Linking themes across WP installations
just duplicate the DB, re-name and call DB in wp-config!
I'm assuming these instructions are for "normal" wordpress... will this work with wpmu or do i need to modify this? Is there anything i should watch for?
http://www.mydigitallife.info/2007/10/01/how-to-move-wordpress-blog-to-new-domain-or-location/
See Moving WordPress « WordPress Codex and WordPress Serialized PHP Search Replace Tool
The biggest issue you face is changing URLs in the database; and you will need to change these URLs in each of the individual databases that Multisite installs uses. Check out the find/replace tool linked above; it correctly deserializes/serializes data in the database. If you change URLs in a text database dump, or with SQL queries as in the link you listed in your question, you risk breaking data, such as theme options and widget settings.
1/09/2016 Note: WordPress MU doesn't exist anymore. MU was rolled into WordPress core in version 3.0.