Getting MediaWiki notifications for all changes - mediawiki

Some users would like to get notifications of changes to a relatively small MediaWiki instance. RSS feeds are not ideal for a number of reasons.
My ideal would be a daily (or configurable period) mail of all changes that have been made. Features I am looking for include
batching of changes to a single page during the period (often a page changes a number of times and you don't need to know every blow by blow change
including by category
excluding by specific categories
Is there any such solution (maybe a plugin or standalone program) that can be recommended? I am an administrator of the wiki and sysadmin of its hosting machine.

If you are an admin of this wiki, you can add
$wgUsersNotifiedOnAllChanges = array( 'UserName' );
into LocalSettings.php and get all possible notifications.

MediaWiki has email notification built in.
With the CategoryWatch extension you can watch pages in a particular category and get emailed on changes.
I don't know any way to batch changes (other than excluding minor changes) or exclude specific categories.

Related

What Would be the Best Way to Version Control Json Objects

I am tasked with creating a version control system (of sorts) for lay people. The data is stored as arbitrary json objects but the user interface is not that low level; it is a nice pretty GUI.
Each type of versioned data will have a master copy and when a user edits the master version a copy will be created for their own workspace; a quasi local branch.
Because of the nature of our users, any conflicts that arise will need to be handled as seamlessly as possible. It can't be complicated, a git conflict-like result can't be sent to the user and expect them to learn to resolve those conflicts. Conflicts will arise because after each person has finished their work they will submit it for review and if it is accepted it will be merged into a master-like branch.
My first thoughts are to have a "meta" list of changes associated to each object, however large the nesting goes, and whenever a change occurs add that change operation to that list associated with that object. When users make workspace changes a query will check to see if other user's changes will conflict and a notification will be sent to the user for them to contact the user authoring the other portion of the conflict.
Is this a practical approach? Are there any glaring shortcomings besides being a large project? Are there better options for writing a version control system for non tech-savvy people?
You might be able to assemble something reasonable around xudiff

From a newby: Can one mediawiki installaiton have two wikis?

We are planning to use mediawiki as the basis for our products documentation. Access control will be used to grant customers access to content.
We would also like to use mediawiki for some of our internal documentation, stuff that customers should not access.
Is it possible to configure one installation of mediawiki such that one group of users sees certain wiki content and that another group of users sees other wiki content? If so, please point me to the appropriate documentation as I am not even sure what this would be called (thus I am uncertain where to look).
Thank you.
If by one installation you mean one database, it is sort of possible but extremely unwise. See this section of the manual for explanation and Category:Page specific user rights extensions (especially the Lockdown extension) if you decide to try it anyway.
Using the same installation directory (ie. PHP files) but separate databases is fine. The manual page about wiki farms describes a few ways to do it.
If you mean, that you want to restrict the "view" permission for certain pages to a specific group, then the answer is kind of maybe. With the default MediaWiki installation, that is not possible, as MediaWiki is designed to be "open" to all users (as least the view persmission). You can "just" restrict, that a certain group can read or can't read, but this will always mean all pages.
Maybe your problem can be solved by having really two wikis, instead of holding two "sections" in one wiki. For this you would need:
One MediaWiki installation on your file system (unzipping the mediawiki tarball release), e.g. /var/www/html/mediawiki/
Two mysql databases (or you use two database prefixes)
Two different urls (e.g. example.com/wiki1 and example.com/wiki2 or wiki1.example.com and wiki2.example.com)
A bit more complex MediaWiki configuration
Now, you first need to create two virtual hosts in your webserver. Both should point to the installation directory of your mediawiki (/var/www/mediawiki/). In the next step you would need to create a configuration which will be different depending on the wiki requested by the user (depending on what url is used). This is a bit tricky and a mostly undocumented way in MediaWiki, but in fact it's working like this:
You create a wgConf object
You fill this wgConf object with valid wikis (usually you use a unique name, e.g. the dbname)
You let wgConf extract all settings (using the name of the wiki, e.g. the dbname)
This part is more or less documented at the wgConf manual page. The more tricky way is to parse the url correctly and set all the information you need. The Wikimedia foundation uses a script called MultiVersion. This tool does a bit more as just parsing the url to indentify the wiki, but ok. With MultiVersion you would then set the configuration variable wgDBname which you then use to load the wgConf data. For more information, you should ask specific questions and look into the git repository of the Wikimedia foundations configuration. I use a similar approach with just 2 wikis, and a lot smaller MultiVersion (but it's based on the idea of the WMF), so maybe this will help you understand the way to configure wikis, too.
You want, e.g., also make sure, that the wikis are able to create inter-wiki links to link, e.g., a documentation of your public wiki in your internal wiki and vice versa. And you probably want to make sure, that some database tables are shared by one wiki, so your users just need to register once to access both wikis (and set the internal read permission for users to false, so that you have to give access to users ecplicitly). See $wgSharedDB and the manual for shared databases. The configuration of my two wikis uses this feature to share user tables.

How to avoid incremental ID security gap in mysql

What is the best practice both security and performance focused to avoid letting users see incrementally IDed data in database or other dataset.
Main concern is to avoid urls such as
www.myweb.com/user/123
This of course applies to posts, users, files or messages.
Implement permissions so that only authorised people can see/change/delete data.
If you still want to hide the incremental ID from users or API consumers, you could add a hash column to your database and index it, then expose that instead of the incremental ID.
Hiding Incremental IDs
Depending on your programming language, the unique ID you assign your users may not need to be displayed in the URL. For example, with PHP, you can use the $_SESSION[] array to store values on your server for each user. Those variables will never be seen by the user, but the server will be able to identify each user appropriately (via PHP cookies) and serve them the correct page dynamically.
For example, when a user signs in to your site, after authenticating, your script might do something like:
$sql = 'SELECT id FROM user_table WHERE name = :username';
// Prepare & execute SQL query, putting result in $sqlResult
$_SESSION['user_id'] = $sqlResult;
Now, whenever the user wants to visit their own page, your server will know which information to fill your home page template with -- and the URL will appear the same to every user.
If a user wants to visit another user's page, you could do something similar: upon choosing a specific user page to visit, your script could set a $_SESSION['visit_user'] variable. Thus, you would be able to fill a visit page template with the appropriate information, and your user will be none the wiser.
This same tactic can be applied to posts, files, etc. that are assigned incremental IDs.
But Is This Necessary?
As you yourself mentioned in your previous post, there are plenty of examples of sites that use incremental IDs -- and with no qualms about displaying them. Because while this does give a malicious user the ability to view other users' IDs, etc., this doesn't necessarily pose a threat to your site's security. If you follow basic security principles (require strong passwords, watch your MySQL users' and files' permissions, sanitize user input, etc.), it doesn't matter if malicious users can guess at auto_incremented IDs. Those IDs aren't valuable information unless your site can be exploited in another manner.

Unifying K2 component's data-source for multiple Joomla! websites hosted in the same server

I am responsible for a few web sites of my organization.
I use Joomla! 2.5.9 for those web sites. They all are running at the same server.
I use K2 component for content managing.
I have a general website in which shows all the staff information at the 'Staff' page. Also some of those people and their contents are shown in another department's website. So, there are databases for each web site.
For example:
In the general website (let's say general.org), when I click on the 'Staff' menu item, page shows all of the people work at my organization. Also they work at different departments.
In another web site (eg: education.general.org) when I click on the 'Staff' menu item, It shows the people work at education department.
But for each web site, I have different user accounts which means a modification in one of them does not affect the other one. If the one of the education staff tries to change his profile picture on the education web site, he also has to do it on the general web site.
And sometimes one person might be working at two departments. Thus he has to edit three times of his data.
Is it possible to merge the records for all websites? In other words, I want everyone to insert/update their data on the general web site, and the other web sites will be updated automatically.
You would have to have one Joomla site to do this. The subdomains would have its own template/style or whatever, but would run on the same Joomla installation. The subdomains then just map to a specific menu item on the general page. That would be one way to do it.
Another way would involve coding a custom user plugin which updates the tables from the other Joomla installations after a profile was edited. If you're familiar with PHP you could probably do this yourself, otherwise you need someone with coding knowledge to do it for you.
Or you could set up Joomla to use authentification based on a LDAP database (http://docs.joomla.org/LDAP). However I'm not sure how well it works with password and profile changes.
That's about the solutions I would see.

Tracking data access

Backstory
I work for a company that has an online site that allows user to text personal information for collection. We collect the data, and make it available online. Users can choose to share the data with other users.
Going Forward
At some point, this may become classified an FDA-governed medical tool. In anticipation, we'd like to have in place a logging system that shows each time someone accesses our users' data, whether it be the user themselves, another authorized user, or a support person.
Current Architecture
We are currently running Ruby/Rails, and using a MySQL database. The personal information is encrypted in the database.
Data Access for Support
Today, support personnel can access data one of three ways:
admin site The admin site is limited to whatever screens we develop. While we don't currently, we could easily add logging to keep an audit trail of who accessed which data using the admin tool.
sql client I use MySQLWorkbench to access production. However, when connected this way, all personal information (user name, cell number, etc), is encrypted.
Ruby Rails console - Finally, support can log into one of the production boxes and use the Ruby/Rails console from command line. Ruby will decrypt the data, so we can do some simple things such as
u=User.find_all_by_state('active')
and it will return the recordset of all users with state='active', and decrypt their personal information in the resultset.
Holy Grail
logging
easy access for support
I'd love to be have a way to allow easy support access (once authenticated) to the data, but would log everything that is accessed (read or updated). That way, if I'm checking out my buddy's ex-wife's data for example, it gets logged to a place where I can't get in and clean it the audit trail. (See Google firing Gmail employee for an example of employees breaching the data policies).
Anyone have ideas, thoughts, experiences, suggestions with this issue?
hey devguy. This was a issue for me a couple months back. We ended up centralizing our mysql queires so that we could start to track all information coming in and out. Unfortunately the class I wrote is in PHP but the idea behind it could make it very easy to start logging.
https://code.google.com/p/php-centralized-mysql-controller/
Try stored procedures. Make all code use the stored procedures for CRUD activities. This defines an API that your developers can use while business rules are global enforced (don't return entire SSN values, but only last 4 digits, etc).
This serves as the basis for an external API as well.
If you want logging/auditing, you put it in the procedure.
This protects you from everyone except the DBAs.