I recently installed a wiki on my server using mediawiki. Whenever I add a page to a category the category is linked correctly on the bottom of the page. However, if I open the category page my content page is just not added. It says "This category currently contains no pages or media." even though it links correctly. Did I make a mistake? Reinstalling several times didn't help an the category appears under "unused categories".
One thing that might help is to purge the category page.
Another possible solution is the maintenance script "Rebuildall.php". It helped me some days ago after I had empty categories after importing content to one of my wikis.
Related
I installed CMS MediaWiki, I want to make sure that all pages and categories that are in the database were displayed on the main page with a list or other method, I tried to find information in Google, but didn’t really find anything, can you please tell me where you can see the instructions, or maybe someone already did that?
You can access all the categories in the database via the page Special:Categories and all the pages by Special:AllPages. I believe that using Extension:DynamicPageList, you can list all the pages/categories on the main page by selecting all pages in their namespaces (main and category respectively).
I have recently restored two wikis from a backup after a harddisk failure. The wikis make use of a single MySQL database, each of them using prefixed tables. One of the wikis is running just fine after the restore; the other one is not.
The issue is that all of the content on the malfunctioning wiki's main page is gone; the page just displays the sample text "There is currently no text in this page. You can search ..." Even worse, I cannot edit the wiki's main page: clicking the edit button when registered brings up the message "No such section --- You tried to edit a section that does not exist. Since there is no section , there is no place to save your edit."
What's also confusing is that all of the other pages seem to be fine; I can search for pages, and clicking on "Random page" does what it's supposed to do. I can also edit all other pages except for the main page.
Has anyone come across anything remotely similar?
I am pretty sure it has to be an error within the database. If I export an XML dump of all the pages, the main page is not among the pages in the XML. However, the page table contains an entry for the main page. How can I test the validity of that entry?
If the page is really present in the database as you say (which requires not only page table but also text table, as Tgr said), but is not accessible, the most likely reason would be an incorrect namespace. Does the page title contain a colon?
Try running php maintenance/namespaceDupes.php --fix.
Is it possible to format a Category page on a wiki (I'm working on a MediaWiki but I suppose it's the same for all) so it shows all the categories in one column instead of the default-3 columns?
If not, is there a way to create another kind of page that dynamically updates its content as a Category page does? I couldn't find an example on wikipedia.
Both Extension:Semantic Mediawiki and Extension:Dynamic Page List are pretty potent tools that allow the creation of pages with dynamic lists and tables and so on.
Have a wiki installed in our organization, and want to start using it.
Failed to find the answers for the next 2 basic questions:
How do I configure the entry page to show a list of all existing pages
How do I create a new page (!). Only succeeded doing it by typing a url of an non existing page. Guess there are nicer methods for this
Thanks
Gidi
For how to show a list of all pages, look at DynamicPageList, which is part of MediaWiki. (There's a more advanced third-party version, but it's not needed for such a simple task.)
Creating a new page really is exactly as you said: Type a URL and save some edits. Most beginning editors will edit a link into a page, and then use that link to browse to the page, so that they don't accidentally forget the spelling and lose the page to the Ether. (Of course it would show up in the recently edited and other special pages.)
This is more of a webapps.stackexchange.com question though.
What I mean by autolinking is the process by which wiki links inlined in page content are generated into either a hyperlink to the page (if it does exist) or a create link (if the page doesn't exist).
With the parser I am using, this is a two step process - first, the page content is parsed and all of the links to wiki pages from the source markup are extracted. Then, I feed an array of the existing pages back to the parser, before the final HTML markup is generated.
What is the best way to handle this process? It seems as if I need to keep a cached list of every single page on the site, rather than having to extract the index of page titles each time. Or is it better to check each link separately to see if it exists? This might result in a lot of database lookups if the list wasn't cached. Would this still be viable for a larger wiki site with thousands of pages?
In my own wiki I check all the links (without caching), but my wiki is only used by a few people internally. You should benchmark stuff like this.
In my own wiki system my caching system is pretty simple - when the page is updated it checks links to make sure they are valid and applies the correct formatting/location for those that aren't. The cached page is saved as a HTML page in my cache root.
Pages that are marked as 'not created' during the page update are inserted into the a table of the database that holds the page and then a csv of pages that link to it.
When someone creates that page it initiates a scan to look through each linking page and re-caches the linking page with the correct link and formatting.
If you weren't interested in highlighting non-created pages however you could just have a checker to see if the page is created when you attempt to access it - and if not redirect to the creation page. Then just link to pages as normal in other articles.
I tried to do this once and it was a nightmare! My solution was a nasty loop in a SQL procedure, and I don't recommend it.
One thing that gave me trouble was deciding what link to use on a multi-word phrase. Say you had some text saying "I am using Stack Overflow" and your wiki had 3 pages called "stack", "overflow" and "stack overflow"....which part of your phrase gets linked to where? It will happen!
My idea would be to query the titles like SELECT title FROM articles and simply check if each wikilink is in that array of strings. If it is you link to the page, if not, you link to the create page.
In a personal project I made with Sinatra (link text) after I run the content through Markdown, I do a gsub to replace wiki words and other things (like [[Here is my link]] and whatnot) with proper links, on each checking if the page exists and linking to create or view depending.
It's not the best, but I didn't build this app with caching/speed in mind. It's a low resource simple wiki.
If speed was more important, you could wrap the app in something to cache it. For example, sinatra can be wrapped with the Rack caching.
Based on my experience developing Juli, which is an offline personal wiki with autolink, generating static HTML approach may fix your issue.
As you think, it takes long time to generate autolinked Wiki page. However, in generating static HTML situation, regenerating autolinked Wiki page happens only when a wikipage is newly added or deleted (in other words, it doesn't happen when updating wikipage) and the 'regenerating' can be done in background so that usually I don't matter how it take long time. User will see only the generated static HTML.