whats the difference between Cargo and SMW? - mediawiki

The page on the mediawiki extension "Cargo" states that it is similar to SMW and stores template content / data in databases. Besides the technical and UI parts - what is the difference to SMW?
I am looking at SMW because of the notion of "properties" as "typed" hyperlinks in a graph of "categorized" wiki-pages.
Is this functionality provided by Cargo as well?

A comparison between Cargo, SMW and Wikibase can be found here: https://www.mediawiki.org/wiki/Manual:Managing_data_in_MediaWiki

You can find an alternative comparison between SMW and Cargo, including some recommendations, at https://professional.wiki/en/articles/managing-data-in-mediawiki

Related

How can I restrict search results to certain sections of an RTD manual?

We use RTD to produce the documentation for our project. The documentation consists of two broad parts: book-style chapters & sections on the one hand, and auto-generated documentation extracted from the code on the other.
We find that often one wants to search only one of the two, but not both at the same time. Is there a way to set this up, or a clever use of an existing mechanism that would do this? A kind of keyword or specifier in the search box would be ideal.
For reference: the full documentation and the auto-generated part.
currently this isn't possible, but we are taking it into consideration https://github.com/readthedocs/readthedocs.org/issues/5966.
In the meantime, Steve's solution How can I restrict search results to certain sections of an RTD manual? is a great workaround. They are using a patched version of Sphinx's searchtools.js https://github.com/plone/documentation/blob/7dc58219ee75129831a481f24849defcd8b290bd/docs/_static/searchtools.js#L225-L231.
Another solution, similar to the above, can be achieved using our API https://docs.readthedocs.io/en/stable/server-side-search.html#get--api-v2-search- and filtering the results client side (this is using the path attribute of each result)

Multilingual Mediawiki installation using Wiki Family Vs single multilingual MediaWiki Extension

I am trying to setup a multilingual encyclopedia (4 languages), where I can have both:
Articles that are translations of other languages, and
Articles that are in a specific language only.
As the wiki grows, I understand that the content of each language can vary.
However, I want to be able to work as fluently as possible between languages.
I checked this article, dating back to 2012, which has a comment from Tgr that basically condemns both solutions.
I also checked this Mediawiki Help Article, but it gives no explanation about the differences between both systems.
My questions are:
1- what is the preferred option now for a multilingual wiki environment that gives the most capabilities and best user experience, given that some of the languages I want are right to left, and some are left to right.
So I want the internationalization of category names, I need to link the categories their corresponding translations, and want users to see the interface in the language that the article is written in.
So Basically as if I have 4 encyclopedias, but the articles are linked to their corresponding translations.
2- Which system would give me a main page per language? So the English readers would see an English homepage, and the French readers see a French homepage..etc?
EDIT:
I have a dedicated server, so the limitation of shared hosting is not there.
Thank you very much.
The Translate extension is meant for maintaining identical translations and tracking up-to-date status while other solutions (interwiki links, Wikibase, homegrown language templates) typically just link equivalent pages together. Translate is useful for things like documentation, but comes with lots of drawbacks (for example, WYSIWYG editing becomes pretty much impossible and even source editing requires very arcane syntax). It's best used for content which is created once and then almost never changes.
You cannot get internationalized category names in a single wiki as far as I know. (Maybe if you wait a year or so... there is ongoing work to fix that, by more powerful Wikibase integration.) Large multi-language wikis like Wikimedia Commons just do that manually (create a separate category page for each category in each language).

What is better: WebSocket-Node or ws? And is there a standard interface for nodejs websockets?

I want to move away from socket.io to regular websockets to take advantage of the binary data transfers and get rid of the base64 encoding.
There seem to be two main websocket libraries for nodejs, both are on github:
Worlize/WebSocket-Node
einaros/ws
Both seem to be getting regular updates, both claim to be supporting the RFC-6455 standard.
Does anyone have experience with either or both of these who can share experience and/or make recommendations? Or does anyone know where I can find a recent comparison of them?
Further are there any plans for an official server side Websocket interface standard? These two libraries seem to have different API's. I did find this, but it is clearly for the client side only, and significantly newer than the date on the RFC standard.
I have been looking through every variation of Google search I can think of, and many related StackOverflow questions, but none seem to answer my question, and even the top Google results on the subject are several years out of date. Some related but insufficient StackOverflow threads include:
which-websocket-library-to-use-with-node-js
are-websockets-really-meant-to-be-handled-by-web-servers
web-sockets-server-side-implementation-for-nodejs
einaros/ws works great. However, Websocket-Node comes with routing support, which is quite handy for non-trivial implementations.

GEDCOM to HTML and RDF

I was wondering if anyone knew of an application that would take a GEDCOM genealogy file and convert it to HTML format for viewing and publishing on the web. I'd like to have separate html files for each individual and perhaps additional files for other content as well. I know there are some tools out there but I was wondering if anyone used any tools and could advise on this. I'm not sure what format to look for such applications. They could be Python or php files that one can edit, or even JavaScript (maybe) or just executable files.
The next issue might be appropriate for a topic in itself. Export of GEDCOM to RDF. My interest here would be to align the information with specific vocabularies, such as BIO or REL which both are extended from FOAF.
Thanks,
Bruce
Like Rob Kam said, Ged2Html was the most popular such program for a long time.
GRAMPS can also create static HTML sites and has the advantage of being free software and having a native XML format which you could easily modify to fit your needs.
Several years ago, I created a simple Java program to turn gedcom into xml. I then used xslt to generate html and rdf. The html I generate is pretty rudimentary, so it would probably be better to look elsewhere for that, but the rdf might be useful to you:
http://jay.askren.net/Projects/SemWeb/
There are a number of these. All listed at http://www.cyndislist.com/gedcom/gedcom-to-web-page-conversion/
Ged2html used to be the most popular and most versatile, but is now no longer being developed. It's an executable, with output customisable through its own scripting syntax.
Family Historian http://www.family-historian.co.uk will create exactly what you are looking for, eg one file per person using the built in Web Site creator. As will a couple of the other Major genealogy packages. I have not seen anything for the RDF part of your question.
I have since tried to produce a Genealogy application using Semantic MediaWiki - MediaWiki, the software behind Wikipedia, and Semantic MediaWiki includes various extensions related to the Semantic Web. I thought it is very easy to use with the forms and the ability to upload a GEDCOM but some feedback from people into genealogy said that it appeared too technical and didn't seem to offer anything new.
So, now the issue is whether to stay with MediaWiki and make it more user friendly or create an entirely new application that allows for adding and updating data in a triple store as well as displaying. I'm not sure how to generate a family tree graphical view of the data, like on sites like ancestry.com, where one can click on a box to see details about the person and update that info or one could click on a right or left arrow around a box to navigate the tree. The data comes from SPARQL queries sent to the data set/triple store both when displaying the initial view and when navigating the tree, where an Ajax call is needed to get more data.
Bruce

MediaWiki Extension for Q&A

Is anyone aware of an already existing MediaWiki Extension that would allow users to ask questions in a similar manner to this website or yahoo answers. I've been looking for an extension to allow users to ask/answer questions referring to specific pages for my wiki and I was hoping someone might have already implemented this.
I've been having no luck while searching considering the key words I've been using to describe what I'm looking for end up returning a wide range of results.
I think there are currently these options:
The simplest is the talk page, available for every article. You could use a main page talk (see Wikipedia example - though Wikipedia uses it only for discussion about main page content, not for general discussion) or Village Pump / Community portal idea for general discussions. Not ideal but it works.
You can improve on this somewhat using the LiquidThreads extension. This enhances the talk pages but does not include gaming elements found here (e.g. up votes).
If you just want to get community views, you can use a poll like QPoll.
Finally, you can use a Chat room.
FWIW - I asked a similar question over on meta.
(You may also be interested in Wikis and Wikipedia.)
Currently there is Wikia Answers that is essentially MediaWiki based website but with excremental extensions for Q&A.
Many of its extensions are not yet available directly for download (You may get the website extensions on https://github.com/Wikia
I would be questioning why you're using Wikipedia for a project like this in the first place. It simply isn't designed for it. I think that you'd get more out of looking at a system that thinks about a website as a collection of datachunks, rather than a collection of documents (which is what wikipedia is GREAT at). Something like Drupal is going to get you where you want to be far faster that wikipedia - in fact, it's already be done: https://drupal.org/project/answers. I wouldn't be surprised if something similar exists for joomla, or even wordpress.
If you're looking at wikipedia because you're already using it for part of your site, there's no reason why you can't run another CMS alongside it.