MediaWiki categories not updating automatically - mediawiki

I have a MediaWiki instance running on a Linux server that does not seem to automatically update its categories whenever a page edit is submitted.
I have never seen this on any other wikis I've worked with, so I'm wondering if there's some script that's not configured correctly.
Periodically running the rebuildAll.php or refreshLinks.php script isn't really viable when we have hundreds of edits happening per hour and need the categories to be populated instantaneously.
Running rebuildAll.php seems to take hours anyway.
Has anyone else encountered this?
Thanks!

What's $wgJobRunRate set to? Try running maintenance/runJobs.php.
It can sometimes take a while (minutes perhaps) for a page to appear in a category, but for most wikis it shouldn't be more than that.

Related

Site compromised: ZMEU attack

My site has been compromised with ZmEu attacks. In the logs I find suspicious user agents named - ZmEu.
The site returns 500 internal server error. There are no related error logs in the apache error log.
There are several dummy files all over in my server. I removed all of them.
But still the site is down.
What is the main target for such attacks?(What files are modified and how do I get them back?)
Where should I look to fix the issues?
If anyone has undergone such situation please give your advise.
update: Its wordpress site which is not working. There are other apps in sub directories which are working fine.
Thanks in advance,
You restore from a backup in this situation.
It will be tough to sort through and reverse everything 100%. The hacker could have even changed the modification times on the files, so you'll never be able to tell what has been accessed or not, without combing through every line.

InfoPath 2010 not Returning Data from Access dB

I have been working on a submission form for two of our employees' to enter tax data. It is an InfoPath 2010 form, which is connecting to an Access 2010 accdb. The purpose of the form is to pull related data from two source tables (one from the old dB that was used, and the other from APX which houses additional information) to prefill as many fields as possible. Everything works fine when running it from my computer, or directly off our server. The problem I am running into now, is that our two users have access to the files, can open them with InfoPath Filler, but upon opening, they get the "InfoPath cannot connect to the data source...". The funny thing is, is that last week, they were able to connect and submit data with no problem (then one day I came back from lunch and it no longer worked). How I had to originally set them up was to create a certificate, make the forms full trust, added both user ID's as being able to have read write access. When I run the form from my desktop it works without a hitch. I even tried remapping to a mdb to see if it was a version issue. The dB is stored on a shared domain, \testdomain\, for arguments sake. Then they access the form via the same directory. Just to note, SharePoint is not connected in any way. All the searches I have done have yielded no solutions. I am thinking it is a networking issue, and have a meeting with the network admin in a few hours. But, what I can't figure out, is how it worked before, and not now. Does anyone have any suggestions or thoughts on what could be causing this? Just to clarify, I can run the same xsn form they run, from the same server location, without having the issue they do. I truly appreciate anything anyone might be able to offer!
Ok, finally figured out what was causing the problem (totally feel dumb). Apparently the servers used randomly get backed up, and when they do, the permissions get overwritten. If anyone else runs into this, review the access permissions for the directory and all the files. Thanks Nathan for your advice!

How do you write a Rails cron job Whenever gem?

The whenever gem is installed. I see all kinds of cron and whenever tutorials about scheduling tasks with minutes, hours, days etc. I watched the railscast video on cron/whenever.
But I've yet to find any examples about how to write a job itself, other than rake tasks.
I want to schedule a task that checks the database for changes. The idea is that the database will have tags that tell you which particular row has changed. Whenever should poll the database periodically to check for these changes. Then it hopefully can push, or let the client know it needs to update the page dynamically using ajax.
If I were doing this manually, I'd use commands like:
rails dbconsole
select blah from blah;
Is there a way to write mysql commands in whenever? Is this the correct/best way to poll the database for changes?
I know there are ways to poll a database from mysql itself, but I've been specifically told to do it from the rails side.
I'm a newbie to all of these technologies (Rails, databases, ajax) so that's probably why the answer isn't clear to me.
On the client end, I have buttons that use jquery to add/delete/change row data, just to assure myself I know how to change things in the table once I can get stuff from the database. Those buttons will eventually be removed.
Right now, the page uses ajax to refresh the entire html table. But they would like just a row refresh/update through ajax.
Look at the RailsCast for cron/whenever again. You'll notice an example line of code like this:
runner "MyModel.some_process"
The code in the strings is evaluated and run. So whatever you want whenever to run, just write that code yourself and have a way for it to be called.
So maybe you create a class named DatabaseWatcher and store it in lib which has a class level method named .run, you'd do the following:
runner "DatabaseWatcher.run"
And that's it. In your .run method is where you'd put your logic. As for how to actually write that code, that depends on your requirements. Are you looking for if the updated_at time is within 1 minute of now? Do you store a time when you last checked the DB, and then you can see if the updated_at time is greater than that? Do you have a table that stores every time the model is changed? That all depends on you.

forcing EDIT of access backend database

i have about 20 users who have their own access front end and write to a backend.
i need to do an update to the backend, and every time i try to do it offhours, it seems that someone is still logged on because the file is locked!
how do i get around this? i do tell the users to log off at the end of the day but a lot of them forget! is there a way to force them off?
can i disconnect someone from an access database?
Idle Detect/Inactivity Timeout
How to determine who is logged on to a database by using Microsoft Jet UserRoster in Access
I don't expect any points for my answer as Remou already gave you these. But whenever reasonable I prefer to give pages with links to the official KB articles rather thwn who knows how mangled up code can get in a forum. Including this one.

Drupal 6: using too many Views module causing site to go down cos of too many mysql connection

I have HostGator Baby Shared Plan . I develop Drupal site on. everything was fine at the beginning, then by the time i go further with development, site started ti work really slow. now it is not working at all. giving my sql errors like TOO many connections, etc...
I created so many blocks, pages with View. so it makes my site to so much depend on database. should not I do that? can it be the reason of my site's no working now.
appreciate helps!!!!
Don't use HostGator. If you're looking for something in the same price range, try DreamHost -- they officially support Drupal on their hosting plans.
Ferran's answer is fine once you're done developing, but you shouldn't devlop with cache turned on or else you won't be able to see your own changes.
Yes, Drupal uses a lot of Database. However, you can cache the results of most Views (check the main screen of each view to enable it) and also there's an option to cache the blocks. It all depends in the content of your views and if users are mostly registered or anonymous.
You can also use modules like Boost which saves static html pages so for anoymous visits your database is not touched at all (just the first time, as the page needs to be generated someday...).
You might also want to check the Views filters, not fetching to many rows at a time (for example fetching a week back for the front page).