In my codeigniter project an excel file around 5000 records uploaded and updated in database, when doing this action it taken around 15 minutes to complete. Please give me any idea to run this task in background , because of this the user no need to wait until it complete the update queries.
Don't know about anything like Jobs or a Scheduler for codeigniter.
For your specific use case you could also look into PHP Generators.
https://www.php.net/manual/en/language.generators.overview.php
Generators can probably speed up the task you talk about significantly.
Related
The whenever gem is installed. I see all kinds of cron and whenever tutorials about scheduling tasks with minutes, hours, days etc. I watched the railscast video on cron/whenever.
But I've yet to find any examples about how to write a job itself, other than rake tasks.
I want to schedule a task that checks the database for changes. The idea is that the database will have tags that tell you which particular row has changed. Whenever should poll the database periodically to check for these changes. Then it hopefully can push, or let the client know it needs to update the page dynamically using ajax.
If I were doing this manually, I'd use commands like:
rails dbconsole
select blah from blah;
Is there a way to write mysql commands in whenever? Is this the correct/best way to poll the database for changes?
I know there are ways to poll a database from mysql itself, but I've been specifically told to do it from the rails side.
I'm a newbie to all of these technologies (Rails, databases, ajax) so that's probably why the answer isn't clear to me.
On the client end, I have buttons that use jquery to add/delete/change row data, just to assure myself I know how to change things in the table once I can get stuff from the database. Those buttons will eventually be removed.
Right now, the page uses ajax to refresh the entire html table. But they would like just a row refresh/update through ajax.
Look at the RailsCast for cron/whenever again. You'll notice an example line of code like this:
runner "MyModel.some_process"
The code in the strings is evaluated and run. So whatever you want whenever to run, just write that code yourself and have a way for it to be called.
So maybe you create a class named DatabaseWatcher and store it in lib which has a class level method named .run, you'd do the following:
runner "DatabaseWatcher.run"
And that's it. In your .run method is where you'd put your logic. As for how to actually write that code, that depends on your requirements. Are you looking for if the updated_at time is within 1 minute of now? Do you store a time when you last checked the DB, and then you can see if the updated_at time is greater than that? Do you have a table that stores every time the model is changed? That all depends on you.
I'm planning to debug Joomla site by entering each query and it's query execution time to a database table. I have more than 10 models which have different queries. I'm pretty sure that all the queries go through a single place/class before executing but I have no idea where/what the place/class is.
My issue is, Is there a central place I can edit to log the database query and the execution time of a SQL query? I mean like edit a core file just to log every SQL query & it's execution time.
How can I get it done?
Have you considered using Joomla's built-in System Debug?
Rather than trying to do this programmatically with brute force, it seems it would be far easier and less intrusive to use a proper SQL benchmarking tool such as MySQL Benchmark Suite Another possible non-brute-force option might be Toad World
If you wanted to stay away from third-party tools, a slow query log might be the place to start.
If you really want to do it via joomla (hack):
Goto joomla's database driver, for 3.3 that is: libraries/joomla/database/driver.php
Remove the setDebug function (in case some component set it to 0)
At start of file change $debug = false; into $this->debug = true;
Now, every query gets logged together with profile information.
Can anyone tell me if this is possible and if so point me in the right direction on how to implement it!
Basically I have a panel on my website that's hitting the database and outputting the data to the screen, everytime somebody hits the page. This is a heavy burden on the server and what I would like to do is create a HTML file with the information every 30 minutes and then request this HTML file with an include on the website.
<!--#include file="myHTMLpanel.asp"-->
By the way, i'm using classic .ASP ;o(
ASP provides a great mechanism for generating web pages, no point in creating your own. Use caching, like this. Create a scheduled task that hits the server every 30 minutes and breaks the cache.
Use VBScript to write a script that will run your query and write out the HTML page. Schedule it to run every 30 mins via scheduled task.
You might want to investigate a pre-built caching solution on the server rather than writing it manually.
A product such as Varnish will make it virtually effortless.
I need a local copy of our production database, and I need to refresh it every few days so testing and development is not working with terribly stale data. A few days old is just fine. Here is the pseudo plan:
Write a script on the Production server that mysqldump's + gzip the database.
Add a cron process to run the script every other day during non-peak hours.
Write a script on the workstation that rsync's that gzipped dump and loads it up.
Is there any better, cleaner, or safer way of doing this?
EDIT: Just to add clarity. We still have in place Test Data that is known, along with our test library (test driven development). Once THOSE tests pass, its on to the (more) real stuff.
You may wish to consider MySQL replication. It isn't a thing to be trifled with but may be what you are looking for. More information here... http://dev.mysql.com/doc/refman/5.0/en/replication-features.html (I don't personally know anything about it other than that it can be done).
Testing should be working with "known" data; not production data. You should have scripts to load "Test" data into the system to achieve this. Test/Dev shouldn't have to deal with a moving target of data. Besides, if you have any sensitive data in production (doesn't everyone"); your dev/test teams shouldn't have access to it.
Some suggestions for creating test data:
1) Excel spreadsheets with VBA behind them to create sql to run against the DB
2) Raw sql scripts
3) Data creation programs that generate data in a known pattern.
For some security issues I'm in an envorinment where third party apps can't access my DB. For this reason I should have some service/tool/script (dunno what yet... i'm open to the best option, still reading to see what I'm gonna do...)
which enables me to generate on a regular basis(daily, weekly, monthly) some csv file with all new/modified records for a certain application.
I should be able to automate this process and also export at any time a new file.
So it should keep track for each application which records he still needs.
Each application will need some data in some other format (csv/xls/sql), also some fields will be needed for some application and some aren't... It should be fairly flexible...
What is the best option for me? Creating some custom tables for each application? Based on that extracting modified data?
I think you best thing here, assuming you have access to the server to let you set this up is to make a small command line program that can do the relativley simple task you need. Languages like pearl are good for this sort of thing I do believe.
once you have that 'tool' made you can schedule it through the OS of the server to run ever set amount of time. Either schedule task for a windows server or a cronjob for a linux server.
You can also (with out having to set up the scheduled task if you don't / can't want to) enable this small command line application to be called via 'CGI' this is a special way of letting applications on the server be executed at will by a web user. If you do enable this though, I suggest you add some sort of locking system so that it can only be run every so often and to stop it being run five times at once.
EDIT
You might also want to just look into database replication or adding read only users. This saves a hole lot of arseing around. Try to find a solution that dose not split or duplicate data. You can set up users to only be able to access certain parts of the database system in certain ways, such as SELECT data