I'm planning to debug Joomla site by entering each query and it's query execution time to a database table. I have more than 10 models which have different queries. I'm pretty sure that all the queries go through a single place/class before executing but I have no idea where/what the place/class is.
My issue is, Is there a central place I can edit to log the database query and the execution time of a SQL query? I mean like edit a core file just to log every SQL query & it's execution time.
How can I get it done?
Have you considered using Joomla's built-in System Debug?
Rather than trying to do this programmatically with brute force, it seems it would be far easier and less intrusive to use a proper SQL benchmarking tool such as MySQL Benchmark Suite Another possible non-brute-force option might be Toad World
If you wanted to stay away from third-party tools, a slow query log might be the place to start.
If you really want to do it via joomla (hack):
Goto joomla's database driver, for 3.3 that is: libraries/joomla/database/driver.php
Remove the setDebug function (in case some component set it to 0)
At start of file change $debug = false; into $this->debug = true;
Now, every query gets logged together with profile information.
Related
I have reviewed the previous Questions and haven't found the answer to the following question,
Is there a Database Tool available in MS Access to run and identify the Queries that are NOT Bring used as a part of my database. We have lots of Queries that are no longer used and I need to clean the database and get rid of these Queries.
Access does have a built in “dependency” feature. The result is a VERY nice tree-view of those dependencies, and you can even launch such objects using that treeview of your application to “navigate” the application so to speak.
The option is found under database tools and is appropriately called Object Dependencies.
The result looks like this:
While you don't want to use auto correct, this feature will force on track changes. If this is a large application, then on first run a significant delay will occur. After that, the results can be viewed instantly. So, most developers still turn off track name autocorrect (often referred to track auto destroy). However, the track auto correct is required for this feature.
And, unfortunately, you have to go query by query, but at least it will display dependences for each query - (forms, or reports). However, VBA code that creates SQL on the fly and uses such queries? Well, it will not catch that case. So, at the end of the day, deleting a query may well still be used in code, and if that code creates SQL on the fly (as at LOT of VBA code does, then you can never really be sure that the query is not not used some place in the application.
So, the dependency checker can easy determine if another query, another form/sub form, or report uses that query. So dependency checker does a rather nice job.
However, VBA code is a different matter, and how VBA code runs and does things cannot be determined until such time code is actually run. In effect, a dependency checker would have to actually run the VBA code, and even then, sometimes code will make several choices as to which query to run, or use - and that is determined by code. I suppose that you could do a quick "search", since a search is global for VBA (all code in modules, reports and forms can be searched). This would find most uses of the query, but not in all cases since as noted VBA code often can and does create sql on the fly.
I have a vague recollection part of Access Analyzer from FMS Inc has this functionality built in.
Failing that, I can see 2 options that may work.
Firstly, you could use the inbuilt Database Documenter. This creates a report that you can export to Excel. You would then need to import this into the database, and write some code that loops the queries to see if they appear in this table;
Alternatively, you could use the undocumented "SaveAsText" feature to loop all Forms/Reports/Macros/Modules in your database, as well as looping the Querydefs and saving their SQL into a text file. You would then write some VBA to loop the queries, open each of the text files and check for the existence of the query.
Either way, rather than just deleting any unused queries, rename then to something like "old_Query", and leave them for a month or so in the database just in case!!
Regards,
I have multiple forms with specific buttons to show/hide data (f.e. data from last year/all data). I've been thinking a while about query performances and I can't really find a good source which explains how access handles those.
In particular I was wondering, if there is a performance difference between using a saved query (for a specific case, like 'data from last year') and appending parameters to a query during runtime (say add 'where xy = -1' to a query with no constraints). Will both querys be executed in the backend, or will any constraints I add to an existing query via VBA be executed in the frontend?
Should I create a query for every possible scenario or use only one query where I add parameters during runtime?
Fyi I'm using jet and my database is split to frontend and backend (Access 2007). The backend will be located on a network folder.
Check the answers to this question: does stored procedure enhance the performance in an acess databank,why?
Mostly covers what you are asking for.
As Access is a file based system rather than a dedicated server, absolutely nothing is run in the back-end, even if it is stored there, everything is executed locally in whichever copy you have open, namely the front-end or the back-end.
I have a classic asp code with access database, and i am new to it. I need to log all the sql queries happen in system since the system seems loosing data. can any one help with with this?
Thanks
Ms-Access does not support transaction logging like the more advanced database servers, the only way to do this would be if you could control the environment whereby any SQL is run, in this case within the front-end application. You could then trigger a routine to save the executed string. However, if people have background access to the database and can run their own queries whenever they like then you have little chance of logging these.
For some security issues I'm in an envorinment where third party apps can't access my DB. For this reason I should have some service/tool/script (dunno what yet... i'm open to the best option, still reading to see what I'm gonna do...)
which enables me to generate on a regular basis(daily, weekly, monthly) some csv file with all new/modified records for a certain application.
I should be able to automate this process and also export at any time a new file.
So it should keep track for each application which records he still needs.
Each application will need some data in some other format (csv/xls/sql), also some fields will be needed for some application and some aren't... It should be fairly flexible...
What is the best option for me? Creating some custom tables for each application? Based on that extracting modified data?
I think you best thing here, assuming you have access to the server to let you set this up is to make a small command line program that can do the relativley simple task you need. Languages like pearl are good for this sort of thing I do believe.
once you have that 'tool' made you can schedule it through the OS of the server to run ever set amount of time. Either schedule task for a windows server or a cronjob for a linux server.
You can also (with out having to set up the scheduled task if you don't / can't want to) enable this small command line application to be called via 'CGI' this is a special way of letting applications on the server be executed at will by a web user. If you do enable this though, I suggest you add some sort of locking system so that it can only be run every so often and to stop it being run five times at once.
EDIT
You might also want to just look into database replication or adding read only users. This saves a hole lot of arseing around. Try to find a solution that dose not split or duplicate data. You can set up users to only be able to access certain parts of the database system in certain ways, such as SELECT data
We have a single SQL Log for storing errors from multiple applications. We have disabled the elmah.axd page for each one of our applications and would like to have a new application that specifically displays errors from all of the apps that report errors to the common SQL log.
As of now, even though the application for all errors is using the common SQL log, it only displays errors from the current application. Has anyone done this before? What within the elmah code might need to be tweaked?
I assume by "SQL Log" you mean MSSQL Server... If so, probably the easiest way of accomplishing what you want would be to edit the stored procedures created in the SQL Server database that holds your errors.
To get the error list, the ELMAH dll calls the ELMAH_GetErrorsXML proc with the application name as a parameter, then the proc filters the return with a WHERE [Application] = #Application clause.
Just remove the WHERE clause from the ELMAH_GetErrorsXML proc, and all errors should be returned regardless of application.
To get a single error record properly, you'll have to do the same with the ELMAH_GetErrorXML proc, as it also filters by application.
This, of course, will affect any application retrieving errors out of this particular database, but I assume in your case you'll only ever have the one, so this should be good.
CAVEAT: I have not tried this, so I can't guarantee the results...
It's not a problem to override the default Elmah handler factory so that it will filter Elmah logs by applications. I wrote a sample app that shows how to do it with MySql: http://diagnettoolkit.codeplex.com/releases/view/103931. You may as well check a post on my blog where I explain how it works.
Yes, it easily works. However you can't see app name in Elmah/Default.aspx. I haven't found if it is confugurable - just display one column more.