codeigniter mysql views vs query - mysql

I am developing with Codeigniter and when it gets to complicated database queries
I am using
$this->db->query('my complicated query');
then cast to array of object using $query->result();
and so far it's very good and useful
Now my question is
what if I want to create mysql view and select from it? Will
$this->db->from('mysql_view')
take the mysql view as it's a table or not?
And if I do that will be any difference in performance are views faster than normal database query?
What would be best practice with Codeigniter and MYSQL database dealing with complicated queries as I understand that ActiveRecord is just query builder and as some tests it's even a little slower
Thanks in advance for your advise

MySQL views are queried the same way as tables, on a side note, you can't have a table and a view share the same name.
Depends on the query you use in the view, views can be internally cached so in the long run - yes, they are faster.
Best practice in this case is to use whatever you find easy to use for yourself and your team, I personally stick to using $this->db->query(); as I find it's easier to change a simple query of this kind to have some advanced functionality like sub-queries or other things that are hard and/or impossible to do with CI query builder. My advice would be to stick to one way of queries - if you use ->query(), then use them everywhere, if you use a query builder, then use it wherever it is possible to achieve the result using it.

Related

improving performance of mysql stored function

I have a stored function I ported from SQL Server that is super slow for MySql. Since my application needs to support both SQL Server and MySql ( via ODBC drivers ), I sort of require this function. The stored function is normally used in the "where" clause, but it is slow even when it is only in the "select" clause.
Are there any tricks for improving the performance of functions? Maybe any helpful tools that point out potential stored function problems or slow points?
They are complex functions. So while one statement in the function is relatively quick, everything put together is slow.
Also, views are used inside these functions. Don't know how much that impacts things. The views themselves seem to run in a reasonable time.
I know I am not giving much specifics, but I am more looking for a performance tool or some high level stored function performance tips.
I did see various posts online advising people not to use functions in "where" clauses, but I am sort of stuck since I want to go back and forth between SQL Server and MySql with the same executable, and the functions are to complex to embed directly into the SQL in my application.
Edit based on Gruber answer, to clarify things with an example:
Here is an example query:
SELECT count(*) FROM inv USE INDEX(inv_invsku) WHERE invsku >= 'PBM116-01' AND WHMSLayer_IS_LOC_ACCESSIBLE( '', 'PFCSYS ', invloc ) > 0 ORDER BY invsku;
If I get rid of the call to IS_LOC_ACCESSIBLE, it is considerably faster. IS_LOC_ACCESSIBLE is just one of 3 such functions. It has a bunch of IF statements, queries of other tables and views, etc. That is why I call it "complex", because of all that extra logic and conditional paths.
You can try using a query profiler, for instance the one included in dbForge Studio, and try to figure out exactly what the bottlenecks are.
Maybe your tables aren't properly indexed?
You can sometimes achieve good improvements in MySQL (and SQL Server) by moving things around in your queries, creating identical output but changing the internal execution path. For instance, try removing the complex WHERE clause, use the complex code as output from a wrapped SELECT statement which you can use WHERE on subsequently.

Efficient and fastest way to fetch huge 50,000+ rows in Zf2

I need to fetch huge data in my code. I am using zf2, currently I am using Doctrine2 to deal with my database. But I just found that ORM queries are taking much time than plain mysql queries. So could you please suggest me the best way to fetch huge data from database in zf2.
There are a few things to consider here:
By default, Doctrine will hydrate the database records into php objects (entities): it fills the entities with the data from your query. This is done by "smart guessing" so to speak, which is relatively slow. Hydrate to arrays, and you are getting a much faster response. Read more about hydration in the manual.
By default, Doctrine does not use any caching to parse your mappings or when transforming the DQL into SQL. You can use caching to speed things up. Also, Doctrine is faster dealing with read-only entities than making them read/write. Read more about Doctrine performance issues in the manual.
Rendering 50,000 rows in a html table (or whatever) is a pain for your render engine (Zend\View / php) and your DOM (in the browser). The query might be optimized and load fairly quickly, rendering all these results into a view and displaying them into a browser will be really slow as well.
Using pagination will decrease the size of your dataset, speeding up the query and the rendering at the same time.
Using an ORM to help you with mapping, database abstraction and so on is really nice, but comes with the trade-off: performance. Using a library like Doctrine inherently increases execution time: you will never reach the performance of the php mysql functions in Doctrine.
So:
Try to decrease the number of results from one query, to speed up the database query, hydration and rendering
Try tuning performance with Doctrine by using array hydration, caching and read-only entities.
If you need to fastest way possible, don't use Doctrine. Try out Zend\Db or write all things in sql yourself

Converting MYSQL queries to HQL

We just migrated from MYSQL to Oracle 11g and I have to change some queries since I had coded the for MYSQL specifically. So we are planning to use HQL for our queries. Is there any particular approach to doing this conversion? Like many places I have used 'Limit' and other MYSQL specific keywords.. (we are using grails/groovy for development, so its a domain classes based model). Is there any specific tool which can do the conversion ?
I don't know of any particular tool that would automatically do this for you. However, HQL can easily accommodate your mySQL queries.
I think for complex queries, best option would be to use executeQuery - basically HQL queries.
To make queries more readable, use createCriteria.
I tend to use both in my grails applications. However, whenever possible I use createCriteria to make it more readable.

MySQL: Data query from multiple tables with views

I want to create a query result page for a simple search, and i don't know , should i use views in my db, would it be better if i would write a query into my code with the same syntax like i would create my view.
What is the better solution for merging 7 tables, when i want to build a search module for my site witch has lots of users and pageloads?
(I'm searching in more tables at the same time)
you would be better off using a plain query with joins, instead of a view. VIEWS in MySQL are not optimized. be sure to have your tables properly indexed on the fields being used in the joins
If you always use all 7 tables, i think you should use views. Be aware that mysql changes your original query when creating the view so its always good practice to save your query elsewhere.
Also, remember you can tweak mysql's query cache env var so that it stores more data, therefore making your queries respond faster. However, I would suggest that you used some other method for caching like memcached. The paying version of mysql supports memcached natively, but Im sure you can implement it in the application layer no problem.
Good luck!

Is there some kind of "strict performance mode" for MySQL?

I'd like to setup one instance of MySQL to flat-out reject certain types of queries. For instance, any JOINs not using an index should just fail and die and show up on the application stack trace, instead of running slow and showing up on the slow_query_log with no easy way to tie it back to the actual test case that caused it.
Also, I'd like to disallow "*" (as in "SELECT * FROM ...") and have that throw essentially a syntax error. Anything which is questionable or dangerous from a MySQL performance perspective should just cause an error.
Is this possible? Other than hacking up MySQL internals... is there an easy way?
If you really want to control what users/programmers do via SQL, you have to put a layer between MySQL and your code that restricts access, like an ORM that only allows for certain tables to be accessed, and only certain queries. You can then also check to make sure the tables have indexes, etc.
You won't be able to know for sure if a query uses an index or not though. That's decided by the query optimizer layer in the database and the logic can get quite complex.
Impossible.
What you could do to make things work better, is createing views optimized by you and give the users only access to these views. Now you're sure the relevent SELECT's will use indexes.
But they can still destroy performance, just do a crazy JOIN on some views and performance is gone.
As far as I'm aware there's nothing baked into MySQL that provides this functionality, but any answer of "Impossible", or similar, is incorrect. If you really want to do this then you could always download the source and add the functionality yourself, unfortunately this would certainly class as "hacking up the MySQL internals".