Converting MYSQL queries to HQL - mysql

We just migrated from MYSQL to Oracle 11g and I have to change some queries since I had coded the for MYSQL specifically. So we are planning to use HQL for our queries. Is there any particular approach to doing this conversion? Like many places I have used 'Limit' and other MYSQL specific keywords.. (we are using grails/groovy for development, so its a domain classes based model). Is there any specific tool which can do the conversion ?

I don't know of any particular tool that would automatically do this for you. However, HQL can easily accommodate your mySQL queries.
I think for complex queries, best option would be to use executeQuery - basically HQL queries.
To make queries more readable, use createCriteria.
I tend to use both in my grails applications. However, whenever possible I use createCriteria to make it more readable.

Related

Efficient and fastest way to fetch huge 50,000+ rows in Zf2

I need to fetch huge data in my code. I am using zf2, currently I am using Doctrine2 to deal with my database. But I just found that ORM queries are taking much time than plain mysql queries. So could you please suggest me the best way to fetch huge data from database in zf2.
There are a few things to consider here:
By default, Doctrine will hydrate the database records into php objects (entities): it fills the entities with the data from your query. This is done by "smart guessing" so to speak, which is relatively slow. Hydrate to arrays, and you are getting a much faster response. Read more about hydration in the manual.
By default, Doctrine does not use any caching to parse your mappings or when transforming the DQL into SQL. You can use caching to speed things up. Also, Doctrine is faster dealing with read-only entities than making them read/write. Read more about Doctrine performance issues in the manual.
Rendering 50,000 rows in a html table (or whatever) is a pain for your render engine (Zend\View / php) and your DOM (in the browser). The query might be optimized and load fairly quickly, rendering all these results into a view and displaying them into a browser will be really slow as well.
Using pagination will decrease the size of your dataset, speeding up the query and the rendering at the same time.
Using an ORM to help you with mapping, database abstraction and so on is really nice, but comes with the trade-off: performance. Using a library like Doctrine inherently increases execution time: you will never reach the performance of the php mysql functions in Doctrine.
So:
Try to decrease the number of results from one query, to speed up the database query, hydration and rendering
Try tuning performance with Doctrine by using array hydration, caching and read-only entities.
If you need to fastest way possible, don't use Doctrine. Try out Zend\Db or write all things in sql yourself

codeigniter mysql views vs query

I am developing with Codeigniter and when it gets to complicated database queries
I am using
$this->db->query('my complicated query');
then cast to array of object using $query->result();
and so far it's very good and useful
Now my question is
what if I want to create mysql view and select from it? Will
$this->db->from('mysql_view')
take the mysql view as it's a table or not?
And if I do that will be any difference in performance are views faster than normal database query?
What would be best practice with Codeigniter and MYSQL database dealing with complicated queries as I understand that ActiveRecord is just query builder and as some tests it's even a little slower
Thanks in advance for your advise
MySQL views are queried the same way as tables, on a side note, you can't have a table and a view share the same name.
Depends on the query you use in the view, views can be internally cached so in the long run - yes, they are faster.
Best practice in this case is to use whatever you find easy to use for yourself and your team, I personally stick to using $this->db->query(); as I find it's easier to change a simple query of this kind to have some advanced functionality like sub-queries or other things that are hard and/or impossible to do with CI query builder. My advice would be to stick to one way of queries - if you use ->query(), then use them everywhere, if you use a query builder, then use it wherever it is possible to achieve the result using it.

MySQL: Data query from multiple tables with views

I want to create a query result page for a simple search, and i don't know , should i use views in my db, would it be better if i would write a query into my code with the same syntax like i would create my view.
What is the better solution for merging 7 tables, when i want to build a search module for my site witch has lots of users and pageloads?
(I'm searching in more tables at the same time)
you would be better off using a plain query with joins, instead of a view. VIEWS in MySQL are not optimized. be sure to have your tables properly indexed on the fields being used in the joins
If you always use all 7 tables, i think you should use views. Be aware that mysql changes your original query when creating the view so its always good practice to save your query elsewhere.
Also, remember you can tweak mysql's query cache env var so that it stores more data, therefore making your queries respond faster. However, I would suggest that you used some other method for caching like memcached. The paying version of mysql supports memcached natively, but Im sure you can implement it in the application layer no problem.
Good luck!

Ruby on rails active-record generated SQL on Postgres

Why does Ruby on rails generated more queries in the background on Postgres than MySQL? I haven't tried deploying Rails on production with Postgres yet, but I am just afraid this generated queries would affect the performance. Do you find Rails with Postgres is slower than MySQL, knowing that it produce more query on the background? Or it is relatively the same?
i don't think that Postgres will affect your performance, despite it uses more resource, since its a full RDBM implementation, in the long term, postgres will be the best option if you are looking for performance.
Number of SQL queries is relative, you must remember that to bring same results (MySQL and PostGRES has their own implementations) Just to give an example MYSQL dont implement the full PS/SQL..
so in the bottom line: the number of "queries" are irrelevant and postgres will help you to sleep better at night :-)
good reference about this subject: http://www.amazon.com/Enterprise-Rails-Dan-Chak/dp/0596515200

mySQL to postgreSQL

We are trying to switch our application from talking to mySQL to postgres. I wished we would have some kind of a subroutine (in VB6) were I can input the mySQl SQL query and get a string out containing the appropriate postgres query.
Does anybody have a solution or is interested working with us on that?
I would consult the "Comparison of Different SQL Implementations;" it's a most useful reference when converting queries from one RDBMS to another. WikiBooks also has a page entitled "Converting MySQL to PostgreSQL." It has a short list of the big differences between the two.
I don't know of any (free/open source) utility to translate queries, but unless you have really big, complicated queries, you shouldn't have much difficulty translating them (and, if you do have big, complicated queries, an automated tool probably won't help).
I don't believe that there's any 'silver bullet' tool that would convert all of your queries from being MySQL to Postgres compatible.
What you do need is:
a reference of the differences between the two RDBMS (see #"James McNellis")
a good test plan for your application that will put it through the paces to ensure that your converted database backend works
a good reason to go through all this trouble; performance? management edict? etc.