PoWA-like web profiler for MySQL - mysql

PoWA is a profiling tool for PostgreSQL using pg_stat_statment module and provide a very complete set of data in a web interface.
I'm searching an equivalent for MySQL, with a Web interface (or a separate tools who provide an interface), to profile requests performances and database general performances, but I can't find good tools for MySQL.
Someone would know a good profiling tool for mysql?

Related

MySQL metadata (usage and performance statistics)

I am working on a web application that is based on a MySQL database. I need to collect and analyse usage and performance statistics. The statistics will be aimed at non-technical personnel.
How can I implement this feature? You should treat my question as a programming question but in case you know of a tool or extension that would be suitable please mention it.
The official MySQL client, MySQL Workbench includes the feature to visualize Performance Schema since version 6.1. It's in Performance section in the software.
Read more at: http://mysqlworkbench.org/

Is neo4j better than mysql

I am working on a network project and already using mysql as my backend. The project is coded in c++. When my data becomes pretty large, its takes a lot of time to retrieve data from mysql. Hence I was exploring other databases and came across neo4j. After reading a lot of stuff on internet about neo4j and I have few queries. My core requirement of my project is high performance and availability which I am not getting when my database becomes huge.
My questions:
I am a little hesitant in using neo4j since I have read on internet at places that it does not perform better than mysql. Is it true?
There are no c++ neo4j drivers and can be accessed only via rest apis. Will it make my project even slower as it will be now a http request and response?
can we run neo4j on solaris as my server for the project will be solaris?
Disclaimer: My answer might be biased since I'm working for Neo Technology. Nevertheless I will stay as objective as possible.
Regarding your questions:
It totally depends on your use case if a graph database are a relational database performs better. A graph database excels when you run local queries (e.g. "which are the friends of my friends). By local query I'm referring to a case where you start at one or more bound nodes and then traverse through the graph. For global queries (e.g. "what is the average age of people in the db") a graph database can perform at the same level a relational but will not be significantly faster. However, if your global queries need to do a lot of traversals, the benefit of a graph database will also be significant.
No, using your language's HTTP capabilities will not be slower compared to using a driver. Most of the drivers add some convenience layer(s) for creating the request and parsing the response, and maybe some caching.
Neo4j as a JVM based database can run on any JVM 7 enabled platform. However Neo Technology's support offering currently covers Linux, Windows and HP-UX. If you need commercial grade support for Solaris please get in touch with me or my colleagues directly.

What is the best way to use Web database using Delphi?

all.
I'm using DBExpress and C++ Builder(Delphi) 2007 and MySQL, firebird , ...
I'd like to make win 32 application which use Database(located on my web server).
I tried using DBExpress (TSQLConnection for MySQL), it's so so slow...
and I tried local database then upload/download using Indy..
but it was not good and little complicated.
So what is the base way to use web-based database for win 32 application?
Do you have any experience? or any document or any comment will be so so graceful..
thanks a lot..
Database connections via an Internet link (using a VPN or not) are slow - you are perfectly right. The main reason IMHO is the "ping" delay of every request, which is very low on a local network, and much higher via Internet. So direct connection is not a good idea.
In latest versions of Delphi, you have the DataSnap components, which is the new "standard" (or Embarcadero recommended) way of doing remote access (including web access). Even if it was found at first to be a bit limited, the latest versions are perfectly usable, and are becoming a key product for cross-platform application building with Delphi. But it is not available for Delphi 2007.
One much matured product (and available for Delphi 2007) is Data Abstract:
Data Abstract is a framework for building database-driven applications
using the multi-tier data access model, for a variety of platforms.
Of course, this is not free, but this is a proven and efficient solution.
You may also take a look at our Client-Server ORM, which can connect to any DB, and is able to implement a RESTful SOA architecture with Delphi 2007, even without using the ORM part - that is, you can use your existing DBExpress-based source code, and expose easily some web interfaces to the data. It is Open Source, and uses JSON as communication format over a secured authentication mechanism. There is a lot of documentation included (more than 700 pages of PDF), which also tries to introduce to the SOA world.
Take a look at Datasnap: info
You need a data access library, which offers features:
Thread safety. In general, you will need to use a dedicated connection for each thread.
Connection pooling. To make connection creation (what is needed for (1)) fast, there must be a connection pool.
Fast execute SQL command, open result set, fetch capabilities.
Tracing. With any one library you may run into performance issues. You need a tool to see what is going on wrong. For that you will need to see and analyze the client and server communication.
Result set caching and ability to read it simultaneously from different threads. You may have few read-only tables, which you will fetch once and cache in your application. But you will need a machanism to read this data from threads. Kind of InMemTable cloning.
My answer is biased, but you may consider AnyDAC. It has all these and many other features.
PS: dbExpress should work too. Try to find first the reason for your performance issue, and not a different library. Because the same may happen with other library ...
DB applications over a slow link need a different approach than those using a fast link. You have to be careful about how much data you move around, and about how many roundtrips your application perform.
Usually an approach when the needed subset is cached on the client, modified, and the applied to the database is preferrable (of course if changes do not neeed to be seen immediately, and the chances of conflicts are low).
No middleware will help you much if the application is not designed with handling a slow link in mind.

Alternatives to MySQL that do not have a GPL licence

We have a built a new data fusion C++ algorithm which uses SQLite as an internal database.
However, we would like each of the multiple C++ threads to do a parallel db write and SQLite cannot do that.
So we are now looking at MySQL which allows each of the multiple C++ threads to do a parallel db write.
However, the MySQL non-GPL licence is too costly and we don't want to rely on Oracle for MySQL support since our data fusion C++ algorithm will soon have a US patent.
Are they are any alternatives to MySQL which allows each of the multiple C++ threads to do a parallel relational database write which do not have a costly licensing policy like ORACLE MySQL?
So far, I am starting to look at PostgreSQL's BSD license and Sybase open source relational database.
Could someone tell us if PostgreSQL or SYbase is the right direction to go in?
PostgreSQL is definitely a very good alternative to MySQL.
In my opinion PostgreSQL is actually the better choice anyway looking at all the things that MySQL doesn't get right and the number of SQL features that they still don't have.
But again that's my personal opinion.
In terms of licensing the Postgres license is indeed more flexible for commercial usage than the GPL.
The support from the PostgreSQL community on the mailing list is outstanding - I don't know if there is something comparable in the Sybase world (actually I didn't know that Sybase is now OpenSource).
There should be quite a few options. If you're not worried about being cross platform, you could try SQL Server Express. You can use this in production subject to some limitations (I think the limit relates to the type of hardware you can install it on). There is also an express edition of Oracle with similar usage constraints.
In the open source world, there is Firebird which I believe you should be able to use in embedded mode (that is, without having to install a separate network server process). I haven't used this in production but it has been around for many years and looking through SO, it seems to be well regarded. It uses MPL so there should be no licensing risks.
For completeness, you could consider MaxDB from SAP and the Ingres Database System. MaxDB seems to be a very capable DBMS but when I tried it years ago (version 7.6) it seemed to be extrodinarily difficult to work with. I've never worked with (or heard of anyone working with) Ingres but apparently it's open source and can be freely used.
Like "a_horse_with_no_name", I'm not aware of there being an open source edition of Sybase although I might have just missed it.
Phil

Database choices

I have a prickly design issue regarding the choice of database technologies to use for a group of new applications. The final suite of applications would have the following database requirements...
Central databases (more than one database) using mysql (must be mysql due to justhost.com).
An application to be written which accesses the multiple mysql databases on the web host. This application will also write to local serverless database (sqlite/firebird/vistadb/whatever).
Different flavors of this application will be created for windows (.NET), windows mobile, android if possible, iphone if possible.
So, the design task is to minimise the quantity of code to achieve this. This is going to be tricky since the languages used are already c# / java (android) and objc (iphone). Not too worried about that, but can the work required to implement the various database access layers be minimised?
The serverless database will hold similar data to the mysql server, so some kind of inheritance in the DAL would be useful.
Looking at hibernate/nhibernate and there is linq to whatever. So many choices!
Get a better host. Seriously - SQL Server hosts don't cost that much more. An hour development time possibly per month - and that is already non-conervative.
Otherwise - throw out stuff you do not need. Neutralize languages to one. If that is an internet access stuff, check out OData for exposing data - nice nidependant protocol
The resit sis architecture. and LINQ (2Sql) sucks - compared to nhibernate ;)
but can the database access layer be reused?
Yes, it can be, but you have to carefully create a loosely coupled datalayer with no dependency on other parts.