Which database to store big metric data on Raspberry Pi? [closed] - mysql

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I want to use Raspberry Pi as an independent sensor, which will measure some value each second and store this metric data into a local database. Then I would like to query the database based on the date range. Which database should I use, taking into account the limited resources of RPi and that there will be ca. 30758400 records/year? Are there any RPi-specific lightweight database engines especially for this purpose?

I think SQLite will perform well in this role. You might need to tune the pragma settings a bit for the rPi (e.g. set journal_mode=WAL), but SQLite can easily handle multi-gigabyte databases. (SQLite's main weakness is concurrent access, but that won't be a problem for your application.)
If you only need to store timestamp/value data, and only query on timestamp ranges, you might opt to use a key/value store such as LevelDB. You lose the flexibility of a SQL engine, but you gain performance.
What storage medium are you planning to use? An ACID database will write to the disk for every transaction. Continuous I/O like that can quickly kill an SD-card.

Related

What's the best Database solution for millions of records? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Actually we use a MYSQL database that contain tables with over than 15 000 000 records , so we noticed that the queries are very slow, it's normal :D, but we are looking for optimizing our services .
Bear in mind that we have indexed all the needed fields, and we use memcache to add some help,
Our SERVER Machine has the following configuration :
model name : Intel(R) Xeon(R) CPU E5630 # 2.53GHz
So my question is : Is there any one in the same case and what's the best DBMS for my case?
I will be glade to receive your suggestions .
Thank you.
All the best.
MySQL can certainly cope with 15M records. If there's a problem, it's not that you're using the wrong DBMS.
If you really have indexed everything you need to index, you could also check whether your machine has enough RAM in it. That's probably going to be more important than the CPU.

Which Database is Best for Magento [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Which database is best for more performance with Magento?
My Magento store using more than 80k products with 2k category.
My Question is :
1) Is Mysql database will enough to manage my product / category ranges ?
2) Or Shall i use any other database with Magento to keep the site loading fast ?
Any suggestion much appriciation.
You might also want to give Percona Percona a call. Other MySQL forks are available. I haven't used Percona but they know a lot about tuning InnoDB. Performance improvements over MySQL are moot but read all the answers to this question for some background.
Whatever you do, caching is certainly your friend.
Your product set is large but traffic volume and traffic spikes have a big impact on tuning and hardware choice.
With that many products and categories your shop surely won't perform very fast. But this also depends on your hardware and the amount of Magento StoreViews you will be using.
So the question you should ask is: what is the best caching mechanism to keep your frontend fast?
There are different approaches like using a combination of Varnish, Redis and nginx.

MySQL to PostgreSQL conversion and data synchronisation [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I have a relatively large MySQL database (over 300 tables) which I desperately need to convert to PostgreSQL and synchronise data between the two databases if not real time then something close to it. Ideally I need a bi-directional data sync, or at least one directional - MySQL to Postgres sync.
I have managed to convert the database and import the data, but synchronisation seems to be a real problem.
This solution from DBConvert should supposedly do exactly that. After many days of trying to make it work I gave up. They don't even have a linux client which is strange considering that absolute majority of MySQL and Postgres database would run on linux servers.
Is there an alternative to DBConvert's solution that would do the same?
Check out Pentaho ETL tools Kettle and its client interface, Spoon. http://kettle.pentaho.com
boy, you have a job ahead of you in terms of bidirectional synchronization. This is hard on the best of days, and it poses a lot of problems.
The tool I would look at first, actually, would be RubyRep. This gives you a basic framework for replication between your databases, and it supports a number of RDBMS's.
The second thing you have to think about is what you are actually doing and why this is a really bad idea. Your biggest issue is conflict resolution and managing what happens if two different people update the same record on the different db's. This is not a trivial problem and it requires thinking through the actual workflows and scenarios carefully.

Database stored on a shared drive [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I have to make a database for 250 students which will have around 100-200 columns. Now its not possible to install MYSQL or anything like that on the server.All we can get is a shared folder on the server. The client side can have anything installed. There will be around 5-10 clients who will add,edit or delete the records. I though about SQLite as an option. Is there any security issues with it???
I need a database to be accessed by a 5 to 10 clients. We do not have a full server per se but rather a shared folder on a server. We therefore cannot install any server-side software, only client-side.
I would use SQLite. You could also use MS Access but consider that problems with old MS Access databases are common in companies where Access where used on the late nineties.
Have a look at the following questions here
https://superuser.com/questions/111798/small-database-recommendation-free
Free database for small datawarehouse
You will find enough information to get you started.
Else look at these
SQLite
HSQLDB

Bad Situation importing/exporting img files - mysql database [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I inherited a poorly created mysql database and now I need to migrate data to a new server.
Long story short, I need to keep it stored this way and I use phpmyadmin. Know of any tools to help the migration of this 1.2GB mysql table?
Hope I don't get slaughtered for this post...
MySQL Workbench (free as in beer, free as in speech) has dump and restore features.
http://dev.mysql.com/doc/workbench/en/wb-manage-server-data-dump-tab.html
Phpmyadmin and other admin tools also have those features, but web-based tools may not handle such a large table properly.
Dump your big table from your old server to a file on your desktop machine. Restore it to the new server. It may take overnight. So what? You only have to do it once (unless you mess it up the first time). Side benefit: you'll have a backup of your old table that you can put onto a DVD-RW and throw in your desk drawer.
You might have to segment the dump process by selecting rows a few million at a time. That's probably a good idea, because then you can restart the process if it crashes.
There are some tools (sqlyog) that can copy data from one server to another directly, as well.
Happy data wrangling.