I am using MySql Server database where that contains 500GB spaces so performance is very slow where it take 15-20mins to show search result. Please assist me what is the best solution to make search within seconds. can i use BigData Hadoop or anyother please assist me.
Thanks
If query speed is important, look into column store databases. That can be either with or without hadoop.
With hadoop, it's important to choose the right file structure. Parquet is a popular option. Query these files with:
Cloudera Impala
Apache Drill
Without Hadoop:
HP Vertica
Amazon Redshift
Greenplum
https://en.wikipedia.org/wiki/Column-oriented_DBMS
Related
My companies site uses a mysql database. One of our clients just trying to take advantage of our API is only able to give us the data in the form of a MSSQL .bak file
I have been trying to import the file using the migration tool built inot mysql workbench but have no luck.
On top of that I am trying to see if this can be done in powershell as I would like to automate this process in the future.
Any suggestions or help would be appreciated
You cannot. MS SQL Server backups are proprietary to MS SQL Server and cannot be used with any other RDBMS. You will need to restore this backup to SQL Server, then use an additional tool to transfer the data from SQL Server into MySQL.
Can you do that second portion through PowerShell? Probably. Though SSIS would probably be a better method.
So i have a low of dmp files that were to make tables in my sql developer database is there a way I could use those same dump files for my mysql database tables?
I never did that, but Googling around I found out that there are tools that make it possible. T
OraDump-to-MySQL is a program to export data from Oracle dump files into MySQL, MariaDB or Percona database. The program does direct reading from the dump, so Oracle installation is not required. Command line support allows to script, automate and schedule the conversion process.
I'm not posting a link (so that someone wouldn't call it spamming); I guess you'll be able to find it yourself.
I have a website having lacks of rows, now I am planning to upgrade its database to mongoDB no SQL database. Could any help me to find an easy way to upgrade to mondoDB with any of tools available on the internet.
You can export from MySQL in a CSV format and then use the Mongify tool. Just I had a few problems with handling dates of binary data.
I am continuously looking for a decent tutorial for importing MySQL database in neo4j but I didn't find anyone easily applicable. I am using neo4j version 3.0.3 and MySQL version 8.2.
You can help me any good tutorial or a tool that can dump MySQL database directly into neo4j, but both of them should target Windows OS.
Thank you.
H,
Michael has made such a tool, but I have never used it.
Take a look at this repo : https://github.com/jexp/neo4j-rdbms-import
Usually I make some SQL queries and I save them into a CSV file (#see How to output MySQL query results in CSV format?).
After I load them into Neo4j with the LOAD CSV (#see http://jexp.de/blog/2014/06/load-csv-into-neo4j-quickly-and-successfully/)
Cheers
That's about it. I can always just dump it to csv and read it in, but I was hoping to avoid that.
Since both Interbase and MySql have ODBC drivers, how about using your favorite development environment to write an app that opens each table in the IB database and copies it into the MySql database? There are various languages and IDE's that support data access using odbc.
This would be nicer than using csv because your code could copy the schema during the process of copying each table.
You can use Database Workbench
Cross database development
Use the Schema Compare and Migration
Tools to compare testing and deployed
databases, migrate existing databases
to different database systems.
ps: I don't know why you want to migrate from Interbase to MySQL but you can also take a look to Firebird