load excel files into mysql automatically [duplicate] - mysql

This question already has answers here:
Automate transfer of csv file to MySQL
(3 answers)
Closed 8 years ago.
I would like to know what would be the best way to automate the loading of an excel file into a mysql database.
The file would most likely be .csv, although, if there is a solution for text files, i can live with that. The data in to file would have to replace what is already in the database table.
I am searching for a solution meanwhile, and have found several for doing approximately this manually, as in, loading a file once, but i need this to happen every few minutes, if it is possible.

There is a native MySQL feature that allows importing a CSV file easily: LOAD DATA INFILE. All you need to do is declare your field- and line-separator correctly, if the default settings do not match your input file.
Please note that a CSV file is not an Excel file. It is a file format that Excel happens to be able to read.
If you really want to import Excel files (a .xlsx file, taht is), then you need some external library to first parse the Excel file, as MySQL is not able to read it natively.

Related

Appending data to SSIS flexible file storage

[SSIS] Do we have any settings for SSIS Flexible file destination to support "Append" data for Apache Parquet file? I am writing my data to the file in a loop and looks like every time it is writing, it is overwriting the existing data. Appreciate any help. Thanks
Writing data in loop. Expecting data to be appended, but getting overwritten
The Flexible File Task is the Azure equivalent of the File System Task which is involved with copying/deleting files and folders.
The Flexible File Destination is the sink for writing parquet files. Based on this MSDN question from 2 years ago citing that Append was a requested feature going on for 4 years
https://learn.microsoft.com/en-us/answers/questions/226219/flexible-file-destination-enable-appending-data-mo

Converting to parquet file format while load to hive tables [duplicate]

This question already has an answer here:
Is it possible to load parquet table directly from file?
(1 answer)
Closed 7 years ago.
We want to do real time replication from mysql to hdfs with the files being stored as the parquet format in the hdfs cluster.
As for as we know ,we can do this using either
1)tungsten replicator or
2)Mysql server supports live replication to hdfs.
But our problem is that none of them support conversion to parquet while loading data to hdfs.
so just wanted to know whether is there any way to do real time replication with the file being stored as parquet in hdfs cluster.
Second question is that when you load csv file in hive tables using "LOAD DATA INPATH" and if the table has been define as Parquet file format ,will hive convert the file to parquet format or we need to write a utility to convert the file to parquet format and then load.
Second question : The CREATE TABLE statement should specify the Parquet storage format with syntax.
it all boils down to the version of Hive . some version do not support parquet file

Copy and Paste Data into PhpMyAdmin

after searching the internet for a bit, I'm pretty sure this hasn't been answered directly so I'm asking here.
I am currently creating a Runescape (Laugh at me all you want ;P) Skilling Calculator for a School Programming Project, and am creating databases for XP values with phpMyAdmin, using information that is already on the web.
Instead of having to manually type out approximately 6000 different entries, each with 3 columns, I would rather copy and paste them, alleviating both time, and chances for errors. For example, I want to copy and paste all the information from here:
http://www.tip.it/runescape/pages/view/divination_calc.htm onto phpMyAdmin in bulk; not one entry at a time. I was wondering if this was possible in any way.
I would suggest copying and pasting the HTML table into excel, tidying up columns to match your database, saving as a CSV and importing using PHPMyAdmin's import function.
Here's an article I found on importing a CSV into PHPMyAdmin: importing a CSV into phpmyadmin

How To Split Discogs Database Dump To Upload?

Discogs Database Dump
Please see the bottom of this page for the latest dump. I am trying to upload the likes of the discogs_20150301_labels.xml however my server only allows a maximum file upload of 500MB (I'm using the 1and1 shared server on the advanced plus package).
How can I split this file to upload in chunks to continue work?
Answer to following related Stackoverflow question lists several xml split utilities for you to research:
XML Split of a Large file

HSQLDB: Easy way to have all created files deleted after closing connection?

I have an app/tool which reads from a CSV file and writes to another, processing it using HSQLDB.
I want to have the CSV file as the only output, and the database files should disappear after the process finishes.
I tried to use mem storage, but that prevents HSQLDB to write to the CSV file.
I also tried to DROP SCHEMA before closing the connection, but that does not remove the files.
I don't like deleting the files manually, as that's HSQLDB implementation-specific and can change over time.
Is there some systemic way to leave only the CSV file?
Ideally, I'd like some option which would allow HSQLDB to write CSV file while using in-memory storage.
HSQLDB never deletes its own files. You can reduce the number of files by using
SET FILES LOG FALSE