insert csv file into MySQL with user id - mysql

I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?

Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.

It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.

The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html

Related

Ruby on rails - how to update my rails database (MYSQL) with values from csv file on a daily basis?

I have a csv file which contains very detailed data of the products that my company sells and it gets updated daily.
And I want my rails to import the data from the csv file then update my database (MYSQL) if there are any new changes found.
What's the best way to achieve this? Some people mentioned about MYSQL for excel. Would this be the way to go about it?
I will appreciate if someone can give me a guidance on this. Thank you.
I'm not gonna walk through the details, specifically because you have no details, nothing attempted at all so I'm gonna stick with a overview.
From a systems point of view I would (assuming your rails app is live and not local):
have the CSV file live in place where you (or whoever needs to) can update it and is also fetchable to the application (dropbox, s3 bucket, your own server, wtv).
have a daily cron rake task, that downloads the CSV file
parse the CSV file and decide what to update.
The trickiest part will be to decide what to update from the CSV and it will depend on how it can change itself. Like if only new lines can be added, or lines removed, if columns in lines can be changed, etc.

Load txt file into SQL Server database

I am still learning SQL Server.
The scenario is that I have a lot of .txt files with name format like DIAGNOSIS.YYMMDDHHSS.txt and only the YYMMDDHHSS is different from file to file. They are all saved in folder Z:\diagnosis.
How could I write a stored procedure to upload all .txt files with a name in the format of DIAGNOSIS.YYMMDDHHSS.txt in folder Z:\diagnosis? Files can only be loaded once.
Thank you
I would not do it using a stored proc. I would use SSIS. It has a for each file task you can use. When the file has been loaded, I would move it to an archive location so that it doesn't get processed the next time. Alternatively you could create a table where you store the names of the files that were successfully processed and have the for each file loop skip any in that table, but then you just keep getting more and more files to loop through, better to move processed ones to a different location if you can.
And personally I also would put the file data in a staging table before loading the data to the final table. We use two of them, one for the raw data and one for the cleaned data. Then we transform to staging tables that match the relational tables in production to make sure the data will meet the needs there before trying to affect production and send exceptions to an exception table of records that can't be inserted for one reason or another. Working in the health care environment you will want to make sure your process meets the government regulations for storage of patient records for the country you are in if they exist (See HIPAA in the US). You may have to load directly to production or severely limit the access to staging tables and files.

load many txt files to mysql db

I want to store 7000 records consisting of txt files as datatype BLOB in my DB using workbench.
But:
1. I don't know how to do it automatically? Should I put all the files in one catalogue and then write a script to take them one by one and insert in the adequate rows?
2. I am not sure if BLOB is fine for this type of file storage? Later I want to connect my DB with GUI so after clicking, it should be possible to open each txt file in new window.
Could you advice me how to solve my problem?
You should write a script, yes. If it's hard for you to put them all in one folder I think there are scripts and tools to do this.
You can use C#, PHP or any other lang to scan those files and then insert them into the database.
Bunch of tutorials:
http://www.csharp-examples.net/get-files-from-directory/
Inserting record in to MySQL Database using C#
http://www.codeproject.com/Articles/43438/Connect-C-to-MySQL
http://net.tutsplus.com/articles/news/scanning-folders-with-php/
http://www.homeandlearn.co.uk/php/php13p3.html
Blob should do, takes around 20 megabytes of text.

How to transfer large data between pages in Perl/CGI?

I have worked with CGI pages a lot and dealt with cookies and storing the data in the /tmp directory in Linux.
Basically I am running a query for millions of records using SQL, and am saving it in a hash format. I want to transfer that data to Ajax ( which eventually will perform some calculation and return a graph using Google API.
Or, I want it to transfer that data to another CGI page somehow.
PS : The data I am talking about here is in forms of 10-100+ MB's.
Until now, i've been saving that data on the file in the server, but again, it's a hassle to deal with that data on the server for each query.
You don't mention why it's a hassle to deal with the data on the server for each query, but assuming the hassle is working with the file, DBM::Deep might make it relatively easy to write the hash out and get it back again. Once you have that, you could create a simple script to return it as JSON and access it as needed from Javascript or other pages. Although I think the browser might slow down with 100MB JSON data structure.

What's the best way of backing up a rails app data?

I need to make a backup system for my rails app but this has to be a little special: It doesn't have to back up all the database info and files in a single file or folder but it has to back up the database info and attachment files per user. I mean, every one of this backups should be able to regenerate all the information and files for one single user.
My questions are:
Is this possible? What's the best way to do it? And, if it's impossible or a bad idea at all, why is it?
Note: The database is a MySQL one.
Note2: I used Paperclip for the users uploads.
Im guessing you have an app that backs up data, when a user clicks on something right? I'm thinking get all the info connected to the user(depends on how you did your user model, so maybe you should have a get_all_info method) then write it out in sql format to a file, which you save as .sql. (either using File.new or Logger.new)
I would dump the entire user object and related objects into a single xml file dump. As you go through the creation of the XML grab out all the files and write the XML + all files into one directory, then compress them.
I think there are definitely use cases to have a feature like this, but be sure to have it run in a background process and only when needed in order to not bog down the web server. Take a look at http://github.com/tobi/delayed_job or http://github.com/defunkt/resque.