I want to allow file upload(public/per user), but not sure about how to properly do it.
I've read that it is not recommended to use MySQL for this and instead should use file system for files and indexing them in the database. I remember reading some popular(with many votes) Q/A about this in SO, but can't find it(please send a link if you can).
So how should I do it? I should use some inaccessible folder and store files there with an ID as their name and this name(ID of the file) in the table of files with user_id, and when the user request a file, I should check auth and then send the corresponding file to the user?
There's no single answer to this question. It depends how you want your application to work. It could be perfectly fine, for example, to put the files in a directory that is http-accessible, if you don't need to restrict access. Then you don't need to pass the file through any code, you just link directly to it.
There are also legitimate reasons to store a file in the database. For instance, it's automatically included in backups, it is guaranteed to be deleted when you delete the database row, it obeys transaction semantics, and so on.
This has been asked frequently on Stack Overflow. Here are a few links to ones that I have answered.
Should I use MySQL blob field type? (2009)
What is difference between storing data in a blob, vs. storing a pointer to a file? (2012)
Save file as blob in MYSQL database or as file path (2018)
I also cover this in the chapter "Phantom Files" in my book, SQL Antipatterns Volume 1: Avoiding the Pitfalls of Database Programming.
Related
tl;dr
In my node.js application I create pdf documents. What is the best/right way to save them? Right now I use node.js fileserver and shell.js to do it.
I am working on a node.js web application to manage apartments and tenants for learning purpose and on some point I create PDF Documents that I want to save under a path
/documents/building_name/apartment_name/tenant_name/year/example.pfd
Now if the user wants to change the building, apartment or tenant name via an http PUT request I change the database but also the want to change the path.
Well both works but I can't write good tests for these functions.
Now a friend told me that it's a bad practice to save documents on a file server and I better should use BLOB.
On the other side google doesn't really agree on using blobs
So what is the right way to save documents?
Thanks
Amit
You should first define a source of truth. Unless you're legally obliged to keep copies of those files and they are not being accessed very often, I wouldn't even bother storing those at all and just generate them upon request.
If not, keep the DB clean, blobs will make it huge. Put them into cold storage (again assuming they are not being accessed too frequently) without those paths. If the paths are reliant on often changing information, that can't be performant for neither the file server nor your system.
Instead store a revision number in your DB that the file can be found under and limit the path structure to information that rarely change.
Like {building}/{apartment}/{tenant}_{revision}.pfd
That - depending on your backup structure - will allow you to time-travel if necessary and doesn't force a re-index all the time.
Note: I don't know too much about your use case.
Background:
I am making a website where I want modular administrative rights for read/write/edit priviledges. My intent is to allow for any number of access level types, and to base it off of folder structure.
As an example, root admins would have read/write/edit for all site pages. Group A may have read/write/edit to all files in the path www.example.com/section1/ (including subfolders), Group B would have read/write/edit to all files in www.example.com/section2/, and so on.
I have considered two options to impliment this: create a MySQL database that would hold:
Group Name (reference name for the access group)
Read (list of folders the group can read separated by comma)
Write (list of folders the group can write new content to separated by comma)
Edit (list of folders the group can change already existing information separated by comma)
The other option I considered is creating a 'GroupAccess.txt' file somewhere and hand-jamming the information into that to reference.
Question: What are the advanatages of each of these systems? Specifically, what do I gain from putting admin access information in a database versus a text file, and vice versa? (i'm looking for information on potential speed issues, ease of maintainability, ease of editing/changing the information that will be stored)
Note: I'm not looking for a 'which is better', I want to know specific advantages so I can make a better informed decision on what's best for me.
The first thing that comes to mind is that the database would be more secure over a text file for the simple reason a text file can be read over the internet as most web servers serve .txt file by default, this would allow for users with restricted access and non-users of the site to see the whole structure of you site and in turn can make you more open to possible attacks on certain areas of your site.
Another benefit of using a database is that you can easily use a join to check is a user has access to some content in the database where as with a file you'll need to read the file get the permissions and the go build the SQL and get the data from the database.
Those are just two of the things that have stuck out from reading your question, hope it helps.
Is it possible to store .doc/.txt or image files to stroe in database directly making attributes of type blob?I neewd a good working example for storing files inside database and also obviously retrieving them from database as original.
The blog entry http://mirificampress.com/permalink/saving_a_file_into_mysql describes the overall process in quite allot of detail for php.
I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?
Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.
It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.
The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html
I need to make a backup system for my rails app but this has to be a little special: It doesn't have to back up all the database info and files in a single file or folder but it has to back up the database info and attachment files per user. I mean, every one of this backups should be able to regenerate all the information and files for one single user.
My questions are:
Is this possible? What's the best way to do it? And, if it's impossible or a bad idea at all, why is it?
Note: The database is a MySQL one.
Note2: I used Paperclip for the users uploads.
Im guessing you have an app that backs up data, when a user clicks on something right? I'm thinking get all the info connected to the user(depends on how you did your user model, so maybe you should have a get_all_info method) then write it out in sql format to a file, which you save as .sql. (either using File.new or Logger.new)
I would dump the entire user object and related objects into a single xml file dump. As you go through the creation of the XML grab out all the files and write the XML + all files into one directory, then compress them.
I think there are definitely use cases to have a feature like this, but be sure to have it run in a background process and only when needed in order to not bog down the web server. Take a look at http://github.com/tobi/delayed_job or http://github.com/defunkt/resque.