may you consult a little bit? In a database a client likes to store a PDF as an attachment within each record. The file size is around 1Mb. We are talking about 5000 - 10000 records. MS Access (2007) is limited to 2Gb per database. However I see another, earlier problem using an application like this: the performance. Is there anybody with some experience? I just learned to link to these binary datas and store the files on a fileserver.
Thank you for any advice!
Regards Urs
As an alternative, I've always stored the file on a file server, and I store the file path in a field of the record. That's always worked well for me.
Related
Apologies in advance if this is a dumb question, but with Google's release of "File Stream", does anyone know how this would impact MS Access dbs stored on it?
My "setup" is that I have users that are routinely using an AccessDB that was built for them years ago, and I'm trying to replicate it in MS SQL. Not by my choice, the file is stored by the clients on a Google Drive: Which essentially means that I can only look at the vba and do testing when the clients are not using it. The clients also use Google's sync to access the file.
So my question is two-fold. First, if the clients are looking to move to File Stream and asked how this would impact their ability to use the Access file (I have no idea). Second, will File Stream make it possible to have multiple users on a MS database without causing conflicts or problems? Would I still need to make sure that I have exclusive access to the file while making changes to the vba code?
Thanks in advance!
UPDATE: Over 1k visitors with the same question and updating the description as asked by the person who marked this as -1 (now marked -2 without feedback) but the only question answered is that having multiple users simultaneously isn't sound. What remains is why a single user scenario can cause corruption of the file stored on stream: Compact/repair fails at least when the file is large.
Based on experience, you need to make sure only one person has the database open at a time. If two people open the database, then changes will not be saved. Try to find a different solution if possible.
The MSAccess database engine is fundamentally based on the database primitives provided by the MS file system. Including the record read/write/lock primitives. (Bet you didn't even know that MS operating systems had a native database system!).
Access records and tables are NOT filesystem records and tables, BUT -- the Access database system of records and tables are built on top of the native system of record locking. (The native system doesn't, and never had, any thing smaller than file-level authentication and permissions, so Access built a whole new system on top).
That means that any file system, or any network file system redirector, which doesn't support record locking like the MS file system and the SMB protocol, can't support simultanous multi-user access to an Access database.
I'm not familiar with Google FileStream, but I assume that it's a way to access binary objects using a database API? Normally that means the API doesn't look inside the binary object: so it can't be locking individual sections of the file for R or W or RW: that means you can't R part of the object while somebody else W a different part: that means you can't share access. ... and even if File Stream means something different to Google, the chance that they support SMB inside it is slim to none, right?
I started working on a side project during the night. And it requires me to use large amounts of images, similar to Instagram, what do you recommend using as a database for this?
Should I upload the file to a path and referenced it from the database? Or should I use another type of database and upload the pictures there?
Thanks
"Should I upload the file to a path and referenced it from the database" - yes. Storing objects in mysql like that as a blob is bad practice. Put them in a directory somewhere and reference the path from your database of choice.. mysql is fine.
It won't be a good idea to store image directly in DB. DB size will increase. DB Hit will increase. So better store image as file in server and store the reference path in DB.
For Detailed Info :- Storing Images in DB - Yea or Nay?
I have received few files(.DBF) from client. Each files is of different size (ranging from 40 Kb to 2.2 Gb)
I am using MS VS Foxpro driver..Provider connection manager.
When I connect (to retrieve tables) to the folder where all there files are stored. In Table/View dropdown, I see all tables except the one with size > 2Gb.
I am able to reproduce this scenario in other systems as well.
Suppose, if the DBF file size is 1.5 Gb, the table shows up in dropdown.
Any help? Thanks in advance
The maximum size for a Visual FoxPro table or almost anything else is 2GB. So I would imagine that is your problem. A 64 bit driver will make no difference. I'm not sure how the client is creating files > 2GB either, unless they're using something other than Fox.
Take a look into SyBase Advantage Local Server. Sybase adapated there database to handle similar format of VFP and can read them directly and exceed the 2 gig file limit of 32-bit based applications. I've been using for some time having converted another system from VFP OleDB. Personally, I'm using C# with Sybase's Data Provider to connect to and read data. From that, you could probably get to what you are looking for.
Link for Sybase Local Server
It's royalty-free for local server and 2-concurrent-users for development/testing.
I want to store 7000 records consisting of txt files as datatype BLOB in my DB using workbench.
But:
1. I don't know how to do it automatically? Should I put all the files in one catalogue and then write a script to take them one by one and insert in the adequate rows?
2. I am not sure if BLOB is fine for this type of file storage? Later I want to connect my DB with GUI so after clicking, it should be possible to open each txt file in new window.
Could you advice me how to solve my problem?
You should write a script, yes. If it's hard for you to put them all in one folder I think there are scripts and tools to do this.
You can use C#, PHP or any other lang to scan those files and then insert them into the database.
Bunch of tutorials:
http://www.csharp-examples.net/get-files-from-directory/
Inserting record in to MySQL Database using C#
http://www.codeproject.com/Articles/43438/Connect-C-to-MySQL
http://net.tutsplus.com/articles/news/scanning-folders-with-php/
http://www.homeandlearn.co.uk/php/php13p3.html
Blob should do, takes around 20 megabytes of text.
I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?
Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.
It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.
The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html