MongoDB or store BSON as file? - json

I have special case where every JSON document stored and loaded in a whole. Thus, complex query is not required. This can be a reason to store every document in Object Storage as BSON instead of MongoDB. However, is there any drawbacks and considerations for that? Any performance and maintainability issues?
Thank you!

I am not sure if I got your question correctly but MongoDB does store documents as a BSON(binary JSON) and if you want to just store JSON in MongoDB and retrieve it as a whole there won't be any performance issue.

Related

What's stopping me from using a standalone JSON file instead of a local db?

I need to store data for a native mobile app I'm writing and I was wondering: 'why do I need to bother with DB setup when I can just read/write a JSON file?. All the interactions are basic and could most likely be parsed as JSON objects rather than queried.
what are the advantages?
DB are intent to work with standardized data or large data sets. If you know that there is only a few properties to read and it's not changing, JSON may be easier, but if you have a list of items, a DB can optimize the queries with index or ensure consistency through multiple tables

Should I use JSONField or FileField to store JSON datas?

I am wondering how I should store my JSON datas to have the best performances and scalability.
I have two options :
The first one would be to use JSONField, which will probably provides me an advantage in simplicity when it comes on performances and handling the datas since I don't have to get them out of a file each time.
My second option would be to store my JSON datas in FileFields as json files. This seems the best option since the huge quantity of JSON wouldn't be stored in a DataBase (only the location of the file). In my opinion it's the best option for scalability but maybe not for user performances since the file has to be read each time before displaying them in the template.
I would like to know if I am thinking reasonably, what's the best way between to store JSON datas for them to be reusable as fast as possible without making it complicated to the database & scalability ?
Json field will obviously has a good performance because of its indexing. A very good feature of it would be the native data access feature which means that you don't have to parse/load json and then query, you can just query directly from model field. Now since you have a huge json data it seems that file is a better option than model field but file only has advantage of storage.
Quoting from some random article from google search:
Postgres json field takes almost 11% extra data than the json file on your file system so test of 268mb file in json field is 233 mb (formatted json file)
Storing in a file has some cons which includes reading files parsing json and querying which is time consuming since it is disk based operations. Scalebility will not be a issue with json field although your db size will be high so moving the data might become tough for you.
So unless you have a shortage of database space you should choose jsonfield.

Storing a JSON file in Redis and retrieving it

I am storing the info contained in a JSON file in Redis. I am doing it with the nodejs redis driver. Do you think that I am losing something if I am employing a hashtable for storing the info?
The info is simply a large array (several thousands) of elements (several fields within every element, no more than 50 fields sometimes) in the data and a small bunch of properties in the meta.
I understand that you're storing those JSON strings as follows:
hset some-key some-sub-key <the json>
Actually there's another valid approach which involves using the global key space directly:
set some-key:sub-key <the json>
If you're just storing those JSON strings I would say that creating global space keys is the simplest and most effective approach in your case.
What do you mean by losing something? Storing values(JSON) and retrieving them in Redis could be really fast. Plus Redis comes with some very handy APIs like TTL, FLUSHALL etc...
Personally, I'm using Redis for my Profile page. I store my image uploads in Redis and never had an issue.
My profile page: http://fanjin.computer
Github repo: https://github.com/bfwg/relay-gallery
Although this question has been answered, for future reference some might be asking the same question but looking for a different answer (like me).
If that's the case I would suggest looking in to RedisJSON for creating a JSON type in redis.

Using mongodb to store a single but complex JSON object

I want to store a single, big and complex JSON object in mongodb and I want to be able to retrieve and modify specific parts of it. A simple solution would be to store it in a single document, but I'm not sure how that would play with multiple write requests. Another option would be to keep every node of the JSON in different documents, kind of like a pattern explained here in the mongodb documentation. This way I can retrieve only parts of the whole object and work on them that way.
My question is: do I get anything out of the latter approach? I'm kind of new to mongodb, but as I read it has database lock on multiple write requests, so it would seem that having my JSON taken apart like this would achieve nothing when it comes to scaling.
If you consider to store data larger then 16MB you should definitely use some sort of hashing as mongodb has a 16MB size limit on its documents.
From MongoDB Limits and Thresholds
The maximum BSON document size is 16 megabytes.

Storing serialized ruby object in database

I would like to store very large sets of serialized Ruby objects in db (mysql).
1) What are the cons and pros?
2) Is there any alternative way?
3) What are technical difficulties if the objects are really big?
4) Will I face memory issues while serializing and de-serializing if the objects are really big ?
Pros
Allows you to store arbitrary complex objects
Simplified your db schema (no need to represent those complex objects)
Cons
Complicates your models and data layer
Potentially need to handle multiple versions of serialized objects (changes to object definition over time)
Inability to directly query serialized columns
Alternatives
As the previous answer stated, an object database or document oriented database may meet your requirements.
Difficulties
If your objects are quite large you may run into difficulties when moving data between your DBMS and your program. You could minimize this by separating the storage of the object data and the meta data related to the object.
Memory Issues
Running out of memory is definitely a possibility with large enough objects. It also depends on the type of serialization you use. To know how much memory you'd be using, you'd need to profile your app. I'd suggest ruby-prof, bleak_house or memprof.
I'd suggest using a non-binary serialization wherever possible. You don't have to use only one type of serialization for your entire database, but that could get complex and messy.
If this is how you want to proceed, using an object oriented dbms like ObjectStore or a document oriented dbms like CouchDB would probably be your best option. They're better designed and targeted for object serialization.
As an alternative you could use any of the multitude of NoSQL databases. If you can serialize your object to JSON then it should be easily stored in CouchDB.
You have to bear in mind that the serialized objects in terms of disk space are far larger than if you saved them in your own way, and loaded them in your own way. I/O from the hard drive is very slow and if you're looking at complex objects, that take a lot of processing power, it may actually be faster to load the file(s) and process it on each startup; or perhaps saving the data in such a way that's easy to load.