Uploading Data into Redis - json

I am working on school project where we need to create website and use Redis to search database, in my case it will be movie database. I got JSON file with names and ratings of 100 movies. I would like to upload this dataset into Redis instead on entering the entire dataset manually. JSON file is saved on my desktop and I am using Ubuntu 20.04.
Is there a way to do it?
I have never used Redis so my question might be very silly. I've been looking all over the internet and cannot find exactly what needs to be done. I might be googling incorrect question maybe that's why I cannot find the answer.
Any help would be appreciated.

Write an appropriate program to do the job. There's no one-size-fits-all process because how your data is structured in redis is up to you; once you decide on that, it should be easy to write a program to parse the JSON and insert the data.

Related

Is there any best way to transfer bulk data from Mysql to Mongodb?

I am using MongoDB first time here. Is there any best way to transfer bulkdata from mysql into MongoDB. I try to search to in different ways but i did not find.
You would have to physically map all your mysql tables to documents in mongodb but you can use an already developed tool.
You can try: Mongify (http://mongify.com/)
It's a super simple way to transform your data from a MySql to MongoDB. It has a ton of support for changing your existing schema into a schema that would work better with MongoDB.
Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.
It supports:
Updating internal IDs (to BSON ObjectID)
Updating referencing IDs
Typecasting values
Embedding Tables into other documents
Before filters (to change data manually before import)
and much much more...
There is also a short 5 min video on the homepage that shows you how easy it is.
Try it out and tell me what you think.
Please up vote if you find this answer helpful.

Get tweet JSON into SQL table with nodejs

I'm getting some tweets from the twitter API with nodejs and saving some of the data with mysql. Gradually as I require more and more data from each tweet, it's become obvious I should just save the whole tweet each time. I'm looking for the cleanest way to save a whole tweet straight from a JSON object to a new row in my db.
I was surprised there aren't more node modules or anything pre-written for creating the database table ready for tweets and mapping a tweet's JSON schema directly to it - can anyone help? At the moment I'm using the 'mysql' module for queries and thought about combining it with 'json-sql' for building the query, but it seems like this must be such a common thing that there should be an even simpler way.
Is anyone aware of another process? Thanks!
I eventually moved over to using MongoDB, it works really well with NodeJS and I'm really enjoying it.

Dynamic JSON file vs API

I am designing a system with 30,000 objects or so and can't decide between the two: either have a JSON file pre computed for each one and get data by pointing to URL of the file (I think Twitter does something similar) or have a PHP/Perl/whatever else script that will produce JSON object on the fly when requested, from let's say database, and send it back. Is one more suited for than another? I guess if it takes a long time to generate the JSON data it is better to have already done JSON files. What if generating is as quick as accessing a database? Although I suppose one has a dedicated table in the database specifically for that. Data doesn't change very often so updating is not a constant thing. In that respect the data is static for all intense and purposes.
Anyways, any thought would be much appreciated!
Alex
You might want to try MongoDB which retrieves the objects as JSON and is highly scalable and easy to setup.

Where to store info besides mysql

My php script pulls about 1000 names from the mysql db on a certain page. These names are used for a javascript autocomplete script.
I think there's a better method to do this. I would like to update the names with a cronjob once a day (php) and store the names locally in a text file? Where else can I store it? It's not sensitive info.
It should be readable and writable to php.
Since you only need the data updated once a day, have a cron-script generate a static json file in some fixed location. Then read this with ajax on the client and make sure it caches it on the client.
Or potentially, generate the file whenever the database is updated (if this is applicable, I don't know your application)
You could try Memcache. But that could be like using a sledge-hammer to crack a nut.
Edit What about storing the data as simple file and let users (JavaScript) download it. Clients would not query the server for every key stroke because they could search for matching values themself. Format could be JSON because it is simple and JavaScript native.
It's unlikely reading from a text file will be much faster than a database query - MySQL already does a lot of caching that should make your query speedy.
If you need to make this query often and performance is a problem often you could consider using a caching module for PHP.
Related
The best way of PHP Caching

Plan for building xml file containing custom designed blog posts/comments for import into WordPress via Clojure

I'm trying to migrate from a custom designed blog software system to a WordPress.com site. I can access my MySQL database of posts and comments without too much difficulty, thanks in part to this post: How do I connect to a MySQL database from Clojure?.
I think my next step is to generate the pseudo-xml file known as a "WordPress eXtended RSS file", and that's where I'm having difficulty. I can navigate my way through the recordsets returned from my database connection (as of now, I'm doing two SQL searches, one for posts, and another for comments, giving me two recordsets), but I don't know what sort of data structure I should put those results into in order to eventually export them into the xml file.
I've been using a lot of file spit/slurps while working my way through this, but I have a feeling that's neither the most efficient way to do it, nor the most "lispy" way.
I think what I need is help defining a clojure-based data structure that I can append both new posts and new comments to, iteratively, for eventual export into the xml file format I need it to finally get to.
Look at clojure.xml/emit and clojure.xml/emit-element; they expect data that look as described in (doc clojure.xml/parse) and walk this data to output XML.