My website mostly use Django + Mysql and sometimes Redis for some frequently access data.
My problem is how to sync data from Mysql to Redis automatically when I write data to Mysql by Django admin page.
Thank you for giving me some advice. It also will be appreciated if someone can tell me how to write data into Redis directly by Django admin page
Thank you!
What you want to achieve looks like using Redis as a cache. The pattern is:
Always look for the data in redis
If it's not in redis, get it from MySQL
and store it in Redis with an expiration date, using the expire command
Doing it like this, you have to modify you app, not the admin page. But there could be a delay between the writing of the informations in the admin page, and their availability to the clients.
You may delete the data from the Redis cache in the admin, when storing them, to ensure the newest version will always be delivered. But you will have to modify Django admin page for this.
Related
I'm working on a project that can take data from a Weintek HMI, put them on a webserver and then send them to an application that I created on android studio.
I've found firebase that can help me in this task.
In easybuilder that works with my hmi, I can create a mysql database that can store the data.
The problem is how can I update automatically firebase database with mysql database with an interval of time in order to access them on the android app.
If there is no solution with mysql, can someone suggest other method to extract the data and use some web server to sync it with the android app?
I don't know your specific need, in terms of data volume or application, but as a workaround, maybe this can help you:
I usually apply MQTT, which many Weintek HMIs have, to send telemetry data, and then use NodeRed to process and redirect the data to a database, email, SMS, Telegram, CSV, TXT... depending on the need , which in your case could be Firebase (I never used it).
It works great for me as I don't have to worry about HMI limitations.
The problem is the reliability of the data, in terms of confirming that when the HMI sends, the server listens and writes, but there are certainly ways to deal with this, and the fact that you need to have a server with NodeRed running.
If you have never done so, in Weintek HMIs you can send the MQTT payload cyclically using macros easily.
Currently I am making a WordPress plugin that will need to store an access token. I understand that it will have to be stored in a database. I've been told I can store it in a Redis database but I am concerned this will be a problem for users that don't use Redis. The alternative solution I've been thinking of is storing it in the WordPress database. Is this a better solution? Any suggestions would be greatly appreciated.
You definitely should write your plugin to store this sort of information in the WordPress mySQL instance. You are correct that most WordPress installations don't have access to Redis. To rephrase that: WordPress can't use Redis (with a very few exceptions).
You don't say much about the tokens you need to store.
If it's one per registered user, you can use the wp_usermeta table. If it's one only for the whole WordPress installation upon which your plugin is installed, you can use the wp_options table. If some of your WordPress posts/pages have their own token, you can use wp_postmeta.
If you need a whole bunch of these tokens, chosen by something besides users or posts, you may need to have your plugin create a new table in the WordPress database.
I'm building a simple commenting system using node and i need to configure this in a PHP project running in Apache server. So, i need to trigger node.js when some changes made in MySQL database table present in the Apache server. So, i need to know whether it is possible to do this in a Apache server? If so, then how to do that? Any idea or suggestions on this are greatly welcome. Please help...
I guess there are few options you could take, but I don't think you can get some sort of triggered action from within MySQL or Apache. IMHO, you these are the approaches you can take:
you can expose a HTTP API from node and every time you need to notify the node app, you could simply insert the data into MySQL using PHP and then issue a simple GET request to trigger node.
You could use some sort of queuing system (rabbitmq, redis, etc.) to manage the messages to and from the two application, hence orchestrating the flow of the data between the two apps (and later the db).
you could poll the database from node and check for new rows to be available. This is fairly inefficient and quite tricky, but it sounds more close to what you want.
I'm working on a bunch of cakephp apps that are all services that you have access to when you login to the main website. Each app will be hosted on a separate server, and there will be a separate login server as well.
Doing the research, I found Cakephp supports writing sessions to a database, http://blog.jambura.com/2011/08/24/should-i-use-database-for-storing-sessions-in-cakephp/
And also it supports having multiple database connections,
http://bakery.cakephp.org/articles/mithesh/2008/09/02/talking-to-multiple-databases-from-single-cakephp-application
So I was considering the possibility of storing the sessions on the login server, which the other websites all access, and then they all use their own databases for the rest of their data. It seems like it would be simple to implement, but I have concern that there would be too much reading and writing on the login server's database. Is there a way to optimize for this? Or should I do another approach entirely?
Try check useDbConfig property of model. Think, that it will be simpler if your sessions and users tables will be on the same database, then you will common user database and sessions. So, you will not require to implement SSO.
We are setting up a virtual private server (not hosted by us), on which we will be user testing our Django-based web application. The user-generated content produced in these tests will be very sensitive. We would like to keep this content encrypted, for example in case back-up media goes missing. The content will be stored in a MySQL or SQLite database.
As I understand it, we cannot encrypt the file system of the VPS. If we encrypt the database, using something like SQLcipher (http://sqlcipher.net/), is there a way of passing the key to Django without storing it on the server? We will be booting up the server for each test, so that part is not a concern.
Thank you!
Sounds like you would want an admin user to manually enter the key into a form as part of the login process, and have Django use that.