Sync indexedDB with mysql database - mysql

I am about to develop an application where employees go to service repair machines at customer premises. They need to fill up a service card using a tablet or any other mobile device.
In case of no Internet connection, I am thinking about using HTML5 offline storage, mainly IndexedDB to store the service card (web form) data locally, and do a sync at the office where Internet exists. The sync is with a MySQL database.
So the question: is it possible to sync indexedDB with mysql? I have never worked with indexedDB, I am only doing research and saw it is a potential.
Web SQL is deprecated. Otherwise, it could have been the closer solution.
Any other alternatives in case the above is difficult or outside the standard?
Your opinions are highly appreciated.
Thanks.

This is definitly do able. I am only just starting to learn indexeddb the last couple of days. This is how I would see it working tho. Sorry dont have code to give you.
Website knows its in offline mode somehow
Clicking submit form saves the data into indexeddb
Later laptop or whatever is back online or on intranet and can now talk to main server sends all indexeddb rows to server to be stored in mysql via an ajax call.
indexeddb is cleared
repeat

A little bit late, but i hope it helps.
This is posible, am not sure if is the best choice. I can tell you that am building a webapp where I have a mysql database and the app must work offline and keep trace of the data. I try using indexedDB and it was very confusing for me so I implemented DexieJs, a minimalistic and straight forward API to comunicate with indexedDB in an easy way.
Now the app is working online then if it goes down the internet, it works offline until it gets internet back and then upload the data to the mysql database. One of the solutions i read to save the data was to store in a TEXT field the json object been passed to JSON.stringify(), and once you need the data back JSON.parse().
This was my motivation to build the app in that way and also that we couldn't change of database :
IndexedDB Tutorial
Sync IndexedDB with MySQL
Connect node to mysql

[Update for 2021]
For anyone reading this, I can recommend to check out AceBase.
AceBase is a realtime database that enables easy storage and synchronization between browser and server databases. It uses IndexedDB in the browser, and its own binary db format or SQL Server / SQLite storage on the server side. MySQL storage is also on the roadmap. Offline edits are synced upon reconnecting and clients are notified of remote database changes in realtime through a websocket (FAST!).
On top of this, AceBase has a unique feature called "live data proxies" that allow you to have all changes to in-memory objects to be persisted and synced to local and server databases, so you can forget about database coding altogether, and program as if you're only using local objects. No matter if you're online or offline.
The following example shows how to create a local IndexedDB database in the browser, how to connect to a remote database server that syncs with the local database, and how to create a live data proxy that eliminates further database coding altogether.
const { AceBaseClient } = require('acebase-client');
const { AceBase } = require('acebase');
// Create local database with IndexedDB storage:
const cacheDb = AceBase.WithIndexedDB('mydb-local');
// Connect to server database, use local db for offline storage:
const db = new AceBaseClient({ dbname: 'mydb', host: 'db.myproject.com', port: 443, https: true, cache: { db: cacheDb } });
// Wait for remote database to be connected, or ready to use when offline:
db.ready(async () => {
// Create live data proxy for a chat:
const emptyChat = { title: 'New chat', messages: {} };
const proxy = await db.ref('chats/chatid1').proxy(emptyChat); // Use emptyChat if chat node doesn't exist
// Get object reference containing live data:
const chat = proxy.value;
// Update chat's properties to save to local database,
// sync to server AND all other clients monitoring this chat in realtime:
chat.title = `Changing the title`;
chat.messages.push({
from: 'ewout',
sent: new Date(),
text: `Sending a message that is stored in the database and synced automatically was never this easy!` +
`This message might have been sent while we were offline. Who knows!`
});
// To monitor realtime changes to the chat:
chat.onChanged((val, prev, isRemoteChange, context) => {
if (val.title !== prev.title) {
console.log(`Chat title changed to ${val.title} by ${isRemoteChange ? 'us' : 'someone else'}`);
}
});
});
For more examples and documentation, see AceBase realtime database engine at npmjs.com

Related

What is the right way to use a database with flutter?

I have an app which interacts with the database directly with mysql1 library like the example below:
Future FetchData() async {
final connection = await MySqlConnection.connect(ConnectionSettings(
host: 'mysql-hostname.example.com',
port: 3306,
user: 'root',
password: 'root',
db: 'testDB',
));
var results = await connection.query('SELECT * FROM `testTable` WHERE 1');
for (var row in results) {
print('${row[0]}');
}
// Finally, close the connection
await connection.close();
}
I wonder if this is a safe and secure method. Because when I build the app I pack all the information (username, password) about connecting my database in the app. Is this risky so should I use a separate back-end for this kind of tasks?
It is generally safer to put a trusted backend environment between your database and app. But even in this case you will have to ensure that only your app has access to this backend resource.
For example if you use Firebase as backend, there is an AppCheck service available. Although this is relatively new, it can attest your app's authenticity.
If you prefer to do it on your own, you can create a bearer token that your app will add the the requests, preferably in the request's Authorization header, and check it in the backend before accessing protected resources. But then the question remains, where do you store this bearer token safely.
If you want to keep it in your code, you should properly obfuscate the code before uploading it to the app stores. Even in this case it is a good idea to check for rooted or jailbroken devices to prevent misuse, for example check out flutter_jailbreak_detection.
There are also secure storage packages, which can store sensitive data in a safer way. Unlike SharedPreferences, these can mitigate the risks of unauthorited access to your secrets. See flutter_secure_storage for example.
It really depends on the level of security that you are looking for. Are you storing user-generated sensitive information in your database? Then the answer is that you should ideally not store that information in your code nor should you ship your application with that information bundled inside it.
I highly suggest that you start using Firebase for your usage. Firebase is an absolutely fantastic and free product provided by the Google, the same company behind Flutter, and within a few minutes you can build a whole experience that relies on authentication with Firebase and you can safely store user-generated content in Firebase.

How can I dynamically choose which MySQL server to point to?

TL;DR: Vertical or Horizontal scaling for this system design?
I have NGINX running as a load balancer for my application. It distributes across 4 EC2 (t2.micro's cuz I'm cheap) to route traffic and those are all currently hitting one server for my MySQL database (also a t2.micro, totalling 6 separate EC2 instances for the whole system).
I thinking about horizontally scale my database via Source/Replica distribution, and my thought is that I should route all read queries/GET requests (the highest traffic volume I'll get) to the Replicas and all write queries/POST requests to the Source db.
I know that I'll have to programmatically choose which DB my servers point to based on request method, but I'm unsure of how best to approach that or if I'm better off vertically scaling my DB at that point and investing in a larger EC2 instance.
Currently I'm connecting to the Source DB using an express server and it's handling everything. I haven't implemented the Source/Replica configuration just yet because I want to get my server-side planned out first.
Here's the current static connection setup:
const mysql = require('mysql2');
const Promise = require('bluebird');
const connection = mysql.createConnection({
host: '****',
port: 3306,
user: '****',
password: '*****',
database: 'qandapi',
});
const db = Promise.promisifyAll(connection, { multiArgs: true });
db.connectAsync().then(() =>
console.log(`Connected to QandApi as ID ${db.threadId}`)
);
module.exports = db;
What I want to happen is I want to either:
set up an express middleware function that looks at the request method and connects to the appropriate database by creating 2 configuration templates to put into the createConnection function (I'm unsure of how I would make sure it doesn't try to reconnect if a connection already exists, though)
if possible just open two connections simultaneously and route which database takes which method (I'm hopeful this option will work so that I can make things simpler)
Is this feasible? Am I going to see worse performance doing this than if I just vertically scaled my EC2 to something with more vCPUs?
Please let me know if any additional info is needed.
Simultaneous MySQL Database Connection
I would be hesitant to use any client input to connect to a server, but I understand how this could be something you would need to do in some scenarios. The simplest and quickest way around this issue would be to create a second database connection file. In order to make this dynamic, you can simply require the module based on conditions in your code, so sometimes it will be called and promised at only certain points, after certain conditions. This process could be risky and requires requiring modules in the middle of your code so it isn't ideal but can get the job done. Ex :
const dbConnection = require("../utils/dbConnection");
//conditional {
const controlledDBConnection = require("../utils/controlledDBConnection");
var [row] = await controlledDBConnection.execute("SELECT * FROM `foo`;")
}
Although using more files could potentially have an effect on space constraints and could potentially slow down code while waiting for a new promise, but the overall effect will be minimal. controlledDBConnection.js would just be something close to a duplicate to dbConnection.js with slightly different parameters depending on your needs.
Another path you can take if you want to avoid using multiple files is to export a module with a dynamically set variable from your controller file, and then import it into a standard connection file. This would allow you to change up your connection without rewriting a duplicate, but you will need diligent error checks and a default.
Info on modules in JS : https://javascript.info/import-export
Some other points
Use Environment Variables for your database information like host, etc. since this will allow for you to easily change information for your database all in one place, while also allowing you to include your .env file in .gitignore if you are using github
Here is another great stack overflow question/answer that might help with setting up a dynamic connection file : How to create dynamically database connection in Node.js?
How to set up .env files : https://nodejs.dev/learn/how-to-read-environment-variables-from-nodejs
How to set up .gitignore : https://stackabuse.com/git-ignore-files-with-gitignore/

Connecting Database with Svelte

I'm new to using Svelte and would like to create a ordering website using Svelte. I know that I will need a database to keep track of the order, customer name, price etc. I have used MySQL before but I haven't learned how to connect a database to a website.
Is there a specific database that you can use if you are using Svelte?
Or is there a way to connect MySQL to Svelte?
I have searched about this on Youtube and Google but I'm not sure if it's different if you are using Svelte so I wanted to make sure.
Note: I have not started this project yet so I do not have any code to show I just want to know how you can connect a database if you're using Svelte.
Svelte is a front end javascript framework that run on the browser.
Traditionally, in order to use databases like mysql from a front end project such as svelte, (that contains only html,css and js), you would have to do it with a separate backend project. You can then communicate the svelte app and the backend project with the help of REST api. The same applies to other other front end libraries/frameworks like react, angular vue etc.
There are still so many ways to achieve the result. Since you are focusing on Svelte here are few things options
1 Sapper
Sapper is an application framework powered by svelte. You can also write backend code using express or polka so that you can connect to database of your choice (mysql / mongodb)
2 User Server less database
If you want you app simple and just focus on svelte app, you can use cloud based databases such as firebase. Svelte can directly talk to them via their javascript SDK.
3 monolithic architecture
To connect with mysql in the backend, you would need to use one serverside application programming language such as nodejs (express) php or python or whatever you are familiar with. Then use can embed svelte app or use api to pass data to the svelte app.
I can make an example with mongodb
You have to install the library
npm install mongodb
or add in package.json
Then you have to make a connection file that you have to call everytime you need to use the db
const mongo = require("mongodb");
let client = null;
let db = null;
export async function init() {
if(!client) {
client = await mongo.MongoClient.connect("mongodb://localhost");
db = client.db("name-of-your-db");
}
return { client, db }
}
for a complete example with insert you can see this video
https://www.youtube.com/watch?v=Mey2KZDog_A
You can use pouchdb, which gives you direct access to the indexedDB in the browser. No backend needed for this.
The client-pouchdb can then be replicated/synced with a remote couchdb. This can all be done inside you svelte-app from the client-side.
It is pretty easy to setup.
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 69
});
db.changes().on('change', function() {
console.log('Ch-Ch-Changes');
});
db.replicate.to('http://example.com/mydb');
more on pouchdb.com
Also the client can save the data offline first and later connect to a remote database.
As i get question mostly about connection to backend, not a database. It is pity, but svelte app template has no way to connect backend "in box".
What about me, i'm using express middleware in front of rollup server. In this case you able to proxy some requests to backend server. Check code below
const proxy = require('express-http-proxy');
const app = require('express')();
app.use('/data/', proxy(
'http://backend/data',
{
proxyReqPathResolver: req => {
return '/data'+ req.url;
}
}
)
);
app.use('/', proxy('http://127.0.0.1:5000'));
app.listen(5001);
This script opend 5001 port where you have /data/ url proxied to backend server. And 5000 port still available from rollup server. So at http://localhost:5001/ you have svelte intance, connected to backend vi /data/ url, here you can send requests for fetching some data from database.

HTML server sent events

I have a little understanding on how server sent events work. Say I have a Linux server(remote server) and I need to monitor it's CPU usage from local machine continuously (via a HTML page which will be in my local machine). Will I be able to get the CPU usage continuously from the server to local machine using SSE? If so, I need some clarifications on how to do so. Or is there any other alternatives that I can go with without involving any softwares or so?
You'll need to write some code to run on the server, which will gather whatever data you need. (Common choices include Node.js, PHP, etc.)
That code will need to either directly serve HTTP requests, or connect to a web server.
Your code will send data in this format:
event: someevent
data: {"key": "value"}
Then, your client-side will use EventSource:
const eventSource = new EventSource('https://example.com/your-sse-path');
eventSource.addEventListener('someevent', (e) => {
console.log(JSON.parse(e.data));
});

Nodejs on Server Ajax/http calls not reading new data Mongodb

I've recently started with pushing my locally tested Node,mongo, angularjs sites to live environments hosted on DigitalOcean.
I'm having inconsistency with ajax/http calls. on my Local machine, I am able to do http request and update an angularjs variable and this in return populates the html on the frontend. all works Great! now testing this on my server with same envireontment setup, the only time the variable load new data is when i refresh the page.
For example (not my actualy code):
Nodejs - app.js:
app.get('/getlist', requiredAuthentication, function(req, res) {
list.find({'username':req.session.user.username}, function(err,list) {
res.send(list);
});});
Angularjs - angular_app.js:
$scope.onClick = function (points, evt) {
$http.get('/getlist').then(function(response) {
$rootScope.list = response;});
};
Jade - home:
li(ng-repeat="row in list")
So like I said, this works perfectly on my local machine, but on my server I must refresh my page to load new data, it's as though my variable gets cached on the server.
Any idea would help.
Thanks.!
------- UPDATE - testing v0.1 --------
So after some intensive testing here is what I've found, but still no fix.
If I add new data via an http post, and I go look in my mongo db, I see the new data. Then when I click on the ng-click to retrieve the new data via HTTP, it doesn't return the new data, and is stuck on the old.
If I leave the page open for 10mins, and then click the button, it retrieves the new data, this is such a shlep.
Sounds like cache, but why des it work perfectly on my local?
When looking at the console > network > status. it is code 304, and this means nothing changed?
------- UPDATE - testing v0.2 --------
I've now tested the return data with a log in the console and I did the GET with ajax jQuery, and I'm getting the same issue/behaviour, it's stuck on the same collection of data, so my conclusion must be that node.js is causing the issues.
------- UPDATE - testing v0.3 --------
Okay so I've completely stopped mongo and switched everything to mysql using node-mysql. once again, on my local it works like a machine and on my actual server its laggy with reading new data.
I used Sequal PRO to access mysql and I started adding new entries to a table.
Opening my web url in the brower it Immediately showed the new entries. But after that, adding new entries or deleting entries only showed affect in 10mins or so.
So my conclusion is that Nodejs is caching like a mother, anyone know more bout this? am i really the only one every to experience this?
Try res.json for return data from node
app.get('/getlist', requiredAuthentication, function(req, res)
{
list.find({'username':req.session.user.username}, function(err,list)
{
res.json(list);
});
});
My conclusion to this issue was that port 80 was somehow caching the content of a page and will only load new data with a page refresh.
I upgrade Node and used latest Express. And I'm running my web app on a custom port, all is working now.