We currently have an desktop application that is sold to small businesses and used as a server/client model application and we are in the early stages of researching the possibility of adding cloud-based syncing to the program.
Besides the obvious hurdles in transitioning/recoding the networking code of the program itself, there seem to be many additional questions related to the server/database selection, available cloud services, scalability, and more.
For example, currently the non-cloud application simply connects to a specified MySQL database file and then loads/views/updates data. This database can even be stored remotely on a server and accessed from multiple machines, for example:
db=New mySQLCommunityServer
db.host="12.23.56.57"
db.port=3306
db.databaseName="myData"
db.userName="userName"
db.Password="password1"
db.connect
But for a distributed cloud application, it would need to connect to a the same host and SQL database name but with each specific user's login and password and access their specific database and tables. Where would that translate into the code above?
A few questions arise:
Would a new entire database need to be created for each new user account that signs up?
If so, how would changes to table formatting be applied to all user databases. Assuming roughly 500-1000 users signup, having 500-1000 separate databases doesn't make much sense.
Would this be better accomplished using a service such as Amazon Web Services? Even there, it was a bit unclear how the "program user account" would translate onto their services.
Thank you for any feedback!
Related
I have 2 systems let's call them i and j. Each have it's own database.
Each have a registration page, where a user is inserted in a user table.
What is the best way to synchronize both tables, where if any user registers at system i it will be also registered at system j.
Notes:
I cannot read from each other databases directly.
I can do small changes in the code if needed and it will not affect the system performance or natural behavior.
I can create API's for both systems if needed.
I can add any tables or fields if needed.
I can create any cron jobs unless it will affect the performance of the system or server.
I'm using cPanel.
Technologies:
MySQL
PHP
REST API's
The fact that you list cpanel as a technology shows you're working with an inflexible budget hosting vendor. So it's unlikely they'll cooperate in setting up background tasks (cron jobs) to merge your user tables behind the scenes. (cpanel isn't a technology: it's a system administration user interface provided by hosting vendors who don't trust their customers' skills.)
So. you should design and implement a REST API in the code of both your apps to perform user registration and authentication tasks. You didn't show us the details of your app, so it's hard to design it for you. Still it seems likely you'll have to implement these operations:
PUT user
DELETE user
GET user
POST user to validate a user's password, etc. (Don't use GET to pass secret information: GET request parameters go into server logs.)
PATCH to update details of a user.
If you get the API working, whenever you create/retrieve/update/delete user information in one app, you'll use the API to change it in the other.
Your best bet would be to create a third app just for user management, and have both your existing apps use it. That way you're sure to have one coherent source of truth about users. But you can do it just within two apps.
I am looking for a solution that lets me interconnect several databases.
But let me explain it with the exact example:
I have a main domain (front page for public clients) and four sub-domains (development, management, client, ...) in the clients webhosting.
Each domain has its own database and runs different software (WordPress, dolibarr, sysPass, our own software), but all databases are stored on the same mySQL server.
If a CRUD is made, I want that the other databases also "do" something with that data.
Basically, automation.
For example - a user on development.subdomain.xyz sets a project task to "finished".
When the UPDATE is done to the "development" database, I want an INSERT with parts of that data into the "management" database and an UPDATE towards the "client" database.
I could write up some script that connects to all four databases and does the operations necessary.
But that feels a little hard to maintain if multiple users shall have access to this "logic" system?
I could also use the provided API's and process the data (again in a script form rather than implementing a whole UI).
That feels like adding an unnecessary, extra security concern and again hard to maintain?
If I want to add additional functionality - like sending an Email as well, that would even make it harder for non-coders to interact.
So I found several of these "Low-Code Business Process Management" tools and now I'm at a loss.
Is that what I'm looking for? Can you throw me some tags, keywords or links to guide my search for possible solutions?
I do not even know how to call such a system or search for it - which stops me from progressing.
Thank you for all tips :)
I have been trying to make an Inventory Management System. I have made the database on Access but I want the database to run online so that people from remote areas with different access levels can modify it in real time.
Is there a way I can store the .accdb file with access restriction? Or is there any online service hosting live databases of MS Access?
It depends on your infrastructure. The simplest (but worst as far as performance) is to set up VPN connections for remote users. Event better if you have the capability use Remote Web Workplace or a Remote Desktop server. Finally put all the tables on SQL Server or MySQL and distribute the front end. With any of these, as with any Internet facing service, you have to be very careful with your security precautions but it is possible to do any of these with adequate security.
If you know only one user will be working at it at a time you can use something like DropBox, Google drive or SkyDrive but that will not work if you want more than one user at a time. Access will not be able to "combine" the changes from multiple user accessing it this way.
I created reports on Web Intelligence accessing an Oracle database. But now, other people want the same reports. Each one of them has a different database (but all are Oracle) with the same structure but with his own data.
What do I have to do to make the same reports available for all? The reports are the same, but the connection or universe changes depending on the user that is running it.
I don't want to make a copy of them to each person, because any change on one report has to be available for everybody.
Regards,
Antonio
If this product sits on a server, you might be able to exclude the database login and password, so the user has to enter in a separate login and password for the database they have to log into. Perhaps this is something your DBA sets up for each database user to have read-only access to certain tables.
If your documents are built on top a "classic" UNV universe, this can be done at the universe level by defining connection restrictions. If you are using a new BI4 UNX universe, you will need to create Data Security Profile Connections. Both of these mechanisms allow you to map alternate database connections for different users within the same universe (and therefore share the same documents based on that universe).
This functionality is fairly well described in the Universe Designer documentation (for UNV) and the Information Design Tool User Guide (for BI4).
I'm looking into using CloudBees for some application prototyping. I am using free accounts right now, I am not paying any subscriptions at the moment.
The first step for me is to create a MySQL database to host my application's data. I've done so (and it was pretty easy!). I also use Liquibase to manage the database (I've started this work using local H2 databases for the pre-prototyping), and I've been able to construct everything as expected.
As part of checking whether liquibase created the tables, I brought up the MySQL database in NetBeans. And, it did function well. But I can also see other schemas as well as the schema I just created. They're all innocently named (test, test_6hob). But, I can see the tables and view their data.
My question is around the visibility of the data that's in the CloudBees database. Is the database created for the free accounts viewable to other people connecting to the same machine? Does this change if I use a paid account? Or is it more the nature of how the database was created? I can see other schemas (and their data) but I have no idea if other people can see mine? Is there a permissions-aspect I need to ensure I set? I've fairly ignorant with the inner-workings of MySQL.
While this is a prototype, were I to move into using CloudBees for production applications, I wouldn't want the data to be visible to anyone who happened to connect to the same database as my application. It's entirely possible that I'm missing something in this new cloud world. :)
Thanks for any info
All CloudBees MySQL databases are secured separately (although will be in shared instances unless you have a dedicated server) - they are not readable by any other account by default.
However, it is possible for the database owner to grant access to users from other accounts on that same database server if you really wanted to - even though it makes very little sense to do so (and your special user configuration will be lost during a failover).
So this is what has happened for the test databases that you can see - the database owner has opened up security on those databases / tables.
This question is probably off topic but i'll bite anyway. The database data is private to your account. Actual hardware/vm's maybe shared but the data/database is not.