Interconnect multiple databases on the same MySQL server - mysql

I am looking for a solution that lets me interconnect several databases.
But let me explain it with the exact example:
I have a main domain (front page for public clients) and four sub-domains (development, management, client, ...) in the clients webhosting.
Each domain has its own database and runs different software (WordPress, dolibarr, sysPass, our own software), but all databases are stored on the same mySQL server.
If a CRUD is made, I want that the other databases also "do" something with that data.
Basically, automation.
For example - a user on development.subdomain.xyz sets a project task to "finished".
When the UPDATE is done to the "development" database, I want an INSERT with parts of that data into the "management" database and an UPDATE towards the "client" database.
I could write up some script that connects to all four databases and does the operations necessary.
But that feels a little hard to maintain if multiple users shall have access to this "logic" system?
I could also use the provided API's and process the data (again in a script form rather than implementing a whole UI).
That feels like adding an unnecessary, extra security concern and again hard to maintain?
If I want to add additional functionality - like sending an Email as well, that would even make it harder for non-coders to interact.
So I found several of these "Low-Code Business Process Management" tools and now I'm at a loss.
Is that what I'm looking for? Can you throw me some tags, keywords or links to guide my search for possible solutions?
I do not even know how to call such a system or search for it - which stops me from progressing.
Thank you for all tips :)

Related

PHP script for front-end management of MySql tables

I'm looking for a PHP script that will allow me to easily manage 'MySql' tables. By managing I mean not their creation but the possibility of adding new records, modifying and deleting them.
It must be possible to specify for each user which tables he will have access to and with which modalities (insertion only, modification only, etc.).
For each user I will also have to specify whether he will be able to see all or some of the columns in the table and with which permissions.
Also I'll need to know who did what, a sort of global change LOG.
My idea was to have a back-end in which I specified the users and how to access the various tables/columns and a front-end for the users.
In the front-end users will be able to add/modify/delete records and data they are allowed and the ability to filter and/or sort the various records.
I know I could use some PHP frameworks or rely on CMS but I have to write a lot of code by hand and it seems hardly credible that such a product is not already available.
Does anyone know if there is something like this?
I had tried starting with PHP frameworks but implementing everything from scratch stopped me.
I expect there is already something available.
Thanks.
Davide.

Dynamically changing Report's Shared Data Source at Runtime

I'm looking to use SSRS for multi-tenant reporting and I'd like the ability to have runtime-chosen Shared Data Sources for my reports. What do I mean by this? Well, I could be flexible but I think the two most likely possibilities are (however, I'm also open to other possibilities):
The Shared Data Source is dictated by the client's authentication. In my case, the "client" is a .NET application and not the user, so if this is a viable path then I'd like to somehow have the MainDB (that's what I'm calling it) Shared Data Source selected by the Service Account that the client logs in as.
Pass the name of the Shared Data Source as a parameter and let that dictate which one to use. Given that all of my clients are "trusted players", I am comfortable with this approach. While each client will have its own representative Service Account, it's just for good measure and should not be important. So instead of just calling the data source MainDB, we could instead have Client1DB and Client2DB, etc. It's okay if a new data source means a new deployment but I need this to scale easily enough as well to ~50 different data sources over time.
Why? Because we have multiple/duplicate copies of our production application for multiple customers but we don't want to duplicate everything, just the web apps and databases. We're fine with some common "back-end" things. And for SSRS, because of how expensive licenses are (and how rarely reports are ran by our users), we really want to have just a single back-end for all of our customers (I actually have a second one on standby for manual disaster recovery situations - we don't need to be too fancy here as reports are the least important DR concern we have).
I have seen this question which points to this post but I was really hoping there was a better way than this. Because of all of those additional steps/efforts/limitations/etc, I'd rather just use PowerShell to script duplicate deployments of the reports with tweaked hardcoded data sources instead of standardizing on the steps in that post. That solution feels WAY too hacky to me and doesn't seem to scale very well at all.
I've done this a bunch of terrible ways (usually hardcoded in a dynamic script), and then I discovered its actually quite simple.
Instead of using Shared Connection, use the Embedded Connection and create your Connection string based on params (or any string manipulation code)....

Add cloud-capabilities to existing application

We currently have an desktop application that is sold to small businesses and used as a server/client model application and we are in the early stages of researching the possibility of adding cloud-based syncing to the program.
Besides the obvious hurdles in transitioning/recoding the networking code of the program itself, there seem to be many additional questions related to the server/database selection, available cloud services, scalability, and more.
For example, currently the non-cloud application simply connects to a specified MySQL database file and then loads/views/updates data. This database can even be stored remotely on a server and accessed from multiple machines, for example:
db=New mySQLCommunityServer
db.host="12.23.56.57"
db.port=3306
db.databaseName="myData"
db.userName="userName"
db.Password="password1"
db.connect
But for a distributed cloud application, it would need to connect to a the same host and SQL database name but with each specific user's login and password and access their specific database and tables. Where would that translate into the code above?
A few questions arise:
Would a new entire database need to be created for each new user account that signs up?
If so, how would changes to table formatting be applied to all user databases. Assuming roughly 500-1000 users signup, having 500-1000 separate databases doesn't make much sense.
Would this be better accomplished using a service such as Amazon Web Services? Even there, it was a bit unclear how the "program user account" would translate onto their services.
Thank you for any feedback!

To add another Database or not to add another Database, that is the questionn

One of my sites is a social networking site running on MySQL. I use postal code and country information to geolocate users using a webservice. This webservice also allows you to download all their many tables of information so that you can access it locally. My site has gotten big enough that I wish to do this now.
My question is, should I create a new database on my site for all of this postal code and country information and all its tables, or should I incorporate those tables into my existing database for my social networking site?
What are the pros/cons either way?
When you're talking about scaling and want to know about other databases like NOSQL, you might find this article interesting: http://highscalability.com/blog/2010/12/6/what-the-heck-are-you-actually-using-nosql-for.html
I'd vote in favor of a separate database if you planned to use the data as read-only and put a web service in front of it to access it. Users would search it based on a small handful of parameters (e.g. address info to get lat/lon data).
I'd say put it in the existing database if you planned to JOIN it with other information in your current schema.
it will live on the same disk probably.
so disk space is not an issue.
if you query the tables in a completely separate manner, then no impact on the existing site.
if you query things together, then easier when all in one database.
overall administration of one database vs 2 is easier.
i think it's a no brainer... they go in one db.

Tracking data access

Backstory
I work for a company that has an online site that allows user to text personal information for collection. We collect the data, and make it available online. Users can choose to share the data with other users.
Going Forward
At some point, this may become classified an FDA-governed medical tool. In anticipation, we'd like to have in place a logging system that shows each time someone accesses our users' data, whether it be the user themselves, another authorized user, or a support person.
Current Architecture
We are currently running Ruby/Rails, and using a MySQL database. The personal information is encrypted in the database.
Data Access for Support
Today, support personnel can access data one of three ways:
admin site The admin site is limited to whatever screens we develop. While we don't currently, we could easily add logging to keep an audit trail of who accessed which data using the admin tool.
sql client I use MySQLWorkbench to access production. However, when connected this way, all personal information (user name, cell number, etc), is encrypted.
Ruby Rails console - Finally, support can log into one of the production boxes and use the Ruby/Rails console from command line. Ruby will decrypt the data, so we can do some simple things such as
u=User.find_all_by_state('active')
and it will return the recordset of all users with state='active', and decrypt their personal information in the resultset.
Holy Grail
logging
easy access for support
I'd love to be have a way to allow easy support access (once authenticated) to the data, but would log everything that is accessed (read or updated). That way, if I'm checking out my buddy's ex-wife's data for example, it gets logged to a place where I can't get in and clean it the audit trail. (See Google firing Gmail employee for an example of employees breaching the data policies).
Anyone have ideas, thoughts, experiences, suggestions with this issue?
hey devguy. This was a issue for me a couple months back. We ended up centralizing our mysql queires so that we could start to track all information coming in and out. Unfortunately the class I wrote is in PHP but the idea behind it could make it very easy to start logging.
https://code.google.com/p/php-centralized-mysql-controller/
Try stored procedures. Make all code use the stored procedures for CRUD activities. This defines an API that your developers can use while business rules are global enforced (don't return entire SSN values, but only last 4 digits, etc).
This serves as the basis for an external API as well.
If you want logging/auditing, you put it in the procedure.
This protects you from everyone except the DBAs.