MySQL - Different encryption for different clients? - mysql

In my scenario, all the client's MySQL data resides in the Service provider's server. How can I let the clients choose their own encryption so that the data still resides in the service provider's server but the service provider cant read the data ?

In the end, your users will have to trust you simply because the pages and code in the pages they run originate from you as well. You can try and protect them against any mishap, except your own. Technically speaking that is, you can always have yourself audited of course.

Related

Mysql 8 connect only with ssl-ca (server-ca.pem)

I found an old question in ServerFault on the same topic but I am not sure what has changed with MySQL 8. Do I need all 3, the server-ca, client-cert and client-key to make successful SSL connections to MySQL 8? I don't really care for client certificates (all my clients are exactly the same). So do I still need all 3? Also looking at Google Cloud MySQL and it lets me download just the server ca. Seems like the client certs are optional. However I can't seem to connect without the client certs.
The bit that's really important is the client key.
In my experience with GCP, depending on the exact setup, you can often just use the client key and client cert, rather than needing to supply the server certificate (which GCP stores on their end anyway).
Cloud SQL allows you to simply download the server CA because, fundamentally, it isn't as sensitive as the client cert/key. Those are private and only available once, when you first generate them. After that, if you didn't securely save them somewhere else, you cannot get them again (although you can revoke them if they ever get leaked).
The good news is that you can generate lots of client key/certificate pairs (whereas generating a new server certificate will likely schedule it to be rotated in at some point in the future), so there's no problem with your going ahead and generating some new ones.

Visual Basic: Best way to share data/variables over network?

My Visual Basic project involves two applications (server and client if you will). The "Server" gathers data from a sensor and the Client must somehow get this information and display it.
My question is:
Whats the best way to get the data from the server to the client? The first thing that comes to my mind is storing the information in an SQL DB and the "client" will pull the data from the DB.
It is worth noting the "Server" and "Client" will eventually be networked through a WAN and NAT...
The data from the sensor is very small, i.e two separate integers, that's it. So an SQL DB seems like overkill to store two integers in. Plus the hardware i'm running these on will not be very powerful, i.e, 1gb ram and 2ghz CPU.
Thanks :)
If the data is not sensitive and you don't mind it being publicly accessible, the server could run a small web server (IIS or something similar) and write the data to a file on that web server.
The client would then download the file by simply visiting that web address and parsing the file.
If you need a level of authentication, you could store the data in a file which is not publicly accessible and then write an asp/asp.net page which accepts a HTTP Post containing a password and then returns reads the file and sends it as a response.
I have decide to try a P2P connection between the "Server" and "Client", This seems to be working a a LAN but Im yet to test it through a NAT. Obviously I will have to do some basic port forwarding to get this to work.

Encrypting database credentials

Firstly to explain, we have some websites which all connect to a central database. As a rule we don't give clients access to the FTP for their website so they cant access any files with the DB credentials in them. 99.9% of the time this is fine.
However we are having a client insisting they have full FTP access. They want to add advertising / tracking stuff in, and I have set them up their own database and a locked down FTP in another directory, but thats not good enough apparently.
Now I am sure they don't intend to steal our mysql credentials and connect and wipe out our DB's but no doubt you will agree its a huge security risk.
Is there any way to:
a) connect to the database without them seeing the credentials within the code
b) stopping them from adding their own code and connecting to the central database, only their own
Pretty sure nothing is going to be 100% secure, as giving them FTP access means they can do the same as I can, but wondering if anyone else has any ideas?
The only way to do this "securely" without writing a RESTful API (which you've indicated is not feasible), is to create for them a special user account in your MySQL, that cannot access records not owned by them. Truthfully you should do this for all your clients if you're doing it "right," although I understand that can be a lot of maintenance.
Regarding encryption, there isn't a way to encrypt your DB credentials for the client and have them decrypted for the DB login without some intermediary code. This shouldn't matter though if the client has their own MySQL account for access.
For people that still are looking for some solutions I found a nice tutorial on how to make credentials more safe. The author proposes to create additional layer encrypting password and decrypting within app.
https://maciejzalwert.medium.com/quick-tip-for-developers-to-protect-against-credentials-leak-b203a4d80b3b

Should I move client configuration data to the server?

I have a client software program used to launch alarms through a central server. At first it stored configuration data in registry entries, now in a configuration XML file. This configuration information consists of Alarm number, alarm group, hotkey combinations, and such.
This client connects to a server using a TCP socket, which it uses to communicate this configuration to the server. In the next generation of this program, I'm considering moving all configuration information to the server, which stores all of its information in a SQL database.
I envision using some form of web interface to communicate with the server and setup the clients, rather than the current method, which is to either configure the client software on the machine through a control panel, or on install to ether push out an xml file, or pass command line parameters to the MSI. I'm thinking now the only information I would want to specify on install would be the path to the server. Each workstation would be identified by computer name, and configured through the server.
Are there any problems or potential drawbacks of this approach? The main goal is to centralize configuration and make it easier to make changes later, because our software is usually managed by one or two people at most.
Other than allowing for the client to function offline (if such a possibility makes sense for your application), there doesn't appear to be any drawback of moving the configuration to a centralized location. Indeed even with a centralized location, a feature can be added in the client to cache the last known configuration, for use when the client is offline).
In case you implement a [centralized] database design, I suggest to consider storing configuration parameters in an Entity-Attribute-Value (EAV) structure as this schema is particularly well suited for parameters. In particular it allows easy addition and removal of particular parameters and also the handling parameters as a list (paving the way for a list-oriented display as well in the UI, and therefore no changes needed in the UI either when new types of parameters are introduced).
Another reason why configuartion parameter collections and EAV schemas work well together is that even with very many users and configuration points, the configuration data remains small enough that is doesn't suffer some of the limitations of EAV with "big" tables.
Only thing that comes to mind is security of the information. In either case you probably have that issue though. Probably be easier to interface with though with a database as everything would be in one spot.

In SQL Server 2008 how can I secure data in a way that it cannot be decrypted unless connected to a network?

We have recently implemented Transparent Data Encryption in SQL Server 2008 for local databases on our developers laptops to keep them protected in the case a laptop is stolen or lost. This works fine.
Now we are trying to figure out a way to have the certificate expire everyday, forcing an automated process (a script at logon maybe) to go out to a network path and grab a new certificate with an expiration for a day later. This would ensure that if something unforeseen happened, the data would not be usable the next day.
I also looked into using a Cryptographic provider but there doesn't appear to be any "providers" out there. Maybe I'm wrong.
I am open to suggestions. If there is a better way please let me know. Thanks!
Short answer: No
Long answer: Once a message (piece of data) is encrypted, that same key will decrypt the same encrypted message, regardless of what time the decryption algorithm is applied. If the key is changed every day, the data must be decrypted with the old key and re-encrypted with the new. If this process doesn't occur (i.e. someone stops the piece of code that performs the re encryption from running), the old key will still work. Even if you do create a cryptographic provider to check the date, someone else can create a new provider to perform the decryption without first checking the date.
T address the question rather than the motivation. If you set up a Microsoft CA with a derived template (Set to expire for a day) and also allow autoenrollment on that certificate template. You could then set your SQL machine to be part of a OU within the Directory that uses autoenrolment (Technet will give you resources on this requires the use of goup policy). That way when the certificate expires the machine will automagically request a new one.
http://windowsitpro.com/article/articleid/40948/windows-server-2003-pki-certificate-autoenrollment.html
Mark
Not true! There are options available for SQL Server 2008 encryption. Check out the database encryption solutions here at TownsendSecurity.com. Townsend's Alliance AES Encryption is a NIST-certified solution that would put you into compliance with the regulations around health care, credit cards, and banking. Also see the white paper on Alliance AES Encryption.
Businesses with sensitive data in database applications
want to encrypt the data in order to secure it from loss.
Protecting sensitive data increases customer trust and
loyalty, reduces legal liability, and helps meet regulatory
requirements for data security. Examples of databases
that might contain sensitive information are Oracle
Database, IBM DB2, Microsoft SQL Server, MySQL,
and Microsoft Access. Regardless of the disk or folder
encryption technology that might be used, the actual
data should be encrypted to prevent loss
Full disclosure: I'm an intern at Townsend Security.
Without additional detail I fail to understand how your TDE setup will protect data in case it is lost or stolen.
If you are not using full disk encryption (via Bitlocker, Truecrypt, etc) then I as an attacker in physical possession of your hardware can easily reset the local admin password, boot up the laptop and access the SQL Server instance with the local admin credentials. At that point I am a sysadmin on the database server and am able to extract any data I want or to turn off TDE.
In addition since all of the encryption keys and certificates are stored locally it is relatively easy for an attacker in physical possession of the device to gain access to them. TDE is only meaningful for data protection when you physically separate the Database Encryption Key protectors (stored in the master database) from the encrypted database.
If you are using full disk encryption than the usage of TDE is not providing any additional deterrent to an attacker and is only adversely affecting system performance of your developers laptops.
You're right - what you want is a cryptographic provider, and you're right that there's none out there yet.
If you're going to the PASS Summit in November, talk to JC Cannon from Microsoft. He's doing a session on compliance, and he's the head of the SQL Server Compliance group. He's tied into the vendors that are currently working on building cryptographic providers, and he may be able to talk to you about vendor names. Right now they haven't come out publicly to announce who's doing it yet.