Visual Basic: Best way to share data/variables over network? - mysql

My Visual Basic project involves two applications (server and client if you will). The "Server" gathers data from a sensor and the Client must somehow get this information and display it.
My question is:
Whats the best way to get the data from the server to the client? The first thing that comes to my mind is storing the information in an SQL DB and the "client" will pull the data from the DB.
It is worth noting the "Server" and "Client" will eventually be networked through a WAN and NAT...
The data from the sensor is very small, i.e two separate integers, that's it. So an SQL DB seems like overkill to store two integers in. Plus the hardware i'm running these on will not be very powerful, i.e, 1gb ram and 2ghz CPU.
Thanks :)

If the data is not sensitive and you don't mind it being publicly accessible, the server could run a small web server (IIS or something similar) and write the data to a file on that web server.
The client would then download the file by simply visiting that web address and parsing the file.
If you need a level of authentication, you could store the data in a file which is not publicly accessible and then write an asp/asp.net page which accepts a HTTP Post containing a password and then returns reads the file and sends it as a response.

I have decide to try a P2P connection between the "Server" and "Client", This seems to be working a a LAN but Im yet to test it through a NAT. Obviously I will have to do some basic port forwarding to get this to work.

Related

Transmitting Data Between Websites Via The Internet

I've edited this question. I hope this version is a bit more clear.
I am seeking to have a programmer build a process for me. I need to ensure what is recommended is a best practice for the below process.
Here are the steps I need to have built:
Have a https: webform on my server that submits client inputted data into a database on my server. The data is personal identifiable information and needs to be securely transmitted in the next step.
Once the data is loaded in my database, I need to transfer the data in an encrypted/Json format to a third-party server. The third-party will unencrypt the data, score it and send it back to my server encrypted.
While the data is being sent and scored by the third-party, the client will see a browser screen indicating processing...
Once the data is scored and sent back to my server, it will be unencrypted and it will update the client's browser with options based on the score given by the third-party.
Based on what I understand, I think an API on both my server and the third-party server might be best.
What is the best practice approach for the above process?
Below are some questions I have which would be very helpful for me to understand in your response.
Is the API approach the best?
What process is used by the third-party to unencrypt data I send and vice versa? How do I prevent others from unencrypting the data if it is intercepted?
3)While the data is being scored by the third-party, the client browser will show processing. From a web development standpoint how does this work? Also, from a web development standpoint, how exactly is the processing screen triggered to update with results on the client's browser screen when the data is sent back from the third-party?
The file that you will be transmitting, as you mentioned is encrypted so it will totally depend on the encryption algorithm you are using, generally encrypted data are stored as BASE64 or HEX so after encryption the data will be passed in the above-mentioned format.
To answer you second question on "how will the receiving website receive the file?", there are several ways you can do this:
You can share the backend database your website is using then it will just be a simple query away (by shared I mean both the websites use the same database).
Another way of achieving this is to use an API which can store your data and can be globally used in any application it is called at
Or you can set up a simple php server locally at your machine and send data between websites using the HTTP: GET or HTTP: POST requests.
also avoid using un-necessary tags like web-development-server or data-transfer or transmission etc. these tags are useless and unrelated to your question. You should only tag those which are related to your question, a simple tag for web-development would be enough.
also edit out your question to make us properly understand, what problems you are facing? what have you tried? what do you expect from us in the answer?
please clarify your question more.
Your concept of files being sent around is kind of wrong, because in most cases none of this is ever been written to disk, and so there is no JSON file with a file-name - and these are not directly being encrypted, but only pushed through an encrypted channel. Most commonly both sides either use HTTPS or WSS as the protocol, which encrypts / decrypts the data being exchanged transparently (all by itself). And depending on the protocol which is being used, this requires either a combination of client & server, server & server - or a P2P network - to be installed.
Further reading: Internetworking Basics - Computer and Information Science.

Database Security when hosted on client

I have a database along with REST API for clients to access the data. For performance and other reasons, I need to move the application along with the data to the client's physical server. Is there a way for me to encrypt the data in the database, so the only way the client can get access to it is through API that I expose, and not by cracking MySql and getting at raw data. I do not want the client to see the data stored on my DB, as I feel they will steal it or share it. What can I do to accomplish that?
One idea:
Is it possible to implement some form of one-way encryption, where its based on the lookup value provided in api.
e.g. api lookup by email, that is then gets one-way encrypted compared in the DB for match, and returns a record. This way if they happen to look at my database, the can not see list of emails, all they see is data that is something similar /etc/passwd file.
No.
From the 10 Immutable Laws of Security
Law #3: If a bad guy has unrestricted physical access to your computer, it's not your computer anymore
What you want is fundamentally impossible, without caveats. Always and everywhere.

Reliability Android when connection is off

I'm developing an App where I store my data in a DB online using HTTP POSTO and GET.
I need to implement some reliability to my software, so if the user presses the button, and there is no connection, the data should be stored in something (file? sqlite?) and then when the connection is again on, send the HTTP request to send data.
Any advices or pieces of code to show me how to do this?
Thanks.
Sounds good and pretty forward for me. Just go.
You use a local sqlite db as "cache". To keep it simple, do not implement any logic about that into your apps normal code. Just use the local db. Then, separately, you code a synchronizer. That one checks for the online connection and synchronizes the the local sqlite database with a remote database, maybe mysql.
This should be perfectly fine for all applications that to not require immediate exchange of the data with other processes all the time.
There is one catch, though: the low performance of sqlite on bigger data sets. That is an issue with all single file database solutions. So this approach probably is only valid for small data sets in total, or if you can reduce the usage of the local database to only a part of the total data, maybe only the time critical stuff.
Another workaround might be to use joins over two separate databases, the local and the remote one. But such things really boost the complexity of code, so think thrice if that really is required.

MySQL - Different encryption for different clients?

In my scenario, all the client's MySQL data resides in the Service provider's server. How can I let the clients choose their own encryption so that the data still resides in the service provider's server but the service provider cant read the data ?
In the end, your users will have to trust you simply because the pages and code in the pages they run originate from you as well. You can try and protect them against any mishap, except your own. Technically speaking that is, you can always have yourself audited of course.

best practice: mysql remote mobile devices sync over 3G connection

currently we have one master mysql server that connect every 1 hour to 100 remote mobile devices [vehicles] over 3G connection [not very reliable: get disconnect daily while sync in progress for few cars]. the sync done through .net windows service tool. after checking the remote mysql status the master start perform the sync. sometimes the sync payload data is about 6-8 MB. the sync performed for one table only using non-transactional approach.
mysql server version in use is: 4.1.22
Questions:
is it useful to make the sync transactional knowing that only one table getting sync? or no value added!
the sync data loaded to remote machine using mysql statement:
LOAD DATA LOCAL INFILE
the file format is CSV. how i can send the data in compressed format? without developing tool that reside on the remote device.
is it good practice or architecture in the sync domain to deploy remote application that will perform the sync after sending the data or it should be done directly by the master? i mean the development of tool that will reside on remote machine will be difficult to update or fix in case new requirements appear. but it will save a lot of bandwidth for the sync operation and it will eliminate the errors that could raise from the live master sync in case disconnection occur while the sync is in-progress. so if this is recommend then only compressed data will be sent, then by using some sort of check-sum I'll verify that the whole data sent otherwise the request will be initiated again.
please share your thoughts and experience.
thanks,
Firstly, I would change the approach to a client inited sync vs a server inited sync. A many to one vs one to many approach will expand much easier than your current setup. My above comments give a few good examples of a required client to server syncing.
Secondly, Turn on transactional record entry. There is no reason not to have it. This will guarentee that the information gets entered in a timely fashion and will be able to possibly provide even more 'meta-data' (such as which clients are slow to update, etc...).
Lastly, you can 'enhance' this uploading by taking a different look at it. If you were to implement a sort of service at the server side that takes in a response via a POST from the client, you'd be able to send the data to the server side with no issues. It would be just like 'uploading' a file to a server. Once your 6-8 MB file is 'uploaded' it is then put into the database. The great thing about this is if your server is an APACHE (or even in your case an IIS server), you'd be able to have every single client uploading data at the same time without much of an issue. At that point, uploading to the mysql server via an insert would take virtually no time and your process would continue on without a problem.
This is the way I'd handle your situation...