"Buffering" data entry into online form in case of disconnection (Racket) - mysql

My company has an existing framework for online medical data entry. We are now working with some doctors in China who are interested in using this framework, however they have some concerns.
On the technical side, the online data entry forms are written in Racket and saved into a MySQL database on a server in Europe after entry. Their concerns are that in some hospitals, Internet connection might be unstable and thus doctors might lose data that was just entered into a form.
So the question is, is there some possibility to somehow buffer the data offline on the respective doctor's workstation before attempting a save to the MySQL database, in order to reduce the risk of data loss. My first instinct was to answer no, because whatever measures the application might take, it's still a web application, so when the Internet connection breaks, there won't be any possibility on the application side to save the data.
Am I right with my guess or might there be a way of accomplishing this? Another idea was creating a completely new application used for data entry and just send the data to the database when data entry is complete; this is however not feasible for the scope of this project.
Thanks in advance!

You can create a local MySQL database, make all the data from the forms be inserted into this database an then when you're sure you have internet connection you export this data to the remote MySQL database, from a different GUI preferably.
Take a look at this page http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html and check if it helps.

Related

Ruby on Rails - Database or excel

I am currently doing a project in Ruby on Rails and I have been presented with a dilemma.
The dilemma is that the users of my system will be uploading an excel spreadsheet. The issue is should I just read straight from this excel spreadsheet into my front-end or should I load this spreadsheet into my MySQL database and then to my front-end.
I have asked numerous people about this issue and have researched on-line to no avail.
Any help would be much appreciated.
The Excel file is not a database. If you need to allow it as source input, parse it, copy the data into a real database and connect to it.
The database is more flexible and efficient for querying and processing information.
I can think of two benefits, or rather options, of having them upload the excel spreadsheet for processing by your back end.
1) would be for your tracking purposes (who sent what and here is what the back-end did with it...). In fact consider that other formats/versions could be introduced, would it be important to keep them to identify what went wrong? "How can we handle this new format"?
2) On the other side, the front-end way that is, you offload processing from the back-end, but that means that the browser app could get fairly complex and depending on your excel, that is if it has many relationships, sending that data up to the server could be complex. However if is simply a flat spreadsheet, say simple rows without totals/tax calc/..., then it might be an advantage of loading it into the browser and then sending these rows up to the server if offloading processing is of any importance.
However point 2 really is diluted by point 1, which to me would be of greater importance for future migration of this service. So I personally would choose uploading it and processing on the back end.
Update
As you clarified in the comments, if you are asking about the use of Excel on the backend as a database? I would agree with Simone Carletti's answer here. Maybe just add a real database gives you much more flexibility, more tools and, more performance. This difference is loading a file, parsing it into some structure, then saving it (unless you are using some .NET framework and even if, the Database (MySQL, MongoDB...) would give you much more flexibility in structuring and querying, over the headache of managing with the speed of DB connections. You might just want to write a sample in both to evaluate, the DB solution will probably win you over.

Database migrations for MS Access

There are a lot of database migration tools available for Ruby, .NET, SQL Server, etc.
Is there anything good for Access/VBA? I've had to roll my own a few times, but I'd really like to offload that burden onto a well-written tool.
The ideal solution would be something like FluentMigrator or RikMigrations with classes or modules that contain DAO code.
When there are only new columns to add personally I tend to do this in the user interface. I have a temporary table in the backend database which is never locked by any users and when creating a new column I add it 1st to this table and double check all the properties are correct. Then when the users are not using the backend database I copy and paste it, then allow users back in.
This means the backend database is unavailable for the shortest period of time and I am not rushed when creating the columns.

Tracking data access

Backstory
I work for a company that has an online site that allows user to text personal information for collection. We collect the data, and make it available online. Users can choose to share the data with other users.
Going Forward
At some point, this may become classified an FDA-governed medical tool. In anticipation, we'd like to have in place a logging system that shows each time someone accesses our users' data, whether it be the user themselves, another authorized user, or a support person.
Current Architecture
We are currently running Ruby/Rails, and using a MySQL database. The personal information is encrypted in the database.
Data Access for Support
Today, support personnel can access data one of three ways:
admin site The admin site is limited to whatever screens we develop. While we don't currently, we could easily add logging to keep an audit trail of who accessed which data using the admin tool.
sql client I use MySQLWorkbench to access production. However, when connected this way, all personal information (user name, cell number, etc), is encrypted.
Ruby Rails console - Finally, support can log into one of the production boxes and use the Ruby/Rails console from command line. Ruby will decrypt the data, so we can do some simple things such as
u=User.find_all_by_state('active')
and it will return the recordset of all users with state='active', and decrypt their personal information in the resultset.
Holy Grail
logging
easy access for support
I'd love to be have a way to allow easy support access (once authenticated) to the data, but would log everything that is accessed (read or updated). That way, if I'm checking out my buddy's ex-wife's data for example, it gets logged to a place where I can't get in and clean it the audit trail. (See Google firing Gmail employee for an example of employees breaching the data policies).
Anyone have ideas, thoughts, experiences, suggestions with this issue?
hey devguy. This was a issue for me a couple months back. We ended up centralizing our mysql queires so that we could start to track all information coming in and out. Unfortunately the class I wrote is in PHP but the idea behind it could make it very easy to start logging.
https://code.google.com/p/php-centralized-mysql-controller/
Try stored procedures. Make all code use the stored procedures for CRUD activities. This defines an API that your developers can use while business rules are global enforced (don't return entire SSN values, but only last 4 digits, etc).
This serves as the basis for an external API as well.
If you want logging/auditing, you put it in the procedure.
This protects you from everyone except the DBAs.

AIR application design

I would like to make some AIR application which would be used for tracking jobs inside a company.
The idea is to create some database which will handle all the data and, when other users form other computers modify data, it is always saved on that same 'server'.
So, more than one user can edit same database, and it would be great if all the data is constantly 'refreshed' (if one user edits and saves data, on other users' computer data is instantly updated). Application would be used only in local network.
I have some data in Excel, so I also wonder if AIR can handle it somehow? Or is it better to re-structure the whole db?
So, which kind of db should I use? I've read that AIR likes SQLite very much, which would be good because I work with MySQL...
Is AIR (in combination with SQLite) able to handle ALL my needs (working over network, sharing same db, refreshing data, creating server/client applications or something, etc.)?
Thank you very much for any thoughts!
m.
There's no restrictions on what database you can use. My advice would be to create an interface in PHP or ASP or whatever language you prefer (since the Database is on a server elsewhere, you'll need some sort of network connectivity anyway), and send all requests and modifications through that.

How to deploy multiuser ms access 2007 DB

I've created a database in access 2007 that needs to be used by 3 users. I'm stuck because I don't know whether to place a copy on each users' computer or to place it on their SQL server. Placing it on the server would mean one access point which is desirable for data consistency but I don't know whether I need ActiveX. Don't know how to use it either. If I place it one each computer how do I work around the master file updates? Can somebody please break it all down, I'm a NEWBIE!!
Your question seems rather confused to me. You mention a SQL Server, but you don't say your application uses SQL Server for its data storage.
Thus, I can only assume that you have a single MDB or ACCDB file with your data tables and forms and reports all in the one file.
The only proper way to distribute this app is:
split it into front end (forms/reports/etc.) with linked tables that point to the back end (data tables only).
place the back end on your file server and relink your tables to point to the new location of the back end.
give a copy of the front end to each of the 3 users, who will run it from their desktop computers. If you're concerned about distributing changes to the front end, something like Tony Toews's front-end updater is very useful.
Others have jumped in to say that you should put the data in SQL Server, but most 3-user Access apps don't need the power of SQL Server. If you're not given administrative permissions on your SQL Server, it could be quite difficult to continue to alter your application's database.
On the other hand, if your database is going to grow to 1GB or more, or if you have strict security requirements, or if the data in your database is so important as to need completely failproof backups, then SQL Server would be a reasonable data store.
For most homegrown apps, not so much.
The "best" way I've worked this out is using Linked Tables in Access to go to SQL Server (since you stated you have that..?).
Using access as a front end in this scenario isn't the best thing you can do, but with 2007, it's a bit better than if you were a few versions back. Check out this article for info on linking Access into SqlServer:
Import or link to SQL Server data
One easy way is to use the EQL Data plugin: http://eqldata.com
That way you can give a copy of the database to each user, but users can sync the database with other users whenever they want. You can also access your tables and queries on the web.