Some background:
We provide a complex system consisting of a large database and several programs - most written in C#, however some legacy applications are still running on MFC.
Most of the stuff we provide runs on a single server (runs SQL server and SQL Management studio 2005), however several applications can run on a number of client's computers. Updating this is a real pain, since after we update the database the outdated software is likely to break due to database changes. Updating the server software manually is one thing, however making sure all the client software works too is practically impossible, and will only get worse with time.
I am to write an updating service, which will be able to update the whole product - update the database, reinstall services and applications. (However only the programs / files /tables / etc that are actually modified should be updated. Downloading the whole product each time there is a update available is not an option. Also, some computers may only have a subset of avaliable programs installed)
First of all is there a already a good way of doing this? If there is something similar to ClickOnce that would also be able to update databases already out there I'd much rather use that.
If not, what are the best practices when it comes to updating? All and any material will be greatly appreciated.
I will need some updates to be installed on the server ASAP after the updates have been submitted, without any user input. That includes a windows service (that is running at all times) and any database changes. After these changes have been made, I will have to prevent any software that is not up to date from either accessing the parts that have been changed, or from running at all.
Any advice will be greatly appreciated - If I do have to write a system like that, I'd like to do it right.
Best practice would be to package the app up in an MSI and use Group Policy to push the updates out to each client.
If that's not possible then you need some way of informing the client app that it is out-of-date (simple check against a server holding the current version number would probably suffice) and refuse to work until an update patch is downloaded and installed - you could even launch this process from inside the app itself.
This answer may help you, I haven't personally used Wix but this seems to be along the lines of what you're looking for. Make sure to check out Lesson 4 in the linked tutorial, as this provides the details you would require.
I'm not sure where you would find best practices when it comes to updating, but in my personal opinion you shouldn't ever force a user to update unless it breaks the underlying application (like yours does). I would be very interested to hear if someone has a link to a list of best practices on this topic.
Edit
I was interested in possible best practices for updating so I started another question thread here. The general consensus in the answers is "Ask the user/client", but there may be some other details in the answers which may help you, I'm afraid I can't find any actual hard rules on the subject anywhere (which I was expecting).
Related
I'm running Azerothcore-WOLTK inside a Docker container.
I would like to update the server since I read there's an important security fix.
However I never updated the server since I first installed it last year (December 2019). Since then, I have customized the server in several ways:
I have customized a few boss scripts to work properly with two players.
I have installed a few modules, including one that also required some extra code to be compiled, and some SQL queries to be run.
I have modified the database myself, adding Quests, NPC, Vendors and Items
As such, I'm extremely concerned I would end up messing everything up. I would require your assistance on how to proceed to update the server to the latest version while maintaining all the customization I have performed.
I'm especially concerned about the database changes as I figure I could backup the updated boss scripts, do a git pull and replace them again before building (I should do a fork afterwards, I didn't think about it)...
But in any I case I would be extremely thankful if you could guide me step by step along the way, considering I am using a docker installation.
For anything Database related I use Heidi SQL, so I could use that for any Database procedure. I'm not very proficient in SQL queries, but I should be able to import .sql files as needed.
I realize I'm asking a lot, so please don't feel pressured to answer right away. I will be most thankful if you could help me whenever you have the chance.
Thank you for your time :)
I'll try to answer all points you mentioned:
1. The boss scripts.
The worst thing that can happen is that you get merge conflicts while pulling the latest changes using git. So you would have to manually solve them. It's not necessarily difficult, especially in your case. It's just boss scripts, so by nature, they are quite self-contained and you are sure to not break anything else when messing with them.
2. Modules
The modules should not be a problem at all. Modules exist exactly for this reason: being isolated and not causing issues in case of updating the core or similar.
My only concern here would be that module that required a core change. I don't know what module you installed, normally this shouldn't happen. A proper AzerothCore module should not require any core change.
However, again, the worst thing you can have is some git merge conflicts, nothing too big I hope (depends on how big and invasive were these changes required by the module).
3. Custom database changes.
The golden rule is: always store your custom SQL queries somewhere, in a way that they can be easily re-applied. For example, always use DELETE before INSERT, prefer UPDATE when possible, etc...
So all you need to have is a file (or a bunch of files) containing all your SQL code corresponding to the custom changes you made. If you don't have it, you can still extract it from your DB.
Then you can always re-apply them after you update your core, if you feel it's needed. It might also be the case that you don't need to re-apply them at all. Or maybe you want to start from a fresh AzerothCore world database and re-apply your changes. This really depends on the specific case, but anyway you will be fine (as long as you keep your changes in SQL files).
You can use Keira3 to edit your database, or just extract your changes in case you need to. For example, you can open an entity and copy its "full query".
Backup first
Before starting the upgrade procedure, create a backup of:
your DB
the source files that you have modified (e.g. bosses, etc...)
Update frequently!
However I never updated the server since I first installed it last year (December 2019).
This is not recommended at all! You are supposed to update your AzerothCore frequently (at least once a week). There are a lot of good reasons to do so, one of them is: it's way easier if you do it often.
How to update AzerothCore when using Docker
A generic question about updating AC with Docker has been asked already here: How to update azerothcore-wotlk docker container
I am trying to understand the trade-offs between going with MySQL or PostgreSQL on AWS.
Some considerations for me are that I am an amateur database user, so I need to be sure resources are available which allow me to overcome problems quickly. Along these lines, I bought the book 'PostgreSQL on the Cloud' and was all set to go with PostgreSQL since the book laid out a great use case.
One thing held me back though is that it is important for my work to be able to to easily use Excel as a front end for importing and exporting data into and out of the Database on AWS.
It looks like MySQL has an open extension which is fully integrated with Excel and is also well documented. My research into PostgreSQL uncovered a much more uneven integration with Excel and a lot of long painful group frustration a closer integration has not already occurred.
Right now, I am leaning to MySQL, but want to make sure I am not missing something.
Thanks!
Microsoft touts a PostgreSQL plugin as well: https://support.office.com/en-us/article/connect-to-a-postgresql-database-power-query-bf941e52-066f-4911-a41f-2493c39e69e4. Never used it, so can't comment on it.
You mention you are a beginner, so I'll add... be careful about security with either of these options. There are options to encrypt the channel between the client and server, which you indicate is running on AWS. If not secure, anyone would be able to effectively monitor the connections, extract credentials, and do whatever to your AWS-hosted DB. Generally, cloud-hosted DBs should be behind an authentication/authorization login process.
I'm working on an eCommerce website for a small merchant. This merchant uses Opera (which is based on Visual FoxPro) to manage his in-store inventory, and would like the online store inventory to reflect the in-store inventory.
I'm guessing that my first step is to set up a way to regularly transfer the information from the VFP database to a MySQL database on the website's server. Is there an established process for this? Am I even approaching this problem from the right angle? I've heard a lot about ODBC, but am unsure as to how to implement it or if it's what I'm looking for in this situation.
If it wasn't obvious by this point, I'm in over my head here, and would appreciate any and all advice you may have, including links to articles or tutorials that can help improve my general understanding of all the moving parts here.
Thanks much.
Co-worker developed synchronization process between VFP and MSSQL2008. WCF service which took input directly from VFP.
On other project - as far as i remember, when we tried ODBC .NET data adapter, it had problems with encodings and foreign languages. That's why we used COM+, serialization for communication with .NET.
But it seems to me you are using PHP (eCommerce=>Drupal=>PHP) so you are in completely different situation.
In your case, i would start with checking out if Opera (i guess it's this Opera) provides built-in export and eCommerce provides built-in import. Mostly because it might be tedious work to sync data manually from 2 apps coded by someone else. Then i would research if i/o can be joined and automated (something like scheduled task on win environment). Unfortunately, can't help much more because i'm unfamiliar with those tools, products and technologies.
Anyway - it seems to me like quite hard and dirty task and i wish you good luck. :)
Depend on what is that you are using to implement the website.. in general it is pretty easy with ODBC (In Java , I did it using the jdbc-odbc bridge)
Wondering if it is possible to have a version control of a MySQL database.
I realize this question has been asked before however the newest is almost a year ago, and at the rate things change...
The problem is coming that each developer has apache/MySQL/PHP on their own computers to which they sometimes edit the database. Its rather inconvenient if they have to send an email to all the other developers and then manually edit the test servers database.
How do you deal with this problem?
Thanks
This is not a MySQL-related solution in itself, but we've had a lot of success with a product called liquibase. (http://www.liquibase.org/)
It's a migration solution which covers many different database vendors, allowing all database changes to be coded in configuration files, all of which are kept in Subversion. Since all configuration is kept in XML files, it's easy to merge other people's changes into the mainline script and it plays well with tags and branches.
The database can be brought up to the current revision level by running the "update database" command. Most changes also have the ability to roll-back a database change, which can be helpful too. I would recommend following the practice of making sure you get current before running the migration, as this would likely be easiest.
Finally, when it comes to a production delivery, you can choose to have all the database changes output as a full SQL script so it can allow DBAs to run it and maintain a separation of duties.
So far, it's worked like a charm.
Well we use Rails which keeps all the change in the migration files. I know that a couple of PHP frameworks do the same thing - Symphony for instance. So when all the changes are merged in our repository ( we user mercurial) - we can see all the changes in migrations that need to or were applied on database in development. Than the person responsible for production rolls out code to production after a full backup is made. However if you don't use a PHP framework that takes care of this than, awied's suggestion sounds very interesting - I haven't heard of liquidbase before but I will definitely check it out.
There is a tool called iBatis, now called MyBatis that handles versions of databases perfectly.
It takes a little work to have all your changes in script instead of with a graphical tool, but, if you are familiar with coding, it's not a problem.
When you have multiple databases (like dev-test-prod), you just make 3 environment files and you can update one environment with only one command-line instruction.
I have been asked by a friend to help him assess a number of quotes for porting a desktop application based on MS access and VBA to a web based app. The application seems to have a relatively large amount of business logic coded into the VBA.
My question is very specific - are there any good tools or resources out there which could assist the porting from access, rather than doing a complete re-write?
The end technology used for the web app does not matter hugely, but would ideally be as mainstream as possible.
You may explore the possibilities offered by Sharepoint. It may help you get the data accessible online but how well will that work depends also on how much VBA code was used in the Access application.
There are some tools around that pretend they can convert MS Access to PHP/ASP websites like DB Forms, but I haven't tried them and they usually only convert the visible part of the app and not the queries and VBA.
They can be helpful to get started though.
Random thoughts
The VBA tends to be the biggest issue.
Moving to ASP.Net will take time and for that you are faced with difficult choices:
transfer all code to the ASP.NET to just get it working
rethink the structure and do a proper ASP.Net implementation from scratch.
I'd prefer the first one: just try has hard as possible to get results fast.
Use SSMA to move the data to SQL Server (unless you want to keep Access as the backend).
Make the forms look the same as on your existing application (or at least have the same function), port the VBA to VB.Net (or C# if you feel like it) form by form, module by module and test that they work as you go along.
Don't try to refactor or make things better at this stage, the point is to 'slap' the old code on the new 'system' and make it bark as it used to, not better, not worse.
Only then can you start refactoring and improving using the new tools at your disposal.
I'm saying all this assuming that there was nothing terribly wrong with the old app and that it just needed to be ported for online consumption.
If the old app was defective and wasn't fulfilling its role, then more emphasis should be placed on re-thinking which parts should be translated and which one should be reworked.
At any rate, you need to have a detailed action plan and a review of the current code and functionalities and try to limit as much as possible your expectations for the first version of the new system: avoid letting everyone input their wishes or your project will become horrendously difficult.
Concentrate on the minimum needed to achieve a certain level of functionality that will satisfy your users, then build on that.
There may be some tools to some of the basic stuff, like to upsize to a different database or maybe the look and text boxes of the forms, but converting what sounds like a lot of VBA code, not so sure.
Is this an intranet/local network type of web app or are you putting it out on the internet? Security will become a major difference between this and your Access app.
Make sure they understand Access/VBA so you can maintain the business logic that has been over the life of the Access app.
Convince your friend to stop/slow any development on the Access app to prevent the company from aiming at a moving target. This may not be realistic, but really needs to be considered.
Is there a reason why hosting the app on Windows Terminal Server would not suffice? This means zero changes to the app, no reprogramming cost and no danger of losing crucial business logic. If you use the Citrix extensions, you can run it in a web browser (though I guess that only works with IE -- I've never used them). But the RDP client comes in versions for Mac and Linux as well as Windows, so you can basically support anybody as long as they install the RDP client for their OS.
Yes, it's more installation on the client end, but it's a helluva lot cheaper and easier on the development and avoids the problem of losing important things coded into the Access app.
Of course, supporting large user populations on WTS/Citrix can get expensive and if the Access app is in need of re-engineering, anyway, it can change the balance. But it's something that you should consider. It's really easy to set up WTS, in fact, and provisioning a server for it basically a matter of adding RAM and Internet bandwidth (though RDP is really efficient to begin with).
One key mistake many people make when trying to run an Access app on WTS:
YOU MUST SPLIT THE DATABASE (front with forms/reports/etc., back end with data tables only), and each user must have their own copy of the front end (stored in user profile on the WTS, or in a folder on your WTS server's data partition with appropriate permissions assigned to the user groups authorized to use the app). Tony Toews's front-end updater is very useful in this context, and explicitly engineered to work in a Terminal Server environment.