I am currently working on an application that uses a SQL Server 2008 database that sits internally on a LAN. I am having two problems related to managing the database:
Currently, I have 2 databases in SQL server, one for test and one for production, and I copy tables and views etc... between these two databases when deploying changes. I'm assuming there is a better way to manage pushing changes from the test database to the production database, can anyone point me in the right direction here?
I do a good portion of my work remotely, so I have installed SQL Server 2008 Express to my laptop and run a 3rd copy of the database locally. Is this the best option for doing remote work? The solution I've been looking for in this situation is to expose my test database to the web with a limited user that I could use for when I am developing remotely. Is this feasible/recommended?
I have found that using my own, local copy of SQL Server Developer Edition on my notebook is the best way to do dev work overall; then a separate test and production database on servers. I like keeping my local dev server so that I am never at the mercy of connections to do dev work.
As a principle, I don't expose SQL servers publicly ever, so working through a VPN is the only way I can access my typical test/production servers. If my dev server was there, too; I would often be unable to do dev work when, for example, I am at a location where VPN pass-through is not permitted.
As for updating the production/test databases; I always generate change scripts when ever I change the dev server, and then keep them organized so they can be applied to the test and then later production servers. You can generate those scripts via SQL Server Management Studio or Visual Studio.
Probably the cleanest most repeable way is to use a real build process for your database code and objects. First put all your database code and objects in source control. Then use DBGHOST to create upgrade scripts to get your production database upgraded. As part of this you can create output that will create a empty dev database that matches any given release easily when using DBGhost. We have been using for about 3 years now and wouldn't do it any otherway. Check out there site for a full walk through. Well well worth the money. Did I say it's well worth the money?
http://www.innovartis.co.uk/
Related
I'm struggling with finding out how to properly test stuff on my local PC and then transfer that over to production.
So here is my situation:
I got a project in NodeJS/typescript, and I'm using Prisma in it for managing my database. On my server I just run a MySQL database, and for testing on my PC I always just used SQLite.
But now that I want to use Prisma Migrate (because it's highly recommended to do so in production) I can't because I use different databases on my PC vs on my Server. Now here comes my question, what is the correct way to test with a database during development?
Should I just connect to my server and make a test database there? Use VS Code's SSH coding function to code directly on the server and connect to the database? Install MySQL on my PC? Like, what's the correct way to do it?
Always use the same brand and same version database in development and testing that you will eventually deploy to. There are compatibility differences between brands, i.e. an SQL query that works on SQLite does not necessarily work the same on MySQL, and vice-versa. Even data types and schema definitions aren't all the same between different SQL products.
If you use different SQL databases in development and production, you will waste a bunch of time and increase your gray hair debugging problems in production, as you insist, "it works on my machine."
This is avoidable!
When I develop on my local computer, I usually have an instance of MySQL Server running in a Docker container on my laptop.
I assume any test data on my laptop is temporary. I can easily recreate schema and data at any time, using scripts that are checked into my source control repo, so I don't worry about losing any data. In fact, I feel no hesitation to drop it and recreate it several times a week.
So if I need to upgrade the local database version to match an upgrade on production, I just delete the Docker container and its data, pull the new Docker image version, initialize a new data dir, and reload my test data again.
Every step is scripted, even the Docker pull.
The caveat to my practice is that you can't necessarily duplicate the software if you use cloud databases, for example Amazon Aurora. There's no way to run an Aurora-compatible instance on your laptop (and don't believe the salespeople that Aurora is fully compatible with MySQL; it's not). So you could run a small Aurora instance in a development VPC and connect to that from your app development environment. At least if your internet connection is reliable enough.
By the way, a similar rule applies to all the other technology you use in development. The version of Node.js, Prisma, other NPM dependencies, http and cache servers, etc. Even the operating system might be the source of compatibility issues, but you may have to develop in a Virtual Machine to match the OS to production exactly.
At one past job, I did help the developer team create what we called the "golden image" which was a pre-configured VM with all our software dependencies installed, and we used this golden image for both the developer sandbox VM, and also an AMI from which we launched the production Amazon EC2 instances. So all the developers were guaranteed to have a test environment that matched production exactly. After that, if they had code problems, they could fix it in development and have a much higher confidence it would work after deploying to production.
So, I basically made a Windows Application using Visual Studio 2019, and used MySQL as my database to store records.
Now I want to publish that application and send it to a client or try to run it on a different machine, but i think since i have used the localhost as my database connection it wont be able to read or write data on my database from another machine.
So basically I want to know how can i host my mysql database on a machine and access that database from another machine using an application.
I am unable to find any sort of guide to do it online, if someone can guide me or give me a referrence from where I can get information and solve my issue.
You can expose your localhost to be accessed remotely, However, this is not ideal or advised. I will suggest you host your database on one of the virtual or cloud services and access it in your application.
Examples of services like https://remotemysql.com/ https://www.db4free.net/ which I advice you use for testing purpose only. You can create your database and they will give you the connection parameters to fill your connection string.
Let me know if is helpful.
I am just starting to use Electron and I have a question about databases. Due to the kind of App I am developing I need a relational DB.
The idea is the user will open the App, if there is connection it would store the data both locally and remotely. If the App is offline it will store the data locally and whenever is online again it will send that data to the remote server. Basically it will be thee same database (local and remote) but it can work offline if necessary.
I am lost on which database will be better for this. As I said I am using MySQL right now and I know you can use MySQL with nodejs so I might give it a try. Also, I am used to use MySQL within a backend language such as PHP, how would you do it in Electron?
Thanks for your time guys!
I've developed an application using the Microsoft Sync Framework 2.1 SDK and my current deployment method has been:
Make a backup of the unprovisioned database from a development machine and restore it on the server.
Provision the server followed by provisioning the client
Sync the databases
Take a backup of the synced database on the development machine and use that for the client installations. It is included in an InstallShield package as an SQL/Server backup that I restore on the client machine.
That works but on the client machine now I would also like to create a seperate test database using the same SQL/Server backup without doubling the size of the installation. That also works but of course because the client test version is no longer synced with the test version on the server it attempts to download all records which takes many hours over slower Internet connections.
Because integrity of the test database is not critical I'm wondering if there's a way to essentially mark it as 'up to date' on the client machine without too much network traffic?
After looking at the way the tracking tables work I'm not sure this is even possible without causing other clients to either upload or download everything. Maybe there is an option to upload only from a client that I've missed? That would suit this purpose fine.
Everytime you take a backup of a provisioned database and restore it to initialize another client or replica, make sure you run PerformPostRestoreFixup after you restore and before you sync it for the first time.
After further analysis of the data structures used by Sync Framework I determined there would be no acceptable way to achieve the result I was seeking without sending a significant of data between the client and server that would have approached what was required to do a 'proper' sync.
Instead I ended up including a seperate test database backup along with the deployment so that the usual PerformPostRestoreFixup could be performed followed by a sync in the normal manner the same as I was handling the live database.
When I run VS2008 locally and open up a package that points to a remote database and run, I believe that the data, from the input file to the db server, is running through my PC, even if the data file is on the database server.
However, if the SSIS package is stored in SQL Server and I start the job through SQL Agent, my PC is out of the picture and that data does not flow through my PC and so I should get see a signatificant performance boost.
Is this the case? I just want to confirm. Currently, I do not have permission to save a Package on our development server and I am considering requesting rights to be able to do for the above reason provided that it is a valid reason.
What kind of access does one have to have to be able to save SSIS Packages on a SQL Server? Might there be a reason to deny me rights to do so perhaps because granting me such access would require giving an elevated access level that would also allow me to do other things that the DBA might not want me to do? As a developer, I think that I should be able to shuffle data from UAT, or so iother non production env into a DEV database without having to request that a DBA do it when he gets around to it.
Your understanding of where the package executes is correct, and performance will certainly be improved by moving execution to the server. At least, it will be if the server has more system resources than your workstation, especially RAM. And avoiding using the network unnecessarily is helpful too, of course.
There are specific roles created in the msdb database for managing SSIS packages so your DBA can let you deploy and run them without making you a sysadmin. However, as the documentation says, there is a possible privilege escalation issue if you run the packages from jobs so the recommended solution is to create a proxy account.