My site is a bit more of a static site. The site is based on word-press now, and I am thinking of using auto scale feature.
The problem is that I am not good at startup scripts like python, java, etc...
I am more comfortable with bash script.
Is there a way create a snapshot of a production compute-engine and use it as a template instead of instance group without startup script complexity?
I have two instances, one is an individual instance and one is an inside instance group for auto scale. Whenever there is a update in my site, I have to change it in individual instance and move the snapshot disk as template in instance group and everything will be updated.
My question is, is that workable or do I really have to work on startup script?
Any friendly advice will be highly appreciated.
Some bash skills should be enough to write a startup script and not use the additional instance and image creation at all. See documentation for an easy example of that - just put in all bash commands that you use to prepare that instance yourself. This should be relatively easy and allows for easy modification of that process later on.
If you really want to avoid writing the script, what you’ve described should be possible: take an instance that has everything installed as you like it, then delete it keeping the disk and create an image out of that disk.
One minor improvement: you can use an instance from the existing instance group by abandoning it.
Related
I'm running Azerothcore-WOLTK inside a Docker container.
I would like to update the server since I read there's an important security fix.
However I never updated the server since I first installed it last year (December 2019). Since then, I have customized the server in several ways:
I have customized a few boss scripts to work properly with two players.
I have installed a few modules, including one that also required some extra code to be compiled, and some SQL queries to be run.
I have modified the database myself, adding Quests, NPC, Vendors and Items
As such, I'm extremely concerned I would end up messing everything up. I would require your assistance on how to proceed to update the server to the latest version while maintaining all the customization I have performed.
I'm especially concerned about the database changes as I figure I could backup the updated boss scripts, do a git pull and replace them again before building (I should do a fork afterwards, I didn't think about it)...
But in any I case I would be extremely thankful if you could guide me step by step along the way, considering I am using a docker installation.
For anything Database related I use Heidi SQL, so I could use that for any Database procedure. I'm not very proficient in SQL queries, but I should be able to import .sql files as needed.
I realize I'm asking a lot, so please don't feel pressured to answer right away. I will be most thankful if you could help me whenever you have the chance.
Thank you for your time :)
I'll try to answer all points you mentioned:
1. The boss scripts.
The worst thing that can happen is that you get merge conflicts while pulling the latest changes using git. So you would have to manually solve them. It's not necessarily difficult, especially in your case. It's just boss scripts, so by nature, they are quite self-contained and you are sure to not break anything else when messing with them.
2. Modules
The modules should not be a problem at all. Modules exist exactly for this reason: being isolated and not causing issues in case of updating the core or similar.
My only concern here would be that module that required a core change. I don't know what module you installed, normally this shouldn't happen. A proper AzerothCore module should not require any core change.
However, again, the worst thing you can have is some git merge conflicts, nothing too big I hope (depends on how big and invasive were these changes required by the module).
3. Custom database changes.
The golden rule is: always store your custom SQL queries somewhere, in a way that they can be easily re-applied. For example, always use DELETE before INSERT, prefer UPDATE when possible, etc...
So all you need to have is a file (or a bunch of files) containing all your SQL code corresponding to the custom changes you made. If you don't have it, you can still extract it from your DB.
Then you can always re-apply them after you update your core, if you feel it's needed. It might also be the case that you don't need to re-apply them at all. Or maybe you want to start from a fresh AzerothCore world database and re-apply your changes. This really depends on the specific case, but anyway you will be fine (as long as you keep your changes in SQL files).
You can use Keira3 to edit your database, or just extract your changes in case you need to. For example, you can open an entity and copy its "full query".
Backup first
Before starting the upgrade procedure, create a backup of:
your DB
the source files that you have modified (e.g. bosses, etc...)
Update frequently!
However I never updated the server since I first installed it last year (December 2019).
This is not recommended at all! You are supposed to update your AzerothCore frequently (at least once a week). There are a lot of good reasons to do so, one of them is: it's way easier if you do it often.
How to update AzerothCore when using Docker
A generic question about updating AC with Docker has been asked already here: How to update azerothcore-wotlk docker container
I'm creating a javafx app with using wamp server for mysql database, how to make automatic backup even if database become large for a specific location, Or what's best practice in this case?
What can I do by java for this issue?
I would suggest to create a cron job for this.
You can either invoke mysqldump tool directly or create a script in any language to export data you want to, which requires more work, but is more flexible.
Alternatively you can search the Internet for some ready tool (like this which I just found, but didn't check if it works). I'm pretty sure you would find some, since it's pretty common thing to do.
I am working out how to synchronize wordpress installations where both can be updated simultaneously, and both can work offline, then come online to sync.
I think the easiest way to sync posts between sites, is to include the site id in the primary key of the posts. Therefore, any post is identified by an incremental id and the id of the server location it was created from.
Is this possible to achieve with a plugin?
What dangers lie ahead if I pursue this path?
Is there a better, alternative way to achieve what I am trying to achieve?
It is possible in several ways:
- Write a stored procedure inside the first Wordpress installation's php files, that inserts the content into the other database when something is written to it. This one probably won't work offline.
- Write a function that compares the two databases at a schedule time using a simple sql query and creates a diff log. Then copies over the difference to the other database.
It depends why you need to do this, but if this works, I would recommend this solution:
- Keep one wordpress installation. Maintain one database, and connect to it from the other website to load the content. You can create your own SQL connection to it and load whatever content you need.
- Keep one wordpress installation, and use it's RSS feed to read the content and display it in whichever second website you need to do it in.
I can't imagine how a plugin would be of much help, especially keeping the databases in sync offline too. In my experience, its usually better to write your custom php scripts, rather than use a plugin so you can have a more direct control over the functionality.
Hope this helps.
In our rails app we sometimes have db entries created by users that we'd like to make part of our dev environment, without exporting the whole table. So, we'd like to be able to have a special 'dev and testing' dump.
Any recommended best practices? mysqldump seems pretty cumbersome, and we'd like to pull in rails associations as well, so maybe a rake task would make more sense.
Ideas?
You could use an ETL tool like Pentaho Kettle. Once you have initial transformation setup that you want you could easily run it with different parameters in the future. This way you could also keep all your associations. I wrote a little blurb about Pentaho for another question here.
If you provide a rough schema I could probably help you get started on what your transformation would look like.
I had a similar need and I ended up creating a plugin for that. It was developed for Rails 2.x and worked fine for me, but I didn't have much use for it lately.
The documentation is lacking, but it's pretty simple. You basically install the plugin and then have a method to_sql available on all your models. Options are explained in README.
You can try it out and let me know if you have any issues, I'll try to help.
I'd go after it using a Rails runner script. That will allow your code to access the same things your Rails app would, including the database initializations. ActiveRecord will be able to take advantage of the model relationships you've defined.
Create some "transfer" tables in your production database and copy the desired data into those using the "runner" script. From there you could serialize the data, or use a dump tool, since you'll be dealing with a reduced amount of records. Reverse the process in the development environment to move the data into the database.
I had a need to populate the database in one of my apps from remote web logs and wrote a runner script that fired off periodically via cron, ftps the data from my site and inserts the data.
For some security issues I'm in an envorinment where third party apps can't access my DB. For this reason I should have some service/tool/script (dunno what yet... i'm open to the best option, still reading to see what I'm gonna do...)
which enables me to generate on a regular basis(daily, weekly, monthly) some csv file with all new/modified records for a certain application.
I should be able to automate this process and also export at any time a new file.
So it should keep track for each application which records he still needs.
Each application will need some data in some other format (csv/xls/sql), also some fields will be needed for some application and some aren't... It should be fairly flexible...
What is the best option for me? Creating some custom tables for each application? Based on that extracting modified data?
I think you best thing here, assuming you have access to the server to let you set this up is to make a small command line program that can do the relativley simple task you need. Languages like pearl are good for this sort of thing I do believe.
once you have that 'tool' made you can schedule it through the OS of the server to run ever set amount of time. Either schedule task for a windows server or a cronjob for a linux server.
You can also (with out having to set up the scheduled task if you don't / can't want to) enable this small command line application to be called via 'CGI' this is a special way of letting applications on the server be executed at will by a web user. If you do enable this though, I suggest you add some sort of locking system so that it can only be run every so often and to stop it being run five times at once.
EDIT
You might also want to just look into database replication or adding read only users. This saves a hole lot of arseing around. Try to find a solution that dose not split or duplicate data. You can set up users to only be able to access certain parts of the database system in certain ways, such as SELECT data