MySQL: host name universal change - mysql

I am making some updates to a php site which I did not design. I have a local copy of the site. At the top of each page there are settings for the host name for the db connection.
Is there someway I can setup a pointer to the remote address. The address is 'mysqlhost' for example and I want that to point to 'mysql.myhost.com'. I tried creating a HOST record for mysqlhost pointing to the IP address it resolves to but that doesn't work.
If I put 'mysql.myhost.com' in the connection it works. If I put that IP address it doesn't so that is probably why the HOST record idea doesn't work.
Other than creating a local copy of the DB is there a quick way so that I don't have to modify each file in my dev environment and then again when I redeploy?

It will probably be more maintainable if you do a global search and replace with your favorite text editor.
While you're doing that, put the connection info in a .inc file and change the current host info to a line that just includes the one .inc file.
Once you have done that, you can just change the .inc file (or have conditional logic in the .inc file to indicate DEV vs. PROD), so there's just one place to update when moving between environments.

Related

How NOT to edit SSIS dtsx packages manually to change config filter in SQL Server configuration schema

I have many packages that are using the package configuration with the following way:
-ALL Packages have the XML configuration file that has only one proporty defined. The ConnectionString of the SQL Server connection that holds the configuration table for the rest of the properties
-A SEPARATE SQL Server package configuration for each connection manager in the package.
-Finally i have an SQL Server configuration for all the properties that are specific to this package.
I attach a pic of what i mean:
Yellow is the XML config with the connectionstring, Blue the connectionamangers and purple the package specific.
So with this setup i can:
Change the xml file location and just point all the setup in another sql server or another database.
Or create different configuration filters in the same config table and try to go into the package and change the filter.
With all the above the problem is that if i do anything from within VS, i am loosing the password in the connectionstring because i am not using the encrypt property. And i dont want to use it...
What are my options? Just go in .dtsx with notepad and chage what i want BEFORE i open the package or before i deploy?
-I dont want to use EncryptSensitiveWithPassword, so:
When i go to package configuration and try to change the ConfoigurationFilter to point to another setting then i am getting to the screen to select the property (connectionstring) and when i finish the DATABASE record for the setting is cleared from the Password= that i have put previously.
So i short what i want:
-No EncryptSensitiveWithPassword in my packages.
-Being able to change configuration from within VS WITHOUT resetting the connectionstring string.
The recommended way for setting this up would be to store the file location of the dtsconfig file in an environment variable. Then change the dtsconfig to use the environment variable rather than a hardcoded location.
So the nuances of that scenario are this:
The password gets blanked out when you resave the xml file (as you pointed out in your question). This is what it is, and it is one of the reasons I never use them.
A process (devenv.exe) will cache the values of the environment variables on start up. This means you need to restart visual studio if you make a change to the value of the environment variable.
The same issue above applies to the integration services service. This will need to be restarted after you add environment variables. Or when you run your packages, the values will not be found.
The idea is that your dev machine points to a dev instance. Then as you migrate the packages to new environments - QA, Prod, each server has it's own environment variable pointing to its respective dtsconfig file.
As a side note, a similar pattern which avoids the password obliteration would be to add a sql connection manager which points to the server which will load the rest of the configurations. Then set the connection string of this connection manager with an environment variable. The advantage is that you don't have to go copying config files around. This works best with integrated security so you are not storing credentials in an environment config. If you want to be more cryptic about it, you could use a registry entry.

Cakephp is not establishing a database link

I have a cake installation on a webserver and a database on a separate server.
I am able to connect to the database remotely via shell, but my cake gives
Error: Mysql requires a database connection
Error: Confirm you have created the file : app/Config/database.php.
Notice: If you want to customize this error message, create app/View/Errors/missing_connection.ctp.
I checked and PDO is setup, mod_rewrite is enabled and I have a similar setup on development server running properly. I checked core.php and it echoes proper base site url, and database.php echoes proper database selection.
Any ideas what may be causing it?
Trying to cover all of the possibilities...
With regard to the database.php file, make sure that the database credentials are being set to the $default variable and not $test, unless of course you're trying to run database unit tests of course.
The file that handles all of the initializing is the webroot's index.php file. You'll want to verify that all paths within that file are using the proper paths. If all you did was extract the CakePHP framework without any folder rearrangements, it should be all correct.
You mention that you checked mod_rewrite is enabled - did you do this with a phpinfo() call to a file located on the same (sub)domain just to verify the settings in the same location?
Although not related to the errors you're experiencing, you'll also want to verify that the external DB allows connection from your webserver's IP.

Playframework 2 - Storing your credentials

Where do you store you credentials like secret key , mail passwords, db passwords?
I made a post on https://security.stackexchange.com/questions/19785/security-concerns-about-my-webapp/19786#19786
And it seems the best way to store the credentials is on an external server.
But play2 uses the application.conf to do this.
So how and where do you store your credentials in play2?
Update 1:
Okay I am using heroku.
I set my enviroment variable like this:
heroku config:add test=ITWORKS
in application.conf I added
sometest=${test}
I trying to access it like this:
Logger.info(Play.application().configuration().getString("sometest"));
But I get the following error:
UnresolvedSubstitution: conf\application.conf: 54: Could not resolve substitution to a value: ${test}
So I guess play2 doesn't find the variable test because it is on heroku. But then I also added it in my local windows environment -> still the same error.
Any idea what is wrong?
Update2:
Okay it works, I just have to reboot after I added an env variable.
Last question:
It's kinda annoying to add the system variable everytime on my local machine. Is there a dev mode?
application.conf supports environment variables, e.g. db.default.user=${DB_USER}. You can pass it as a console parameter (which is not safe since it appears in ps), or more safely set it as an environment variable.
On Heroku, set the environment variable via heroku config, e.g. heroku config:add DB_USER=MyDBAdmin.
Locally you can set them via export DB_USER=MyDBAdmin, or add them to your ~/.bash_profile (if you use bash).
ad. 3: In Play application.conf is not accesible via any route or other kind of path so it can not be considered as 'placed in webroot'. Terry's advice is proper in PHP, but doesn't fit to Play (he warned, that he don't know the framework of course). He gives a sample of PHP script, but believe me, the diffrence between access to http://somdomain.tld/config.php and Play's conf/application.conf is huge. They can't be compared directly.
Storing credentials in application.conf is the safest (and fastest) way for now, I can't imagine a way to decompile the file in browser even if parser would die (which isn't possible as it's not PHP). If you'll decide to store credentials in some distant location, you'll gain the new risk, as you will need to additionally check if client has permissions to get the config, time required for application's start will rise etc, etc.
Update 1:
Using environment variables is not a secure way - as Marius pointed, it will appear in the process list, so you will show your credentials to each admin and I'm pretty sure that you don't want do that with ie. your email.
In Heroku of course it's a way for passing their DB connection URL, but other credentials should be stored in config file. Finally remember that Procfile command length is limited to 255 chars, so placing all credential in it will cause, that your app won't start some day.
Resolution in this case is using alernative configuration files, scenario is quite simple
in your application.conf keep an URL to your production database If it's Heroku most probably db.default.user and db.default.password should be commented as common heroku URL contains credentials in it.
For your local version create a file ie: conf/local_postgres.conf include application.conf at the beginning and override/add all required configuration keys like credentials to your local Postgres DB. Additionally you can set there other things, change logging levels, enable smtp.mock, etc.
Run your app locally with this conf. (note, I had some problem with -Dconfig.resource so I had to use -Dconfig.file syntax instead, you have to find which method will be working good in your system) ie.
play -Dconfig.resource=local_postgres.conf "~run 9123"
Tip: Using non default port is the easiest way to "demonstrate" that you're working with local config. If you'll forget that you have alternative config and will start your app with common play ~run command, your app in location http://localhost:9123 will be just unavailable.
Create a bash script run-local (or run-local.bat file in Windows) and place there command from previous point. Ignore it in .gitignore file.
From now you'll be running the application for local development with the script from point 4. While pushing to Heroku it will deploy your app with values from application.conf as you don't set alternative config in the Procfile. What's more with some other combinations you can run locally your application with Heroku's SQL to perform evolutions without pushing it to deployment, or check newest fix-pack. Of course you have to always make sure that you're developing on the local version of database, otherwise there's a risk that you accidentally change/destroy your life data.
Update 2:
Finally using *.conf files is better than storing it in separate classes in case when you have to change configuration for different locations (as mentioned yet, team working on the same code, dev/prod environments etc.)
Of course can be shortened to:
import play.Configuration;
import play.Play;
// ...
Configuration cfg = Play.application().configuration();
cfg.getString("smtp.password");
cfg.getBoolean("smtp.mock");
cfg.getInt("smtp.port");

mysql host on internet using hp cloud and xeround

I am new to the 'cloud' concept I have a Java based application for data entry which runs well on my LAN.
On my LAN I install:
MySql
Configure Instance ( user name - root, pass - ******)
Dump dummy database entry_db that is in raw format
Then I have a jar executable file which when runs, displays a login screen.
I manage to successfully log in using predefined ID and PASSWORD (user - config pass - ******)
After logging in I configure(d):
Database Type
Database IP
User Name (Root)
Password ****
Database Name ( It auto selects database named entry_db)
In another window I configure(d) Network File Sharing Location:
file shared location
image path
back up data path
config file location in xml
(Note - When I select file shared location, all other files take the same path automatically)
Then I create Admin account rather than Supervisor account or operator account and login with the Admin account and I can now upload data and distribute to all operators.
Here is my problem:
I configure a cloud computer on Hp Cloud (they provide me a static ip) and then import database from xeround.com.
I now have a dns and port number and also a log in form using MY PHP CLIENT
How can I package all this to the same executable jar file to be used from anywhere?
How can I use it just like on my LAN from the web?
What is the optimal configuration for this?
I work in Xeround.
I have read your question and I wanted to point out a couple of things; you should use the DNS in the connection string where you used to put hostname/IP of the MYSQL server machine and the port number where you used to put the MySQL default port (3306).
 
Other than that you can connect from anywhere there is access to the instance. I suggest that if your jar runs in the HP cloud you create your Xeround database instance there as well (this will yield improved performance).
 
If you still need help, we will be more than happy to help you. Just send us a quick email to support#xeround.com and we'll take it from there.
Cheers,
Yuval

Cannot open backup device. Operating System error 5

Below is the query that I am using to backup (create a .bak) my database.
However, whenever I run it, I always get this error message:
Msg 3201, Level 16, State 1, Line 1
Cannot open backup device 'C:\Users\Me\Desktop\Backup\MyDB.Bak'. Operating system error 5(Access is denied.).
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.
This is my query:
BACKUP DATABASE AcinsoftDB
TO DISK = 'C:\Users\Me\Desktop\Backup\MyDB.Bak'
WITH FORMAT,
MEDIANAME = 'C_SQLServerBackups',
NAME = 'Full Backup of MyDB';
Yeah I just scored this one.
Look in Windows Services. Start > Administration > Services
Find the Service in the list called: SQL Server (MSSQLSERVER) look for the "Log On As" column (need to add it if it doesn't exist in the list).
This is the account you need to give permissions to the directory, right click in explorer > properties > Shares (And Security)
NOTE: Remember to give permissions to the actual directory AND to the share if you are going across the network.
Apply and wait for the permissions to propogate, try the backup again.
NOTE 2: if you are backing up across the network and your SQL is running as "Local Service" then you are in trouble ... you can try assigning permissions or it may be easier to backup locally and xcopy across outside of SQL Server (an hour later).
NOTE 3: If you're running as network service then SOMETIMES the remote machine will not recognize the network serivce on your SQL Server. If this is the case you need to add permissions for the actual computer itself eg. MyServer$.
Go to the SQL server folder in start menu and click configuration tools
Select SQL Server configuration manager
On SQL server services, on the desired instance change the (Log On as) to local system
In order to find out which user you need to give permission to do the restore process, you can follow the following steps:
You need to go to your server where SQL Server is installed. Find SQL Server Configuration Manager
Next, you need to go to "SQL Server Services"
Under your SQL Server (MSSQLSERVER) instance there will be an account with column "Logon As", in my case it is NT Service\MSSQLSERVER.
That is the account which you need to add under Security tab of your source .bak location and give that user the "Read" permissions so that the backup file can be read.
Let's say your backup file is present at "D:\Shared" folder, then you need to give permissions like this:
One of the reason why this happens is you are running your MSSQLSERVER Service not using a local system. To fix this issue, use the following steps.
Open run using Windows + R
Type services.msc and a services dialog will open
Find SQL Server (MSSQLSERVER)
Right click and click on properties.
Go to Log on tab
Select Local System account and click on "Apply" and "OK"
Click on Stop link on the left panel by selecting the "SQL Server (MSSQLSERVER)" and Start it again once completely stopped.
Enjoy your backup.
Hope it helps you well, as it did to me. Cheers!
The SQL Server service account does not have permissions to write to the folder C:\Users\Kimpoy\Desktop\Backup\
I had this issue recently as well, however I was running the backup job from server A but the database being backed up was on server B to a file share on server C. When the agent on server A tells server B to run a backup t-sql command, its actually the service account that sql is running under on SERVER B that attempts to write the backup to server C.
Just remember, its the service account of the sql server performing the actual BACKUP DATABASE command is what needs privileges on the file system, not the agent.
I face the same problem with SQL Express 2014 SP1 on Windows 10.
Solution which work
Open Service by typing Services
Locate and open the SQL Server (SQLExpress)
Go to the LogOn Tab
Choose Local System Account ( Also Check for Allow Services to interact with desktop )
Click OK . Stop the service . Restart the service.
Problem solved
I was just going through this myself. I had ensured that my MSSQLSERVER login user had full access but it was still causing issues. It only worked once I moved the destination to the root of C. More importantly out of a user folder (even though I had a share with full permissions - even tried "Everyone" as a test).
I don't know if i consider my issue "fixed", however it is "working".
Just a FYI for any other users that come across this thread.
I had a similar issue. I added write permissions to the .bak file itself, and my folder that I was writing the backup to for the NETWORK SERVICE user. To add permissions just right-click what file/directory you want to alter, select the security tab, and add the appropriate users/permissions there.
Here is what I did to by-pass the issue.
1) Go to backup
2) Remove the destination file-path to disk
3) Click on Add
4) In the File name: check box manually type in the backup name after ..\backup like below where Yourdb.bak is the database backup name
C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Backup\Yourdb.bak
5) Click on OK
Hope this helps!
I solved the same problem with the following 3 steps:
I store my backup file in other folder path that's worked right.
View different of security tab two folders (as below image).
Edit permission in security tab folder that's not worked right.
I know it is not an exact solution but using external drive paths solves this problem.
BACKUP DATABASE AcinsoftDB
TO DISK = 'E:\MyDB.Bak'
WITH FORMAT,
MEDIANAME = 'C_SQLServerBackups',
NAME = 'Full Backup of MyDB';
I have the same error. Following changes helped me to fix this.
I had to check Server Manager->Tool->Services and find the user ("Log
On As" column) for service: SQL Server (SQLEXPRESS).
I went to the local folder (C:\Users\Me\Desktop\Backup) and added "NT
Service\MSSQL$SQLEXPRESS" as the user to give Write permissions.
SQL Server is not able to access (write) the backup into the location specified.
First you need to verify the service account on which the Sql server is running. This can be done by using Configuration manager or Services.msc.
or
Use below query :
SELECT DSS.servicename,
DSS.startup_type_desc,
DSS.status_desc,
DSS.last_startup_time,
DSS.service_account,
DSS.is_clustered,
DSS.cluster_nodename,
DSS.filename,
DSS.startup_type,
DSS.status,
DSS.process_id FROM sys.dm_server_services AS DSS;
Now look at the column service_account and note it down.
Go to the location where you are trying to take the backup.In your case : C:\Users\Me\Desktop\Backup
Right click--> Properties --> Security -->
Add the service account and provide read/write permissions. This will resolve the issue.
In my case, I forgot to name the backup file and it kept giving me the same permission error :/
TO DISK N'{path}\WRITE_YOUR_BACKUP_FILENAME_HERE.bak'
I had the same issue and the url below really helped me.
It might help you as well.
http://blog.sqlauthority.com/2011/04/13/sql-server-fix-error-msg-3201-level-16-cannot-open-backup-device-operating-system-error-5access-is-denied/
Msg 3201, Level 16, State 1, Line 1
Cannot open backup device 'C:\Backup\Adventure_20120720_1024AM.trn'. Operating system error 5(Access is denied.).
Msg 3013, Level 16, State 1, Line 1
BACKUP LOG is terminating abnormally.
I verified backup folder on C drive, Is new service account is having full control access permission or not?, I realized that "Test\Kiran" service account is not having Full control security permission.
Please follow the below steps to give full control to service account:
Go to C drive, Right click on Backup folder.
Select Security tab.
Click on Edit button, new window will open.
Click on Add button and enter Test\Kiran user account and click check name button, this will validate you entered user is existing or not, if it is existing it will show the user on window, select OK.
Select you entered user name and select Full Control check box under allow.
Please check the access to drives.First create one folder and go to folder properties ,
You may find the security tab ,click on that check whether your user id having the access or not.
if couldn't find the your id,please click the add buttion and give user name with full access.
Share this folder and use UNC path, by example: \pc\backups\mydb.bak
Then You can stop share.
Not very elegant, but it resolves all permissions problems (You need to give permissions to share as well, as mentioned above)
I experienced this problem when the .BAK file was temporarily stored in a folder encrypted with BitLocker. It retained the encryption after it was moved to a different folder.
The NETWORK SERVICE account was unable to decrypt the file and gave this thoroughly informative error message.
Removing BitLocker encryption (by unchecking "Encrypt contents to secure data" in the file properties) on the .BAK file resolved the issue.
Hi you need to change the query from:
BACKUP DATABASE AcinsoftDB
TO DISK = 'C:\Users\Me\Desktop\Backup\MyDB.Bak'
to
BACKUP DATABASE AcinsoftDB
TO DISK = N'C:\Users\Me\Desktop\Backup\MyDB.Bak'
You have to add a N in front of the path works for me.
My issue was that the "File Ownership" was set to my company. I changed it to "Personal" and it worked. Right click the file and click the "File Ownership >" option and then change it to "Personal". I believe this happens with all files sent over Microsoft Teams.
If the backup destination path resides on your local machine, change the account of 'SQL Server' service to 'Local System Account', then everything must be resolved, keep in mind that the 'SQL Server' instance service is responsible to access the backup destination so the account it is running under, must have access to the destination path of your backup.
Make sure you are actually saving to a FILE and not a folder,
My problem was I was simply putting in the Folder path and not the File path
You want this
'F:\Database Backup\Pharmacy\data.bak';
You dont want this
'F:\Database Backup\Pharmacy';