Dump MysqlDB and transfer it to FTP - mysql

I had created a batch file
#echo off
echo Running dump...
CD c:\Program Files\MySQL\MySQL Server 5.5\bin
CALL mysqldump --user=1234 --password=aaaa dba1 --result-file="c:\Users_%DATE%.sql"
echo Done!
and I dont know, how to transfer it to ftp;

How about using ftp command line utility?
It is capable of using scripts (lists of commands from external files)
ftp -s:ftpcmd.dat your.ftp.server.com
Now you can either create this script file beforehand or make it on the fly from your batch file using echo.

Related

OpenGrok) How can I use '--symlink' command in OpenGrok?

I'm not sure how to use the --symlink command in OpenGrok, so I'm asking.
OpenGrok's source root folder is '/opengrok/src'.
In this folder, I created a symbolic link file with the following command.
ln -s /home/A/workspace/tmp tmp
And I did indexing with the following command.
java -Djava.util.logging.config.file=/opengrok/etc/logging.properties -jar /opengrok/dist/lib/opengrok.jar -c /usr/local/bin/ctags -s /opengrok/src -d /opengrok/data -P -S -W /opengrok/etc/configuration.xml --symlink /opengrok/src/tmp -U http://localhost:8080/source
When I connect to localhost/source, the tmp file is displayed, but when I click it, the files in tmp are not displayed and the following error message is displayed.
Error: File not found!
The requested resource is not available.
Resource lacks history info. Was remote SCM side up when indexing occurred? Cleanup history cache dir(or just the .gz for the file or db record) and rerun indexer making sure remote side will respond during indexing.
How can I access and view the files in tmp using OpenGrok?

Silent Installation of MySQL (version 5.7) on Windows with Batch Files

i'm trying to create a setup process for my C# application that use MySQL Database local (not online, every PC that will install my app, will get a clean Database to use). Searching around google i found that i need to make a "Silent" install of MySQL required components which are:
MySQL Server 5.7
MySQL Connector/Net
So I've done the following steps:
1) Downloading mysql-installer-community-5.7.11.0.msi from MySQL Home page
2) Made a Batch file "Setup.bat" that will create a Folder C:\MySQL and Copy there the Installer.msi and another batch file "Installer.bat".
3) After copying start Installer.bat with the command
Call C:\MySQL\Installer.bat
4) What Installer.bat will do:
#echo off
echo.
set password=admin
set installer="C:\Program Files (x86)\MySQL\MySQL Installer for Windows\MySQLInstallerConsole.exe"
set version=5.7.11
echo -----------------------
echo Details
echo -----------------------
echo Server User: root
echo.
echo Server Password: %password%
echo.
echo Configuration Folder: %installer%
echo.
echo Version: %version%
echo -----------------------
echo Start Installation... (si prega di attendere alcuni minuti)
echo.
msiexec /i "%~dp0\mysql-installer-community-%version%.0.msi" /qb
%installer% community install server;%version%;X64:*:port=3307;openfirewall=true;passwd=%password% -silent
pause
Installing the .msi file inside the folder,
Download and Install the MySQL server 5.7
and from here start my problem, the Download and Installation of MySQL server 5.7 go well, but once it need to configure with my parameters it start to initilazie the server with several steps:
Stopping Server [If Necessary]
Writing Configuration file
Updating Firewall
... and so on till i get this one
Starting Server
Once there it will never complete, so I checked Windows Event Viewer and i found that two errors occur once the " Starting Server " step begin:
Error1 Failed to create datadir C:\Program Files\MySQL\MySQL Server 5.7\data
Then
Error2 Aborting
If i try to install those components manually with the GUI installer i don't get any error and everything works good, but i need to make them install and configure automatically, can someone help me please? (Sorry about my english)

OwnCloud: How to synchronyze the FileSystem with the DB

I have to "insert" a lot of files into an owncloud server (8.2).
A user give me a USB key with the files and tell me to copy of all them into his owncloud data files repository.
Do you know if is it possible ?
Is it possible to synchronyze the ownCloud data fileSystem with the ownCloud database?
My environment is Linux CentOS7 (Apache 2.4, mySQL 5.6, php 5.6)
Thanks,
owncloud brings a command line utility that allows to manually trigger some tasks. Among those is the files:scan function which re-scans a users file system.
So you can import those files by following these steps:
1. you copy the files into the physical file system of the user(s) inside ownclouds data folder
2. you fire the command line utility to re-scan the files. That takes care to update the database according to the files found.
This is an example for the manual trigger:
sudo -u www-data php occ files:scan <user name>
Here <user name> obviously has to be replaced. Also the account name the sudo command switches to depends on the linux distribution and its setup. The command has to be started inside ownclouds base folder. THe command can be called in a loop with different user names, that can be done by means of standard scripting.
Here is a documentation of the utility: https://doc.owncloud.org/server/8.0/admin_manual/configuration_server/occ_command.html
I just made a try myself using an owncloud-8.2 installation and succeeded.
Before I could sucessfully scan my files again as arkascha explained, I needed to change the ownder and the group of the new folder to www-data (for Debian OS - others see OC-Docu 1) and set rights of the new directory to 755
Change ownder:
sudo chown -R www-data:www-data <path>
Change rights:
sudo chmod 755 <path>
whwere is the path to the newly added directory and could for example look like this example: /media/hdd/owncloud/data/<username>/files/<newFolderName>
OC-Docu:
https://doc.owncloud.org/server/9.0/admin_manual/configuration_server/occ_command.html

unoconv return error when running as www-data

When running this from command line as root it works
unoconv -f csv $file
But when running it as www-data this error is returned
Traceback (most recent call last):
File "/usr/bin/unoconv", line 1114, in <module>
office_environ(of)
File "/usr/bin/unoconv", line 203, in office_environ
os.environ['PATH'] = realpath(office.basepath, 'program') + os.pathsep + os.environ['PATH']
File "/usr/lib/python3.4/os.py", line 633, in __getitem__
raise KeyError(key) from None
KeyError: 'PATH'
update
echo shell_exec('echo $PATH');
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
centos 7.3 with php via php-fpm
the env in php is cleaned by php-fpm
u can use putenv to set evn["PATH"] in php code, examples
putenv("PATH=/sbin:/bin:/usr/sbin:/usr/bin");
var_dump(shell_exec('unoconv -vvvv -f pdf -o 123.pdf 123.doc));
or u can set env use one line shell cmd
var_dump(shell_exec('PATH=/sbin:/bin:/usr/sbin:/usr/bin'.' unoconv -vvvv -f pdf -o 123.pdf 123.doc));
or u can change /etc/php-fpm.d/www.conf to pass the env to php, add this line
clean_env = no
and the restart php-fpm
systemctl restart php-fpm.service
The PHP call you used (pasted from chat):
exec("unoconv -f csv $file")
My guess is that exec() is giving you an environment that is too limited. To work around this, you could set up a polled directory. The PHP script would copy files to be converted into the polled directory and wait for the files to be converted.
Then create a bash script (either running as root or a somewhat more secure user) to run in an infinite loop and check the polled directory for any incoming files. See How to keep polling file in a directory till it arrives in Unix for what the bash script might look like.
When the bash script sees incoming files, it runs unoconv.
Found a solution myself by running libreoffice directly
sudo libreoffice --headless --convert-to csv --outdir $tmp_path $file

How to make automatic backups of mysql db´s on Goddady with Apache servers

Im trying to automate a daily backup of mysql database on a shared hosting with Godaddy.com using Apache servers.
For this I researched and found out about bash scripts.
Goddady hosting lets me do cron jobs also so I did the following:
My bash script looks something like this (I masked the sensible data only):
<br>
#/bin/sh<p></p>
<p>mysqldump -h myhost-u myuser -pMypassword databasename > dbbackup.sql<br>
gzip dbbackup.sql<br>
mv dbbackup.sql.gz _db_backups/`date +mysql-BACKUP.sql-%y-%m-%d.gz`<br>
</p>
I configured the cron job which points to this file and executes it every 24 hours.
I have the cron job utility configured to send me a log message to my email every time it runs.
And this is the log message:
/var/chroot/home/content/01/3196601/html/_db_backups/backup.sh: line
1: br: No such file or directory
/var/chroot/home/content/01/3196601/html/_db_backups/backup.sh: line
3: p: No such file or directory
/var/chroot/home/content/01/3196601/html/_db_backups/backup.sh: line
4: br: No such file or directory
/var/chroot/home/content/01/3196601/html/_db_backups/backup.sh: line
5: br: No such file or directory
/var/chroot/home/content/01/3196601/html/_db_backups/backup.sh: line
6: /p: No such file or directory
Its like it doesn't understand the language. Should I edit my .htaccess file for this?
Any ideas?
Remove those html tags from the bash script, error messages are all related to them . Your script should be as the following.
#!/bin/sh
mysqldump -h myhost-u myuser -pMypassword databasename > dbbackup.sql
rm -rf dbbackup.sql.gz
gzip dbbackup.sql
mv dbbackup.sql.gz _db_backups/`date +mysql-BACKUP.sql-%y-%m-%d.gz`