How to access phpmyadmin cloud9 cakephp3? - mysql

Hi people I have a problem developing in cloud9. I followed the steps to configure mysql and phpmyadmin. So when I run the app I do it with the following line: bin/cake server -H 0.0.0.0 -p 8080. The apps run fine, but when a try to access phpmyadmin (https://james-mand-cortana.c9users.io/phpmyadmin/) shows an error:Error: PhpmyadminController could not be found.
But when I run the app by running the index.php file (without bin/cake server -H 0.0.0.0 -p 8080) works fine to access phpmyadmin.
So Basically this is my problem I want to run my application with the line bin / cake server -H 0.0.0.0 -p 8080 and access phpmyadmin without any problem.
Thanks for the help.
Here is an excerpt from the index.php:
<?php
if (php_sapi_name() === 'cli-server') {
$_SERVER['PHP_SELF'] = '/' . basename(__FILE__);
$url = parse_url(urldecode($_SERVER['REQUEST_URI']));
$file = DIR . $url['path'];
if (strpos($url['path'], '..') === false && strpos($url['path'], '.') !== false && is_file($file)) {
return false;
}
}
require dirname(DIR) . '/vendor/autoload.php';
use App\Application;
use Cake\Http\Server;
$server = new Server(new Application(dirname(DIR) . '/config'));
$server->emit($server->run());

Related

Reading laravel .env values in windows terminal and use them as credentials for mysql in a batch file

I created a .bat script to import several .csv in my DB for a Laravel project.
At first, I was using python and each time it took an eternity to restore long files, so I decided to back up those tables and restore them with MySQL.
old .bat file
echo.
echo - Rebuilding database
php artisan migrate:fresh
echo.
echo - Importing animals data
cd py_animalimporter
python importer.py
cd ..
echo.
echo - Importing colors data
cd py_colorimporter
python importer.py
cd ..
echo.
echo - Rebuilding database
php artisan db: seed
echo.
echo - Importing places data
cd py_placeimporter
python importer.py
cd ..
echo.
echo - Starting local server
php artisan serve
New .bat file
echo.
echo - Rebuilding database
php artisan migrate:fresh
echo.
echo - Restoring sql backup
mysql -u username -p test_local < backup.sql
password
echo.
echo - Rebuilding database
php artisan db: seed
echo.
echo - Importing places data
cd py_placeimporter
python importer.py
cd ..
echo.
echo - Starting local server
php artisan serve
My python scripts read MySQL credentials from my laravel.env file (thanks to dotenv library), unfortunately, I can't figure how to do anything similar from the windows terminal.
.env file
DB_HOST=127.0.0.1
DB_PORT=3308
DB_DATABASE=test_local
DB_USERNAME=username
DB_PASSWORD=password
.py files example
from dotenv import load_dotenv
from pathlib import Path
import os
import mysql.connector
from mysql.connector import errorcode
def connectDb():
# Retrieve db credentials from .env
env_path = '../.env'
load_dotenv(dotenv_path=env_path)
db_host = os.getenv("DB_HOST")
db_port = os.getenv("DB_PORT")
db_database = os.getenv("DB_DATABASE")
db_username = os.getenv("DB_USERNAME")
db_password = os.getenv("DB_PASSWORD")
if db_password is None:
db_password = ''
return mysql.connector.connect(user=db_username, password=db_password,
host=db_host,
port=db_port,
database=db_database)
def insertPrimaryColour(hex,color):
try:
cnx = connectDb()
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("] Wrong Credentials")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("] No Existing Database")
else:
print("] " + err)
else:
cursor = cnx.cursor()
query = f"INSERT INTO dom_colors(`order`,hex,id_translation) VALUES(0,'{hex}','{color}');"
cursor.execute(query)
insert_id = cursor.lastrowid
cnx.commit()
cnx.close()
return insert_id
Alternately, I could use python to restore the DB but everything I tried didn't work!
If you want to make ad os indipendent solution you can try to use subprocess.
You can use in this way:
mysqlLogin = [...]
process = subprocess.Popen(mysqlLogin, shell=True, stdout=subprocess.PIPE)
process.wait()
This code line will run the command contained in mysqlLogin and wait for his termination.
You can also configure standard output redirection with stdout paramters.
Here is the docs: https://docs.python.org/3/library/subprocess.html
I finally found a solution with Python sending code to the Windows Terminal!
Here's the Python Script that now I call from the batch file
from dotenv import load_dotenv
from pathlib import Path
import os
def restoreDB():
# Retrieve DB credentials from .env
env_path = '.env'
load_dotenv(dotenv_path=env_path)
#db_host = os.getenv("DB_HOST")
#db_port = os.getenv("DB_PORT")
db_database = os.getenv("DB_DATABASE")
db_username = os.getenv("DB_USERNAME")
db_password = os.getenv("DB_PASSWORD")
if db_password is "":
mysqlLogin = "mysql -u "+db_username+" "+db_database+" < backup.sql"
else:
mysqlLogin = "mysql -u "+db_username+" --password='"+db_password+"' "+db_database+" < backup.sql"
os.system('cmd /c "%s"' % mysqlLogin)
restoreDB()

How to execute mysql script insertion on terraform user_data?

The last line of the script was not executed.
I tried to execute the code manually on the instance created and it was successful.
#!/bin/bash
#install tools
apt-get update -y
apt-get install mysql-client -y
#Create MySQL config file
echo "[mysql]" >> ~/.my.cnf
echo "user = poc5admin" >> ~/.my.cnf
echo "password = poc5password" >> ~/.my.cnf
#test
echo "endpoint = ${rds_endpoint}" >> ~/variables
hostip=$(hostname -I)
endpoint=${rds_endpoint}
echo "$hostip" >> ~/variables
#I have created a table here but I will remove the code since it is unnecessary...
#Create User
echo "CREATE USER 'poc5user'#'%' IDENTIFIED BY 'poc5pass';" >> ~/mysqlscript.sql
echo "GRANT EVENT ON * . * TO 'poc5user'#'%';" >> ~/mysqlscript.sql
cp mysqlscript.sql /home/ubuntu/mysqlscript.sql
mysql -h $endpoint -u poc5admin < ~/mysqlscript.sql
Expected result: There should be a Database, Table and User created on the RDS instance.
You can insert or create Database like this from the bash script but it is not recommended an approach to work with RDS. better to place your data over s3 and import from the s3.
Here is the example, that will create DB
resource "aws_db_instance" "db" {
allocated_storage = 20
storage_type = "gp2"
engine = "mysql"
engine_version = "5.7"
instance_class = "db.t2.micro"
name = "mydb"
username = "foo"
password = "foobarbaz"
parameter_group_name = "default.mysql5.7"
s3_import {
source_engine = "mysql"
source_engine_version = "5.6"
bucket_name = "mybucket"
bucket_prefix = "backups"
ingestion_role = "arn:aws:iam::1234567890:role/role-xtrabackup-rds-restore"
}
}
~/.my.cnf why you need this? better to place these script in the s3 file.
second thing, If you still interesting to run from your local environment then you can insert from local-exec
resource "null_resource" "main_db_update_table" {
provisioner "local-exec" {
on_failure = "fail"
interpreter = ["/bin/bash", "-c"]
command = <<EOT
mysql -h ${aws_rds_cluster.db.endpoint} -u your_username -pyour_password your_db < mysql_script.sql
EOT
}
}
But better to with s3.
If you want to import from remote, you can explore remote-exec.
With user-data, you can do this but it seems your MySQL script not generating properly. better to cp script to remote and then run with local exec in remote.
There is no such thing as terraform "user_data". User data is a bootstrap script for the EC2 instances which you can use to install software/binaries or to execute your script at the boot time.
The script will be executed by the cloud-init, not by the terraform itself. The responsibility of the terraform is to set user-data for the ec2 instances.
You may check the cloud-init output logs which should have the result of your user-data script also.
From your code, I am not able to understand which step you have copied the below file.
cp mysqlscript.sql /home/ubuntu/mysqlscript.sql
mysql -h $endpoint -u poc5admin < ~/mysqlscript.sql
I am assuming that you are creating a new server and it does not have any file.
Thank you for your inputs. I have found an answer by moving the config file to /etc/mysql/my.cnf and then executing
mysql -h $endpoint -u poc5admin < ~/mysqlscript.sql

I provide password in envoy script but it prompts to enter password manually

With laravel 5.8 envoy command I deploy on remote server and I set password in command line, like:
envoy run Hostels2Deploy --lardeployer_password=111 --app_version=0.105a
and envoy file:
#setup
$server_login_user= 'lardeployer';
$lardeployer_password = isset($lardeployer_password) ? $lardeployer_password : "Not Defined";
#endsetup
#servers(['dev' => $server_login_user.':'.$lardeployer_password.'#NNN.NN.NNN.N'])
#task('clean_old_releases')
echo "Step # 81";
echo 'The password is: {{ $lardeployer_password }}';
echo 'The $server_login_user is: {{ $server_login_user }}';
echo "Step # 00 app_version ::{{ $app_version }}";
cd {{ $release_number_dir }}
# php artisan envoy:delete-old-versions Hostels2Deployed
#endtask
#macro('Hostels2Deploy',['on'=>'dev'])
clean_old_releases
#endmacro
With credentials in #servers block I expected I will not have to enter password manually, but in command line I see prompt to enter
password. I output $server_login_user and $lardeployer_password vars and they have valid values.
Which is valid path ?
I found a decision with ssh keys in /home/user/.ssh/config of my OS to add line :
Host laravelserver
IdentityFile ~/.ssh/id_rsa
HostName NNN.NN.NNN.N
Port 22
User lardeployer
and in envoy file to connect to this server like :
#servers(['dev' => ['laravelserver'] )
Aslo on remote user in file authorized_keys lardeployer's public key must be added.
and restart the sevice :
sudo systemctl restart ssh

Google cloud sql with php

I'm having a problem to connect my DB cloud sql with php.
I've tried every possible way to connect (pdo, mysqli & mysql)
none of them worked.
I've put an IPv4 for the could sql instance and authorized the compute engine IP in the allowed networks section.
When I'm trying to connect from the compute engine with
mysql --host=cloud_sql_IP --user=my_user --password
it's working and I'm able to see the tables.
On the php side I've put this code:
$db = mysql_connect(<CLOUD_SQL_IPV4>, 'root', '');
if (!$db){
die('Connect Error (' . mysql_error());
}
and I get "Connect Error (Permission denied)"
when I'm trying this way:
$conn = mysql_connect(":/cloudsql/<COMPUTE_INSTANCE>:<DB>", "root", "");
if (!$conn) {
die('Connect Error (' . mysql_error());
}
I'm getting:
"Connect Error (No such file or directory"
What else should I do?
Thanks!
Well after diggin into it for 2 days I managed to connect cloud DB
first run this for the permission fix (centos)
setsebool httpd_can_network_connect=1
And for the php it should be written this way:
new mysqli(
<IP_V4>, // host
'root', // username
'', // password
<DB_NAME>, // database name
null,
'/cloudsql/<GCE_INSTANCE>:<DB>'
);

MySQL via SSH Tunnel Access Denied

I want to use SSH tunnel to connect to MySQL DB on remote host.
I've set up the tunnel with command:
ssh user#host -L 3307:remote_mysql_hostname:3306
I can successfull connect with HeidiSQL using this settings:
hostname: localhost
user: remote_mysql_user_login
password: remote_mysql_user_password
port: 3307
But when i use PDO in PHP to connect, i get:
Access denied for user 'remote_mysql_user_login'#'localhost' (using password: YES)
My PDO Dns is sth like this:
mysql:type=Core_Db_Adapter_Pdo_Mysql;host=localhost;port=3307;dbname=db_name;
Where is the trick?
Solution:
#symcbean thanks!
The problem was (as suggested by #symcbean) in hostname.
Changing to '127.0.0.1' fix the problem.**
I borrowed from your example and was able to connect to a remote host using your same SSH tunnel setup. Can you share the PHP script you are using to get some more info about your setup?
You may try a hostname other than 'localhost' for your remote database you are connecting to.
setup ssh tunnel - I setup an entry in /etc/hosts for my_db_host_name instead of using localhost
ssh my_user#my_remote_host -L 3307:my_db_host_name:3306
<?php
$dsn = 'mysql:host=my_db_host_name;dbname=my_db;port=3307';
$username = 'my_user';
$password = 'my_pass';
$options = array(
PDO::MYSQL_ATTR_INIT_COMMAND => 'SET NAMES utf8',
);
$dbh = new PDO($dsn, $username, $password, $options);
$result = $dbh->query("select * from my_table");
try {
foreach($result as $row) {
echo "column1: " . $row['column1'];
}
} catch (PDOException $pde) {
echo "PDO exception: $pde";
}
echo "done \n";
?>