My CodeIgniter app on Google App Engine is not able to connect to my database on Google Cloud SQL. I tried so many things.
My site loads when I leave database username, password & database name empty but, pages that have database calls show an error. It says that no database was selected.
I noticed that my database was not created and created a new database and a user with all privileges. I entered this details in my app and now, it doesn't even connect to the database server. No pages serve.
When I remove only the username & password fields in database.php, it connects to the database server but, doesn't connect to the database.
I checked the mysql database for users and my user has all privileges. I checked all spellings and it is correct. The app is working locally. HOW I CAN FIX THIS? i just can't get it to connect.
Out of the box CodeIgniter will not connect to a Google Cloud SQL instance, modifications to the CI database driver files are required, this is because CI expects that it’s choices are either to connect to localhost or to a remote tcpip host, the developers never anticipated that anybody would want to connect directly to a socket.
I chose to use the Mysqli driver instead of Mysql for performance reasons and here is how I did it:
Step 1) Edit the codeigniter/system/database/drivers/mysqli/mysqli_driver.php file and replace the db_connect function with the following code:
function db_connect()
{
if(isset($this->socket)){
return mysqli_connect(null, $this->username, null, $this->database, null, $this->socket);
}
elseif ($this->port != ”)
{
return mysqli_connect($this->hostname, $this->username, $this->password, $this->database, $this->port);
}
else
{
return mysqli_connect($this->hostname, $this->username, $this->password, $this->database);
}
}
Step 2) Alter your application’s config/database.php (or wherver you want to declare your database settings) - Depending on your application you may choose to add “database” to the autoload array in the yourapp/config/autoload.php or you may choose to manually call the load->database() function. This assumes your application name is “myappname”. Replace APPENGINE-ID and DATABASE-INSTANCE-ID and YOUR_DATABASE_NAME appropriately.
$db[‘myappname’][‘hostname’] = ‘localhost’;
$db[‘myappname’][‘username’] = ‘root’;
$db[‘myappname’][‘password’] = null;
$db[‘myappname’][‘database’] = ‘YOUR_DATABASE_NAME’;
$db[‘myappname’][‘dbdriver’] = ‘mysqli’;
$db[‘myappname’][‘pconnect’] = FALSE;
$db[‘myappname’][‘dbprefix’] = ‘’;
$db[‘myappname’][‘swap_pre’] = ‘’;
$db[‘myappname’][‘db_debug’] = FALSE;
$db[‘myappname’][‘cache_on’] = FALSE;
$db[‘myappname’][‘autoinit’] = FALSE;
$db[‘myappname’][‘char_set’] = ‘utf8’;
$db[‘myappname’][‘dbcollat’] = ‘utf8_general_ci’;
$db[‘myappname’][‘cachedir’] = ”;
$db[‘myappname’][‘socket’] = ‘/cloudsql/APPENGINE-ID:DATABASE-INSTANCE-ID’;
Viola, your CodeIgniter application should now be able to connect and talk to your Google Cloud MySQL database!
Now if you want to get really fancy and enable the database caching, either make alterations to the CI code to use memcache (fastest) or Google Cloud Storage (more guaranteed persistance) but I won’t cover that in this blog…
Answer courtesy of http://arlogilbert.com/post/67855755252/how-to-connect-a-codeigniter-project-to-google-cloud
Have you authorized your appengine app for access to the Cloud SQL instance? Go to the access control panel on the console for the instance (at https://cloud.google.com/console#/project/{project name}/sql/instances/{instance name}/access-control). Look for authorized app engine applications.
Otherwise, if you're connecting to the instance successfully, you'll have to choose the database from your code or configuration (depending on the app). For example, from the "running wordpress" guide (https://developers.google.com/appengine/articles/wordpress) you have to define DB_NAME. If you're handling the connections in your own code you'll need to use mysql_select_db.
From skimming the codeigniter docs, it looks like you need something like:
$config['database'] = "mydatabase";
I'm not familiar with this framework though, so check the docs yourself (http://ellislab.com/codeigniter/user-guide/database/configuration.html).
Related
I have an application developed with SailsJS and mysql. Only a logged in user is meant to be able to create a fresh user. During development stage, I made creation of the first user easy with a simple request to server. That is however not feasible again as I have written some policy codes to prevent such.
module.exports= async function(req, res, proceed){
const adminId = req.param('adminId');
if(!adminId){
return res.status(401).json({status: 401, message: 'Unauthorized access, invalid user'});
}
//let's check if the user has a role as superadmin
const superAdmin = await Admin.findOne({id: adminId, superAdmin: true});
console.log(superAdmin)
if(superAdmin){
return proceed();
}
else{
return res.status(401).json({status: 401, message: "Unauthorized access. You don't have this privilege"})
}
}
Also, Every saved new user has a compulsory createdBy column in mysql database.
I currently want to host the project on production. What best way should I do this. By default, I am supposed to run
sails lift --prod
On production environment and generate the mysql tables. However, I won't be able to login or create an admin user. So what is the best way for me to create a new user?
The "best" way is obviously subjective. Personally I would write a migration to bootstrap the DB with the first production user (you are using migrations, right?).
Some people eschew including DML in their migrations, although in my experience at some point in the long lifetime of an application under development, some type of "data fixup" needs to happen. Doing it as idempotently as possible in a migration has been the easiest and most reliable approach.
I am a beginner GCP administrator. I have several applications running on one instance. Each application has its own database. I set up automatic instance backup via the GCP GUI.
I would like to prepare for a possible failure of one of the applications, i.e. one database. I would like to prepare a procedure for restoring such a database, but in the GCP GUI there is no option to restore one database, I need to restore the entire instance, which I cannot due to the operation of other applications on this instance.
I also read in the documentation that a backup cannot be exported.
Is there any way to restore only one database from the entire instance backup?
Will I have to write a MySQL script that will backup each database separately and save it to Cloud Storage?
Like Daniel mentioned you can use gcloud sql export/import to do this. You'll also need a Google Storage Bucket.
First export a database to a file
gcloud sql export sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]
Create an empty database
gcloud sql databases create [new-database-name] --instance=[instance-name]
Use the export file to populate your fresh, empty database.
gcloud sql import sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]
I'm also a beginner here, but as an alternative, I think could you do the following:
Create a new instance with the same configuration
Restore the original backup into the new instance (this is possible)
Create a dump of the one database that you are interested in
Finally, import that dump into the production instance
In this way, you avoid messing around with data exports, limit the dump operation to the unlikely case of a restore, and save money on database instances.
Curious what people think about this approach?
As of now there is no way to restore only one database from the entire instance backup. As you can check on the documentation the rest of the applications will also experience downtime (since the target instance will be unavailable for connections and existing connections will be lost).
Since there is no built in method to restore only one database from the entire backup instance you are correct and writing a MySQL script to backup each database separately and use import and export operations (here is the relevant documentation regarding import and export operations in the Cloud SQL MySQL context).
But I would recommend you from an implementation point of view to use a separate Cloud SQL instance for each application, and then you could restore the database in case one particular application fails without causing downtime or issues on the rest of the applications.
I see that the topic has been raised again. Below is a description of how I solved the problem with doing backup individual databases from one instance, without using the built-in instance backup mechanism in GCP and uploud it to cloud storage.
To solve the problem, I used Google Cloud Functions written in Node.js 8.
Here is step by step solution:
Create a Cloud Storage Bucket.
Create Cloud Function using Node.js 8.
Edit below code to meet your instance and database parameters:
const {google} = require("googleapis");
const {auth} = require("google-auth-library");
var sqladmin = google.sqladmin("v1beta4");
exports.exportDatabase = (_req, res) => {
async function doBackup() {
const authRes = await auth.getApplicationDefault();
let authClient = authRes.credential;
var request = {
// Project ID
project: "",
// Cloud SQL instance ID
instance: "",
resource: {
// Contains details about the export operation.
exportContext: {
// This is always sql#exportContext.
kind: "sql#exportContext",
// The file type for the specified uri (e.g. SQL or CSV)
fileType: "SQL",
/**
* The path to the file in GCS where the export will be stored.
* The URI is in the form gs://bucketName/fileName.
* If the file already exists, the operation fails.
* If fileType is SQL and the filename ends with .gz, the contents are compressed.
*/
uri:``,
/**
* Databases from which the export is made.
* If fileType is SQL and no database is specified, all databases are exported.
* If fileType is CSV, you can optionally specify at most one database to export.
* If csvExportOptions.selectQuery also specifies the database, this field will be ignored.
*/
databases: [""]
}
},
// Auth client
auth: authClient
};
// Kick off export with requested arguments.
sqladmin.instances.export(request, function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
res.status(200).send("Command completed", err, result);
}); } doBackup(); };
Sorry for the last line but I couldn't format it well.
Save and deploy this Cloud Function
Copy the Trigger URL from configuration page of Cloud function.
In order for the function to run automatically with a specified frequency, use Cloud
Scheduler: Descrition: "", Frequency: USE UNIX-CORN !!!, Time zone: Choose
yours, Target: HTTP, URL: PAST COPIED BEFORE TRIGGER URL HTTP
method: POST
Thats All, it shoudl work fine.
I am currently deploying an application using Apache Tomcat 7
Debugged the errors now and by changing localhost -> I.P., we are able to access the pages, but other users are not able to access the datasets. This maybe due to many reasons, MySQL is not allowing them etc
Backend #1 : My own MySQL Server at Local
ERROR : dataset is not defined
This could be due to the reason that mysql server is not allowing access to other users as privileges are not given yet.
Backend #2 : Then, I tried my company's MySQL server.
This database can be accessed by all, using a particular set of credentials and also the particular "Host".
ERROR : XMLHttpRequest cannot load http://localhost:8080/composite2/Tablev1/datejson. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://172.21.24.24:8080' is therefore not allowed access.
dataset is not defined
In the second case, my "Datasource.groovy"
environments {
development {
dataSource {
//dbCreate = "update" // one of 'create', 'create-drop', 'update', 'validate', ''
url = "jdbc:mysql://10.1.70.13/ewacs?useUnicode=yes&characterEncoding=UTF-8"
username = "w"
password = "w#123"
}
}
test {
dataSource {
//dbCreate = "update"
url = "jdbc:mysql://10.1.70.13/ewacs?useUnicode=yes&characterEncoding=UTF-8"
username = "w"
password = "w#123"
}
}
production {
dataSource {
//dbCreate = "update"
url = "jdbc:mysql://10.1.70.13/ewacs?useUnicode=yes&characterEncoding=UTF-8"
username = "w"
password = "w#123"
}
}
}
In both the cases, the datasets are not loading. What am I doing wrong in case #2?
The access should be there for other users to hit the mysql server.
Why am I getting the error? Tried looking it up, but did not find any relevant link yet.
I went through this link : Question
But, I'm a beginner here, not able to understand this at all.
UPDATE:
Ok, so I noticed that in my javascript files, I was still using localhost, big mistake!
I changed localhost -> 10.1.70.13
So, new error now...
GET http://10.1.70.13:8080/composite2/Tablev1/datejson net::ERR_CONNECTION_REFUSED
OR
Failed to load resource: net::ERR_CONNECTION_REFUSED
What is the issue here?
All approaches/suggestions are most welcome.
I have seen a number of posts relating to this issue, however, have still not found an answer that works for me. I'm trying to connect to an external MySQL database on Bluehost, from a Google Apps Script, using Jdbc.getConnection()
I've tried configuring a table with both MyISAM and InnoDB. In both cases I get the "Failed to connect to the database..." error. In one of the posts, I saw that someone had set their storage engine version to 5.5.25a. I looked for how to do that but couldn't find it in the phpMyAdmin interface that Bluehost provides. They also allow you to write SQL scripts but I couldn't find an SQL syntax example other than "ALTER TABLE [tablename] ENGINE=InnoDB", with no way to specify a version number.
In the code sample below, I don't provide a table name since the getConnection() function is failing anyway. If I can get the connection to work, I'll be good to go.
Here's my apps script code:
function myFunction() {
var address = '69.195.124.100:3306';
var user = 'nathany7_usr2';
var userPwd = 'vom4usr2';
var db = 'nathany7_test2db';
var dbUrl = 'jdbc:mysql://' + address + '/' + db;
try{
// Write one row of data to a table.
var conn = Jdbc.getConnection(dbUrl, user, userPwd);
...
// close database
conn.close();
}catch(e){
return e.message;
}
This is an old question but I just happened upon it now and think I might have the answer. Maybe it will be useful for someone else, if not for OP.
If I understand correctly you are trying to connect to a MySQL database running on your bluehost server from another server. By design, however, Bluehost will block all incoming database connections to it's server (a reasonable security measure methinks)
So you first have to follow the steps here ( https://my.bluehost.com/cgi/help/89 ) to allow the server you're executing the script on to pass the connection call through Bluehost's firewall.
I've a MySql database hosted in my web site, with a table named UsrLic
Where any one wants to buy my software must register and enter his/her Generated Machine Key (+ username, email ...etc).
So my question is:
I want to automate this process from my software, how this Process will be?
Should I connect and update my database directly from my software ( and this means I must save all my database connection parameters in it * my database username , password , server * and then use ADO or MyDac to connect to this database ? and if yes how secure is this process ?
or any other suggestions .
I recommend creating an API on your web site in PHP and calling the API from Delphi.
That way, the database is only available to your web server and not to the client application, ever. In fact, you should run your database on localhost or with a private IP so that only machines on the same physical network can reach it.
I have implemented this and am implementing it again as we speak.
PHP
Create a new file named register_config.php. In this file, setup your MySQL connection information.
Create a file named register.php. In this file, put your registration functions. From this file, include 'register_config.php'. You will pass parameters to the functions you create here, and they will do the reading and writing to your database.
Create a file named register_api.php. From this file, include 'register.php'. Here, you will process POST or GET variables that are sent from your client application, call functions in register.php, and return results back to the client, all via HTTP.
You will have to research connecting to and querying a MySQL database. The W3Schools tutorials will have you doing this very quickly.
For example:
Your Delphi program calls https://mysite/register_api.php with Post() and sends the following values:
name=Marcus
email=marcus#gmail.com
Here's how the beginning of register_api.php might look:
// Our actual database and registration functions are in this library
include 'register.php';
// These are the name value pairs sent via POST from the client
$name = $_POST['name'];
$email = $_POST['email'];
// Sanitize and validate the input here...
// Register them in the DB by calling my function in register.php
if registerBuyer($name, $email) {
// Let them know we succeeded
echo "OK";
} else {
// Let them know we failed
echo "ERROR";
}
Delphi
Use Indy's TIdHTTP component and its Post() or Get() method to post data to register_api.php on the website.
You will get the response back in text from your API.
Keep it simple.
Security
All validation should be done on the server (API). The server must be the gatekeeper.
Sanitize all input to the API from the user (the client) before you call any functions, especially queries.
If you are using shared web hosting, make sure that register.php and register_config.php are not world readable.
If you are passing sensitive information, and it sounds like you are, you should call the registration API function from Delphi over HTTPS. HTTPS provides end to end protection so that nobody can sniff the data being sent off the wire.
Simply hookup a TIdSSLIOHandlerSocketOpenSSL component to your TIdHTTP component, and you're good to go, minus any certificate verification.
Use the SSL component's OnVerifyPeer event to write your own certificate verification method. This is important. If you don't verify the server side certificate, other sites can impersonate you with DNS poisoning and collect the data from your users instead of you. Though this is important, don't let this hold you up since it requires a bit more understanding. Add this in a future version.
Why don't you use e.g. share*it? They also handle the buying process (i don't see how you would do this for yourself..) and let you create a reg key through a delphi app.