I am looking for a way to load blob data from the table faq_attachment into a directory of my choice using a sql command in the MYSQL Workbench.
If necessary, the name can be composed of
file_id + filename Example: 61_image.png
I am unfortunately not a mysql expert…
thanks a lot
This is a very crude example but it should work. There is no graceful error handling or safety checks. It will overwrite existing files so make sure $file_path is writable and empty.
<?php
// Report all PHP errors
error_reporting(E_ALL);
$db_name = '';
$db_host = 'localhost';
$dsn = "mysql:dbname=$db_name;host=$db_host";
$user = '';
$password = '';
// the file path must be writable by the current user
$file_path = 'D:\\tmp\\';
// Creates a PDO instance representing a connection to a database
$PDO = new PDO($dsn, $user, $password);
$PDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// Prepares and executes an SQL statement and returns a PDOStatement object or false on failure.
$stmt = $PDO->query('SELECT `id`, `filename`, `content_type`, `content` FROM `faq_attachment`', PDO::FETCH_OBJ);
// iterates through the returned result set
foreach ($stmt as $file) {
// prepare filename with full path to write to
$filename = $file_path . str_replace(' ', '-', "{$file->id}_{$file->filename}");
// writes the content from the BLOB to the given file
file_put_contents($filename, $file->content);
}
Related
I am trying to connect to 2 databases on the same instance of MySQL from 1 Perl script.
I am using this in a migration script where I am grabbing data from the original database and inserting it into the new one.
Connecting to 1 database and then trying to initiate a second connection with the same user just changes the current database to the new one.
#!/usr/bin/perl
use DBI;
use strict;
my $driver = "mysql";
my $database1 = "db1";
my $dsn1 = "DBI:$driver:database=$database1";
my $userid = "userhead";
my $password = "pwdhead";
my $database2 = "db2";
my $dsn2 = "DBI:$driver:database=$database2";
my $dbh1 = DBI->connect($dsn1, $userid, $password ) or die $DBI::errstr;
my $dbh2 = DBI->connect($dsn2, $userid, $password ) or die $DBI::errstr;
my $sth = $dbh2->prepare("INSERT INTO Persons") $dbh1->prepare("SELECT *FROM Persons");
$sth->execute() or die $DBI::errstr;
print "Number of rows found :" + $sth->rows;
In the above example i am trying to copy from one database table to another datbase table. but i am getting error while running the script. Please help me out
At a guess, you're trying to use the same database handle to connect to both databases. If you need to operate two separate connections then you need two separate handles
This program uses the data_sources class method to discover all of the available MySQL databases and creates a connection to each of them, putting the handles in the array #dbh. You can use each element of that array as normal, for instance
my $stmt = $dbh[0]->prepare('SELECT * FROM table)
It may be that you prefer to set up the #databases array manually, or the username and password may be different for the two data sources, so some variation on this may be necessary
use strict;
use warnings 'all';
use DBI;
my $user = 'username';
my $pass = 'password';
my #databases = DBI->data_sources('mysql');
my #dbh = map { DBI->connect($_, $user, $pass) } #databases;
Update
You need to select data from the source table, fetch it one row at a time, and insert each row into the destination table
Here's an idea how that might work, but you need to adjust the number of question marks in the VALUES of the INSERT statement to match the number of columns
Note that, if you're just intending to copy the whole dataset, there aree better ways to go about this. In particular, if you have any foreign key constraints then you won't be able to add data until the table it it is dependent on is populated
#!/usr/bin/perl
use strict;
use warnings 'all';
use DBI;
my $userid = "userhead";
my $password = "pwdhead";
my ($dbase1, $dbase2) = qw/ db1 db2 /;
my $dsn1 = "DBI:mysql:database=$dbase1";
my $dsn2 = "DBI:mysql:database=$dbase2";
my $dbh1 = DBI->connect($dsn1, $userid, $password ) or die $DBI::errstr;
my $dbh2 = DBI->connect($dsn2, $userid, $password ) or die $DBI::errstr;
my $select = $dbh1->prepare("SELECT * FROM Persons");
my $insert = $dbh2->prepare("INSERT INTO Persons VALUES (?, ?, ?, ?, ?)");
$select->execute;
while ( my #row = $select->fetchrow_array ) {
$insert->execute(#row);
}
If you need to handle the columns from the source data separately then you can use named scalars instead of the array #row. Like this
while ( my ($id, $name) = $select->fetchrow_array ) {
my $lastname = '';
$insert->execute($id, $name, $lastname);
}
Plan A (especially if one-time task):
Run mysqldump on the source machine; feed the output to mysql on the target machine. This will be much faster and simpler. If you are on a Unix machine, do it with an exec() from Perl (if you like).
Plan B (especially if repeated task):
If the table is not "too big", do one SELECT to fetch all the rows into an array in Perl. Then INSERT the rows into the target machine. This can be sped up (with some effort) if you build a multi-row INSERT or create a CSV file and use LOAD DATA instead of INSERT.
I just started using Perl. I am able to connect to my MySQL database, create tables and get query results using my Perl Script. I came across a task that involves "You MUST use the provided DB.pm for all database interaction, and you must use it as it is (DB.pm cannot be modified except for the connection settings)."
What does that mean? Any one can guide me in the right direction ?
DB.pm file contains the following code
package GUI::DB;
use strict;
use DBI;
use vars qw(#ISA #EXPORT);
use Exporter;
#ISA = qw(Exporter);
#EXPORT = qw(dbConnect query);
#
# dbConnect - connect to the database, get the database handle
#
sub dbConnect {
# Read database settings from config file:
my $dsn = "DBI:mysql:database=test";
my $dbh = DBI->connect( $dsn,
'',
'',
{ RaiseError => 1 }
);
return $dbh;
}
#
# query - execute a query with parameters
# query($dbh, $sql, #bindValues)
#
sub query {
my $dbh = shift;
my $sql = shift;
my #bindValues = #_; # 0 or several parameters
my #returnData = ();
# issue query
my $sth = $dbh->prepare($sql);
if ( #bindValues ) {
$sth->execute(#bindValues);
} else {
$sth->execute();
}
if ( $sql =~ m/^select/i ) {
while ( my $row = $sth->fetchrow_hashref ) {
push #returnData, $row;
}
}
# finish the sql statement
$sth->finish();
return #returnData;
}
__END__
Probably, it means, that in your code you must use something like this:
use GUI::DB;
my $dbh = dbConnect();
my $sql = qq{SELECT * FROM my_table};
my #data = query($sql, $dbh);
You interact with the database through the provided module.
Recently I was making an upload component for joomla 3.1 back-end.
based on How to Save Uploaded File's Name on Database
I was successful in moving the file to the hard-drive,
however I just cant get the update query to work based on the posted post above.
I don't get any SQL errors and saving works, but somehow ignores the database part.
I really hope I missed something obvious. (btw I don't know the joomla way of queries very well)
In phpmyadmin the following query works:
UPDATE hmdq7_mysites_projects
SET project_file = 'test'
WHERE id IN (3);
I have tried the following queries:
$id = JRequest::getVar('id');
$db =& JFactory::getDBO();
$sql = "UPDATE hmdq7_mysites_projects
SET project_file =' " . $filename. "'
WHERE id IN (".$id.");";
$db->setQuery($sql);
$db->query();
$colum = "project_file";
$id = JRequest::getVar('id');
$data = JRequest::getVar( 'jform', null, 'post', 'array' );
$data['project_file'] = strtolower( $file['name']['project_file'] );
$db =& JFactory::getDBO();
$query = $db->getQuery(true);
$query->update('#__mysites_projects');
$query->set($column.' = '.$db->quote($data));
$query->where('id'.'='.$db->quote($id));
$db->setQuery($query);
$db->query();
Here is the current code:
class MysitesControllerProject extends JControllerForm
{
function __construct() {
$this->view_list = 'projects';
parent::__construct();
}
function save(){
// ---------------------------- Uploading the file ---------------------
// Neccesary libraries and variables
jimport( 'joomla.filesystem.folder' );
jimport('joomla.filesystem.file');
$path= JPATH_SITE . DS . "images";
// Create the gonewsleter folder if not exists in images folder
if ( !JFolder::exists(JPATH_SITE . "/images" ) ) {
JFactory::getApplication()->enqueueMessage( $path , 'blue');
}
// Get the file data array from the request.
$file = JRequest::getVar( 'jform', null, 'files', 'array' );
// Make the file name safe.
$filename = JFile::makeSafe($file['name']['project_file']);
// Move the uploaded file into a permanent location.
if ( $filename != '' ) {
// Make sure that the full file path is safe.
$filepath = JPath::clean( JPATH_SITE . "/images/" . $filename );
// Move the uploaded file.
JFile::upload( $file['tmp_name']['project_file'], $filepath );
$colum = "project_file";
$id = JRequest::getVar('id');
$data = JRequest::getVar( 'jform', null, 'post', 'array' );
$data['project_file'] = strtolower( $file['name']['project_file'] );
$db =& JFactory::getDBO();
$query = $db->getQuery(true);
$query->update('#__mysites_projects');
$query->set($column.' = '.$db->quote($data));
$query->where('id'.'='.$db->quote($id));
$db->setQuery($query);
$db->query();
}
// ---------------------------- File Upload Ends ------------------------
JRequest::setVar('jform', $data );
return parent::save();
}
(Answered by the OP in comments. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
Solved: after reviewing the post update record in database using jdatabase I made up some fixed test values. It turns out the query is correct but $data variable in the query had no data. $data['project_file'] = strtolower( $file['name']['project_file'] ); removed the array from first part and variable worked.
Connect with mysql and retrive data from the table.
my $db ="JJusers";
my $user ="root";
my $password ="abcdef";
my $host ="localhost";
my $dbh =DBI->connect("DBI:mysql:$db:$host",$user,$password);
my $uDt = $dbh->prepare("select Username,Password from Users");
my $rv = $uDt->execute;
print "<script>alert($rv)</script>";
When I execute this code I am getting the result as 1. In database the data stored as:
1, jj, pp(SNO, USERNAME,PASSWORD)
Why isn't it getting the right data?
You are printing the result of execute, not the actual database results. You want to do something like this...
while (my #data = $rv->fetchrow_array()) {
my $username = $data[0];
my $password = $data[1];
// ...
}
->execute returns just query result(0, 1, 0E0), but not resultset.
As for me, best way is:
my $res = $dbh->selectall_arrayref('select Username,Password from Users', {Slice=>{}});
# now, you can iterate result.
# for example:
foreach my $row(#$res) {
print $row->{Username};
}
If you need bind vaiables, you can use selectall_arrayref also:
my $res = $dbh->selectall_arrayref('select Username,Password from Users where id = ?',
{Slice=>{}}, 1
);
I am trying to import several .csv files into a mysql database, the script below works except that it only imports the first row of my csv data into the database. Both my tables are populated with exactly one data entry.
Any help would be appreciated.
Thank you
#!/usr/bin/perl
use DBI;
use DBD::mysql;
use strict;
use warnings;
# MySQL CONFIG VARIABLES
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=" . $database . ";host=" . $host;
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
# enter the file name that you want import
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open FILE, "<", $filename or die $!;
$_ = <FILE>;
$_ = <FILE>;
while (<FILE>) {
my #f = split(/,/,$_);
if (length($f[4]) < 10) {
print "No Weight\n";
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
}
my $sql = "INSERT INTO datetime_stamp (subject, date, time, weight)
VALUES('$f[1]', '$f[2]', '$f[3]', '$f[4]')";
print "$sql\n";
my $query = $dbh->do($sql);
my $sql = "INSERT INTO subj_weight (subject, weight) VALUES('$f[1]', '$f[2]')";
my $query = $dbh->do($sql);
close(FILE);
}
As has been commented, you close the input file after reading the first data entry, and so only populate your database with a single record.
However there are a few problems with your code you may want to consider:
You should set autoflush on the STDOUT file handle if you are printing diagnostics as the program runs. Otherwise perl won't print the output until either it has a buffer full of text to print or the file handle is closed when the program exits. That means you may not see the messages you have coded until long after the event
You should use Text::CSV to parse CSV data instead of relying on split
You can interpolate variables into a double-quoted string. That avoids the use of several concatenation operators and makes the intention clearer
Your open is near-perfect - an unusual thing - because you correctly use the three-parameter form of open as well as testing whether it succeeded and putting $! in the die string. However you should also always use a lexical file handle as well instead of the old-fashioned global ones
You don't chomp the lines you read from the input, so the last field will have a trailing newline. Using Text::CSV avoids the need for this
You use indices 1 through 4 of the data split from the input record. Perl indices start at zero, so that means you are droppping the first field. Is that correct?
Similarly you are inserting fields 1 and 2, which appear to be subject and date, into fields called subject and weight. It seems unlikely that this can be right
You should prepare your SQL statements, use placeholders, and provide the actual data in an execute call
You seem to diagnose the data read from the file ("No Weight") but insert the data into the database anyway. This may be correct but it seems unlikely
Here is a version of your program that includes these amendments. I hope it is of use to you.
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
use Text::CSV;
use IO::Handle;
STDOUT->autoflush;
# MySQL config variables
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=$database;host=$host";
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open my $fh, '<', $filename
or die qq{Unable to open "$filename" for input: $!};
my $csv = Text::CSV->new;
$csv->getline($fh) for 1, 2; # Drop header lines
my $insert_datetime_stamp = $dbh->prepare( 'INSERT INTO datetime_stamp (subject, date, time, weight) VALUES(?, ?, ?, ?)' );
my $insert_subj_weight = $dbh->prepare( 'INSERT INTO subj_weight (subject, weight) VALUES(?, ?)' );
while (my $row = $csv->getline($fh)) {
if (length($row->[4]) < 10) {
print qq{Invalid weight: "$row->[4]"\n};
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
$insert_datetime_stamp->execute(#$row[1..4]);
$insert_subj_weight->execute(#$row[1,4]);
}
}