Recently I was making an upload component for joomla 3.1 back-end.
based on How to Save Uploaded File's Name on Database
I was successful in moving the file to the hard-drive,
however I just cant get the update query to work based on the posted post above.
I don't get any SQL errors and saving works, but somehow ignores the database part.
I really hope I missed something obvious. (btw I don't know the joomla way of queries very well)
In phpmyadmin the following query works:
UPDATE hmdq7_mysites_projects
SET project_file = 'test'
WHERE id IN (3);
I have tried the following queries:
$id = JRequest::getVar('id');
$db =& JFactory::getDBO();
$sql = "UPDATE hmdq7_mysites_projects
SET project_file =' " . $filename. "'
WHERE id IN (".$id.");";
$db->setQuery($sql);
$db->query();
$colum = "project_file";
$id = JRequest::getVar('id');
$data = JRequest::getVar( 'jform', null, 'post', 'array' );
$data['project_file'] = strtolower( $file['name']['project_file'] );
$db =& JFactory::getDBO();
$query = $db->getQuery(true);
$query->update('#__mysites_projects');
$query->set($column.' = '.$db->quote($data));
$query->where('id'.'='.$db->quote($id));
$db->setQuery($query);
$db->query();
Here is the current code:
class MysitesControllerProject extends JControllerForm
{
function __construct() {
$this->view_list = 'projects';
parent::__construct();
}
function save(){
// ---------------------------- Uploading the file ---------------------
// Neccesary libraries and variables
jimport( 'joomla.filesystem.folder' );
jimport('joomla.filesystem.file');
$path= JPATH_SITE . DS . "images";
// Create the gonewsleter folder if not exists in images folder
if ( !JFolder::exists(JPATH_SITE . "/images" ) ) {
JFactory::getApplication()->enqueueMessage( $path , 'blue');
}
// Get the file data array from the request.
$file = JRequest::getVar( 'jform', null, 'files', 'array' );
// Make the file name safe.
$filename = JFile::makeSafe($file['name']['project_file']);
// Move the uploaded file into a permanent location.
if ( $filename != '' ) {
// Make sure that the full file path is safe.
$filepath = JPath::clean( JPATH_SITE . "/images/" . $filename );
// Move the uploaded file.
JFile::upload( $file['tmp_name']['project_file'], $filepath );
$colum = "project_file";
$id = JRequest::getVar('id');
$data = JRequest::getVar( 'jform', null, 'post', 'array' );
$data['project_file'] = strtolower( $file['name']['project_file'] );
$db =& JFactory::getDBO();
$query = $db->getQuery(true);
$query->update('#__mysites_projects');
$query->set($column.' = '.$db->quote($data));
$query->where('id'.'='.$db->quote($id));
$db->setQuery($query);
$db->query();
}
// ---------------------------- File Upload Ends ------------------------
JRequest::setVar('jform', $data );
return parent::save();
}
(Answered by the OP in comments. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
Solved: after reviewing the post update record in database using jdatabase I made up some fixed test values. It turns out the query is correct but $data variable in the query had no data. $data['project_file'] = strtolower( $file['name']['project_file'] ); removed the array from first part and variable worked.
Related
I am looking for a way to load blob data from the table faq_attachment into a directory of my choice using a sql command in the MYSQL Workbench.
If necessary, the name can be composed of
file_id + filename Example: 61_image.png
I am unfortunately not a mysql expert…
thanks a lot
This is a very crude example but it should work. There is no graceful error handling or safety checks. It will overwrite existing files so make sure $file_path is writable and empty.
<?php
// Report all PHP errors
error_reporting(E_ALL);
$db_name = '';
$db_host = 'localhost';
$dsn = "mysql:dbname=$db_name;host=$db_host";
$user = '';
$password = '';
// the file path must be writable by the current user
$file_path = 'D:\\tmp\\';
// Creates a PDO instance representing a connection to a database
$PDO = new PDO($dsn, $user, $password);
$PDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// Prepares and executes an SQL statement and returns a PDOStatement object or false on failure.
$stmt = $PDO->query('SELECT `id`, `filename`, `content_type`, `content` FROM `faq_attachment`', PDO::FETCH_OBJ);
// iterates through the returned result set
foreach ($stmt as $file) {
// prepare filename with full path to write to
$filename = $file_path . str_replace(' ', '-', "{$file->id}_{$file->filename}");
// writes the content from the BLOB to the given file
file_put_contents($filename, $file->content);
}
I need to add query logging in Symfony and in csv file. Sometime database can be busy or not available but I want to log all queries.
This log should be in csv format with columns:
-url
-datasource name
-SQL content
-parameters
-username
-start time of query execution (accuracy in ms)
-end time of query execution (accuracy in ms)
Is any help how can I do this? or what can be done for this?
Maybe Custom Function In which I can create the csv file with the current login user info with the url and time to execute the query to access that particular url?
In Symfony If you have a common SQL function which runs when each
query executes then this one helpful for you.
This is the thing I have implemented in my case. Hope This may help you.
public function execute($parameters = null)
{
if ($this->executed) {
return $this;
}
$executionStartTime = microtime(true);
$stmt = $this->execute($parameters);
$this->result = $stmt->fetchAll();
// Close cursor to allow query caching.
$stmt->closeCursor();
$this->executed = true;;
$executionEndTime = microtime(true);
//below code is added to logging for SQL queries to web server in csv format
$execution_time = $executionEndTime - $executionStartTime;
$getpageParameter = $parameters;
$getusername = $this->getUser()->getUsername();
$url = $_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'];
$parms = json_encode($getpageParameter);
$data = array(
date ("Y-m-d H:i:s")."|".$url."|".$getusername."|".$parms."|".$executionStartTime."|".$executionEndTime,
);
if(!file_exists('/../../app/logs/querylog.csv')){
$column = array(
"DATE & TIME|URL|USERNAME|PARAMETERS|START TIME|END TIME"
);
$fp = fopen('/../../app/logs/querylog.csv', 'a+');
foreach ( $column as $line ) {
$val = explode("|", $line);
fputcsv($fp, $val);
}
fclose($fp);
}
$fp = fopen('/../../app/logs/querylog.csv', 'a+');
foreach ( $data as $line ) {
$val = explode("|", $line);
fputcsv($fp, $val);
}
fclose($fp);
return $this;
}
im trying to insert multiple JSON files into a database(about 20-30 in total, im using 2 for now to test). All the files will have the same format. I inserted the files into an HTML table previously so i used the same loop so my script inserts any JSON file found in the directory into the database. I am however getting some errors 1) "Undefined index : Comments" and 2) Table 'serverd.serverd' doesn't exist. Any guidance would be appreciated. I have moved my brackets around but no luck.
'<?php
$connect =mysqli_connect("reservation","-----","-----","serverD") or
die("could not connect");
$dir = "/Users/-----/Desktop/reserve/sql";
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
foreach(glob("*.json") as $filename) {
$jsondata = file_get_contents($filename);
$data = json_decode($jsondata, true);
$Manufacturer = $data['Comments']['Manufacturer'];
$Model = $data['Comments']['Model'];
$BIOSFamily = $data['Comments']['BIOSFamily'];
$BIOSDATE = $data['Comments']['BIOSDate'];
$SerialNumber = $data['Comments']['SerialNumber'];
$sql= " INSERT INTO serverD(Manufacturer, Model, BIOSFamily, BIOSDate, SerialNumber)
VALUES('$Manufacturer' , '$Model' , '$BIOSFamily' , '$BIOSDate' , '$SerialNumber')";
$query=mysqli_query($connect, $sql) or die (mysqli_error($connect));
}
}
}
?>'
I would like to ask some help in inserting data from texfile to a database table....i created a php code to execute in inserting the files to the table but i have no luck in importing it...can anyone know to do this?help me please.
current php code:
<?php
$host= "localhost";
$user= "root";
$pass= "";
$db="klayton";
$connect= mysql_connect($host,$user,$pass);
if (!$connect)die ("Cannot connect!");
mysql_select_db($db, $connect);
$file = fopen("tblApplicants.txt","r");
while( $applicants = fgets($file) )
{
$sql = "INSERT INTO tb_applicants( aic,name ) VALUES ('$applicants')";
mysql_query($sql);
}
?>
Read each line from this file,skip every second line $i%2 == 0 do break;
explode it at " | " and get second ( $row[1] ) and nextone ( $row[2] ), then trim it, and make sql insert.
Try to do at this way
I am trying to export my MySQL tables from my database to a JSON file, so I can list them in an array.
I can create files with this code no problem:
$sql=mysql_query("select * from food_breakfast");
while($row=mysql_fetch_assoc($sql))
{
$ID=$row['ID'];
$Consumption=$row['Consumption'];
$Subline=$row['Subline'];
$Price=$row['Price'];
$visible=$row['visible'];
$posts[] = array('ID'=> $ID, 'Consumption'=> $Consumption, 'Subline'=> $Subline, 'Price'=> $Price, 'visible'=> $visible);
}
$response['posts'] = $posts;
$fp = fopen('results.json', 'w');
fwrite($fp, json_encode($response));
fclose($fp);
Now this reads a table and draws it's info from the fields inside it.
I would like to know if it is possible to make a JSON file with the names of the tables, so one level higher in the hierarchy.
I have part of the code:
$showtablequery = "
SHOW TABLES
FROM
[database]
LIKE
'%food_%'
";
$sql=mysql_query($showtablequery);
while($row=mysql_fetch_array($sql))
{
$tablename = $row[0];
$posts[] = array('tablename'=> $tablename);
}
$response['posts'] = $posts;
But now i am stuck in the last line where is says: $ID=$row['ID']; This relates to the info inside the Table and I do not know what to put here.
Also as you can see, I need to filter the Tables to only list the tables starting with food_ and drinks_
Any help is greatly appreciated:-)
There is no 'table id' in MySQL and therefore the result set from SHOW TABLES has no index id. The only index in the resultset is named 'Tables_in_DATABASENAME'.
Also you should use the mysqli library as the good old mysql library is depreacted. Having prepared an example:
<?php
$mysqli = new mysqli(
'yourserver',
'yourusername',
'yourpassword',
'yourdatabasename'
);
if ($mysqli->connect_errno) {
echo "Failed to connect to MySQL: (" . $mysqli->connect_errno . ") "
. $mysqli->connect_error;
}
$result = $mysqli->query('SHOW TABLES FROM `yourdatabasename` LIKE \'%food_%\'');
if(!$result) {
die('Database error: ' . $mysqli->error);
}
$posts = array();
// use fetch_array instead of fetch_assoc as the column
while($row = $result->fetch_array()) {
$tablename = $row[0];
$posts []= array (
'tablename' => $tablename
);
}
var_dump($posts);