I have a certain table in mySQL which has a field called "image" with a datatype of "BLOB". I was wondering if it is possible to upload an image in that field directly from the Command Line Client rather than doing it through php...If it is possible, then where exactly should I place my image files?
Try using the LOAD_FILE() function.
UPDATE `certain_table`
SET image = LOAD_FILE('/full/path/to/new/image.jpg')
WHERE id = 1234;
See the manual for requirements about the path to the filename, privileges, etc.
LOAD_FILE works only with certain privileges and if the file is on the server. I've found out a way to make it work completely client side:
mysql -e "update mytable set image=FROM_BASE64('`base64 -i image.png`')" DBNAME
The idea is to encode the image to base64 on the fly and let then MySql decode it.
This is a variation on Teudimundo's answer that works with older MySQL versions, where Base64 functions are not available:
mysql -e "update mytable set col = x'$(xxd -p image.png | tr -d \\n)' where ..."
The trick is to use xxd -p to convert a binary file to a plain hexdump:
$ xxd -p /usr/share/font-manager/data/blank.png
89504e470d0a1a0a0000000d4948445200000040000000400806000000aa
6971de000000274944415478daedc1010d000000c220fba77e0e37600000
00000000000000000000000000c0bd0040400001ee1dbb2f000000004945
4e44ae426082
then using tr -d \\n to remove the newlines, and finally embedding the result into a MySQL-specific hexdump string literal: x'...'
I recommend you to never upload images directly in a database, it's quite inefficient. It's better to simply store the location and name of the image and store those images in a folder somewhere.
and if you want to "upload" via the commandline, you can just do an:
insert into table(image_loc) values('/images/random/cool.jpg') where id=1;
and depending on your environment you can use shell access to move images around. I'm not quite sure what you are trying to do with these images or how your system is setup. You'll probably need to clarify on that.
It is more preferable to build a sample application and then insert the values in the database. For instance this method could be used to enter a BLOB datatype into the database...
[WebMethod]
public string sendDataToMySql(string get_name, byte[] buffer)
{
string MyConString = "SERVER=localhost;" +
"DATABASE=test;" +
"UID=root;" +
"PASSWORD=admin;";
MySqlConnection connection = new MySqlConnection(MyConString);
connection.Open();
MySqlCommand command = new MySqlCommand("", connection);
command.CommandText = "insert into testImage(name, image) values(#name, #image);";
MySqlParameter oParam1 = command.Parameters.Add("#name", MySqlDbType.VarChar, 50);
oParam1.Value = get_name;
MySqlParameter oParam2 = command.Parameters.Add("#image", MySqlDbType.Blob);
oParam2.Value = buffer;
command.ExecuteNonQuery();
connection.Close();
return "Data was inserted successfully!";
}
Sometimes we try to upload file using loadfile but file is not loaded or file path in formatted text is stored in BLOB field. This is because of access issues. If you are facing such condition, instead of loading file from any location, try to load it from data path of mysql preferably like :
INSERT INTO `srms`.`images` (`ID`, `Image`) VALUES ('5', load_file('C:\ProgramData\MySQL\MySQL Server 5.7\Uploads\test.jpg'));
Related
I'm using a shell script to launch SQL query, works fine.
I would like to replace, inside MySQL database, a specific code, which is different depending of the website.
See image below :
My query is the following :
mysql -D DATABASE_NAME -e "UPDATE TABLE SET params = REPLACE(params, '%OLD_CODE%', 'NEW_CODE') WHERE element = 'EXTENSION'"
The problem is : the% OLD_CODE% zone does not work. I have to enter the exact string.
Below the content of PARAMS field from MySQL database :
{"com_difficulty":"3","clubcode":"OLD_CODE","shownews":"1","com_calViewName":"flat","darktemplate":"0"}
I need to replace OLD_CODE by a NEW_CODE, without loosing settings inside PARAMS field.
Problem, since OLD_CODE string is different for each site, how to replace each OLD_CODE string ?
I have laso tried with * caracter to take everything but does not work.
Example:
how to replace this
{"com_difficulty":"3","clubcode":"546465-595gfd-gfgfds65-5654gfd","shownews":"1","com_calViewName":"flat","darktemplate":"0"}
by this :
{"com_difficulty":"3","clubcode":"fg5gfdgfq-grefdg-gredfgfd","shownews":"1","com_calViewName":"flat","darktemplate":"0"}
Thanks
L.
this query works :
mysql -D DATABASE_NAME -e "UPDATE TABLE SET params = REGEXP_REPLACE(params, '\"clubcode\":\".+?\"', '\"clubcode\":\"TOTO\"') WHERE element = 'EXTENSION'"
I am using some legacy project and I need to export some files from my Lotus Notes database to MySQL DB using ODBC connection.
I have a ~94000 documents in lotus database with some small attachments (30-40kb).
As always, for this tasks I was always using some kind of this:
Dim mysqlConnection As New ODBCConnection
Dim sqlQuery As New ODBCQuery
Dim result As New ODBCResultSet
Dim notesSession As New NotesSession
Set ntsDatabase = notesSession.CurrentDatabase
Call mysqlConnection.ConnectTo("DSN_NAME","NAME","PASS")
And I was not having problems with sending/parsing some data with queries like this:
Set sqlQuery.Connection = mysqlConnection
Set result.Query = sqlQuery
sqlQuery.SQL = some query e.t.c.
Everything is working fine. But now I am trying to find a way to send files to MySQL database and having some real problems to find the solution.
Can you please give some small example with sending a small blob file to MySQL or some kind of advise to solve this?
Thanks!
I don't think an example like that could be considered "small".
You're going to have to extract the attachment to a file, read the file into NotesStream, convert the bytes in the NotesStream into a Base64 string, and send that string value in a SQL command.
I've got 200k csv files and I need to import them all to a single postgresql table. It's a list of parameters from various devices and each csv's file name contains device's serial number and I need it to be in one of the colums for each row.
So to simplify, I've got few columns of data (no headers), let's say that columns in each csv file are: Date, Variable, Value and file name contains SERIALNUMBER_and_someOtherStuffIDontNeed.csv
I'm trying to use cygwin to write a bash script to iterate over files and do it for me, however for some reason it won't work, showing 'syntax error at or near "as" '
Here's my code:
#!/bin/bash
FILELIST=/cygdrive/c/devices/files/*
for INPUT_FILE in $FILELIST
do
psql -U postgres -d devices -c "copy devicelist
(
Date,
Variable,
Value,
SN as CURRENT_LOAD_SOURCE(),
)
from '$INPUT_FILE
delimiter ',' ;"
done
I'm learning SQL so it might be an obvious mistake, but I can't see it.
Also I know that in that form I will get full file name, not just the serial number bit I want but I can probably handle that somehow later.
Please advise.
Thanks.
I dont think there is a CURRENT_LOAD_SOURCE() function in postgres. A work-around is to leave the name-column NULL on copy, and patch is to the desired value just after the copy. I prefer a shell here-document because that make quoting inside the SQL body easier. (BTW: for 10K of files, the globbing needed to obtain FILELIST might exceed argmax for the shell ...)
#!/bin/bash
FILELIST="`ls /tmp/*.c`"
for INPUT_FILE in $FILELIST
do
echo "File:" $INPUT_FILE
psql -U postgres -d devices <<OMG
-- I have a schema "tmp" for testing purposes
CREATE TABLE IF NOT EXISTS tmp.filelist(name text, content text);
COPY tmp.filelist ( content)
from '/$INPUT_FILE' delimiter ',' ;
UPDATE tmp.filelist SET name = '$FILELIST'
WHERE name IS NULL;
OMG
done
For anyone interested in an answer, I've used a python script to change file names and then another script using psycopg2 to connect to the database and then done everyting in one connection. Took 10 minutes instead of 10 hours.
Here's the code:
Renaming files (also apparently to import from CSV you need all the rows to be filled and the information I needed was in first 4 columns anyway, therefore I've put together a solution to generate whole new CSVs instead of just renaming them):
import os
import csv
path='C:/devices/files'
os.chdir(path)
i=0
for file in os.listdir(path):
try:
i+=1
if i%10000 == 0:
#just to see the progress
print(i)
serial_number = (file[:8])
creader = csv.reader(open(file))
cwriter = csv.writer(open('processed_'+file, 'w'))
for cline in creader:
new_line = [val for col, val in enumerate(cline) if col not in (4, 5, 6, 7)]
new_line.insert(0, serial_number)
#print(new_line)
cwriter.writerow(new_line)
except:
print('problem with file: ' + file)
pass
Updating database:
import os
import psycopg2
path="C:\\devices\\files"
directory_listing = os.listdir(path)
conn = psycopg2.connect("dbname='devices' user='postgres' host='localhost'")
cursor = conn.cursor()
print(len(directory_listing))
i=100001
while i < 218792:
current_file=(directory_listing[i])
i+=1
full_path = "C:/devices/files/" + current_file
with open(full_path) as f:
cursor.copy_from(file=f, table='devicelistlive', sep=",")
conn.commit()
conn.close()
Don't mind while and weird numbers, it's just because I was doing it in portions for testing purposes. Can easily be replaced with for
I want to make a sitemap for google. the fastest way to do it (i think) is exporting the query directly to a txt file with a linux command. I do it like this
$ mysql -pMyPassword -e "SELECT concat('http:/mysitecom/',id,'/',urlPart1,'/',urlPart2,'/') as url FROM products LIMIT 50000" > /home/file.xml
the problem is google says there has to be nothing but urls in the txt file, and my first line is "url" which is the resulting column name, typical for CSV file. How can i skip it?
Piping it to tail should work
mysql -pMyPassword -e "SELECT concat('http:/mysitecom/',id,'/',urlPart1,'/',urlPart2,'/') as url FROM products LIMIT 50000" | tail -n +2 > /home/file.xml
Do remember that you'll have to do this each time that table is updated. Depending on the implementation of your site I'd imagine that it wouldn't be too complex to write a query and output the list at an endpoint such as domain.com/sitemap - what language are you using?
Using PHP
Pretty simple:
// do your regular database connection where $db = mysqli_connect(...);
$resource = mysqli_query($db, "SELECT concat('http:/mysitecom/',id,'/',urlPart1,'/',urlPart2,'/') as url FROM products");
while ($arr=mysqli_fetch_row($resource)){
$urls[]=$arr[0];
}
header('Content-type: text/plain');
foreach ($urls as $url){
echo $url.PHP_EOL;
}
Should work but haven't tested it! (esp. the header, that's off the top of my head)
I have this slot:
void Managment::dbExportTriggered()
{
save = QFileDialog::getSaveFileName(this, trUtf8("Export db"),
QDir::currentPath() + "Backup/",
trUtf8("Dumped database (*.sql)"));
sqlQuery = "SELECT * INTO OUTFILE '" + save + ".sql' FROM Users, Data";
//QMessageBox::critical(0, trUtf8("query dump"), QString::number(query.exec(sqlQuery)));
query.exec(sqlQuery);
}
And I have this query:
sqlQuery = "SELECT * INTO OUTFILE " + save + " FROM Users, Data";
I execute normally but no dumped file appear, the backup directory has the right permission, the dumped database must be in client.
UPDATE:
After a search I found that the INTO OUTFILE query will dump database in the server not in the client as I was thought, so my question now how can I dump database in remote MySQL server, any quick methods with out any external tools like mysqldump client.
SELECT ... INTO OUTFILE creates a file on the MySQL server machine, with permissions matching whoever the MySQL server runs as. Unless you have root access on the MySQL server to retrieve the file that you're exporting, SELECT ... INTO OUTFILE is unlikely to do what you want.
In fact, I think I'd go so far as to say that if you're trying to use SELECT ... INTO OUTFILE from a GUI client, you're probably taking the wrong approach to your problem.
Just an idea: Another approach is to call mysqldump with QProcess. With some google-fu this seems to be an example:
..
if (allDatabases->isChecked()) {
arguments << "--all-databases";
} else {
arguments << "--databases";
foreach(QListWidgetItem *item, databasesList->selectedItems())
arguments << item->text();
}
proc->setReadChannel(QProcess::StandardOutput);
QApplication::setOverrideCursor(Qt::WaitCursor);
proc->start("mysqldump", arguments);
..
Thus, you can also add some parameters to dump only a specific table.
Edit:
Just note from the mysql doc on the SELECT ... INTO OUTFILE statement:
If you want to create the resulting
file on some other host than the
server host, you normally cannot use
SELECT ... INTO OUTFILE since there is
no way to write a path to the file
relative to the server host's file
system.
Thus you must roll your own, or you can use mysql -e as suggested by the above documentation.
Did you dump/print save to check it's valid? Does currentPath() return a trailung "/"?
Could there be difference between the path as seen by your client program and as (to be) seen by the server?
Does the user have the necessary privileges (file privilege for sure, maybe more)
Can't you get an error message from the log?
Are you getting any errors running the sql statement?
I notice that you're concatenating the filename into the SQL query without surrounding it by quotation marks. Your code will yield something like
SELECT * INTO OUTFILE /path/to/somewhere FROM Users, Data
but the MySQL documentation says it wants something like
SELECT * INTO OUTFILE '/path/to/somewhere' FROM Users, Data
Also keep the following in mind:
The file is created on the server host, so you must have the FILE privilege to use this syntax. file_name cannot be an existing file, which among other things prevents files such as /etc/passwd and database tables from being destroyed.
If you're looking on your client, you won't see the file there, even if the operation succeeds.