Insert a BLOB value into MySQL database - mysql

I have a MYSQL database with a column named img of BLOB type.
When I insert a value into that column like this :
LOAD_FILE('C:/Documents and Settings/All Users/Documents/My Pictures/Sample Pictures/Sunset.jpg')
it works !
But like this :
LOAD_FILE('C:/Documents and Settings/Administrator/My Documents/My Pictures/picture.jpg')
it doesn't work and it tells me that the column img cannot be null !
And in both cases the file exists, and I'm connecting to the database as the root user (all privileges), so I don't understand why I'm getting this error.
Thanks in advance

Maybe problem with max_allowed_packet
1.jpg is a small picture and 2.jpg is a big picture
mysql> DESCRIBE blob_files;
+-------+---------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------+---------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| file | blob | YES | | NULL | |
+-------+---------+------+-----+---------+----------------+
2 rows in set (0.01 sec)
mysql> INSERT INTO blob_files(file) VALUE(LOAD_FILE('D:/2.jpg'));
Query OK, 1 row affected, 1 warning (0.00 sec)
mysql> SHOW WARNINGS;
+---------+------+--------------------------------------------------------------
------------------+
| Level | Code | Message
|
+---------+------+--------------------------------------------------------------
------------------+
| Warning | 1301 | Result of load_file() was larger than max_allowed_packet (104
8576) - truncated |
+---------+------+--------------------------------------------------------------
------------------+
1 row in set (0.00 sec)
mysql> INSERT INTO blob_files(file) VALUE(LOAD_FILE('D:/1.jpg'));
Query OK, 1 row affected (0.05 sec)

I'm going to post this as an answer, and I'll modify it as needed. In the comments I mentioned mysqld needs to run as administrator. Upon consideration, I realized this is actually not a good idea, since windows UAC is in place for a reason. A better option is to add the necessary permission to the folder.
Go to your My Documents folder under administrator, right click on My Pictures, go to security, and add "LOCAL SERVICE" to your permissions with the read attribute. Then your MySQL server should be able to read from that folder.
If you want to verify LOCAL SERVICE is the proper account, go to start -> run and type services.msc and press enter. Find MySQL, right click and hit properties, go to the Log On tab and see which account it runs as. This is the one you should add to the security tab of the folder with the read permissions.

This may be a little late but how I insert blob files when I just need them as sample data is like this:
Using MysqlWorkbench I enter a record into my blob table without applying the insert.
I right-click the blob field, open value editor then upload an image there.
Now before applying the insert I copy the row and paste it repeatedly (a procedure loop can be used as well)
I do this same thing when updating blob fields as well except in the WHERE clause I would use WHERE primaryKey >= this value this would update all the samples

Related

How can I restore a backup in Opencart from command line? (no admin panel)

I`m stacked with a Opencart problem. After restoring and old backup, now I cannot login into the admin panel anymore. The problem was an installation of a language extension, then I deleted the default English language. The problem persisted so I tried to restore a backup, but that backup has the admin panel language set to English, now I see in the database only Español. I guess that is the reason of the login failure.
If restoring back to the previous state, I will be able to enter again, but I need the way to restore a backup without admin panel, using the database. O maybe a way to add English language to the oc_language table y de db...
MariaDB [opencartdb]> select * from oc_language;
+-------------+----------+------+-------------+-----------+------------+--------+
| language_id | name | code | locale | extension | sort_order | status |
+-------------+----------+------+-------------+-----------+------------+--------+
| 4 | Español | es | es_ES.UTF-8 | | 1 | 1 |
+-------------+----------+------+-------------+-----------+------------+--------+
1 row in set (0.002 sec)
Hmm... At first check language_id from oc_product_description table. And sure that English had 1 or other number.
You need to insert English Language
insert into `oc_language` set `name` = 'English', `code` = 'en-gb', `locale`= 'en.GB', `extension`='', `sort_order` = '2', `status`= '1';
After that command you must update language_id:
update `oc_language` set `language_id`= 1 where `code` = 'en-gb';

MySQL load_file() for .zip folders

I have a couple files that I want to store together in a blob type column of a mysql table. So I just put them into a folder and then zipped it. I've never had any trouble storing images, text and pdf files using the load_file() function, but when I try with the .zip folder I get back a NULL value.
What am I missing? Thanks!
I have noted the same phenomenon.
It does seem a bit strange indeed and OS related. Here is the result of my investigation (using MARIA DB 10.4, Windows 10 Pro20H2):
In a given folder, C:\zipfolder for ex., I've created a textfile zipdoc.txt with some text content and a zip file containing the textfile.
This gives the folowing load_file output:
select load_file('C:\\zipfolder\\zipdoc.txt');
+----------------------------------------+
| load_file('C:\\zipfolder\\zipdoc.txt') |
+----------------------------------------+
| zipcontent text |
+----------------------------------------+
select load_file('C:\\zipfolder\\zipdoc.zip');
+----------------------------------------+
| load_file('C:\\zipfolder\\zipdoc.zip') |
+----------------------------------------+
| NULL |
+----------------------------------------+
Changing the file extension from .zip to .zip_ for ex. fixes the issue:
select load_file('C:\\zipfolder\\zipdoc.zip_');
+---------------------------------------------------------------------------------------------------------------------------------------+
| load_file('C:\\zipfolder\\zipdoc.zip_') |
+---------------------------------------------------------------------------------------------------------------------------------------+
| PK♥♦¶ FÄLR├SAÏ☼ ☼
zipdoc.txtzipcontent textPK☺☻¶ ¶ FÄLR├SAÏ☼ ☼
☺ zipdoc.txtPK♣♠ ☺ ☺ 8 7 |
+---------------------------------------------------------------------------------------------------------------------------------------+
So, it looks like Windows 10 is blocking the access to .zip files in a more restrictive way than other files.
Giving EVERYONE access to the zip-file allows the load_file function accessing of the original zip-file. After granting the access with the following Powerhell script (adopted from here):
$acl = Get-Acl C:\zipfolder\zipdoc.zip
$AccessRule = New-Object System.Security.AccessControl.FileSystemAccessRule("Jeder","Read","Allow")
$acl.SetAccessRule($AccessRule)
$acl | Set-Acl C:\zipfolder\zipdoc.zip
load_file is able to access the zipfile:
select load_file('C:\\zipfolder\\zipdoc.zip');
+---------------------------------------------------------------------------------------------------------------------------------------+
| load_file('C:\\zipfolder\\zipdoc.zip') |
+---------------------------------------------------------------------------------------------------------------------------------------+
| PK♥♦¶ FÄLR├SAÏ☼ ☼
zipdoc.txtzipcontent textPK☺☻¶ ¶ FÄLR├SAÏ☼ ☼
☺ zipdoc.txtPK♣♠ ☺ ☺ 8 7 |
+---------------------------------------------------------------------------------------------------------------------------------------+
So, the solution is to grant EVERYONE access to the zip-files or just changing the extension of the files (It remains a task for admins to find a more restrictive working access level).
Complement: As mentioned by #Álvaro González, the use of an archiving program that sets the appropriate rights is also a solution.
I cannot reproduce the problem. See console output:
mysql> CREATE TABLE test (val BLOB);
Query OK, 0 rows affected (0.29 sec)
mysql> INSERT INTO test SELECT LOAD_FILE('C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.sql');
Query OK, 1 row affected (0.05 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> INSERT INTO test SELECT LOAD_FILE('C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.zip');
Query OK, 1 row affected (0.04 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> SELECT LENGTH(val) FROM test;
+-------------+
| LENGTH(val) |
+-------------+
| 5603 |
| 17725 |
+-------------+
2 rows in set (0.00 sec)

Process TEXT BLOBs fields in MySQL line by line

I have a MEDIUMTEXT blob in a table, which contains paths, separated by new line characters. I'd like to add a "/" to the begging of each line if it is not already there. Is there a way to write a query to do this with built-in procedures?
I suppose an alternative would be to write a Python script to get the field, convert to a List, process each line and update the record. There aren't that many records in the DB, so I can take the processing delay (if it doesn't lock the entire DB or table). About 8K+ rows.
Either way would be fine. If second option is recommended, do I need to know of specific locking schematics before getting into this -- as this would be run on a live prod DB (of course, I'd take a DB snapshot). But in place updates would be best to not have downtime.
Demo:
mysql> create table mytable (id int primary key, t text );
mysql> insert into mytable values (1, 'path1\npath2\npath3');
mysql> select * from mytable;
+----+-------------------+
| id | t |
+----+-------------------+
| 1 | path1
path2
path3 |
+----+-------------------+
1 row in set (0.00 sec)
mysql> update mytable set t = concat('/', replace(t, '\n', '\n/'));
mysql> select * from mytable;
+----+----------------------+
| id | t |
+----+----------------------+
| 1 | /path1
/path2
/path3 |
+----+----------------------+
However, I would strongly recommend to store each path on its own row, so you don't have to think about this. In SQL, each column should store one value per row, not a set of values.

the utf-8 charset txt file convert to the mysql can not show correct on macos

I create a table in mysql on macos commandline using the 'utf-8' charset,
mysql> CREATE TABLE tb_stu (id VARCHAR(20), name VARCHAR(20), sex CHAR(1), birthday DATE) default charset=utf8;
Query OK, 0 rows affected (0.02 sec)
mysql> SHOW TABLES;
+----------------+
| Tables_in_test |
+----------------+
| pet |
| tb_stu |
+----------------+
2 rows in set (0.00 sec)
mysql> show create table tb_stu \G
*************************** 1. row ***************************
Table: tb_stu
Create Table: CREATE TABLE `tb_stu` (
`id` varchar(20) DEFAULT NULL,
`name` varchar(20) DEFAULT NULL,
`sex` char(1) DEFAULT NULL,
`birthday` date DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8
1 row in set (0.00 sec)
I want to add some values to the 'tb_stu' table, I have a txt file containing Chinese string :
1 小明 男 2015-11-02
2 小红 女 2015-09-01
3 张三 男 2010-02-12
4 李四 女 2009-09-10
and the txt file is 'utf-8' charset too!
➜ ~ file /Users/lee/Desktop/JAVA/Java从入门到精通/第18章--使用JDBC操作数据库/Example_18_02/tb_stu.txt
/Users/lee/Desktop/JAVA/Java从入门到精通/第18章--使用JDBC操作数据库/Example_18_02/tb_stu.txt: UTF-8 Unicode text
so I execute the mysql command line:
mysql> LOAD DATA LOCAL INFILE '/Users/lee/Desktop/JAVA/Java从入门到精通/第18章--使用JDBC操作数据库/Example_18_02/tb_stu.txt' INTO TABLE tb_stu;
Query OK, 4 rows affected, 4 warnings (0.01 sec)
Records: 4 Deleted: 0 Skipped: 0 Warnings: 4
but I get the messy code in mysql :
mysql> select * from tb_stu;
+------+----------------+------+------------+
| id | name | sex | birthday |
+------+----------------+------+------------+
| 1 | å°æ˜Ž | ç | 2015-11-02 |
| 2 | å°çº¢ | å | 2015-09-01 |
| 3 | 张三 | ç | 2010-02-12 |
| 4 | æŽå›› | å | 2009-09-10 |
+------+----------------+------+------------+
4 rows in set (0.00 sec)
it makes me confused, the tabel in mysql and the txt are both 'utf-8' charset, why I get the messy code? thanks a lot!
You will need to investigate some more to understand your problem. One of the options for example is that your data was written into DB correctly but in your command line it is just displayed incorrectly due to some wrong setting of encoding in your operating system environment. Or the problem might be that the data was garbled (corrupted) when it was written and that means that it is wrongly stored in the DB. So I would suggest to take your original file with properly displayed Chinese characters and convert them to unicode sequence, and then take the data in DB and also convert them into unicode sequence and compare to see if your DB data is just displayed incorrectly or the data itself is corrupted. This will help you to understand your problem and then to find a way to fix it. Here is tool that can help you:
There is an Open Source java library MgntUtils (written by me) that has a Utility that converts Strings to unicode sequence and vise versa:
result = "Hello World";
result = StringUnicodeEncoderDecoder.encodeStringToUnicodeSequence(result);
System.out.println(result);
result = StringUnicodeEncoderDecoder.decodeUnicodeSequenceToString(result);
System.out.println(result);
The output of this code is:
\u0048\u0065\u006c\u006c\u006f\u0020\u0057\u006f\u0072\u006c\u0064
Hello World
The library can be found at Maven Central or at Github It comes as maven artifact and with sources and javadoc
Here is javadoc for the class StringUnicodeEncoderDecoder

MySQL: Appending records: find then append or append only

I'm writing a program, in C++, to access tables in MySQL via the MySql C++ Connector.
I retrieve a record from the User (via GUI or Xml file).
Here are my questions:
Should I search the table first for
the given record, then append if it
doesn't exist,
Or append the record, and let MySQL
append the record if it is unique?
Here is my example table:
mysql> describe ing_titles;
+----------+----------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------+----------+------+-----+---------+-------+
| ID_Title | int(11) | NO | PRI | NULL | |
| Title | char(32) | NO | | NULL | |
+----------+----------+------+-----+---------+-------+
In judgment, I am looking for a solution that will enable my program to respond quickly to the User.
During development, I have small tables (less than 5 records), but I am expecting them to grow when I formally release the application.
FYI: I am using Visual Studion 2008, C++, wxWidgets, and MySQL C++ Connector on Windows XP and Vista.
Mark the field in question with a UNIQUE constraint and use INSERT ... ON DUPLICATE KEY UPDATE or INSERT IGNORE.
The former will update the records if they already exists, the latter will just do nothing.
Searching the table first is not efficient, since it requires two roundtrips to the server: the first one to search, the second one to insert (or to update).
The syntaxes above do the same in one sentence.