Load NULL values INT - mysql

FIY:
I'm working with a CVS file from Census - FactFinder
Using MySQL 5.7
OS is Windows 10 PRO
So, I created this table:
+----------+------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------+------------+------+-----+---------+-------+
| SERIALNO | bigint(13) | NO | PRI | NULL | |
| DIVISION | int(9) | YES | | NULL | |
| PUMA | int(4) | YES | | NULL | |
| REGION | int(1) | YES | | NULL | |
| ST | int(1) | YES | | NULL | |
| ADJHSG | int(7) | YES | | NULL | |
| ADJINC | int(7) | YES | | NULL | |
| FINCP | int(6) | YES | | NULL | |
| HINCP | int(6) | YES | | NULL | |
| R60 | int(1) | YES | | NULL | |
| R65 | int(1) | YES | | NULL | |
+----------+------------+------+-----+---------+-------+
And tried to load data using:
LOAD DATA INFILE "C:/ProgramData/MySQL/MySQL Server 5.7/Uploads/Housing_Illinois.csv"
INTO TABLE housing
CHARACTER SET latin1
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\n'
It didn`t work as this message appear:
ERROR 1366 (HY000): Incorrect integer value: '' for column 'FINCP' at
row 2
The row the error message is referring to is:
2012000000051,3,104,2,17,1045360,1056030,,8200,1,1
I believed FINCP which is the blank value ,, right before 8200 is the problem. So I followed this thread instructions: MySQL load NULL values from CSV data
And updated my code to:
LOAD DATA INFILE "C:/ProgramData/MySQL/MySQL Server 5.7/Uploads/Housing_Illinois.csv"
INTO TABLE housing
CHARACTER SET latin1
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\n'
(#SERIALNO, #DIVISION, #PUMA, #REGION, #ST, #ADJHSG, #ADJINC, #FINCP, #HINCP, #R60, #R65)
SET
SERIALNO = nullif(#SERIALNO,''),
DIVISION = nullif(#DIVISION,''),
PUMA = nullif(#PUMA,''),
REGION = nullif(#REGION,''),
ST = nullif(#ST,''),
ADJHSG = nullif(#ADJHSG,''),
ADJINC = nullif(#ADJINC,''),
FINCP = nullif(#FINCP,''),
HINCP = nullif(#HINCP,''),
R60 = nullif(#R60,''),
R65 = nullif(#R65,'');
The first error is now gone but this message appears:
' for column 'R65' at row 12t integer value: '
The row at which this message is referring to is:
2012000000318,3,1602,2,17,1045360,1056030,,,,
There's no error message so I don't know what exactly is the problem. I can only assume that the problem is that there are four consecutive blank values.
Another tip, if I use CSV and change all blank to 0 the code goes smoothly, but I`m not a fan or editing raw data so I would like to know other options.
Bottom line, I have two questions:
Shouldn`t data be loaded with the first code as MySQL should take ,, as null and 0 a plain 0?
What's the problem I'm getting now that I'm using SERIALNO = nullif(#SERIALNO,'')
I want to be able to differentiate between 0 and null/blank values.
Thank you.

MySQL's LOAD DATA tool interprets \N as being a NULL value. So, if your troubled row looked like this:
2012000000318,3,1602,2,17,1045360,1056030,\N,\N,\N,\N
then you might not have this problem. If you have access to a regex replacement tool, you may try searching for the following pattern:
(?<=^)(?=,)|(?<=,)(?=,)|(?<=,)(?=$)
Then, replace with \N. This should fill in all the empty slots with \N, which semantically will be interpreted by MySQL as meaning NULL. Note that if you were to write a table out from MySQL, then nulls would be replaced with \N. The issue is that your data source and MySQL don't know about each other.

Related

Getting error while sending or writing python dataframe on real time server

I am working on python dataframe to send on real time server database (Mysql). The code is working fine on local machine but facing issue on server.
The below code i have tried.
import pandas as pd
from sqlalchemy import create_engine
def db_write(db_config,contact_df):
IP_ADDR=db_config["ip_addr"]
PORT_NUMBER=db_config["port_num"]
USER_NAME=db_config["user_name"]
PASSWORD=db_config["password"]
engine = create_engine("mysql+pymysql://"+USER_NAME+":"+PASSWORD+"#"+IP_ADDR+"/db_replica")
con = engine.connect()
contact_df.to_sql(con=con, name='users',if_exists='append', index=False)
con.close()
#call a db_write() function
db_write(json_data['mysql_db'],processed_db_df)
I want to write the processed_db_df dataframedat into database (mysql). But, while running the code on real time server getting below error.
sqlalchemy.exc.DataError: (pymysql.err.DataError) "Incorrect string
value: '\xE0\xB8\xAAibh...' for column 'first_name'
sqlalchemy.exc.IntegrityError: (pymysql.err.IntegrityError) "Column
'last_name' cannot be null")
I tried with setting the the charset utf value at the end of connection string link below
engine
create_engine("mysql+pymysql://"+USER_NAME+":"+PASSWORD+"#"+IP_ADDR+"/db_replica?charset=utf8")
But still. issue did not resolved.
I check the database table schema and it looks like below
| Field | Type | Null | Key | Default | Extra |
+------------------------+--------------+------+-----+---------+-------+
| Unnamed: 0 | bigint(20) | YES | | NULL | |
| ext_lead_id | text | YES | | NULL | |
| activity | text | YES | | NULL | |
| update_date_time | text | YES | | NULL | |
| first_name | text | NO | | NULL | |
| last_name | text | YES | | NULL
Instead of text i want varchar as a dataype or else plz help me with the custome schema in SQLAlChermy
Thanks in advance

mariadb select * returns empty set

when I run SELECT * FROM urlcheck
it returns 'EMPTY Set (0.0 sec)'
According SHOW TABLE STATUS LIKE 'urlcheck'
The table has 3 rows.
Table structure is:
+-------------+---------------+------+-----+---------+----------------+<br>
| Field | Type | Null | Key | Default | Extra |<br>
+-------------+---------------+------+-----+---------+----------------+<br>
| id | int(11) | NO | PRI | NULL | auto_increment |<br>
| coursegroup | varchar(20) | YES | | NULL | |<br>
| url | varchar(2588) | YES | | NULL | |<br>
+-------------+---------------+------+-----+---------+----------------+<br>
I start by selecting the database with USE db
any ideas why this happened. I know this is similiar to Mysql select always returns empty setMysql select always returns empty set but that was apparently a corrupted database. I have truncated this database and add new rows and I still get the same problem. The code that adds records FWIW is
cur.execute('insert into urlcheck (coursegroup, url) values("'+coursegroup+'","'+url+'");')
db.commit
cur.close
The problem was a syntax error in my code.
should have been:
db.commit()
cur.close()
lack of parentheses caused the code not to write. I leave this here even as it redounds to my own humiliation in the hopes it helps someone else.

Does CodeName One support MYSQL BLOBs using RESTful Database Interface?

Using CodenameOne web database extension, I can get basic SQL fields to work for strings and numbers, but not for large binary objects BLOBs. I'm following the instructions here: https://www.codenameone.com/blog/connecting-to-a-mysql-database-part-2.html
Are BLOBs supported by CodenameOne? If so how do you do it? I can't find any examples that use BLOB types.
I've tried using long strings, and with the MarianaDB, can get up to 512K string size, but I need to store images which can be larger.
MariaDB [(none)]> use tsg; desc photos;
Database changed
+------------+------------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+------------------+------+-----+---------+----------------+
| id | int(10) unsigned | NO | PRI | NULL | auto_increment |
| player_id | int(11) | NO | | NULL | |
| tree_id | int(11) | NO | | NULL | |
| photo_type | longtext | NO | | NULL | |
| image | blob | YES | | NULL | |
+------------+------------------+------+-----+---------+----------------+
5 rows in set (0.001 sec)
When I add the record without the blob it works:
m.put("playerId", "1");
m.put("treeId", "2");
m.put("photoType", "front");
m.put("image", null);
client.create(m, res -> {
System.out.println(m);
System.out.println("create result = " + res);
});
outputs:
{treeId=2, image=null, photoType=front, playerId=1}
create result = true
But when I try to add the blob, it does not:
m.put("playerId", "1");
m.put("treeId", "2");
m.put("photoType", "front");
byte bytes[] = new byte[100];
m.put("image", bytes);
client.create(m, res -> {
System.out.println(m);
System.out.println("create result = " + res);
});
outputs:
{treeId=2, image=[B#5968c8cb, photoType=front, playerId=1}
create result = false
Help! I'm using BLOBs in the wrong way, or does CN1 not support BLOBs?
The only error message is from the result of create being false.
It doesn't have builtin support for that at this time. You can use MultipartRequest to submit binary data to the server.

How should I format my .txt enum values?

The table test from my database has a unique ENUM column. How should I format my .txt file in order to load data from it into the column?
This is how I'm doing it right now:
text.txt:
0
1
2
2
1
MySQL Script:
LOAD DATA LOCAL INFILE 'Data/test.txt' INTO TABLE test
DESCRIBE test
+-------+-------------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+-------------------+------+-----+---------+-------+
| enum | enum('0','1','2') | YES | | NULL | |
+-------+-------------------+------+-----+---------+-------+
The output:
+------+
| enum |
+------+
| |
| |
| |
| |
| 1 |
+------+
The first (possible) bug is break line symbols, which is '\n' by default in unix systems. Check your file, a high probability that it is '\r\n', and add LINES TERMINATED clause -
LINES TERMINATED BY '\r\n'
The second bug - a file name, you wrote 'text.txt', but in LOAD DATA command you have used 'test.txt'.
LOAD DATA INFILE Syntax

viewing mysql blob with putty

I am saving a serialized object to a mysql database blob.
After inserting some test objects and then trying to view the table, i am presented with lots of garbage and "PuTTYPuTTY" several times.
I believe this has something to do with character encoding and the blob containing strange characters.
I am just wanting to check and see if this is going to cause problems with my database, or if this is just a problem with putty showing the data?
Description of the QuizTable:
+-------------+-------------+-------------------+------+-----+---------+----------------+---------------------------------+-------------------------------------------------------------------------------------------------------------------+
| Field | Type | Collation | Null | Key | Default | Extra | Privileges | Comment |
+-------------+-------------+-------------------+------+-----+---------+----------------+---------------------------------+-------------------------------------------------------------------------------------------------------------------+
| classId | varchar(20) | latin1_swedish_ci | NO | | NULL | | select,insert,update,references | FK related to the ClassTable. This way each Class in the ClassTable is associated with its quiz in the QuizTable. |
| quizId | int(11) | NULL | NO | PRI | NULL | auto_increment | select,insert,update,references | This is the quiz number associated with the quiz. |
| quizObject | blob | NULL | NO | | NULL | | select,insert,update,references | This is the actual quiz object. |
| quizEnabled | tinyint(1) | NULL | NO | | NULL | | select,insert,update,references | |
+-------------+-------------+-------------------+------+-----+---------+----------------+---------------------------------+-------------------------------------------------------------------------------------------------------------------+
What i see when i try to view the table contents:
select * from QuizTable;
questionTextq ~ xp sq ~ w
t q1a1t q1a2xt 1t q1sq ~ sq ~ w
t q2a1t q2a2t q2a3xt 2t q2xt test3 | 1 |
+-------------+--------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+
3 rows in set (0.00 sec)
I believe you can use the hex function on blobs as well as strings. You can run a query like this.
Select HEX(quizObject) From QuizTable Where....
Putty is reacting to what it thinks are terminal control character strings in your output stream. These strings allow the remote host to change something about the local terminal without redrawing the entire screen, such as setting the title, positioning the cursor, clearing the screen, etc..
It just so happens that when trying to 'display' something encoded like this, that a lot of binary data ends up sending these characters.
You'll get this reaction catting binary files as well.
blob will completely ignore any character encoding settings you have. It's really intended for storing binary objects like images or zip files.
If this field will only contain text, I'd suggest using a text field.