MySQL LOAD DATA - Avoid convert string to zero when integer column - mysql

I try to trigger an error when I load a string into integer column with LOAD DATA.
The string value in file (aaa) become "0" in table.
My table :
CREATE TABLE (
a INT(11) DEFAULT NULL,
b INT(11) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL
)
My loader :
LOAD DATA LOCAL INFILE 'file.txt'
INTO TABLE `test1`
FIELDS TERMINATED BY ';'
IGNORE 1 LINES (a,b,c,d)
My data file :
a;b;c;d
aaa;11;aa;z
2;bbb;bb;x
3;33;cc;w
4;44;dd;y
And the result in the table :
a b c d
-------------
0 11 aa z
2 0 bb x
3 33 cc w
4 44 dd y
You can see that "aaa" become "0" and "bbb" too.
I would like the file records to be rejected.
I tried to set sql mode to STRICT_ALL_TABLES but no effect :
set sql_mode = STRICT_ALL_TABLES;
Thank you !

Related

Converting a mysql dict column using JSON_Table

Edited
I have a mysql table "prop" with a column "detail" that contains a dict field.
fnum details
55 '{"a":"3"},{"b":"2"},{"d":"1"}'
I have tried to convert this to a table. using this
SELECT p.fnum, deets.*
FROM prop p
JOIN JSON_TABLE( p.details,
'$[*]'
COLUMNS (
idx FOR ORDINALITY,
a varChar(10) PATH '$.a',
b varchar(20) PATH '$.b'
d varchar(45) PATH '$.d',
)
) deets
I have tried various paths including $.*. I am expecting the following:
fnum a b d
55 3 2 1
also if I have 2 rows such as
fnum details
55 '{"a":"3"},{"b":"2"},{"d":"1"}'
56 '{"c":"car"}'
should generate the following
fnum a b d c
55 3 2 1 null
56 null null null car
Your details data is not valid JSON, because it doesn't have [ ] delimiting the array.
Demo:
mysql> create table prop (fnum int, details json);
mysql> insert into prop select 55, '{"a":"3"},{"b":"2"},{"d":"1"}';
ERROR 3140 (22032): Invalid JSON text: "The document root must
not be followed by other values." at position 9 in value for column
'prop.details'.
mysql> insert into prop select 55, '[{"a":"3"},{"b":"2"},{"d":"1"}]';
Query OK, 1 row affected (0.00 sec)
Records: 1 Duplicates: 0 Warnings: 0
It's worth using the JSON data type instead of storing JSON in a text column, because the JSON data type ensures that the document is valid JSON format. It must be valid JSON to use the JSON_TABLE() function or any other JSON function.
Also your query has some syntax mistakes with respect to commas:
SELECT p.fnum, deets.*
FROM prop p
JOIN JSON_TABLE( p.details,
'$[*]'
COLUMNS (
idx FOR ORDINALITY,
a varChar(10) PATH '$.a',
b varchar(20) PATH '$.b' <-- missing comma
d varchar(45) PATH '$.d', <-- extra comma
)
) deets

MySQL "LOAD DATA INFILE" is importing unquoted "NULL" string as `NULL`

I'm using MySQL 5.7.35. If I use the LOAD DATA INFILE command on a CSV file with NULL as an unquoted string value in the CSV file, the value is imported as NULL in MySQL.
For example, if I import a CSV file with the following content:
record_number,a,b,c,d,e,f
1,1,2,3,4,5,6
2,NULL,null,Null,nUlL,,"NULL"
The imported table will have the following values:
+---------------+------+--------+--------+--------+--------+--------+
| record_number | a | b | c | d | e | f |
+---------------+------+--------+--------+--------+--------+--------+
| 1 | 1 | 2 | 3 | 4 | 5 | 6 |
| 2 | NULL | "null" | "Null" | "nUlL" | "" | "NULL" |
+---------------+------+--------+--------+--------+--------+--------+
Is there any way to force column a, record 2, to be imported as a string without modifying the CSV file?
Update
#Barmar Pointed out that there's a paragraph in the MySQL documentation on this behavior here:
If FIELDS ENCLOSED BY is not empty, a field containing the literal
word NULL as its value is read as a NULL value. This differs from the
word NULL enclosed within FIELDS ENCLOSED BY characters, which is read
as the string 'NULL'.
This is documented here:
If FIELDS ENCLOSED BY is not empty, a field containing the literal word NULL as its value is read as a NULL value. This differs from the word NULL enclosed within FIELDS ENCLOSED BY characters, which is read as the string 'NULL'.
So you need to specify the quoting character with something like FIELDS ENCLOSED BY '"' and then write "NULL" in the CSV file.
You could check for a NULL value in your code and convert it to a string.
LOAD DATA INFILE 'file.txt'
INTO TABLE t1
(record_number, #a, #b, #c, #d, #e, #f)
SET a = IFNULL(#a, 'NULL'),
b = IFNULL(#b, 'NULL'),
c = IFNULL(#c, 'NULL'),
d = IFNULL(#d, 'NULL'),
e = IFNULL(#e, 'NULL'),
f = IFNULL(#f, 'NULL')
However, this can't distinguish between an intentional NULL written as \N and MySQL treating NULL as NULL.

Trying to make a pingpong stat tracking database with a stored procedure

I'm using a stored procedure to (try to) write to 3 different tables in MYsql to track ping-pong data and show cool statistics.
So I'm a complete noob to MySQL (and StackOverflow) and haven't really done any sort of database language before so all of this is pretty new to me. I'm trying to make a stored procedure that writes ping-pong stats that come from Ignition(I'm fairly certain that Ignition isn't the problem. It's telling me the writes failed so I think it's a problem with my stored procedure).
I currently have one stored procedure that writes to the players table and can add wins, losses, and total games played when a button is pressed. My problem now is that I want to add statistics where I can track the score and who played against who so I could make graphs and stuff.
This stored procedure is supposed to search through the pingpong table to find if the names passed have played against each other before so I can find the corresponding MatchID. If the players haven't played before, then it should create a new row with a new MatchID(This is the key so it should be unique every time). Once I have the MatchID, I can then figure out how many games the players have played against each other before, what the score was, and who beat who and stuff like that.
Here's what I've written and MySQL says it's fine, but obviously it's not working. I know it's not completely finished but I really need some guidance since this is my second time doing anything with MySQL or and database language for that matter and I don't think this should be failing when I test any sort of write.
CREATE DEFINER=`root`#`localhost` PROCEDURE `Matchups`(
#these are passed from Ignition and should be working
IN L1Name VARCHAR(255), #Player 1 name on the left side
IN L2Name VARCHAR(255), #Player 2 name on the left side
IN R1Name VARCHAR(255), #Player 3 name on the right side
IN R2Name VARCHAR(255), #Player 4 name on the right side
IN TWOvTWO int, #If this is 1, then L1,L2,R1,R2 are playing instead of L1,R1
IN LeftScore int,
IN RightScore int)
BEGIN
DECLARE x int DEFAULT 0;
IF((
SELECT MatchupID
FROM pingpong
WHERE (PlayerL1 = L1Name AND PlayerR1 = R1Name) OR (PlayerL1 = R1Name AND PlayerR1 = L1Name)
)
IS NULL) THEN
INSERT INTO pingpong (PlayerL1, PlayerL2, PlayerR1, PlayerR2) VALUES (L1Name, L2Name, R1Name, R2Name);
INSERT INTO pingponggames (MatchupID, Lscore, Rscore) VALUES ((SELECT MatchupID
FROM pingpong
WHERE (PlayerL1 = L1Name AND PlayerR1 = R1Name) OR (PlayerL1 = R1Name AND PlayerR1 = L1Name)), LeftScore, RightScore);
END IF;
END
Here are what my tables currently look like:
pingpong
PlayerL1 | PlayerL2 | PlayerR1 | PlayerR2 | MatchupID
-----------------------------------------------------
L1 | NULL | R1 | NULL | 1
L1 | NULL | L2 | NULL | 3
L1 | NULL | R2 | NULL | 4
L1 | NULL | test2 | NULL | 5
pingponggames
GameID | MatchupID | LScore | RScore
------------------------------------------
1 | 1 | NULL | NULL
pingpongplayers
Name | TotalWins | TotalLosses | GamesPlayed
-----------------------------------------------------
L1 | 8 | 5 | NULL
L2 | 1 | 1 | NULL
R1 | 1 | 6 | 7
R2 | 1 | 1 | NULL
test2 | 1 | 0 | 1
test1 | 0 | 0 | 0
Explained some features, If needed more I need more info
CREATE DEFINER=`root`#`localhost` PROCEDURE `Matchups`(
#these are passed from Ignition and should be working
IN L1Name VARCHAR(255), #Player 1 name on the left side
IN L2Name VARCHAR(255), #Player 2 name on the left side
IN R1Name VARCHAR(255), #Player 3 name on the right side
IN R2Name VARCHAR(255), #Player 4 name on the right side
-- what will be the INPUT other than 1? It's to notice doubles or singles right? so taking 0 as single & 1 as doubles
IN TWOvTWO INT, #If this is 1, then L1,L2,R1,R2 are playing instead of L1,R1
IN LeftScore INT,
IN RightScore INT)
BEGIN
DECLARE x INT DEFAULT 0; # i guess you are using it in the sp
DECLARE v_matchupid INT; #used int --if data type is different, set as MatchupID column datatype
DECLARE inserted_matchupid INT; -- use data type based on your column MatchupID from pingpong tbl
IF(TWOvTWO=0) THEN -- for singles
#what is the need of this query? to check singles or doubles? Currently it search for only single from what you have written, will change according to that
SELECT MatchupID INTO v_matchupid
FROM pingpong
WHERE L1Name IN (PlayerL1, PlayerR1) AND R1Name IN (PlayerL1, PlayerR1); # avoid using direct name(string) have a master tbl for player name and use its id to compare or use to refer in another tbl
# the if part checks is it new between them and insert in both tbls
IF(v_matchupid IS NULL) THEN
INSERT INTO pingpong (PlayerL1, PlayerR1) VALUES (L1Name, R1Name);
SET inserted_matchupid=LAST_INSERT_ID();
INSERT INTO pingponggames (MatchupID, Lscore, Rscore) VALUES (inserted_matchupid, LeftScore, RightScore);
/*
Once I have the MatchID, I can then figure out how many games the players have played against each other before
A: this will not work for new matchup since matchupid is created now
*/
# so assuming if match found update pingponggames tbl with matched matchupid.. i leave it up to you
ELSE
UPDATE pingponggames SET Lscore=LeftScore, Rscore=RightScore WHERE MatchupID=v_matchupid;-- you can write your own
END IF;
-- for doubles
ELSE # assuming the possibilities of TWOvTWO will be either 0 or 1 if more use "elseif(TWOvTWO=1)" for this block as doubles
SELECT MatchupID INTO v_matchupid
FROM pingpong
# Note: If player name are same it will be difficult so better use a unique id as reference
WHERE L1Name IN (PlayerL1, PlayerL2, PlayerR1, PlayerR2) AND
L2Name IN (PlayerL1, PlayerL2, PlayerR1, PlayerR2) AND
R1Name IN (PlayerL1, PlayerL2, PlayerR1, PlayerR2) AND
R2Name IN (PlayerL1, PlayerL2, PlayerR1, PlayerR2);
IF(v_matchupid IS NULL) THEN
INSERT INTO pingpong (PlayerL1, PlayerL2, PlayerR1, PlayerR2) VALUES (L1Name, L2Name, R1Name, R2Name);
SET inserted_matchupid=LAST_INSERT_ID();
INSERT INTO pingponggames (MatchupID, Lscore, Rscore) VALUES (inserted_matchupid, LeftScore, RightScore);
ELSE
UPDATE pingponggames SET Lscore=LeftScore, Rscore=RightScore WHERE MatchupID=v_matchupid;-- you can write your own
END IF;
END IF;
END

GORM UUID too long

Currently I am using GO-GORM for all of my database queries (mostly CRUD) and I am having some issues inserting a generated UUID into a MySQL database column.
The column is a BINARY(16) as suggested in multiple blogs, the UUID is generated using github.com/satori/go.uuid package for Golang.
I am using GORM's BeforeCreate hook to generate the UUID if one does not already exist on the user, the code that I am using is as follows:
func (u *User) BeforeCreate(scope *gorm.Scope) (err error) {
if u.UserID == uuid.Nil {
uuid, err := uuid.NewV4().MarshalBinary()
scope.SetColumn("user_id", uuid)
}
}
I have also used len to get the length that MarshalBinary outputs and it returns as 16.
The error I get from GORM when trying to insert the UUID into MySQL is as follows:
(Error 1406: Data too long for column 'user_id' at row 1)
I have also fmt.Println(uuid) to see the results and they are also as follows (obviosuly changes as the UUID is generated every insert)
[93 132 59 55 102 96 72 35 137 185 34 21 195 88 213 127]
My MYSQL schema is as follows also:
CREATE TABLE users
(
id INT(10) unsigned PRIMARY KEY NOT NULL AUTO_INCREMENT,
created_at TIMESTAMP,
updated_at TIMESTAMP,
deleted_at TIMESTAMP,
user_id BINARY(16) NOT NULL,
username VARCHAR(255) NOT NULL,
password VARCHAR(255),
firstname VARCHAR(255),
lastname VARCHAR(255),
email VARCHAR(255),
address_id VARCHAR(255)
);
CREATE INDEX idx_users_deleted_at ON users (deleted_at);
CREATE UNIQUE INDEX username ON users (username);
CREATE UNIQUE INDEX user_id ON users (user_id);
I have tried different methods and libraries to generate UUIDs and convert them to binary to insert with similar results.
I think the problem is in the definition of model User. To save the GUID as 16-bytes binary, you need to define the UserID column as []byte not uuid.UUID.
type User struct {
//other fields ..
UserID []byte
//other fields ...
}
func (u *User) BeforeCreate(scope *gorm.Scope) (err error) {
if u.UserID == nil {
uuid, err := uuid.NewV4().MarshalBinary()
scope.SetColumn("user_id", uuid)
}
return nil
}
If you define the field as uuid.UUID, gorm "misinterpreted" the field as string and then insert that string into the database as binary. For example, the following UUID,
uuid: 16ac369b-e57f-471b-96f6-1068ead0bf98
//16-bytes equivalent
bytes: [22 172 54 155 229 127 71 27 150 246 16 104 234 208 191 152]
will be inserted to database as the ASCII codes of the UUID which are
0x31 0x36 0x61 0x63 0x33 0x36 0x39 0x62 0x2D 0x65 ...
('1' '6' 'a' 'c' '3' '6' '9' 'b' '-' 'e' ...)
which are 36-bytes in length, thus you're getting Error 1406: ...

for loop statement to create rows in database

I am trying to use for loop statement as follows:
for(int i=1; i <= 48; i++) { insertdiary("", ""); }
in my MyDB file:
package com.cookbook.data;
import android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.database.sqlite.SQLiteException;
import android.util.Log;
public class MyDB {
private SQLiteDatabase db;
private final Context context;
private final MyDBhelper dbhelper;
// Initializes MyDBHelper instance
public MyDB(Context c){
context = c;
dbhelper = new MyDBhelper(context, Constants.DATABASE_NAME, null,
Constants.DATABASE_VERSION);
}
// Closes the database connection
public void close()
{
db.close();
}
// Initializes a SQLiteDatabase instance using MyDBhelper
public void open() throws SQLiteException
{
try {
db = dbhelper.getWritableDatabase();
} catch(SQLiteException ex) {
Log.v("Open database exception caught", ex.getMessage());
db = dbhelper.getReadableDatabase();
}
}
// Saves a diary entry to the database as name-value pairs in ContentValues instance
// then passes the data to the SQLitedatabase instance to do an insert
public long insertdiary(String title, String content)
{
try{
ContentValues newTaskValue = new ContentValues();
newTaskValue.put(Constants.TITLE_NAME, title);
newTaskValue.put(Constants.CONTENT_NAME, content);
newTaskValue.put(Constants.DATE_NAME, java.lang.System.currentTimeMillis());
return db.insert(Constants.TABLE_NAME, null, newTaskValue);
} catch(SQLiteException ex) {
Log.v("Insert into database exception caught",
ex.getMessage());
return -1;
}
}
// updates a diary entry (existing row)
public boolean updateDiaryEntry(String title, long rowId)
{
ContentValues newValue = new ContentValues();
newValue.put(Constants.TITLE_NAME, title);
return db.update(Constants.TABLE_NAME, newValue, Constants.KEY_ID + "=" + rowId, null)>0;
}
// Reads the diary entries from database, saves them in a Cursor class and returns it from the method
public Cursor getdiaries()
{
Cursor c = db.query(Constants.TABLE_NAME, null, null,
null, null, null, null);
return c;
}
}
My aim is to create 48 empty rows upon database or table first creation so I can further update these rows instead of creating new entries. Unfortunately my attempts to utilize this code were unfortunate giving me errors or creating many more rows than 48.
Is there anyone who could help me with utilizing this code to create 48 rows upon database or table first time creation please?
I appreciate all help.
Paddy
Unless there is really some strict rule governing the requirement to create 48 empty rows, creating them is really the absolute wrong way to go about doing it. Create them as needed, when you need to plug data into them.
I did this in mysql originally. Had trouble creating an SQLFiddle so i created an SQLite version as well.
There is an SQLFiddle. Squeezing all the stuff that follows into 8K, the SQLFiddle limit, was 'interesting' ;-/
The SQLite version, which is exactly the same apart from the 'create table' statements, i will make available if required. It will be a download of the database file, that understand, is the same across all machines. I can also provide the creation scripts if required.
Purpose:
The idea, i understand, is to display 'appointments' where the day is split into 48, 30 minute periods.
The requirement is to only record the actual appointments.
I pictured it as a small number of departments, recording appointments during the day when events will happen. In my example data, people visiting.
Here is the query to show appointments:
SELECT *
FROM department_appointments_view dav
WHERE dav.the_date = '2014-04-11'
AND dav.department_id = 1
AND dav.time_slot_id BETWEEN 12 AND 20;
Here is the sample output:
appointment_id department_id department_code the_date time_slot_id start_time attendee reason duration
-------------- ------------- --------------- ------------------- ------------ ---------- ----------------- --------------------- ----------
0 1 dept_01 2014-04-11 00:00:00 12 05:30:00 30
0 1 dept_01 2014-04-11 00:00:00 13 06:00:00 30
1 1 dept_01 2014-04-11 00:00:00 14 06:30:00 Catherine Tramell to see you 30
0 1 dept_01 2014-04-11 00:00:00 15 07:00:00 30
2 1 dept_01 2014-04-11 00:00:00 16 07:30:00 Buddy Ackerman to see them 30
0 1 dept_01 2014-04-11 00:00:00 17 08:00:00 30
0 1 dept_01 2014-04-11 00:00:00 18 08:30:00 30
3 1 dept_01 2014-04-11 00:00:00 19 09:00:00 Ivan Drago to visit someone else 30
0 1 dept_01 2014-04-11 00:00:00 20 09:30:00 30
So, the main table, where appointments are entered, is:
CREATE TABLE `department_appointments` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`department_id` int(11) NOT NULL,
`the_date` date NOT NULL,
`time_slot_id` int(11) NOT NULL,
`attendee` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
`reason` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
`duration` int(11) NOT NULL,
PRIMARY KEY (`id`),
KEY `dept_fk` (`department_id`),
CONSTRAINT `dept_fk` FOREIGN KEY (`department_id`) REFERENCES `departments` (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=7 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
This is the only table where appointment information is entered.
Sample data:
id department_id the_date time_slot_id attendee reason duration
------ ------------- ---------- ------------ ----------------------- --------------------- ----------
1 1 2014-04-11 14 Catherine Tramell to see you 30
2 1 2014-04-11 16 Buddy Ackerman to see them 30
3 1 2014-04-11 19 Ivan Drago to visit someone else 30
We need some supporting tables:
The departments table:
CREATE TABLE `departments` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`department_code` varchar(64) COLLATE utf8_unicode_ci NOT NULL,
`title` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
Sample data:
id department_code title
------ --------------- -----------------------------
1 dept_01 Dept 01 - The Widget Makers
2 dept_02 Dept 02 - For Bar Workers
The calendar: This is just a table with dates in it. My test data was for april.
CREATE TABLE `the_calendar` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`the_date` datetime NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=31 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
Sample data:
id the_date
------ ---------------------
1 2014-04-01 00:00:00
2 2014-04-02 00:00:00
3 2014-04-03 00:00:00
4 2014-04-04 00:00:00
The read_only_time_slots table. This has 48 rows in it with start times. This table is read only and never updated or copied or anything.
CREATE TABLE `read_only_time_slots` (
`time_slot_id` int(11) NOT NULL,
`start_time` time NOT NULL,
`duration` int(11) NOT NULL,
PRIMARY KEY (`time_slot_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
Sample data:
time_slot_id start_time duration
------------ ---------- ----------
1 00:00:00 30
2 00:30:00 30
3 01:00:00 30
----------------------
You now need some queries to run this lot. Please be aware that we take advantage of the database engine to do cartesian products whenever it can. It will generate all the needed rows for us from just the above tables.
Now, to simplify the use of the information, i have used 'views'. Less confusion for me that way.
The views
The first is: time_slot_view
CREATE VIEW `time_slot_view` AS (
SELECT ts.time_slot_id AS time_slot_id,
ts.start_time AS start_time,
ts.duration AS duration
FROM read_only_time_slots ts
ORDER BY ts.time_slot_id ASC)
The next is: department_calendar_view
This returns empty timeslots for each department, for each day.
CREATE VIEW `department_calendar_view` AS (
SELECT
`d`.`id` AS `department_id`,
`d`.`department_code` AS `department_code`,
`c`.`the_date` AS `the_date`,
`tsv`.`time_slot_id` AS `time_slot_id`,
`tsv`.`start_time` AS `start_time`,
`tsv`.`duration` AS `duration`
FROM ((`the_calendar` `c`
JOIN `time_slot_view` `tsv`)
JOIN `departments` `d`)
ORDER BY `d`.`department_code`,`c`.`the_date`,`tsv`.`time_slot_id`)
Finally: there is the view that uses all the above:
The: department_appointments_view
This probably could be done as an outer join. I just used two queries and a union.
CREATE VIEW `department_appointments_view` AS
SELECT da.id AS appointment_id,
dcv.`department_id` AS department_id,
dcv.`department_code` AS department_code,
da.`the_date` AS the_date,
da.`time_slot_id` AS time_slot_id,
dcv.start_time AS start_time,
da.`attendee` AS attendee,
da.`reason` AS reason,
da.`duration` AS duration
FROM
`department_appointments` AS da
INNER JOIN department_calendar_view AS dcv
ON da.department_id = dcv.department_id
AND da.the_date = dcv.the_date
AND da.time_slot_id = dcv.time_slot_id
UNION
SELECT 0,
dcv.department_id,
dcv.`department_code` ,
dcv.the_date,
dcv.time_slot_id,
dcv.start_time,
'' AS attendee,
'' AS reason,
dcv.`duration`
FROM department_calendar_view AS dcv
WHERE NOT EXISTS (SELECT 1
FROM `department_appointments` AS da
WHERE da.department_id = dcv.department_id
AND da.the_date = dcv.the_date
AND da.time_slot_id = dcv.time_slot_id)
ORDER BY department_code, the_date, time_slot_id;