Storing Percentages in MySQL - mysql

I'm trying to store a percentage value in a MySQL database but when ever I try and set the value of the percentage column to 100%, I get an "Out of range value" error message.
I am currently using a DECIMAL(5,2) type and I need to be able to store values from 0% up to 100% (with 2dp when the value isn't an integer) ( the values are being calculated in a php script).
All values are fine apart from 100% which triggers the error.
Am I misunderstanding something or is there something else I am missing?
EDIT: The table was created using the following sql
CREATE TABLE overviewtemplate
(
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
name VARCHAR(32),
numberOfTests INT NOT NULL DEFAULT 0,
description VARCHAR(255) NOT NULL DEFAULT "Please add a Description",
percentageComplete DECIMAL(5,2),
numberPassed INT NOT NULL DEFAULT 0,
numberFailed INT NOT NULL DEFAULT 0
) ENGINE=MYISAM;
EDIT 2: This is the code of the SQL query
$numberOfPasses = 5;
$numberOfFails = 5;
$percentageComplete = 100.00;
$sqlquery = "UPDATE `overviewtemplate`
SET numberPassed = {$numberOfPasses},
numberFailed = {$numberOfFails},
percentageComplete = {$percentageComplete}
WHERE description = '{$description}'";
EDIT 3: FIXED - Had a syntax error in my table names which meant it was trying to update a wrong table.

With your declaration you should be able to save even 999.99 without trouble. Check if you have set any rule for it not be bigger than 100? If yes then set it to be less than or equal to 100.00
It could be in a trigger.

Related

Incorrect datetime value output (without fractional part) using MATLAB from a MySQL table

I have a datetime(6) column in MySQL database with fractional seconds. For example, an example entry in my database has the time value: 2022-12-22 12:29:04.602000.
CREATE TABLE `test_data` (
`sampletime_utc` datetime(6) DEFAULT NULL,
`project_uuid` int(11) DEFAULT NULL,
`item_id` int(11) DEFAULT NULL
);
When we read this time in MATLAB, the default type for the time variable is char which looses the fractional seconds part. I found the following solution to the problem:
https://www.mathworks.com/matlabcentral/answers/602656-sql-datetime-query-is-there-a-faster-way?s_tid=prof_contriblnk
The source code in the above answer is given below:
datasourceName = "mysqlJdbc"; % The name of JDBC datasource for MySQL
username = "USERNAME";
password = "PASSWORD";
conn = database(datasourceName, username, password);
opts = databaseImportOptions(conn, tablename);
columnNames = {'col1', 'col2'}; % The column names you want to change the import options
opts = setoptions(opts, columnNames, 'Type', 'datetime'); % Change from char to datetime
sqlquery = "SELECT * FROM YOUR_TABLENAME";
data = fetch(conn, sqlquery, opts, 'MaxRows', 10);
The above solution works only for a simple select statement. If I use a complicated statement like with where and and clauses, the following error is thrown:
Error using database.jdbc.connection/fetch (line 197)
Unable to use database import options with
'SELECT data_value, sampletime_utc FROM db1.tb1 dd WHERE (dd.project_uuid = 'b7d0-cf70b35e32cb' AND dd.item_id = 131 AND dd.sampletime_utc >= '2022-04-22 20:45:52.000' and dd.sampletime_utc <= '2022-04-22 21:45:52.000' ) ORDER BY dd.sampletime_utc ASC'.
Can anyone please guide how to get the time values from mysql table with fractional seconds or as a datetime variable in MATLAB.

How do I preserve utf-8 JSON values and write them correctly to a utf-8 txt file in Python 3.8.2?

I recently wrote a python script to extract some data from a JSON file and use it to generate some SQL Insert values for the following statement:
INSERT INTO `card`(`artist`,`class_pid`,`collectible`,`cost`, `dbfid`, `api_db_id`, `name`, `rarity`, `cardset_pid`, `cardtype`, `attack`, `health`, `race`, `durability`, `armor`,`multiclassgroup`, `text`) VALUES ("generated entry goes here")
The names of some of the attributes are different in my SQL table but the same values are used (example cardClass in the JSON file/Python script is referred to as class_pid in the SQL table). The values generated from the script are valid SQL and can successfully be inserted into the database, however I noticed that in the resulting export.txt file some of the values changed from what they originally were. For example the following JSON entries from a utf-8 encoded JSON file:
[{"artist":"Arthur Bozonnet","attack":3,"cardClass":8,"collectible":1,"cost":2,"dbfId":2545,"flavor":"And he can't get up.","health":2,"id":"AT_003","mechanics":["HEROPOWER_DAMAGE"],"name":"Fallen Hero","rarity":"RARE","set":1,"text":"Your Hero Power deals 1 extra damage.","type":"MINION"},{"artist":"Ivan Fomin","attack":2,"cardClass":11,"collectible":1,"cost":2,"dbfId":54256,"flavor":"Were you expectorating another bad pun?","health":4,"id":"ULD_182","mechanics":["TRIGGER_VISUAL"],"name":"Spitting Camel","race":"BEAST","rarity":"COMMON","set":22,"text":"[x]At the end of your turn,\n  deal 1 damage to another  \nrandom friendly minion.","type":"MINION"}]
produce this output:
('Arthur Bozonnet',8,1,2,'2545','AT_003','Fallen Hero','RARE',1,'MINION',3,2,'NULL',0,0,'NULL','Your Hero Power deals 1\xa0extra damage.'),('Ivan Fomin',11,1,2,'54256','ULD_182','Spitting Camel','COMMON',22,'MINION',2,4,'BEAST',0,0,'NULL','[x]At the end of your turn,\n\xa0\xa0deal 1 damage to another\xa0\xa0\nrandom friendly minion.')
As you can see, some of the values from the JSON entries have been altered somehow as if the text encoding was changed somewhere, even though in my script I made sure that the JSON file was opened with utf-8 encoding and the resulting text file was also opened and written to in utf-8 to match the JSON file. My aim is to preserve the values exactly as they are in the JSON file and transfer those values to the generated SQL value entries exactly as they are in the JSON. As an example, in the generated SQL I want the "text" value of the second entry to be:
"[x]At the end of your turn,\n deal 1 damage to another \nrandom friendly minion."
instead of:
"[x]At the end of your turn,\n\xa0\xa0deal 1 damage to another\xa0\xa0\nrandom friendly minion."
I tried using functions such as unicodedata.normalize() but unfortunately it did not seem to change the output in any way.
This is the script that I wrote to generate the SQL values:
import json
import io
chosen_keys = ['artist','cardClass','collectible','cost',
'dbfId','id','name','rarity','set','type','attack','health',
'race','durability','armor',
'multiClassGroup','text']
defaults = ['NULL','0','0','0',
'NULL','NULL','NULL','NULL','0','NULL','0','0',
'NULL','0','0',
'NULL','NULL']
def saveChangesString(dataList, filename):
with io.open(filename, 'w', encoding='utf-8') as f:
f.write(dataList)
f.close()
def generateSQL(json_dict):
count = 0
endCount = 1
records = ""
finalState = ""
print('\n'+str(len(json_dict))+' records will be processed\n')
for i in json_dict:
entry = "("
jcount = 0
for j in chosen_keys:
if j in i.keys():
if str(i.get(j)).isdigit() and j != 'dbfId':
entry = entry + str(i.get(j))
else:
entry = entry + repr(str(i.get(j)))
else:
if str(defaults[jcount]).isdigit() and j != 'dbfId':
entry = entry + str(defaults[jcount])
else:
entry = entry + repr(str(defaults[jcount]))
if jcount != len(chosen_keys)-1:
entry = entry+","
jcount = jcount + 1
entry = entry + ")"
if count != len(json_dict)-1:
entry = entry+","
count = count + 1
if endCount % 100 == 0 and endCount >= 100 and endCount < len(json_dict):
print('processed records '+str(endCount - 99)+' - '+str(endCount))
if endCount + 100 > len(json_dict):
finalState = 'processed records '+str(endCount+1)+' - '+str(len(json_dict))
if endCount == len(json_dict):
print(finalState)
records = records + entry
endCount = endCount + 1
saveChangesString(records,'export.txt')
print('done')
with io.open('cards.collectible.sample.example.json', 'r', encoding='utf-8') as f:
json_to_dict = json.load(f)
f.close()
generateSQL(json_to_dict)
Any help would be greatly appreciated as the JSON file I am actually using contains over 2000 entries so I would prefer to avoid having to edit things manually. Thank you.
Also the SQL table structure code is:
-- phpMyAdmin SQL Dump
CREATE TABLE `card` (
`pid` int(10) NOT NULL,
`api_db_id` varchar(50) NOT NULL,
`dbfid` varchar(50) NOT NULL,
`name` varchar(50) NOT NULL,
`cardset_pid` int(10) NOT NULL,
`cardtype` varchar(50) NOT NULL,
`rarity` varchar(20) NOT NULL,
`cost` int(3) NOT NULL,
`attack` int(10) NOT NULL,
`health` int(10) NOT NULL,
`artist` varchar(50) NOT NULL,
`collectible` tinyint(1) NOT NULL,
`class_pid` int(10) NOT NULL,
`race` varchar(50) NOT NULL,
`durability` int(10) NOT NULL,
`armor` int(10) NOT NULL,
`multiclassgroup` varchar(50) NOT NULL,
`text` text NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
ALTER TABLE `card`
ADD PRIMARY KEY (`pid`);
ALTER TABLE `card`
MODIFY `pid` int(10) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=1;
COMMIT;
\xa0 is a variant on space. Is it coming from Word?
But, more relevant, it is not utf8; it is latin1 or other non-utf8 encoding. You need to go back to where it came from and change that to utf8.
Or, if your next step is just to put it into a MySQL table, then you need to tell the truth about the client -- namely that it is encoded in latin1 (not utf8). Once you have done that, MySQL will take care of the conversion for you during the INSERT.

MySql doesn't accept boolean as its column datatype

I have a Java program which stores data in MySQL database in two states. For this purpose I wanted to use BOOLEAN. But whenever I enter BOOLEAN it's getting changed into TINYINT.
Is there any other way to store data in two states?
MySQL uses TINYINT(1) to mimic the behaviour of Boolean type, so make sure you use TINYINT(1) as the data type of your column, not TINYINT.
Alternatively, you can use BOOL or BOOLEAN which are both synonyms for TINYINT(1).
Similarly, the values TRUE and FALSE are merely aliases for 1 and 0, respectively in MySQL.
MySQL doesn't have a native boolean type.
https://dev.mysql.com/doc/refman/5.7/en/numeric-type-overview.html says:
BOOL, BOOLEAN
These types are synonyms for TINYINT(1). A value of zero is considered false. Nonzero values are considered true:
This means the "boolean" type is still an 8-bit signed integer, and the (1) syntax is not a size limit. It doesn't prevent the column from storing integer values from -128 to 127. It's up to you to refrain from storing those values.
MySQL also supports a BIT data type: https://dev.mysql.com/doc/refman/5.7/en/bit-type.html
The MySQL JDBC driver translates BIT(1) into java.lang.Boolean. See https://dev.mysql.com/doc/connector-j/5.1/en/connector-j-reference-type-conversions.html for JDBC data type mappings.
But the BIT data type has had some bugs in its history. I avoid using it.
Bit probably won't save any space, anyway, if that's what you are hoping for. If you had a bunch of BIT columns defined consecutively in your table, they would be stored compactly, up to 8 columns per byte. But the minimum storage is still 1 byte, so if you had 1 BIT column, it would still take a whole byte.
Re your questions:
It doesn't take much code to test this out.
I created a test table and put one row of values in it:
CREATE TABLE `foo` (
`b` bool DEFAULT '1', /* this is a synonym for TINYINT(1) */
`ti` tinyint DEFAULT '1',
`tiu` tinyint unsigned DEFAULT '1',
`si` smallintDEFAULT '1',
`siu` smallint unsigned DEFAULT '1',
`i` int DEFAULT '1',
`iu` int unsigned DEFAULT '1',
`bi` bigint DEFAULT '1',
`biu` bigint unsigned DEFAULT '1'
);
INSERT INTO foo () VALUES ();
Then I called some JDBC code and used getObject() for each column, and asked it to tell me what data type it returned:
stmt = conn.createStatement();
rs = stmt.executeQuery("SELECT * FROM foo");
while (rs.next()) {
Object b = rs.getObject("b");
System.out.println("b ("+b.getClass().getSimpleName()+"):\t" + b);
Object ti = rs.getObject("ti");
System.out.println("ti ("+ti.getClass().getSimpleName()+"):\t" + ti);
Object tiu = rs.getObject("tiu");
System.out.println("tiu ("+tiu.getClass().getSimpleName()+"):\t" + tiu);
Object si = rs.getObject("si");
System.out.println("si ("+si.getClass().getSimpleName()+"):\t" + si);
Object siu = rs.getObject("siu");
System.out.println("siu ("+siu.getClass().getSimpleName()+"):\t" + siu);
Object i = rs.getObject("i");
System.out.println("i ("+i.getClass().getSimpleName()+"):\t" + i);
Object iu = rs.getObject("iu");
System.out.println("iu ("+iu.getClass().getSimpleName()+"):\t" + iu);
Object bi = rs.getObject("bi");
System.out.println("bi ("+bi.getClass().getSimpleName()+"):\t" + bi);
Object biu = rs.getObject("biu");
System.out.println("biu ("+biu.getClass().getSimpleName()+"):\t" + biu);
}
Output:
b (Boolean): true
ti (Integer): 1
ti2 (Integer): 1
tiu (Integer): 1
si (Integer): 1
siu (Integer): 1
i (Integer): 1
iu (Long): 1
bi (Long): 1
biu (BigInteger): 1
I'm testing with MySQL Connector/J 5.1.44.
So it seems that TINYINT(1) is handled specially by the JDBC driver. It automatically converts it to a java.lang.Boolean.
Again, TINYINT(1) has no real effect on the range of possible values in MySQL. It's an 8-bit signed integer type. But the JDBC driver has special code to look for the (1) length option and it uses that as an advisory to make it cast to a java.lang.Boolean.
java.lang.Integer is okay up to the unsigned 32-bit INT, then it has to use java.lang.Long to handle the unsigned INT. Then a BigInteger for an unsigned BIGINT. Java integer types are not unsigned, so to handle the larger values of an unsigned INT or BIGINT, Java has to use the larger Integer type.
Boolean and TINYINT(1) are synonyms in mysql, meaning that you can use them interchangeably without raising any problem.

CakePHP saving default 0 values

Using CakePHP 1.3 I would like to save a 0 value for a field if nothing is written in the form field.
Best way is to do it in just MySQL but no success. I have tryed to set:
Null=no and Default = 0; or Null=yes and Default = 0;
Also both combinations combined with CakePHP Behavior which sets in beforeSave or beforeValidate:
$model->data[$name][$field] = 0;
or
unset($model->data[$name][$field]);
also with:
Null=yes and Default = 0; or Null=yes and Default = NULL;
Always the query is :
INSERT INTO `table` (`zero_field`, `other_fields`) VALUES (NULL, 'other_data')
or
INSERT INTO `table` (`zero_field`, `other_fields`) VALUES ('', 'other_data')
And if Null=no getting an error: Column 'zero_field' cannot be null
Even if I unset the field I get the query with the field inside.
How should I save a 0 value in the databese if in the form it is not set or empty?
The zero_field is int(11)
Set null = no, default = 0 for your field in database
Clear model cache files in app/tmp/cache/models/*
Be sure to use Model->create() before saving a new record so that it populates the new record to be saved with default values.
Save and Win.

SQL Server 2008 Merge Soft Delete Error

I'm trying to perform a soft delete on a row in my target table using the SQL server 2008 MERGE command.
I think this should fall under the "when not matched by source" section, since the source is missing the row and the target still has it. All I want to do is set the IsActive bit to false, but I'm getting an error.
"Attempting to set a non-NULL-able column's value to NULL."
What am I missing?
The Users table is:
[ID] [nvarchar](50) NOT NULL,
[FirstName] [nvarchar](200) NULL,
[LastName] [nvarchar](200) NULL,
[EmailAddress] [nvarchar](200) NULL,
[IsActive] [bit] NOT NULL
The Merge statement is:
merge into Users
using TempUserTable lu
on Users.ID = TempUserTable.ID
when matched then
update set
ID = lu.ID,
FirstName = lu.FirstName,
LastName = lu.LastName,
EMailAddress = lu.EmailAddress,
IsActive = lu.Status
when not matched then
insert (ID, FirstName, LastName, EmailAddress, IsActive)
values (lu.ID, lu.FirstName, lu.LastName, lu.EmailAddress, lu.Status)
when not matched by source then
update set IsActive = 0;
You can get this to work exactly as you want but for me I needed to add a condition in the NOT MATCHED line.
So try something like...
WHEN NOT MATCHED BY SOURCE
AND TARGET.[IsActive] = 1
AND TARGET.[DeletedOn] IS NULL
THEN UPDATE
SET
TARGET.[IsActive] = 0,
TARGET.[DeletedOn] = SYSDATETIMEOFFSET()
It appears that your temp table TempUserTable either has a NULL in the IsActive column or the ID column.