I need to insert an empty value into a date field.
I've read how to insert an empty value in mysql date type field? which states to set the field to allow null. I've done that.
It also states to isnert null rather than an empty value. Unfortunatly that's not possible with the current set up - is there a way to allow it to input an empty string?
Say, your table is:
CREATE TABLE MyTable (A INT ,
B INT ,
C INT
) ;
The statement:
INSERT INTO MyTable (B,C) VALUES (3,4) ;
will leave A null.
Related
I'm trying insert data in table using this query
INSERT INTO table (
url,
v_count,
v_date)
SELECT
url,
v_count,
v_date FROM json_populate_recorset(null::record,
'[{"url_site":"test.com","visit_count":1,"visit_date":"2022-08-31"},
{"url_site":"dev.com","visit_count":2,"visit_date":"2022-08-31"}]'::json)
AS ("url" varchar(700), "v_count" integer, "v_date" date)
And I'm getting this error:
null value in column "v_date" of relation table violates not null constraint
Since my json could be hundreds of entries at some times,
how should I send the date in my json ?
There is another (efficient) way to insert this data in the table ?
Edit: in postico 1.5.20 my example above works as long as I have the json key named the same as the table columns, how can I reference differents names in my json keys?
Since v_date can resolve to null, you'll need to either skip them or provide a value when null appears.
To skip the null values, you may want to add a WHERE v_date NOTNULL clause to your SELECT statement.
Otherwise, you can use COALESCE() to assign a value when v_date is null. For example ... SELECT url, v_count, COALESCE(v_date,now()) FROM json_populate_recordset...
I need to change column type from tinyInt(used as bool) to Varchar, without loosing data.
I have found many answers on stack-overflow but all of them are written in postgres and I have no idea how to rewrite it in Mysql.
Answers for this problem on stack-overflow looks like that:
ALTER TABLE mytabe ALTER mycolumn TYPE VARCHAR(10) USING CASE WHEN mycolumn=0 THEN 'Something' ELSE 'TEST' END;
How would similar logic look like in Mysql?
The syntax you show has no equivalent in MySQL. There's no way to modify values during an ALTER TABLE. An ALTER TABLE in MySQL will only translate values using builtin type casting. That is, an integer will be translated to the string format of that integer value, just it would in a string expression.
For MySQL, here's what you have to do:
Add a new column:
ALTER TABLE mytable ADD COLUMN type2 VARCHAR(10);
Backfill that column:
UPDATE mytable SET type2 = CASE `type` WHEN 0 THEN 'Something' ELSE 'TEST' END;
If the table has millions of rows, you may have to do this in batches.
Drop the old column and optionally rename the new column to the name of the old one:
ALTER TABLE mytable DROP COLUMN `type`, RENAME COLUMN type2 to `type`;
Another approach would be to change the column, allowing integers to convert to the string format of the integer values. Then update the strings as you want.
ALTER TABLE mytable MODIFY COLUMN `type` VARCHAR(10);
UPDATE mytable SET `type` = CASE `type` WHEN '0' THEN 'Something' ELSE 'TEST' END;
Either way, be sure to test this first on another table before trying it on your real table.
I need to copy the data of an old table with millions of rows to a newer table, with a slightly different definition. Most importantly, there is one new field with a null-default, and a varchar field became an enum (with directly mapping values).
Old table:
id : integer
type : varchar
New table:
id : integer
type : enum
number : integer, default null
All of the possible string values of type are within the new enumeration.
I tried the following:
insert into new.table select * from old.table
But I obviously get:
Insert value list does not match column list: 1136 Column count doesn't match value count at row 1
You can copy the table data and structure from phpmyadmin window, and then modify the new table and add the new column.
Using the INSERT ... SELECT syntax:
INSERT INTO new.table `id`, `type` SELECT `id`, `type` FROM old.table
Apparently the varchar to enum remapping isn't a problem.
I'm experimenting with temporary tables and running into a problem.
Here's some super-simplified code of what I'm trying to accomplish:
IF(Object_ID('tempdb..#TempTroubleTable') IS NOT NULL) DROP TABLE #TempTroubleTable
select 'Hello' as Greeting,
NULL as Name
into #TempTroubleTable
update #TempTroubleTable
set Name = 'Monkey'
WHERE Greeting = 'Hello'
select * from #TempTroubleTable
Upon attempting the update statement, I get the error:
Conversion failed when converting the varchar value 'Monkey' to data type int.
I can understand why the temp table might not expect me to fill that column with varchars, but why does it assume int? Is there a way I can prime the column to expect varchar(max) but still initialize it with NULLs?
You need to cast null to the datatype because by default its an int
Select 'hello' as greeting,
Cast (null as varchar (32)) as name
Into #temp
I'm trying to insert rows from one table to the other. In the first table, the datatype of one column is char(5), but the same column has tinyint(4) datatype in the second table. When i run the insert query, it says
Incorrect integer value: '' for column 'x' at row 258
I cannot alter or modify the datatype now as it violates some constraints. Is there a way to use cast or convert char to tinyint?
Thanks.
You probably want something like this:
INSERT INTO newtable
SELECT CASE WHEN x = '' THEN 0 ELSE x END
FROM oldtable
I'm assuming that you want blanks to turn into zeros? If not, then provide the integer value you want blanks to have.
If there are other exceptions, use more alternatives in the CASE expression.