Laravel allowing null dates on Model::create() but not Model->update() - mysql

I have setup my Laravel migration to allow nullable dates for my StartTime and EndTime entries as such:
$table->dateTime( 'StartTime' )->nullable();
$table->dateTime( 'EndTime' )->nullable();
When I create a new entry through eloquent, it allows me to insert null values into my database successfully:
try {
// Create the new Campaign record
$campaign = Campaign::create( $request->all() );
}
+----+-------+--------+-----------+---------+---------------------+
| Id | Name | Active | StartTime | EndTime | created_at |
+----+-------+--------+-----------+---------+---------------------+
| 1 | Test2 | 0 | NULL | NULL | 2020-07-02 22:01:22 |
+----+-------+--------+-----------+---------+---------------------+
1 row in set (0.00 sec)
However, when I later try and update my record using eloquent and still passing a null value for StartTime, it throws an error:
try {
// Get a reference to the campaign
$campaign = Campaign::find( $id );
// Update the campaign
$campaign->update( $request->all() );
}
(22007) SQLSTATE[22007]: Invalid datetime format: 1292 Incorrect datetime value: 'null' for column 'StartTime' at row 1
In the case of the create method, I am not passing in a StartTime value at all, but in the case of the update method, I am simply passing back the null value that Laravel returns as part of the model. So in other words, I haven't altered the value of StartTime at all, I've simply just passed $campaign back to Laravel for the update.
So it seems that Laravel is assigning the nullable() upon insert of a new entry into the database, but will not allow me to pass a null value back for the update.
Am I missing something here? I can't seem to find a solution to this anywhere.
UPDATE
Okay, so further investigation seems like my problem is stemming from the AngularJS $http POST request. For troubleshooting purposes, I added code to my Laravel controller to alter the StartTime to null:
if( $request->StartTime === 'null' ) {
$request['StartTime'] = null;
}
And that worked. So it looks like Angular is passing the null value back in the request as 'null'

Laravel 5.3 was updated to use MySQL "strict" mode by default, which includes the NO_ZERO_DATE mode.
The issue is that your existing data was allowed to have '0000-00-00 00:00:00' as a datetime value. But, now your connection is using a sql mode that does not allow that value (NO_ZERO_DATE).
However, the quick option is to just disable "strict" mode on your database connection. Open your config/database.php file, and make sure your database connection shows 'strict' => false.
Or, create migration like this :
$table->datetime('StartTime')->nullable($value = true);
Or,
$table->datetime('StartTime')->default(null);

Related

Manual Insert into MySQLx Collection

I have a large amount of JSON data that needs to be inserted into a MySQLx Collection table. The current Node implementation keeps crashing when I attempt to load my JSON data in, and I suspect it's because I'm inserting too much at once through the collection API. I'd like to manually insert the data into the database using a traditional SQL statement (in the hope that they will get me pass this NodeJs crash).
The problem is that I have this table def:
+--------------+---------------+------+-----+---------+-------------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+---------------+------+-----+---------+-------------------+
| doc | json | YES | | NULL | |
| _id | varbinary(32) | NO | PRI | NULL | STORED GENERATED |
| _json_schema | json | YES | | NULL | VIRTUAL GENERATED |
+--------------+---------------+------+-----+---------+-------------------+
But when running
insert into documents values ('{}', DEFAULT, DEFAULT)
I get:
ERROR 3105 (HY000): The value specified for generated column '_id' in table 'documents' is not allowed.
I've tried with not providing the DEFAULTs, with NULL (but _id doesn't allow NULL even though that's the default), with 0 for _id, with numbers and with uuid_to_bin(uuid()) but I still get the same error.
How can I insert this data into the table directly (I'm using session.sql('INSERT...').bind(JSON.stringify(data)).execute() - using the #mysql/xdevapi library)
The _id column is auto generated from the value of the namesake field in the JSON document. When you use the CRUD interface to insert documents, the X Plugin is capable of generating a unique value for this field. However, by executing a plain SQL statement, you are also by-passing that logic. So, you are able to insert documents if you generate the _ids yourself, otherwise you will bump into that error.
As an example (using crypto.randomInt()):
const { randomInt } = require('crypto')
session.sql('insert into documents (doc) values (?)')
.bind(JSON.stringify({ _id: randomInt(Math.pow(2, 48) - 1) }))
.execute()
Though I'm curious about the issue with the CRUD API and I wanted to see if I was able to reproduce it as well. How are you inserting those documents in that case and what kind of feedback (if any) is provided when it "crashes"?
Disclaimer: I'm the lead developer of the MySQL X DevAPI connector for Node.js

Node.js with knex + Mysql migrate error while rename column cause default value

I am new in nodejs and my english is bad please help me, I am just trying to migrate my database in nodejs(express) using knex, and I'm using Mysql for database. I want to rename one field in table, and when I try to migrate my database, I got some problem that say error default value.
here is what I'm trying to do :
My migrate
exports.up = function(knex) {
return knex.schema.table ('tbl_skills', function(table) {
table.renameColumn('preminum_price', 'premium_price')
})
};
Here my database structure
Name | Datatype | length | Default |
id | INT | 20 | No default |
preminum_price | DOUBLE | 5,2 | No default |
insertdate | TIMESTAMP| | 0000-00-00 00:00:00 |
updatedate | TIMESTAMP| | 0000-00-00 00:00:00 |
and this what I got when I'm trying knex migrate:latest
migration file "20191125105226_alter_tbl_skills.js" failed
migration failed with error: alter table `tbl_skills` change `preminum_price` `premium_price` double(5,2) NOT NULL - ER_INVALID_DEFAULT: Invalid default value for 'insertdate'
Error: ER_INVALID_DEFAULT: Invalid default value for 'insertdate'
I dont know to how set value for insertdate with default value. Please help
That is most probably because of server SQL Mode - NO_ZERO_DATE.
In strict mode, don't allow '0000-00-00' as a valid date. You can still insert zero dates with the IGNORE option. When not in strict mode, the date is accepted but a warning is generated. If you have access to my.ini (mysql conf file) remove the NO_ZERO_DATA from sql-mode and restart the server.
You can check it with SHOW VARIABLES LIKE 'sql_mode'

Laravel - timezone saved in DB off by 5 hours

So, in Laravel's app.php I have the following timezone set:
'timezone' => 'America/Denver',
In MySQL settings I've got the same timezone. When I run select now() I get the current Denver time.
However, when I create a record in any table in the database, the created_at field (with default value set to CURRENT_TIMESTAMP) somehow ends up 5 hours ahead of Denver.
I believe it's somehow defaulting to UTC time, but I am not sure. All online resources I've found related to this issue claim that setting the timezone in Laravel should do the trick.
What else can I do to make sure I get the correct timezone saved in CURRENT_TIMESTAMP?
I don't think server-wide PHP settings should have precedent over what's set in MySQL or in Laravel in this matter, but I have still gone ahead and tried editing the timezone in php.ini to America/Denver and no luck. It was previously commented out (not set to UTC).
Use
SET SESSION time_zone = 'America/Denver';
In a raw query (DB::select(DB::raw("SET SESSION time_zone = 'America/Denver'")) before inserting and updating.
Test case
CREATE TABLE test (
id INT
, created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
INSERT INTO test (id) VALUES(1);
SET SESSION time_zone = 'America/Denver';
INSERT INTO test (id) VALUES(2);
Possible results
| id | created_at |
| --- | ------------------- |
| 1 | 2019-03-04 13:57:31 |
| 2 | 2019-03-04 06:57:31 |
see demo
Eloquent creates a new Carbon object when it sets the timestamp for created_at, it doesn’t use MySQL’s default. This should use date_default_timezone_set, which Laravel is setting.
Rather obvious answer is: have you tried clearing your config cache?
php artisan config:clear
As an aside it is generally advisable to always use UTC across everything, and only convert it to a local timezone at the last possible moment.
From Carbon:
// PS: we recommend you to work with UTC as default timezone and only use
// other timezones (such as the user timezone) on display

How to receive any changes in MySQL database in real time from client code in a timed function?

I have a database with a log record table which looks like this:
+-----------+------------------------+--------+
| Timestamp | Symbol_name | Status |
+-----------+------------------------+--------+
| 1 | Group2 | 1 |
| 2 | Group1-Device3-Signal1 | 1 |
| 3 | Group2-Device1-Signa13 | 0 |
+-----------+------------------------+--------+
Where Timestamp is a double, Symbol_name is a varchar, and Status is an int.
Log records which contain the above data are going to be inserted into the table in real time, and my client code is supposed to query those records and analyze them. The problem I am having right now is reading a unique record each query. Currently, I have this function:
/* Called every 1000 ms (1 second). */
gboolean app_models_timed_query(gpointer data) {
FwApp *app = data;
char query[APP_MAXIMUM_STRING_CHARS];
strncpy(query, "SELECT * FROM ", APP_MAXIMUM_STRING_CHARS);
strncat(query, app->query_db_table_name, APP_MAXIMUM_STRING_CHARS);
strncat(query, " WHERE Timestamp <> #lastSeenTimestamp AND Symbol_name <> #lastSeenSymbolName AND Status <> #lastSeenStatus;", APP_MAXIMUM_STRING_CHARS);
if (mysql_query(app->query_db_con, query))
{
printf("Unable to retrieve data from query table.\n");
return TRUE;
}
MYSQL_RES *result = mysql_store_result(app->query_db_con);
if (result == NULL) return TRUE;
/* Analyze the resulting row(s) here. */
/* How to set #lastSeenTimestamp, #lastSeenSymbolName and #lastSeenStatus here? */
return TRUE;
}
The function gets called every second, and in it, I query the database using the following statement:
SELECT * FROM table1 WHERE Timestamp <> #lastSeenTimestamp AND Symbol_name <> #lastSeenSymbolName AND Status <> #lastSeenStatus;
No two records will ever be exactly the same, but they can have the same timestamp, status, or symbol name.
Note that before I enable app_models_timed_query to be called each second, I set the user-defined variables like so:
SET #lastSeenTimestamp = -1, #lastSeenSymbolName = '', #lastSeenStatus = 0;
And since timestamps will never be negative, the first time app_models_timed_query is called, the first row will be in the result of the query.
However, my question is how to set the user-defined variables to the last row of the result of my query. Also, I want to know if there is a better way of reading only newly inserted rows each time app_models_timed_query is called.
Many thanks,
Vikas
You should be using a message queue like RabbitMQ for this sort of application. Message queues have an API associated with fetching from the top of a queue. Even if you store the main data in MySQL, you can use the message queue for the primary key. With this choice of suited infrastructure your application doesn't need to preserve state.

Seeking coding example for TSQLTimeStamp

Delphi XE2 and MySql.
My previous question led to the recommendation that I should be using MySql's native TIMESTAMP datatype to store date/time.
Unfornately, I can't seem to find any coding examples, and I am getting weird results.
Given this table:
mysql> describe test_runs;
+------------------+-------------+------+-----+---------------------+-------+
| Field | Type | Null | Key | Default | Extra |
+------------------+-------------+------+-----+---------------------+-------+
| start_time_stamp | timestamp | NO | PRI | 0000-00-00 00:00:00 | |
| end_time_stamp | timestamp | NO | | 0000-00-00 00:00:00 | |
| description | varchar(64) | NO | | NULL | |
+------------------+-------------+------+-----+---------------------+-------+
3 rows in set (0.02 sec)
I woudl like to :
declare a variable into which I can store the result of SELECT CURRENT_TIMESTAMP - what type should it be? TSQLTimeStamp?
insert a row at test start which has start_time_stamp = the variable above
and end_time_stamp = some "NULL" value ... "0000-00-00 00:00:00"? Can I use that directly, or do I need to declare a TSQLTimeStamp and set each field to zero? (there doesn't seem to be a TSQLTimeStamp.Clear; - it's a structure, not a class
upadte the end_time_stamp when the test completes
calcuate the test duration
Can somene please point me at a URL with some Delphi code whcich I can study to see how to do this sort of thing? GINMF.
I don't know why you want to hassle around with that TIMESTAMP and why you want to retrieve the CURRENT_TIMESTAMP just to put it back.
And as already stated, it is not a good advice to use a TIMESTAMP field as PRIMARY KEY.
So my suggestion is to use this TABLE SCHEMA
CREATE TABLE `test_runs` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`start_time_stamp` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`end_time_stamp` timestamp NULL DEFAULT NULL,
`description` varchar(64) NOT NULL,
PRIMARY KEY (`id`)
);
Starting a test run is handled by
INSERT INTO test_runs ( description ) VALUES ( :description );
SELECT LAST_INSERT_ID() AS id;
and to finalize the record you simply call
UPDATE test_runs SET end_time_stamp = CURRENT_TIMESTAMP WHERE id = :id
just declare a TSQLQuery (or the correct component for the data access layer of your choice), attach it to a valid connection and populate it's SQL property with:
select * from test_runs;
double click on the query to launch it's fields editor and select add all fields from the contextual menu of that editor.
It will create the correct field type, according to the data access layer and driver you're using to access your data.
Once that's done, if you need to use the value in code, usually you do it by using the AsDateTime property of the field, so you just use a plain TDateTime Delphi type and let the database access layer deal with the specific database details to store that field.
For example, if your query object is named qTest and the table field is named start_time_stamp, your Delhi variable associated with that persistent field will be named qTeststart_time_stamp, so you can do something like this:
var
StartTS: TDateTime;
begin
qTest.Open;
StartTS := qTeststart_time_stamp.AsDateTime;
ShowMessage('start date is ' + DateTimeToStr(StartTS));
end;
If you use dbExpress and are new to it, read A Guide to Using dbExpress in Delphi database applications
I don't know about MySQL, but if the TField subclass generated is a TSQLTimeStampField, you will need to use the type and functions in the SqlTimSt unit (Data.SqlTimSt for XE2+).
You want to declare the local variables as TSQLTimeStamp
uses Data.SQLTmSt....;
....
var
StartTS: TSQLTimeStamp;
EndTS: TSQLTimeStamp;
begin
StartTS := qTeststart_time_stamp.AsSQLTimeStamp;
SQLTmSt also includes functions to convert to and from TSQLTimeStamp, e.g. SQLTimeStampToDateTime and DateTimeToSQLTimeStamp.
P.S. I tend to agree that using a timestamp as a primary key is likely to cause problems. I would tend to use a auto incrementing surrogate key as Sir Rufo suggests.