Create datafield with gmt-timezone timestamp in mysql? - mysql

I'm trying to convert a postgresql sql-query to mysql. Using a translator.
this is the query in postgres:
comment_date_gmt timestamp without time zone DEFAULT timezone('gmt'::text, now()) NOT NULL,
it's converted to
comment_date_gmt timestamp DEFAULT timezone('gmt',
The none-closed parenthesis is a sign that everything isn't right. I'm trying to figure out what this query should look like. Any suggestions?

The only reliable SQL query dialect converter is the human brain.
Tools can be useful for the basics, like data type renaming, but lots of that sort of thing can be avoided by just writing the queries using standard types in the first place.
You'll have a very hard time converting a MySQL query that uses query variables to a PostgreSQL query, or converting a PostgreSQL (well, SQL-standard) recursive common table expression to something MySQL understands. The two have totally different stored procedure languages, different built-in functions, and all sorts of things. array_agg, unnest, etc ... most of that stuff would require translation to queries using MySQL variables where it's possible to do it at all. Then you've got window functions like row_number, lead, lag, and aggregates used as running windows like sum(blah) OVER (...). A generic converter would need to "understand" the query to actually do the job.
A specific answer for the named problem isn't really possible since you haven't identified the converter tool.
At a guess, if you change the PostgreSQL query to:
comment_date_gmt timestamp without time zone DEFAULT (current_timestamp AT TIME ZONE 'utc') NOT NULL,
which is the standard phrasing understood by PostgreSQL and other compliant databases.

Related

How to export data from Cloud SQL to BigQuery on a daily basis?

I have created a connection to Cloud SQL and used EXTERNAL_QUERY() to export the data to Bigquery. My problem is that I do not know a computationally efficient way to export a new days data since the Cloud SQL table is not partitioned; however, it does have a date column date_field but it is of the datatype char.
I have tried running the following query with the view of scheduling a similar type so that it inserts the results: SELECT * FROM EXTERNAL_QUERY("connection", "SELECT period FROM table where date_field = cast(current_date() as char);") but it takes very long to run, whereas: SELECT * FROM EXTERNAL_QUERY("connection", "SELECT period FROM table where date_field = '2020-03-20';") is almost instant.
Firstly, it’s highly recommended to convert the ‘date_field’ column to the datatype DATE. This would improve simplicity and performance in the future.
When comparing two strings, MySQL will make use of indexes to speed up the queries. This is executed successfully when defining the string as ‘2020-03-20’ for example. When casting the current date to a string, it’s possible that the characters set used in the comparison aren’t the same, so indexes can’t be used.
You may want to check the characters set once current_datetime has been casted compared to the values in the ‘date_field’ column. You could then use this command instead of cast:
CONVERT(current_date() USING enter_char_sets_here)
Here is the documentation for the different casting functions.

Proper way to use datetime in Codeigniter queries

I wonder if there is some type of common method that would help me write query with date/time field in it. For example: I am developing a very small project utilizing MySQL database. However, my client is considering switching to his existing SQL server.
Example (datetime column):
SELECT DATE_FORMAT(contract_date, '%d.%m.%Y') FROM `employees`
Question: Can query below become usable in SQL in case I replace database driver (currently) mysqli to sqlsrv?
I understand I can use some type of config variable for date format... Would it be the best way? Is there something that Codeigniter 3 has in place?
feel free to use your own query sample

Dealing with different datetime formats in the DB?

I'm writing a Ruby program using Sequel which is running on the legacy database. There is an issue dealing with different date time formats.
DB has a table which has a column start_date. In Sequel's migration script I set it to DateTime which is a timestamp type in SQLite, however, the legacy data has a different time format:
Some are using an ISO8601, like 2013-09-01T08:28:00+10:00.
Some are using a different one, which I don't know if it has a name, like 2013-09-01 08:28:00.000000+1000.
The problem is, when I run a query against the table and try to filter by start_date, the difference between two date time formats will cause incorrect results.
The query I'm using is:
current = Time.now
MyModel.where { start_date < current }
Sequel will convert it into SQL like this:
SELECT * FROM `my_model` WHERE `start_date` < '2013-09-01 08:28:00.000000'
From my local testing, Sequel looks like it's comparing the date as a string, so 2013-09-01 08:28:00.000000+1000 is less than 2013-09-01T01:28:00+10:00. Because whitespace is less than T this is not what I want.
I could use an iso8601 time like:
current_iso8601 = Time.now.iso8601
MyModel.where { start_date < current_iso8601 }
But it won't solve the problem because the database has two different datetime formats.
My questions are:
Does Ruby/Sequel support querying the database by Date/Time not as a string?
Does it work for different date time formats?
SQLite is just for local testing, in production it will be MySQL. So, the solution should be using general Sequel methods as a adaptor and should not have any database specific methods.
NOTE: the program is not a Rails application.
Thank you very much!
SQLite does not have date/time types (see http://sqlite.org/datatype3.html). It stores datetime values as strings.
The best solution is to use the same database in development/testing that you use in production. If you don't want to do that, you need to convert all the SQLite datetime values so that they all use the same ISO8601 format. That way the comparison operators will work correctly (as they do in MySQL).

Is there MySQL equivalent to Oracle's TIMESTAMP WITH TIME ZONE?

Is there MySQL equivalent to Oracle's TIMESTAMP WITH TIME ZONE?
I need to map a Oracle table, which has some columns with that datatype, into a MySQL table but I can't seem to find an easy way to do this without resorting to some MySQL functions.
Thanks and best regards.
No, you'll need to split the data into 2 columns, one a datetime, and the other holding the timezone information. But what you put in the latter field is dependant on what you've got stored in Oracle - the TIMESTAMP WITH TIME ZONE Datatype can contain the TZ offset and (optionally) the time zone region. Obviously the latter is a requirement for the date time to be semantically correct, but IIRC Oracle does not enforce this data being populated.
without resorting to some MySQL functions
Since MySQL doesn't have the datatype, it'll be very difficult to write MySQL function to process it - it's a lot simpler to create a MySQL compatible representation in Oracle where the datatype is supported. You just need to work out what data you've actually got and decide how you want to represent it in MySQL. By convention that means storing it in UTC along with the TZ in a seperate column, then convert it on selection with the convert_tz function (always from UTC)
MySQL always store timestamps as utc. Dates are always stored without timezone information.
You can configure mysql to return values from now() in different timezones.
To store the current offset you need to add this to some column on your own.

mysql to oracle

I've googled this but can't get a straight answer. I have a mysql database that I want to import in to oracle. Can I just use the mysql dump?
Nope. You need to use some ETL (Export, Transform, Load) tool.
Oracle SQL Developer has inbuilt feature for migrating MySQL DB to Oracle.
Try this link - http://forums.oracle.com/forums/thread.jspa?threadID=875987&tstart=0 This is for migrating MySQL to Oracle.
If the dump is a SQL script, you will need to do a lot of copy & replace to make that script work on Oracle.
Things that come to my mind
remove the dreaded backticks
remove all ENGINE=.... options
remove all DEFAULT CHARSET=xxx options
remove all UNSIGNED options
convert all DATETIME types to DATE
replace BOOLEAN columns with e.g. integer or a CHAR(1) (Oracle does not support boolean)
convert all int(x), smallint, tinyint data types to simply integer
convert all mediumtext, longtext data types to CLOB
convert all VARCHAR columns that are defined with more than 4000 bytes to CLOB
remove all SET ... commands
remove all USE commands
remove all ON UPDATE options for columns
rewrite all triggers
rewrite all procedures
The answer depends on which MySQL features you use. If you don't use stored procedures, triggers, views etc, chances are you will be able to use the MySQL export without major problems.
Take a look at:
mysqldump --compatible=oracle
If you do use these features, you might want to try an automatic converter (Google offers some).
In every case, some knowledge of both syntaxes is required to be able to debug problems (there almost certainly will be some). Also remember to test everything thoroughly.