MySQL 5.1.73 Change timestamp format for general_log file - mysql

I'm trying to change the format of the general log file generated by MySQL 5.1.73.
the actual timestamp format for "event_time is:
"210909 10:32:12 12 Connect user#localhost on database"
Is there a possibility for changing its timestamp format ?
I would like the format to follow the format: 2015-04-14 22:52:11 or even containing the timezone following the RFC 3339. "1937-01-01T12:00:27.87+00:2" (UTC)

I found a solution by adding the mysql-general.log file as inputFile in rsyslog. Applying a template allow me to rewrite the log with a correct timestamp format.
Thanks for your replies / comments

Related

MySQL to GeoMesa through .csv

I have a MySQL table whose data I have to export to .csv and then ingest this .csv to GeoMesa.
My Mysql table structure is like below:
[
Now, as you can see the the_geom attribute of table has data type point and in database it is stored as blob like shown below:
Now I have two problems :
When I export the MySQL data into a (.csv) file my csv file shows (...) for the_geom attribute as shown below instead of any binary representation or anything which will allow it to be ingested in GeoMesa. So, how to overcome this?
Csv file also shows # for any attribute with datetime datatype but if you expand the column the date time can be seen as sown in below picture (however my question is does it will cause problem in geomesa?).
For #1, MySQL's export is not automatically converting the Point datatype into text for you. You might need to call a conversion function such as AsWKT to output the geometry as Well Known Text. The WKT format can be used by GeoMesa to read in the Point data.
For #2, I think you'll need to do the same for the date field. Check out the date and time functions.

Convert a Unicode time in a CSV spreadsheet so I can import to MySQL date/time stamp

I'm stuck on this. I have a guestbook I'm converting from an old website into Wordpress comments. The guestbook has all the dates written in as unix. I already have the file transposed so I can import it into Wordpress comments and all works good, except the date field, which defaults to 0000-00-00 00:00:00.
I also have fields in standard Excel time formats but I need to find a way to conver that to the standard 0000-00-00 00:00:00.
Any ideas?
I actually figured this out after posting. Here was the issue. I had a CSV file from a previous guestbook that contained comments. In that CSV file the date / timestamp was in UNIX code (i.e. seconds since 1/1/1970).
In Excel, I found a formula to convert that to a date Excel can read. However, that date, when opened as a CSV, doesn't show in the correct format for MySQL. It uses the Excel date format, which is some string of numbers, I"m assuming similar to Unix time - from a certain date.
What I did was use the excel formula TEXT(Value, YYYY-MM-DD HH:MM:SS) to convert the excel date/time to a string in the correct format. Then the CSV file worked.

Syntax for MySQL LOAD XML "SET" parameter?

I am trying to load an xml file containing timestamp entries in the "20120924 22:12" format into MySql.
I am using the LOAD XML feature. Of course it's not working because MySQL is expecting "2012-09-24 22:12". If I was using LOAD FILE I would add
SET tmstmp = str_to_date(#var3, '%Y%m%d %h:i%')}
to my command, where the tmstmp data is in column 3 of the tile. So for XML I'd like to use
SET tmstmp = str_to_date(#tmstmp, '%Y%m%d %h:i%')}
where tmstmp is the tag containing my timestamp data. But this doesn't work. #tmstmp is empty. How do I access tags in SET statements under LOAD XML? The MySQL documentation just defers to the LOAD FILE documentation but it's not there of course. Thanks for any help.
The format is wrong (i% must be %i). So, change '%Y%m%d %h:i%' with '%Y%m%d %h:%i' and try to load data again.
Also %h should be %H because you use 24-hour format.

How do I import mm/dd/yyyy values in a file to a database field of format yyyy-mm-dd?

I have a CSV file with date column in the following format mm/dd/yyyy (4/20/2012), and I need to load this column into a database that has datetime column in yyyy-mm-dd (2012-04-20).
I have used derived column transformation for this purpose and written and expression like
(DT_WSTR)(SUBSTRING(ReceivedDateTime,1,4) + "-" +
SUBSTRING(ReceivedDateTime,5,2) + "-" + SUBSTRING(ReceivedDateTime,7,2))
Upon running my package, it's throwing the error Unable to perform type cast.
If the incoming values from the CSV file are always formatted like 04/20/2012 (MM/DD/YYYY), then you don't have to perform any type casting. You just have to configure the flat file connection manager to treat the values in the file as date data type.
Let's assume that your CSV file looks something like this with a single column containing dates in .
In the SSIS package, create a flat file connection manager to read the CSV file. I stored the CSV in the path C:\temp\Source.csv
On the Advanced section, you will notice that the flat file connection manager named the first column as Column1 and the DataType property is set to string [DT_STR]. However, the values in the file are actually dates. Either you can manually configure the data types or click on the Suggest Types... button.
On the Suggest Column Types, leave the default values and click OK. This will read the first 100 rows of the file and will determine the column type according to the data available in the file.
Once you click OK on the Suggest Column Types dialog, you will notice that the data type on the flat file connection manager for the Column 0 has been changed to date [DT_Date]. Click OK to configure the flat file connection manager. You can also rename the column according to your requirements (say InvoiceDate or OrderDate etc.)
Now that you have the flat file connection manager configured, you can use it inside a Flat file source within a data flow task to read the data and populate your database. So, there is no need to manipulate values using a derived column transformation.
However, if your incoming file values are in string like 120420 (YYMMDD), these values cannot be configured as date data types. In these scenarios, you need to use Derived Column transformation as suggested in this answer.
Hope that helps.
If your incoming value is a string 20140106(YYYYMMDD) then typecasting the statement with (DDT_DBDATE) will throw an error. I had encountered this issue so I took out the typecast
and the mapping looked like this:
(LEN(TRIM(SUBSTRING([Screening Date],1,8))) > 0 ? (SUBSTRING([Screening Date],1,4) + "/" + SUBSTRING([Screening Date],5,2) + "/" + SUBSTRING([Screening Date],7,2)) : NULL(DT_WSTR,5))
Later I added a Data Conversion Transformation and changed it to database timestamp and it worked for me. You can also change it to type date(DT_DATE).
Thanks!
Try the following, ( I was successful in doing this)
(LEN(TRIM(SUBSTRING(ReceivedDateTime,1,8))) > 0 ? (DT_DBDATE)(SUBSTRING(ReceivedDateTime,1,4) + "-" + SUBSTRING(ReceivedDateTime,5,2) + "-" + SUBSTRING(ReceivedDateTime,7,2)) : (DT_DBDATE)NULL(DT_WSTR,5))
Just typecast the statement with (DT_DBDATE). And this will also serve if there is no date in that position i.e. if the position 1 to 8 of the received string is empty then it will put NULL.

date conversion in csv to mysql db format

I am exporting a csv file in to mysql db using load data infile syntax.
the date in csv is in 2009/10/31 7:8:57.0 format. Is there any way to convert this while loading to something like 2009-10-31 07:08:57 ?
Execute TO_CHAR(TO_DATE(datefromcsv, 'YYYY/MM/DD HH:MI:SS.FF'), 'YYYY-MM-DD HH:MI:SS') when you are doing the INSERT into the db.
(usual caveats apply here) A regular expression might be what you need. Substitute / with - and remove the trailing .0.
I am surprised, though, that mysql can't handle dates like the one you provided. See for example the mySql manual. Have you tried feeding it to mysql and seeing what happens?