Database connection using dplyr with date field in database - mysql

Is there some magic to using dplyr to access a database when it has a date field?
A dplyr tbl_df converts mysql datetime fields to chr. That wouldn't be so bad if I could as.Date() them back. But if I do that before collect()'ing the table, I get an error that as.Date() is an invalid sql function. I can sort-of workaround this by calling collect(), but then I'm copying all of the data out of the database, which is what I was trying to avoid. In addition, once I've collect()'ed, its a data.frame, so if I want to join it with another tbl I have to set copy=TRUE and copy that one into memory as well.

Related

Is it possible to use sqlalchemy to reflect table and change data type of column from string to datetime?

I have a web application where users can upload CSVs. I use Python Pandas to actually do the upload. I have to give the users the ability to change the database table's column's types such as from strings to datetimes. Is there a way to do this in Sqlalchemy? I'm working with reflected tables, and so I have Table objects with all their columns but I have a feeling that Sqlalchemy does not have this capability and that I will have to execute raw SQL to do this.

How to migrate CLOB column to (json) BLOB in DB2 without truncating data?

I have a DB2 11 database with a large table that has JSON data stored in a CLOB column. Given that I'd like to perform queries on it using the JSON_VAL function, I always need to use JSON2BSON to convert it first, which I assume is a significant overhead. I would like to move the data into another table that has exactly the same structure, except for the CLOB column which I'd like to replace with a BLOB one to store the JSON immediately in BLOB, hoping that this will speed up my queries.
My approach to this was writing a
insert into newtable (ID, BLOBDATA) select ID, SYSTOOLS.JSON2BSON(CLOBDATA) from oldtable;
After doing this I realized that long json objects got truncated. I have googled on this and learned that selects to truncate large objects.
I am reaching out to here to see if there is any simple way for me to do this excercise, without having to write a program to read out and write back all the data. (I had myself burnt with similar truncation taking place when I used DB2 csv export features.)
Thanks.
Starting with Db2 11.1.4.4 there are new JSON functions based on the ISO technical paper. I would advise to use them. They are the strategic functionality going forward.
You could use JSON_VALUE to perform the equivalent of what you planned to with JSON_VAL.

Finding specific value in MySQL database

This may sound strange but is it possible to construct a SQL statement that search all the tables in a database for a specific value? I'm testing another person's Drupal(V.7) code and that code uses taxonomy_term_save function to import data in CSV format. I like to find the table where these data are stored. I don't know the field name either. Is it possible? I use MySQL.
SELECT * FROM databasenameHERE WHERE tablenameHERE = 'keyYouAreSearchingForHere'";
That is for MySql

Windows Collation in Oracle

I am new SSIS and the package that i am building involves the use of Merge Join. Join is performed between a RAW File and Oracle Table. The NLS_SORT and NLS_COMP option for oracle database is "BINARY". RAW File, by default picks up Windows Collation. What is the equivalent of Windows Collation in ORACLE? Will the above settings would work or some workaround is required, since i am not getting desired results from Merge Join. I had even used SELECT.... ORDER BY NLSSORT(EmployeeName, 'BINARY_CI'), but still getting wrong results. Do anyone have idea?
Use Data Conversion element for both Sources,before Sort element. Choose same data type for both columns NLS_SORT and NLS_COMP , now you can JOIN on new columns with new data types.
Some youtube example on data conversion

Ruby convert Mysql timestamp to Mysql datetime

I am trying to import data from a DB to another. The source DB has TIMESTAMP (mysql) the destination has DATETIME (mysql). I am trying something like that:
start_at = DateTime.at(row['QA_CONF_START_STAMP']) #start_at
But it is not working
I'm not sure if conversion is expressly required in this case, as the two values should be equivalent.
Since you're not retrieving the original data using a model, it's coming through as a raw string. The easiest way to interpret that is:
DateTime.parse(row['QA_CONF_START_STAMP'])