How to insert matlab data table into MySQL database - mysql

I have matlab data table T in my matlab with more than 40,000 rows. I want to insert this table into MySQL database. This table T has columns with different data types(char, date, integer). I tried following:
fastinsert(conn,'tablename',colnames2,T)
I even tried with "Insert" and datainsert". I converted table to cellarray, but still it didn't work. Then I tried to convert that cellarray into mat, but i couldn't convert it to matrix It says all the content should be of same data type to convert it into matrix.
Is there any way i can insert my data table present in matlab to MySQL database?

Instead of converting your data from a cell array to a matrix have you tried converting it to a table using cell2table() and then using insert(). There is an example in the MATLAB documentation that can be found here.
The linked example uses multiple data types in a cell and then converts them to a table (instead of a matrix) which can then be written to the database with mixed data types.

Related

MySQL - Invalid GIS data provided to function st_polygonfromtext

I have a table in mysql with geometry data in one of the columns. The datatype is text and I need to save it as Polygon geometry.
I have tried a few solutions, but keep running into Invalid GIS data provided to function st_polygonfromtext. error.
Here's some data to work with and an example:
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=78ac63e16ccb5b1e4012c21809cba5ff
Table has 25k rows, there are likely some bad geometries in there. When I attempt to update on a subset of rows, it seems to successfully work, like it did in the fiddle example. It fails when I attempt to update all 25k rows.
Someone suggested using wrapping the statements around TRY and CATCH. Detecting faulty geometry WKT and returning the faulty record
I am not too familiar with using them in MySQL or stored procedures either.
I need a spatial index on the table to be able to use spatial functions and filter queries by location.
Plan A: Create a new table and try to convert as you INSERT IGNORE INTO that table from your existing table. I don't know if this will apply the "IGNORE" to conversion failures. Also, you would end up with the "good" values. What do you want to do about the "bad" values?
Plan B: Write a loop in application code -- read one row, convert the varchar value, check for errors.

Is it possible to use sqlalchemy to reflect table and change data type of column from string to datetime?

I have a web application where users can upload CSVs. I use Python Pandas to actually do the upload. I have to give the users the ability to change the database table's column's types such as from strings to datetimes. Is there a way to do this in Sqlalchemy? I'm working with reflected tables, and so I have Table objects with all their columns but I have a feeling that Sqlalchemy does not have this capability and that I will have to execute raw SQL to do this.

How to migrate CLOB column to (json) BLOB in DB2 without truncating data?

I have a DB2 11 database with a large table that has JSON data stored in a CLOB column. Given that I'd like to perform queries on it using the JSON_VAL function, I always need to use JSON2BSON to convert it first, which I assume is a significant overhead. I would like to move the data into another table that has exactly the same structure, except for the CLOB column which I'd like to replace with a BLOB one to store the JSON immediately in BLOB, hoping that this will speed up my queries.
My approach to this was writing a
insert into newtable (ID, BLOBDATA) select ID, SYSTOOLS.JSON2BSON(CLOBDATA) from oldtable;
After doing this I realized that long json objects got truncated. I have googled on this and learned that selects to truncate large objects.
I am reaching out to here to see if there is any simple way for me to do this excercise, without having to write a program to read out and write back all the data. (I had myself burnt with similar truncation taking place when I used DB2 csv export features.)
Thanks.
Starting with Db2 11.1.4.4 there are new JSON functions based on the ISO technical paper. I would advise to use them. They are the strategic functionality going forward.
You could use JSON_VALUE to perform the equivalent of what you planned to with JSON_VAL.

Update column from Base64 to string (natively in SQL)

I've been trying to figure this out for two days now and would really appreciate some help. I've imported data from a csv where one field contained html data encoded in base64.
The idea is to loop over every row and run FROM_BASE64 on it.
How do I structure a query that:
Loops over all lines
Calls FRON_BASE64 for each line
Runs UPDATE (or similar functionality) on that same row and column
Context: I'm running MariaDB (MySQL equivalent).
Thanks for any help!
Typically base64 would be used for binary data. You probably shouldn't store the decoded data in the same column as the base64-coded string. If necessary you should ALTER TABLE to add a new column that is VARBINARY or BLOB type, to hold the binary data.
ALTER TABLE MyTable ADD COLUMN BinaryField BLOB;
You can then fill that column with an UPDATE statement:
UPDATE MyTable SET BinaryField = FROM_BASE64(EncodedField);

infer table structure from file in MySql

Another posting said there is a way to infer the table columns from a data file using phpMyAdmin. I haven't found documentation on this, can you point me to it? Does it only use the header row, or does it also sample the data to infer the data type?
I'm trying to create several tables in MySQL from data files, which have roughly 100 columns each, so I don't want to write the SQL DDL to create the tables manually.
Thanks!