inserting into userdefined data in sql server - sql-server-2008

Hi I am using SQL server 8.0 for my database. I dont know how to insert user define data.
This is my table.
column name: data type length allow_nulls
study_type_id int 4
study_type_name UD_NAME(varchar) 150
study_type_abbrev UD_NAME_SHORT(varchar) 50
order int
UD_NAME and UD_NAME_SHORT are user defined data types in sql enterprise manager. base type is varchar.
when i used insert command as below,
INSERT into study_type VALUES (15, 'test', 'TT',100)
It gives me "Implicit_conversion_error" I could not see ASP webpage link to that table.
And
INSERT into study_type (study_type_id, study_type_name, study_type_abbrev, order)
VALUES (15,CAST('test' as UD_NAME),CAST('TT' as UD_NAME_SHORT),100)
Then it said "type UD_NAME is not defined system type."

You cannot use user defined types in CAST and CONVERT functions

Related

MySQL ERROR 1265: Data truncated for column

I can't figure out why I'm getting this message. I'm using MySQL Workbench and am editing the values in an ENUM field that connects to a dropdown choice in my app.
Everything seems to be fine. I've searched on this error and all I find refers to datatype mismatches but, in this instance, that's not possible with ENUM when feeding it an array of string values.
Here's the SQL
Executing:
ALTER TABLE `mydbase`.`average_monthly_expenses`
CHANGE COLUMN `expense_category` `expense_category` ENUM('Home', 'Healthcare', 'Child care', 'Groceries and supplies', 'Eating out', 'Utilities', 'Telecomms', 'Laundry and cleaning', 'Clothes', 'Education', 'Entertainment gifts vacation', 'Auto and transportation', 'Insurance', 'Savings and investments', 'Charitable contributions', 'Itemized monthly payments') NULL DEFAULT NULL ;
Operation failed: There was an error while applying the SQL script to the database.
ERROR 1265: Data truncated for column 'expense_category' at row 1
SQL Statement:
ALTER TABLE `mydbase`.`average_monthly_expenses`
CHANGE COLUMN `expense_category` `expense_category` ENUM('Home', 'Healthcare', 'Child care', 'Groceries and supplies', 'Eating out', 'Utilities', 'Telecomms', 'Laundry and cleaning', 'Clothes', 'Education', 'Entertainment gifts vacation', 'Auto and transportation', 'Insurance', 'Savings and investments', 'Charitable contributions', 'Itemized monthly payments') NULL DEFAULT NULL
Any suggestions are very welcome
The query itself is correct.
modelling fiddle
Execute
SELECT DISTINCT expense_category, HEX(expense_category)
FROM mydbase.average_monthly_expenses
and check the output for the values which are not listed in the column definition.
There may be typos, leading/trailing spaces or another non-printed symbols, double spaces in the middle of the value, or there may be some collation problems.
UPDATE
My current field definition I'm trying to change to the above is ENUM('Home', 'Living', 'Telecommunications', 'Transportation', 'Other'). When I run your suggested SQL I just get Housing and Other listed.
These values are absent in new column definition - so server cannot convert them and truncates the values.
Recommendations: alter column definition, add new values to ENUM values list but do not remove old ones; update table and replace old values with new ones; alter column definition and remove old values from the list.
#Akina
So I figured out why I was blocked from editing the values. It would not let me make any edits that changed either value "Housing" or "Other". I did as you suggested, adding my new values to the existing ones, no problem, that worked fine. I couldn't however delete "Housing" or "Other", but the other prior values deleted fine. For the moment, I kept both, using "Housing" instead of "Home" and leaving "Other" at the end.
But I wanted to know however why those two values were protected, and then it dawned on me, there were existing records using those values. I manually changed all instances using "Other" to "Telecomms" and then I could remove "Other" from the ENUM values. All good now.

S3 to MySQL AWS Data Pipeline Insert table error

It's my first time asking a question on here, please so bear with me
I am trying to create a data pipeline to upload a CSV file in an S3-Bucket to a MySQL database table(Production1) using the template provided by aws, but fails when executing RdsMySqlTableCreateActivity.
The sql statement that I'm using(all column names match the CSV file) in the myRDSTableInsertSql parameter:
INSERT INTO `Production1` (`API`, `Normalized Month`, `DATE`, `Monthly Liquid`, `Cum Oil`, `BOPD`, `Monthly Gas Mcf/Month`, `Cum Gas`, `MCFPD`) VALUES(?,?,?,?,?,?,?,?,?);
The RdsMySqlTableCreateActivity error:
errorId
ActivityFailed:SQLException
errorMessage
No value specified for parameter 1
errorStackTrace
amazonaws.datapipeline.taskrunner.TaskExecutionException:
private.com.amazonaws.services.datapipeline.redshift.QueryStatementException: Exception No value specified for
parameter 1 while executing INSERT INTO `Production1` (`API`, `Normalized Month`, `DATE`, `Monthly Liquid`, `Cum Oil`, `BOPD`, `Monthly Gas Mcf/Month`, `Cum Gas`, `MCFPD`) VALUES(?,?,?,?,?,?,?,?,?);...
I ran the insert command on MySQL workbench, replacing the (?,?,?,?,?,?,?,?,?) with (1,2,3,4,5,6,7,8,9), and it worked. The CSV file that I'm using only has 2 rows the column names and values 1-9 for each column respectively. Really not sure what it means by No value specified for parameter 1, any help/guidance would really be appreciated!!!
For anyone that runs into the same issue using the "Load S3 data into RDS MySQL table" template
My values for each parameter were the following
myRDSTableInsertSql:
INSERT INTO tableName(`col_name1`, `col_name2`, `col_name3`, `col_name4`, `col_name5`, `col_name6`, `col_name7`, `col_name8`, `col_name9`) VALUES(?,?,?,?,?,?,?,?,?);
myRDSTableName: tableName
myRDSCreateTableSql:
CREATE TABLE tableName(`col_name1` type, `col_name2` type, `col_name3` type, `col_name4` type, `col_name5` type, `col_name6` type, `col_name7` type, `col_name8` type, `col_name9` type);
The main issue was with the actual CSV file format, you have to make sure there is no header, and that the types are exactly the same. Also make sure that you're separators are "," and each value is not quoted within your CSV file.
This template is a good starting point but form more detailed/complex CSV files making your own datapipeline is a must!

Access Report - Calculating Hours between 2 times

I have a SQL time off database with a Access front end. I currently have BeginTimeOff and EndTimeOff fields on the report. In my SQL database, these are Time(7) fields. I want a new field to show the time difference. I've tried to have the Control Source be equal to:
=DateDiff("n",CDate([BeginTimeOff]),CDate([EndTimeOff]))
AND
=DateDiff("n",[BeginTimeOff],[EndTimeOff])
AND
= [EndTimeOff] - [BeginTimeOff]
I can't get anything to work. I can subtract dates fine, just not times. Help!
Access does not have a time-only field type (Access Date/Time fields have both a date and time component), and any unknown field types in an ODBC linked table are usually mapped to Text. So if you have a SQL Server table with time(7) columns ...
CREATE TABLE [dbo].[TimeTest](
[Id] [int] NOT NULL,
[BeginTimeOff] [time](7) NULL,
[EndTimeOff] [time](7) NULL
...
then the corresponding ODBC linked table in Access will have Text(255) columns instead:
If you want to directly use the columns in the linked table then you will have to convert the values into a form that Access will accept before you can use functions like DateDiff() to do calculations with them. Specifically, Access Date/Time values do not support fractional seconds so you will have to remove them. That is,
CDate("07:59:00.0000000")
will fail with a "Type mismatch" error (run-time error 13), while
CDate("07:59:00")
works fine. You can use string manipulation functions like InStr(), Left(), Mid(), etc. to get rid of the fractional part of the string.
Another approach would be to create a SQL Server View that converts the DATE(7) columns to DATETIME
CREATE VIEW [dbo].[TimeView]
AS
SELECT
Id,
DATEADD(day, -2, CONVERT(DATETIME, BeginTimeOff)) AS BeginTimeOff,
DATEADD(day, -2, CONVERT(DATETIME, EndTimeOff)) AS EndTimeOff
FROM dbo.TimeTest
and then if you link to that View the columns will appear as Date/Time values in Access

How to get database sql values from an active record object?

My original problem is that I need to insert a lot of records to DB, so to speed up, I want to use mysqlimport which takes a file of row values and load them to specified table. So suppose I have model Book, I couldn't simply use book.attributes.values as one of the fields is a hash that is serialized to db (using serialize), so I need to know what is the format this hash will be stored in in the db. Same for time and dates fields. Any help?
How about using SQL insert statements instead of serialization?
book = Book.new(:title => 'Much Ado About Nothing', author: 'William Shakespeare')
sql = book.class.arel_table.create_insert
.tap { |im| im.insert(record.send(
:arel_attributes_with_values_for_create,
record.attribute_names)) }
.to_sql

MySQL insert to bit(1) column via ODBC 5.2

I've searched and can't seem to find quite what I'm looking for.
I'm running a PL/SQL script in Oracle, and attempting to insert records into a table in MySQL via database link using MySQL ODBC 5.2 Unicode Driver.
The link works fine, I can do complex queries in Oracle using it, and do various inserts and updates on records there.
Where it fails is in trying to insert a record into a MySQL table that has a column of type bit(1).
It is basically a cursor for loop, with the insert statement looking something like:
INSERT INTO "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, PASSWD, PASSWORD_EXPIRED)
VALUES (CU_rec.USERNAME, CU_rec.VERSION, CU_rec.ACCOUNT_EXPIRED, CU_rec.ACCOUNT_LOCKED, CU_rec.PASSWD, CU_rec.PASSWORD_EXPIRED)
Some of the target columns, like ACCOUNT_EXPIRED, ACCOUNT_LOCKED, etc. are the bit(1) columns in MySQL. Given that I can convert the data types in the cursor CU_rec to pretty much anything I want in Oracle, how can I get them inserted into the target? I've tried everything I can think of, and I just keep getting:
Error report:
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[MySQL][ODBC 5.2(w) Driver][mysqld-5.6.10]Data too long for column 'ACCOUNT_EXPIRED' at row 1 {HY000,NativeErr = 1406}
ORA-02063: preceding 2 lines from MOBILEAPI
ORA-06512: at line 44
28500. 00000 - "connection from ORACLE to a non-Oracle system returned this message:"
*Cause: The cause is explained in the forwarded message.
*Action: See the non-Oracle system's documentation of the forwarded
message.
Any help at all would be greatly appreciated.
Your problem is Oracle's default datatype conversion over ODBC; according to their own documentation they convert SQL_BINARY to a raw. Although not directly related, Oracle's comparison of MySQL and Oracle within SQL Developer also alludes to the fact that the automatic conversion from a MySQL bit is to an Oracle raw.
Extremely confusingly, MySQL's documentation indicates that a bit is converted to a SQL_BIT or a SQL_CHAR, which implies that it may work in the other direction1.
According to Microsoft's ODBC docs you should, theoretically, be able to use the CONVERT() function to transform this into a character, which should, theoretically, be translatable by MySQL.
insert into some_table#some_db (bit_col)
values( {fn convert(some_col, SQL_CHAR)} );
Failing that there's another couple of options, but it does depend on what you're attempting to insert into the MySQL database from Oracle and what the datatype is in Oracle. For instance you could use the Oracle CAST() function to convert between datatypes. For instance, the following would convert an integer to a binary double.
select cast(1 as binary_double) from dual
Unfortunately, you can't cast an integer to a raw, only a character or a rowid, so in order to convert to a raw you'd have to do the following:
select cast(to_char(1) as raw(1)) from dual
I've no idea whether MySQL will accept this but with some testing you should be able to work it out.
1. For clarity, I've never tried it in either direction.
Hah! I found a solution. Dropping it here in case it helps someone else. It's not pretty, but it works.
I used the old EXECUTE IMMEDIATE trick.
Basically, I created a variable sql_stmt varchar2(4000) and wrote code like:
sql_stmt := 'insert into "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, CIPHER_PASSPHRASE, ENABLED, PASSWD, PASSWORD_EXPIRED)
values ('''||CU_rec.USERNAME||'','||CU_rec.VERSION||', '||CU_rec.ACCOUNT_EXPIRED||', '||CU_rec.ACCOUNT_LOCKED||', '''||CU_rec.CIPHER_PASSPHRASE||''', '||
CU_rec.ENABLED||', '''||CU_rec.PASSWD||''', '||CU_rec.PASSWORD_EXPIRED||')';
EXECUTE IMMEDIATE sql_stmt;
Something like that anyway (the quotes might not line up, as I hacked this a bit from the actual code). Looking at the contents of sql_stmt, I get:
insert into "app_user"#mobileapi (USERNAME, VERSION, ACCOUNT_EXPIRED, ACCOUNT_LOCKED, CIPHER_PASSPHRASE, ENABLED, PASSWD,PASSWORD_EXPIRED)
values ('user#email.com', 0, 0, 0, 'asdfastrwaebawavgansdhnsgjsjsh', 1, 'awercbcakwjerhcawuerawieubkahbewvkruh', 0)
The EXECUTE IMMEDIATE completes, and checking the target table, the values are there.
Possibly a crappy solution, but better than nothing.