I am getting the following error message.
Cannot resolve the collation conflict between "Latin1_General_CI_AI" and "SQL_Latin1_General_CP1_CI_AS" in the equal to operation.
I only get it when I place this code below in my WHERE clause.
WHERE Region IN (SELECT Token FROM dbo.getParmsFromString(#Region))
Now #Region contains all the values from my multi-select fields from SSRS.
Below is the code for the function that is used.
CREATE FUNCTION [dbo].[getParmsFromString]
(#String VARCHAR(MAX))
RETURNS #Parms TABLE
(
Token VARCHAR(MAX)
)
AS
BEGIN
IF CHARINDEX(',', #String) != 0
BEGIN
;WITH cte0(Token, List) AS
(
SELECT SUBSTRING(#String, 1, CHARINDEX(',',#String,1) - 1)
,SUBSTRING(#String,CHARINDEX(',',#String,1) + 1, LEN(#String)) + ','
UNION ALL
SELECT SUBSTRING(List,1,ISNULL(CHARINDEX(',',List,1) - 1,1))
,SUBSTRING(List,CHARINDEX(',',List,1) + 1, LEN(List))
FROM cte0
WHERE LEN(cte0.List) > 0
)
INSERT INTO #Parms (Token)
SELECT Token
FROM cte0
OPTION (MAXRECURSION 0)
RETURN;
END
ELSE
INSERT INTO #Parms
SELECT #String
RETURN;
END
Try changing
RETURNS #Parms TABLE
(
Token VARCHAR(MAX)
)
with
try changing RETURNS #Parms TABLE
(
Token VARCHAR(MAX) COLLATE DATABASE_DEFAULT
)
and
WHERE Region IN (SELECT Token FROM dbo.getParmsFromString(#Region))
with
WHERE Region COLLATE DATABASE_DEFAULT IN (SELECT Token FROM dbo.getParmsFromString(#Region))
Generally this type of error occurs when you try to compare the the data of different regions or when you compare data using a specific encryption with other data using a different encryption.
The most probable reason is that their tempdb is using the collation "SQL_Latin1_General_CP1_CI_AS" while the database is using "Latin1_General_CI_AS". As a result, temp objects are created under the collation "SQL_Latin1_General_CP1_CI_AS" and then fail to compare with database objects of the database which are using the collation "Latin1_General_CI_AS".
The easiest fix and also the one which would recommended would be to run the database on a server which was installed using collation "Latin1_General_CI_AS".
FYI. SQL collations("SQL_Latin1_General_CP1_CI_AS") are present in sql server for backward compatibility. When dealing with international data or databases using unicode and non unicode data, it is recommended to use windows collations ("Latin1_General_CI_AS").
You can change your database collation by:
use master
ALTER DATABASE "Your database"
COLLATE Latin1_General_CI_AS;
SELECT name, collation_name
FROM sys.databases;
and if needed you can also change the collation of "master" database i.e. rebuilding the database, for this go through to these links:
http://msdn.microsoft.com/en-us/library/dd207003(v=sql.100).aspx
http://sqlbuzz.wordpress.com/2011/08/20/how-to-rebuild-master-database-aka-rebuilding-sql-server-2008r2/
but make sure you backup all your database before doing this.
Related
I'm trying to import tables from a MYSQL database via a linked server. The MYSQL database has all the date fields set to a default of '0000-00-00'. I can't even list the contents via a stored procedure as I get the following error.
An unexpected NULL value was returned for column "[MYSQL_progmgt]...[dbo.project].date_completed" from OLE DB provider "MSDASQL" for linked server "MYSQL_progmgt". This column cannot be NULL.
If I use any other table that doesn't have dates everything works fine.
I also need to import tables (that contain dates) from another MYSQL database via a linked server as well and have no problems as the default date fields are left as NULL values.
My stored procedure is
USE [TEST_COPY_PETER]
GO
/****** Object: StoredProcedure [dbo].[RAP_weekly] Script Date: 2/25/2022 10:57:03 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[RAP_weekly]
AS
Select * from [MYSQL_progmgr]...[dbo.project]
I've also tried to retrieve a single field for clarity with
Select NULLIF(date_completed,'1901-01-01') as date from [MYSQL_progmgr]...[dbo.project]
Select COALESCE(date_completed,'1901-01-01') as date from [MYSQL_progmgr]...[dbo.project]
Can't figure it out.
My sql server is version 2019 we are using MYSQL 5.3 ODBC driver.
The table on the MySQL side has the date fields set as
deployment_date` date NOT NULL DEFAULT '0000-00-00'
table on the SQL server side is not created yet as I can't even read the MySQL table.
Pete
here is the modified working code.note the quadruple single quotes.
DECLARE #OPENQUERY nvarchar(4000), #TSQL nvarchar(4000), #TSQL_SELECT nvarchar(4000), #LinkedServer nvarchar(4000)
SET #LinkedServer = 'MYSQL_ECHEANCIER'
SET #OPENQUERY = 'Select nullif( project.deployment_date, ''''0000-00-00'''') as deployment_date FROM OPENQUERY('+ #LinkedServer + ','''
SET #TSQL = 'SELECT * from dbo.project'')'
EXEC (#OPENQUERY+#TSQL)
If a column does not accept NULL value; the appreciate solution is two handle the NULL value all the time. In your case use the minimum value of datatime which is 1753-01-01.
ISNULL(date_completed, '1753-01-01')
OR
ISNULL(date_completed, Cast('1753-01-01' as DateTime))
I'm running a game server and want to offload some processing which is poorly programmed as potentially thousands of separate update statements into a single update with a join, wrapped in a stored procedure. I'm passing a JSON array of unique identifiers as the parameter and want to use the new MariaDB 10.6 JSON_TABLE function to quickly map to an in memory table which can then be used to join onto other tables to get the job done. I've tested the code below with just the #json string and I can see the SELECT * FROM json_table call work fine, I get a single column two row result set. When I do the join onto users table though I'm getting a collation mixing error.
SET #json='[ "license:4b5ef761bfcc3be1e86c3bf16e33f5432417ca6b", "license:4faca2e6f3fe74881e5fb91c9a02bbcfe1d3bb83" ]';
SELECT accounts
FROM users u
INNER JOIN
(
SELECT * FROM json_table(#json, '$[*]'
COLUMNS(
identifier varchar(100) PATH '$'
)
) AS t1
) AS players
ON u.identifier = players.identifier;
Illegal mix of collations (utf8mb4_unicode_520_ci,IMPLICIT) and (utf8mb4_general_ci,IMPLICIT) for operation '='
My database has utf8mb4 charset with collation of utf8mb4_unicode_520_ci and my.ini has those setup as defaults. I would have thought that the system would default to that even in expressions like this, but alas. I can't find anything about setting this within the json_table COLUMNS definition and have tried a few combinations as you would when create table DDL but it's not making any difference.
Does anyone know how to get this to work?
You need to set the same charset collation for identifier
CREATE tABLe users(id int,accounts varchar(100), identifier varchar(199)) CHARACTER SET 'utf8mb4'
COLLATE 'utf8mb4_unicode_520_ci'
SET #json='[ "license:4b5ef761bfcc3be1e86c3bf16e33f5432417ca6b", "license:4faca2e6f3fe74881e5fb91c9a02bbcfe1d3bb83" ]';
SELECT accounts
FROM users u
INNER JOIN
(
SELECT * FROM json_table(#json, '$[*]'
COLUMNS(
identifier varchar(100) CHARACTER SET 'utf8mb4'
COLLATE 'utf8mb4_unicode_520_ci' PATH '$'
)
) AS t1
) AS players
ON u.identifier = players.identifier;
| accounts |
| :------- |
db<>fiddle here
I'm using MySQL 8.0.4 (rc4) I need MySQL 8 because it's the only version of MySQL that supports CTEs.
My database is created thus:
CREATE DATABASE IF NOT EXISTS TestDB
DEFAULT CHARACTER SET utf8mb4
DEFAULT COLLATE utf8mb4_general_ci;
USE TestDB;
SET sql_mode = 'STRICT_TRANS_TABLES';
CREATE TABLE IF NOT EXISTS MyTable (
(...)
Body LONGBLOB NOT NULL,
(...)
);
When I try to insert raw byte data to this description field, I receive this error:
Error 1366: Incorrect string value: '\x8B\x08\x00\x00\x00\x00...' for column 'Body' at row 1.
This is the insert statement I'm using.
REPLACE INTO MyTable
SELECT Candidate.* FROM
(SELECT :Id AS Id,
(...)
:Body AS Body,
(...)
) AS Candidate
LEFT JOIN MyTable ON Candidate.Id = MyTable.Id
WHERE (
(...)
);
How could there be an incorrect string value for BLOB? Doesn't BLOB mean I can insert quite literally anything?
What's the : stuff? Why have the nested query? May we see actual SQL? What language are you using? It sounds like the "binding" tried to apply character set rules, when it should not. May we see the code that did the substitution of the : stuff?
BLOBs have not character set. As long as you can get the bytes past the parser, there should be no problem.
However, I find this to be a better way to do it...
In the app language, generate a hex string, then use that in
INSERT INTO ... VALUES (..., UNHEX(the-hex-string), ...)
I am trying to import data from SQL Server to mysql using OPENQUERY with a linked server.
I have imported couple of tables but I am having an issue with a table that has a long varchar field.
everytime i run this query I get the following
Msg 7344, Level 16, State 1, Line 2
The OLE DB provider "MSDASQL" for linked server "Serv1" could not INSERT INTO table "[MSDASQL]" because of column "Notes". Could not convert the data value due to reasons other than sign mismatch or overflow.
the column Notes is of the type varchar(8000) in SQL servr and also varchar(8000) in MySQL.
What is the issue? why it is giving me this error? not I have tried to case Notes to varchar(8000) first but that did not work.
INSERT OPENQUERY (Serv1, 'SELECT
id,
i3_identity,
mid,
Notes,
Comments,
result_code,
Disposition,
completed_on,
calltype
FROM finaltesting.activities')
SELECT
CAST(ID AS int) AS id,
CAST(identity AS int) AS identity,
CAST(merchantAccount AS varchar(255)) AS mid,
Notes,
CAST(Comments AS varchar(8000)) AS Comments,
CAST(FinishCode AS varchar(255)) AS result_code,
CAST(Disposition AS varchar(255)) AS Disposition,
CAST(callDate AS datetime) AS completed_on,
CAST(CallType AS varchar(255)) AS calltype
FROM activities
Could not convert the data value due to reasons other than sign mismatch or overflow.
I was able to solve this problem by changes the column type in MySQL from varchar(8000) to Text. I am using MySQL 5.6.12. It is probably column type issue when converting form SQL Server to MySQL Server.
Thanks
Question: is there an effective way to switch the collation of 1 database in SQL Server?
I've set the collation of the database through properties -> options -> collation. Although this collation is set for new fields, none of the existing text-based fields in the database are being changed. This is what I need to get done.
The database on which I need to switch the collation is huge (50+GB, 750+ tables) so manually changing all fields in the database is not an option.
What about the following:
create scripts for the structure of the database
export all data
drop the database
create an empty database with the correct collation
create database structure - all text based fields should now be set
to the database default
import data
Bada bing, bada boom?
Other strategies?
Could I query the master database and change the collations there?
Thanks for your input!
I faced that problem before and the only way I found was the save-drop-load and populate that database.
Also, I guess you've already browsed this page.
You could conceivably create a bunch of queries which will produce rows like "Alter table [schema].[TableName] alter column [ColName] [type] collate [New Collation] [nullability]" as output. This can be done using the built in sys.Schemas, sys.Tables and sys.Columns system views.
Sample:
select 'alter table [' + s.name + '].[' + t.name + '] alter column [' + c.name
+ '] nvarchar(' + CAST(c.max_length/2 as nvarchar) +') '
+ ' collate Finnish_Swedish_CI_AI '
+ (case when c.is_nullable = 1 then ' null ' else ' not null ' end)
from sys.tables t
join sys.schemas s on t.schema_id = s.schema_id
join sys.columns c on t.object_id = c.object_id
where t.type='U' and c.system_type_id = 231
The sample will find columns of type nvarchar([length]) and produce alter statements to change the collation to Finnish_Swedish_CI_AI.
Note that this sample is meant as a proof of concept only. If you want to explore this possible solution you must put some time into studying the mentioned system views.
You would likely have to make separate queries for handling nvarchar, nchar, ntext and varchar, char, text fields. Nvarchar/Varchar(max) fields might need special consideration, as well as any potential computed columns of character type.