SQL INSERT statement conflicted with the CHECK constraint - sql-server-2008

I am getting this error
InsertRecord(InsertRow): HResult of -2147217873 (80040e2f)
Error Source: Microsoft OLE DB Provider for SQL Server Error
Description : The INSERT statement conflicted with the CHECK constraint CK_ContactName. The conflict occurred in database EPIC.7.5_GR_SM_ETLTRAINING; table dbo.ContactName.
The Statements
CREATE TABLE [dbo].[ContactName](
[UniqContactName] [dbo].[DM_UNIQID] IDENTITY(65536,1) NOT NULL,
[UniqFixedContactName] [dbo].[DM_UNIQID] NOT NULL,
[UniqEntity] [dbo].[DM_UNIQID] NOT NULL,
[LkPrefix] [dbo].[DM_PREFIX] NOT NULL,
[FullName] [dbo].[DM_NAME] NOT NULL,
[FirstName] [varchar](30) NOT NULL,
[MiddleName] [varchar](16) NOT NULL,
[LastName] [varchar](30) NOT NULL,
[LkSuffix] [dbo].[DM_SUFFIX] NOT NULL,
[DescriptionOf] [dbo].[DM_DESC_050] NOT NULL,
[Title] [dbo].[DM_TITLE] NOT NULL,
[Department] [dbo].[DM_DEPARTMENT] NOT NULL,
[UniqContactAddressMain] [dbo].[DM_UNIQID] NOT NULL,
[UniqContactAddressEmployer] [dbo].[DM_UNIQID] NOT NULL,
[UniqContactNumberMain] [dbo].[DM_UNIQID] NOT NULL,
[UniqContactNumberEmailMain] [dbo].[DM_UNIQID] NOT NULL,
[ContactMethodCode] [char](1) NOT NULL,
[InformalHeading] [varchar](50) NOT NULL,
[FormalHeading] [varchar](50) NOT NULL,
[BirthDate] [dbo].[DM_DATE] NULL,
[GenderCode] [char](1) NOT NULL,
[SSN] [char](9) NOT NULL,
[MaritalStatusCode] [char](1) NOT NULL,
[RelationToInsuredCode] [char](2) NOT NULL,
[LkLanguage] [dbo].[DM_LANGUAGE] NOT NULL,
[Comments] [dbo].[DM_COMMENT] NOT NULL,
[BillingDeliveryCode] [char](1) NOT NULL,
[ServicingDeliveryCode] [char](1) NOT NULL,
[MarketingDeliveryCode] [char](1) NOT NULL,
[CategoryCode] [char](1) NOT NULL,
[EmployerName] [dbo].[DM_NAME] NOT NULL,
[LkOccupation] [dbo].[DM_OCCUPATION] NOT NULL,
[HiredDate] [dbo].[DM_DATE] NULL,
[YearsEmployed] [smallint] NULL,
[YearsPriorEmployer] [smallint] NULL,
[FEIN] [char](9) NOT NULL,
[DUNSNumber] [char](9) NOT NULL,
[CdNAICSCode] [char](6) NOT NULL,
[CdSICCode] [char](8) NOT NULL,
[BusinessTypeCode] [char](2) NOT NULL,
[BusinessTypeOtherDesc] [varchar](50) NOT NULL,
[NumberEmployees] [smallint] NULL,
[NumberMembersManagers] [int] NULL,
[BusinessStartedDate] [datetime] NULL,
[NatureOfBusinessCode] [char](2) NOT NULL,
[NatureOfBusinessOtherDesc] [varchar](50) NOT NULL,
[CreditBureauNameCode] [char](5) NOT NULL,
[CreditBureauNameOtherDesc] [varchar](50) NOT NULL,
[CreditBureauIDNumber] [varchar](30) NOT NULL,
[DriverLicenseNumber] [varchar](25) NOT NULL,
[LicensedState] [char](4) NOT NULL,
[LicensedDate] [datetime] NULL,
[LicensedMADate] [dbo].[DM_DATE] NULL,
[DriverTypeCode] [char](1) NOT NULL,
[GoodStudentCode] [char](1) NOT NULL,
[DriverTrainingCode] [char](1) NOT NULL,
[AccidentPreventionCourseDate] [dbo].[DM_DATE] NULL,
[CommercialExperienceBeganDate] [dbo].[DM_DATE] NULL,
[MatchClientNameOf] [smallint] NOT NULL,
[InsertedByCode] [dbo].[DM_INSERTUPDATEBYCODE] NOT NULL,
[InsertedDate] [dbo].[DM_DATETIME] NOT NULL,
[UpdatedByCode] [dbo].[DM_INSERTUPDATEBYCODE] NOT NULL,
[UpdatedDate] [dbo].[DM_DATETIME] NULL,
[Flags] [dbo].[DM_FLAGS] NOT NULL,
[ts] [datetime] NOT NULL,
[SIN] [char](9) NOT NULL,
[BusinessNumber] [varchar](30) NOT NULL,
[BusinessIDNumber] [varchar](30) NOT NULL,
[IBCCode] [char](6) NOT NULL,
CONSTRAINT [PK_ContactName] PRIMARY KEY NONCLUSTERED ALTER TABLE [dbo].[ContactName]
WITH CHECK ADD CONSTRAINT [CK_ContactName]
CHECK ((
[UniqContactName]=(-1)
AND [UniqEntity]=(-1)
OR [UniqContactName]>(-1)
AND [UniqEntity]>(-1)
))
GO
I have checked contactName db and all the rows from uniqueContactName all have a value above 65000, then for UniqEntity they all are above 65000 as well.
Does anyone have an idea why this would fail?

Related

prevent django test runner from creating stale table

I have mariadb database that used to have CHARSET utf8 COLLATE utf8_general_ci config but now CHARSET utf8mb4 COLLATE utf8mb4_unicode_ci. All tables have the same CHARSET and COLLATE as those of the database.
When I run ./manage.py test, stacktrace looks like this:
....
django.db.utils.OperationalError: (1118, 'Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.')
I managed to find out what the troubling table is, and the sql query looks like the following. Note that I changed names of table and fields for security:
CREATE TABLE `troubling_table`
(
`id` INTEGER auto_increment NOT NULL PRIMARY KEY,
`no_tax` VARCHAR(20) NOT NULL,
`cd_pc` VARCHAR(7) NOT NULL,
`cd_wdept` VARCHAR(12) NOT NULL,
`id_write` VARCHAR(20) NULL,
`cd_docu` VARCHAR(10) NULL,
`dt_acct` VARCHAR(8) NULL,
`st_docu` VARCHAR(3) NULL,
`tp_drcr` VARCHAR(3) NULL,
`cd_acct` VARCHAR(20) NULL,
`amt` NUMERIC(19, 4) NULL,
`cd_partner` VARCHAR(20) NULL,
`nm_partner` VARCHAR(50) NULL,
`tp_job` VARCHAR(40) NULL,
`cls_job` VARCHAR(40) NULL,
`ads_hd` VARCHAR(400) NULL,
`nm_ceo` VARCHAR(40) NULL,
`dt_start` VARCHAR(8) NULL,
`dt_end` VARCHAR(8) NULL,
`am_taxstd` NUMERIC(19, 4) NULL,
`am_addtax` NUMERIC(19, 4) NULL,
`tp_tax` VARCHAR(10) NULL,
`no_company` VARCHAR(20) NULL,
`dts_insert` VARCHAR(20) NULL,
`id_insert` VARCHAR(20) NULL,
`dts_update` VARCHAR(20) NULL,
`id_update` VARCHAR(20) NULL,
`nm_note` VARCHAR(100) NULL,
`cd_bizarea` VARCHAR(12) NULL,
`cd_dept` VARCHAR(12) NULL,
`cd_cc` VARCHAR(12) NULL,
`cd_pjt` VARCHAR(20) NULL,
`cd_fund` VARCHAR(20) NULL,
`cd_budget` VARCHAR(20) NULL,
`no_cash` VARCHAR(20) NULL,
`st_mutual` VARCHAR(3) NULL,
`cd_card` VARCHAR(20) NULL,
`no_deposit` VARCHAR(20) NULL,
`cd_bank` VARCHAR(20) NULL,
`ucd_mng1` VARCHAR(20) NULL,
`ucd_mng2` VARCHAR(20) NULL,
`ucd_mng3` VARCHAR(20) NULL,
`ucd_mng4` VARCHAR(20) NULL,
`ucd_mng5` VARCHAR(20) NULL,
`cd_employ` VARCHAR(20) NULL,
`cd_mng` VARCHAR(20) NULL,
`no_bdocu` VARCHAR(20) NULL,
`no_bdoline` NUMERIC(4, 0) NULL,
`tp_docu` VARCHAR(3) NULL,
`no_acct` NUMERIC(5, 0) NULL,
`tp_trade` VARCHAR(10) NULL,
`no_check` VARCHAR(20) NULL,
`no_check1` VARCHAR(20) NULL,
`cd_exch` VARCHAR(10) NULL,
`rt_exch` NUMERIC(10, 4) NULL,
`cd_trade` VARCHAR(10) NULL,
`no_check2` VARCHAR(50) NULL,
`no_check3` VARCHAR(50) NULL,
`no_check4` VARCHAR(100) NULL,
`tp_cross` VARCHAR(1) NULL,
`erp_cd` VARCHAR(50) NULL,
`am_ex` NUMERIC(19, 4) NULL,
`tp_export` VARCHAR(1) NULL,
`no_to` VARCHAR(20) NULL,
`dt_shipping` VARCHAR(8) NULL,
`tp_gubun` VARCHAR(3) NULL,
`no_invoice` VARCHAR(20) NULL,
`no_item` VARCHAR(20) NULL,
`md_tax1` VARCHAR(4) NULL,
`nm_item1` VARCHAR(50) NULL,
`nm_size1` VARCHAR(20) NULL,
`qt_tax1` NUMERIC(17, 4) NULL,
`am_prc1` NUMERIC(19, 4) NULL,
`am_supply1` NUMERIC(19, 4) NULL,
`am_tax1` NUMERIC(19, 4) NULL,
`nm_note1` VARCHAR(20) NULL,
`cd_bizplan` VARCHAR(20) NULL,
`cd_bgacct` VARCHAR(10) NULL,
`cd_mngd1` VARCHAR(20) NULL,
`nm_mngd1` VARCHAR(100) NULL,
`cd_mngd2` VARCHAR(20) NULL,
`nm_mngd2` VARCHAR(100) NULL,
`cd_mngd3` VARCHAR(20) NULL,
`nm_mngd3` VARCHAR(100) NULL,
`cd_mngd4` VARCHAR(20) NULL,
`nm_mngd4` VARCHAR(100) NULL,
`cd_mngd5` VARCHAR(20) NULL,
`nm_mngd5` VARCHAR(100) NULL,
`cd_mngd6` VARCHAR(20) NULL,
`nm_mngd6` VARCHAR(100) NULL,
`cd_mngd7` VARCHAR(20) NULL,
`nm_mngd7` VARCHAR(100) NULL,
`cd_mngd8` VARCHAR(20) NULL,
`nm_mngd8` VARCHAR(100) NULL,
`yn_iss` VARCHAR(1) NULL,
`final_status` VARCHAR(2) NULL,
`no_bill` VARCHAR(24) NULL,
`tp_bill` VARCHAR(1) NULL,
`tp_record` VARCHAR(1) NULL,
`tp_etcacct` VARCHAR(1) NULL,
`st_gware` VARCHAR(3) NULL,
`sell_dam_nm` VARCHAR(30) NULL,
`sell_dam_email` VARCHAR(50) NULL,
`sell_dam_mobil` VARCHAR(20) NULL,
`nm_pumm` VARCHAR(100) NULL,
`jeonjasend15_yn` VARCHAR(1) NULL,
`dt_write` VARCHAR(8) NULL,
`st_tax` VARCHAR(1) NULL,
`md_tax2` VARCHAR(4) NULL,
`nm_item2` VARCHAR(50) NULL,
`nm_size2` VARCHAR(20) NULL,
`qt_tax2` NUMERIC(17, 4) NULL,
`am_prc2` NUMERIC(19, 4) NULL,
`am_supply2` NUMERIC(19, 4) NULL,
`am_tax2` NUMERIC(19, 4) NULL,
`nm_note2` VARCHAR(20) NULL,
`md_tax3` VARCHAR(4) NULL,
`nm_item3` VARCHAR(50) NULL,
`nm_size3` VARCHAR(20) NULL,
`qt_tax3` NUMERIC(17, 4) NULL,
`am_prc3` NUMERIC(19, 4) NULL,
`am_supply3` NUMERIC(19, 4) NULL,
`am_tax3` NUMERIC(19, 4) NULL,
`nm_note3` VARCHAR(20) NULL,
`md_tax4` VARCHAR(4) NULL,
`nm_item4` VARCHAR(50) NULL,
`nm_size4` VARCHAR(20) NULL,
`qt_tax4` NUMERIC(17, 4) NULL,
`am_prc4` NUMERIC(19, 4) NULL,
`am_supply4` NUMERIC(19, 4) NULL,
`am_tax4` NUMERIC(19, 4) NULL,
`nm_note4` VARCHAR(20) NULL,
`no_asset` VARCHAR(20) NULL,
`nm_bigo` VARCHAR(100) NULL,
`nm_ptr` VARCHAR(20) NULL,
`ex_hp` VARCHAR(15) NULL,
`ex_emil` VARCHAR(100) NULL,
`no_biztax` VARCHAR(8) NULL,
`yn_import` VARCHAR(1) NULL,
`ref_no_docu` VARCHAR(20) NULL,
`cd_fx` VARCHAR(2) NULL,
`fx_bill` VARCHAR(20) NULL,
`no_iss` VARCHAR(24) NULL,
`file_attach` VARCHAR(100) NULL,
`tp_evidence` VARCHAR(4) NULL,
`st_bizbox` VARCHAR(1) NULL,
`tp_input` VARCHAR(30) NULL,
`sell_dam_tel` VARCHAR(20) NULL,
`no_car` VARCHAR(20) NULL,
`no_carbody` VARCHAR(17) NULL,
`dec_lease` VARCHAR(100) NULL,
`no_tdocu` VARCHAR(20) NULL,
`no_tdoline` NUMERIC(4, 0) NULL,
`cd_bizcar` VARCHAR(20) NULL,
`cd_taxacct` VARCHAR(10) NULL,
`yn_fixasset` VARCHAR(1) NULL
)
So if I run this query in sql editor, the error looks the same as that of django. This error didn't happend when I created database with the now gone pair of characterset and collate. But when I run the test it raises error. Different charset may be one of the reaons.
So I deleted the model and apply migrations, since that model is no longer in use. So, stale.
But even after that model is erased, django test runner still seems to bother to create that stale table.
Does django test runner go through every migraion files from start? Is that why I can't run test against model that used to be created with old charset and collate?
How can I prevent django test runner from creating stale, no longer existed, table because old table conflicts in charset and collate with new table, without changing db charset and collate?

Updating a stored JSON in Mysql Database

I am trying to update a Stored JSON in my database and i am unable to run the following update query. I have copied below a select statement and an update statement
SELECT
`core_animal_event`. `animal_id` AS `MilkingEvent_animalID`,
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."62"')) AS `MilkingEvent_milkCompositeLitres`,
coalesce(JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."68"')) +
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."61"')) +
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."59"')))
FROM `core_animal_event` WHERE (`core_animal_event`. `event_type` = 2) AND (`core_animal_event`. `country_id` = '10');
UPDATE `core_animal`
SET
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."62"')) =
coalesce(JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."68"')) +
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."61"')) +
JSON_UNQUOTE(JSON_EXTRACT(`core_animal_event`.`additional_attributes`, '$."59"')))
WHERE (`core_animal_event`. `event_type` = 2) AND (`core_animal_event`. `country_id` = '10')
The following is the sample data from which we have a stored JSON column called additional_attribute and animal_id which is unique
# animal_id, additional_attributes
'2576', '{\"59\": null, \"61\": null, \"62\": null, \"63\": null, \"64\": null, \"65\": null, \"66\": null, \"67\": null, \"68\": null, \"69\": \"1\", \"70\": \"2\", \"71\": \"1\", \"72\": null, \"73\": \"2\", \"74\": \"1\", \"75\": null, \"76\": null, \"77\": [\"1\"], \"78\": \"32\", \"79\": \"70\", \"80\": \"4\", \"81\": null, \"82\": null, \"83\": null, \"84\": \"Mkiwa\", \"85\": \"19280\", \"86\": \"2405\", \"87\": \"TNZ000192802405\", \"88\": \"Brownwhite\", \"89\": \"1565789020239.jpg\", \"90\": \"1565789049469.jpg\", \"96\": null, \"97\": null, \"98\": null, \"99\": \"1\", \"100\": null, \"101\": null, \"102\": null, \"103\": null, \"104\": null, \"105\": null, \"106\": null, \"107\": null, \"108\": null, \"109\": null, \"110\": null, \"111\": null, \"112\": null, \"113\": null, \"114\": null, \"115\": null, \"116\": null, \"117\": null, \"118\": null, \"119\": null, \"120\": null, \"121\": null, \"122\": null, \"123\": null, \"124\": null, \"125\": null, \"126\": null, \"127\": null, \"128\": null, \"129\": null, \"130\": null, \"131\": null, \"132\": null, \"133\": null, \"134\": null, \"135\": null, \"136\": null, \"137\": null, \"138\": null, \"139\": null, \"141\": null, \"142\": null, \"143\": null, \"144\": null, \"145\": null}'
The following is an example of a create statement
CREATE TABLE `core_animal_event` (
`id` int NOT NULL AUTO_INCREMENT,
`animal_id` int NOT NULL,
`event_type` int NOT NULL,
`additional_attributes` json DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `org_id` (`country_id`),
KEY `animal_id` (`animal_id`),
KEY `event_type` (`event_type`),
CONSTRAINT `core_animal_event_ibfk_1` FOREIGN KEY (`animal_id`) REFERENCES `core_animal` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
) ENGINE=InnoDB AUTO_INCREMENT=941817 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci ROW_FORMAT=COMPACT;
Escape characters are not necessary here. I even think they may be the source of the problem.
Here's my code for the tests :
CREATE TABLE `core_animal_event` (
`animal_id` int(11) NOT NULL AUTO_INCREMENT,
`additional_attributes` json DEFAULT NULL,
`event_type` int(11) NOT NULL,
`country_id` int(11) NOT NULL,
PRIMARY KEY (`animal_id`)
) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=latin1
INSERT INTO core_animal_event
(animal_id, additional_attributes, event_type, country_id)
VALUES(1, '{ "59": null, "61": null, "62": null, "63": null, "64": null, "65": null, "66": null, "67": null, "68": null, "69": "1", "70": "2", "71": "1", "72": null, "73": "2", "74": "1", "75": null, "76": null, "77": ["1"], "78": "32", "79": "70", "80": "4", "81": null, "82": null, "83": null, "84": "Mkiwa", "85": "19280", "86": "2405", "87": "TNZ000192802405", "88": "Brownwhite", "89": "1565789020239.jpg", "90": "1565789049469.jpg", "96": null, "97": null, "98": null, "99": "1", "100": null, "101": null, "102": null, "103": null, "104": null, "105": null, "106": null, "107": null, "108": null, "109": null, "110": null, "111": null, "112": null, "113": null, "114": null, "115": null, "116": null, "117": null, "118": null, "119": null, "120": null, "121": null, "122": null, "123": null, "124": null, "125": null, "126": null, "127": null, "128": null, "129": null, "130": null, "131": null, "132": null, "133": null, "134": null, "135": null, "136": null, "137": null, "138": null, "139": null, "141": null, "142": null, "143": null, "144": null, "145": null}', 2, 10);
Without escape characters \, your SELECT query work.
For your update, here's an example :
UPDATE core_animal_event
SET additional_attributes = json_set(additional_attributes, '$."62"',
COALESCE(JSON_UNQUOTE(JSON_EXTRACT(additional_attributes, '$."68"')) +
JSON_UNQUOTE(JSON_EXTRACT(additional_attributes, '$."61"')) +
JSON_UNQUOTE(JSON_EXTRACT(additional_attributes, '$."59"')))
)
WHERE (event_type = 2) AND (country_id = '10');
//-------
EDIT:
Be careful when using JSON_EXTRACT and COALESCE function, in case all values are null, the returned value is 0, not a null value.
EDIT 2:
Akina is right, your COALESCE function is no good... As you do, it's an addition (but maybe that's what you want...)
EDIT 3:
If you want to use COALESCE here's an example:
SELECT animal_id AS colID, JSON_UNQUOTE(JSON_EXTRACT(additional_attributes, '$."62"')) AS col62,
COALESCE (
IF(JSON_TYPE(JSON_EXTRACT(additional_attributes, '$."68"'))='NULL', null, JSON_EXTRACT(additional_attributes, '$."68"')),
IF(JSON_TYPE(JSON_EXTRACT(additional_attributes, '$."59"'))='NULL', null, JSON_EXTRACT(additional_attributes, '$."59"')),
IF(JSON_TYPE(JSON_EXTRACT(additional_attributes, '$."61"'))='NULL', null, JSON_EXTRACT(additional_attributes, '$."61"'))
) AS result_of_coalesce
FROM core_animal_event WHERE (event_type = 2) AND (country_id = '10');

Convert MSSQL to mysql table

Can I know the easiest way to convert the Microsoft sql query to MYSQL.
sample query of Microsoft sql query:
CREATE TABLE [dbo].[T_NEWBIZ](
[newbID] [int] IDENTITY(1,1) NOT NULL,
[planID] [int] NULL,
[applID] [int] NULL,
[taxqID] [int] NULL,
[doptID] [int] NULL,
[stcoID] [int] NULL,
[crsnID] [int] NULL,
[newbEPBR] [bit] NULL,
[newbPolicyNo] [varchar](50) NULL,
[newbEffectiveDate] [varchar](20) NULL,
[newbIssueDate] [varchar](20) NULL,
[newbDistributorOrderNumber] [varchar](30) NULL,
[newbJointAnnuitantNA] [bit] NULL,
[newbOwnerIsAnnuitant] [bit] NULL,
[newbDeliveryEMail] [varchar](100) NULL,
[newbDeliveryConsent] [bit] NULL,
[newbAnnuityOptionSpecialRequest] [varchar](500) NULL,
[newbCSRID] [varchar](20) NULL,
[newbBrokerID] [varchar](10) NULL,
[newbUGMA] [bit] NULL,
[newbUTMA] [bit] NULL,
[newbRecapAvailable] [bit] NULL,
[newbOriginatorCode] [varchar](8) NULL,
[newbFileNetStatus] [char](1) NULL,
[newbJointOwnerNA] [bit] NULL,
[newbNameChange] [bit] NULL,
[newbDateCompleted] [datetime] NULL,
[newbSuitabilityStatus] [varchar](25) NULL,
[newbClientNumber] [varchar](15) NULL,
[newbCheckPayable] [bit] NULL,
[newbMSANumber] [varchar](11) NULL,
[newbPrincipalReviewer] [varchar](10) NULL,
[newbCreationDate] [datetime] NULL,
[newbAnyExistingPolicies] [varchar](1) NULL,
[newbCreatedBy] [varchar](50) NULL,
[newbReplaceExistingPolicy] [varchar](1) NULL,
[newbFormsVerified] [bit] NULL,
[newbTransfer1035] [varchar](1) NULL,
[newbFormsComment] [varchar](500) NULL,
[newbADRequired] [bit] NULL,
[newbAssignedCSRID] [varchar](10) NULL,
[newbBackupWithholding] [varchar](1) NULL,
[newbSuitabilityApprovedDate] [datetime] NULL,
[newbExchangeInternal] [bit] NULL,
[newbAnnuityDate] [varchar](20) NULL,
[newbAnnuityDateOverride] [bit] NULL,
[newbAnnuityOption] [varchar](1) NULL,
[newbSpecialRequest] [varchar](500) NULL,
[newbSpecialRequestNA] [bit] NULL,
[newbFraudAccepted] [bit] NULL,
[newbHireDate] [varchar](20) NULL,
[newbAppless] [bit] NULL,
[newbMailToClient] [bit] NULL,
[newbPGR] [bit] NULL,
[newbDataLock] [varchar](8) NULL,
[amlsCode] [varchar](1) NULL,
[newbFEPlanCodeIndicator] [bit] NULL,
[newbFEPlanCodeDate] [varchar](10) NULL,
[newbBestPlusClientID] [varchar](10) NULL,
[newbDetachedOfficeIndicator] [char](1) NULL,
[newbDetachedOfficeCode] [varchar](5) NULL,
[newbPreviousWFM] [bit] NULL,
[newbARDForm1] [bit] NULL,
[newbARDForm2] [bit] NULL,
[newbARDForm3] [bit] NULL,
[newbARDForm4] [bit] NULL,
[newbARDForm5] [bit] NULL,
[newbPSO] [bit] NULL,
CONSTRAINT [PKC_NewBiz_cnewbID] PRIMARY KEY NONCLUSTERED
(
[newbID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) ON [PRIMARY]
) ON [PRIMARY]
You can use mysql workbench to do so, I converted mssql database to MySQL database.
Here is the link to tutorial, it has all the details along with screenshots.

How can I move one schema into another with different fields?

I created an application to replace a legacy one that we had for a while, and I need to move the old database records into the new system, but the schemas are not the same. I'm wondering if there is any way to move the old schema into the new, ignoring the fields that don't exist in the new table, and moving fields that have changed to their updated version.
My old schema is Microsoft SQL Server and has the following fields:
[req_id] [int] IDENTITY(1,1) NOT NULL,
[req_user_id] [nvarchar](50) NOT NULL,
[req_subject] [nvarchar](200) NOT NULL,
[req_details] [nvarchar](4000) NOT NULL,
[req_request_date] [date] NOT NULL,
[req_year] [smallint] NOT NULL,
[req_expect_date] [date] NOT NULL,
[leader] [nvarchar](10) NULL,
[member1] [nvarchar](10) NULL,
[member2] [nvarchar](10) NULL,
[member3] [nvarchar](10) NULL,
[member4] [nvarchar](10) NULL,
[member5] [nvarchar](10) NULL,
[member6] [nvarchar](10) NULL,
[status_code] [nvarchar](10) NULL,
[hours_used] [int] NULL,
[completed_date] [date] NULL,
[category_code] [nvarchar](10) NULL,
[staff_comments] [nvarchar](2000) NULL,
[response_Email] [bit] NOT NULL,
[response_Phone] [bit] NOT NULL,
[response_Fax] [bit] NOT NULL,
[response_online_upload] [bit] NOT NULL,
[response_post_mail] [bit] NOT NULL,
[response_file] [bit] NOT NULL,
[response_pickup] [bit] NOT NULL,
My new schema is MySQL and looks like this:
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`userid` varchar(100) COLLATE utf8_unicode_ci NOT NULL,
`subject` varchar(200) COLLATE utf8_unicode_ci NOT NULL,
`details` text COLLATE utf8_unicode_ci NOT NULL,
`eta` date NOT NULL,
`leader` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member1` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member2` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member3` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member4` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member5` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`member6` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`status` varchar(100) COLLATE utf8_unicode_ci NOT NULL,
`time_spent` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`date_completed` date DEFAULT NULL,
`category` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`comments` text COLLATE utf8_unicode_ci,
`response_method` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00',
`updated_at` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00',
Is there any way to do this and salvage our old records in the new system?
I used the Data Export tool to generate an xlsx file, then used Navicat to import it as a new table to the database, and then used simple inserts statements to move the data to the other table. I was appalled by the response this question got on SO, but I'm glad I found a solution on my own.

SQL Error : "Incorrect syntax near"

I have a database that is connected to SQL Server 2008, I am getting an error when printing database : Printing aborted! Error 37000 (Microsoft OLE DB Provider for ODBC Drivers) - [Microsoft] [ODBC SQL Server Driver] [SQL Server] [Incorrect syntax near 'TEMP_TAB_SHEET_U17'.
Details script as below :
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [data].[TEMP_TAB_SHEET_U17](
[TAL] [varchar](30) NULL,
[Phase] [smallint] NOT NULL,
[SHT] [varchar](30) NULL,
[PreferedSYS] [varchar](30) NULL,
[sysdesc] [varchar](100) NULL,
[EquipmentNumber] [varchar](30) NULL,
[TaskNumber] [varchar](30) NULL,
[TaskId] [int] NULL,
[OTdescription] [varchar](50) NULL,
[OTremark] [varchar](50) NULL,
[Estimated_Man_Hours] [float] NULL,
[EquipmentId] [int] NULL,
[EquipmentDescription] [varchar](50) NULL,
[TAA_ID] [int] NULL,
[UserGroup] [varchar](30) NULL,
[MODULE] [varchar](30) NULL,
[ModuleDesc] [varchar](50) NULL,
[SHTTitle1] [varchar](50) NULL,
[SHTTitle2] [varchar](50) NULL,
[SHTTitle3] [varchar](50) NULL,
[SHTTitle4] [varchar](50) NULL,
[SHTDescription] [varchar](50) NULL,
[SHTNbSections] [int] NULL,
[NbMaxTasksPerSheet] [int] NULL,
[S1_PH] [varchar](30) NULL,
[S1_M] [varchar](30) NULL,
[S1_PF] [varchar](30) NULL,
[S2_PH] [varchar](30) NULL,
[S2_M] [varchar](30) NULL,
[S2_PF] [varchar](30) NULL,
[S3_PH] [varchar](30) NULL,
[S3_M] [varchar](30) NULL,
[S3_PF] [varchar](30) NULL,
[SHTFooterTitle1] [varchar](50) NULL,
[SHTFooterTitle2] [varchar](50) NULL,
[SHTFooterTitle3] [varchar](50) NULL,
[SHTFooterTitle4] [varchar](50) NULL,
[SHTFooterTitle5] [varchar](50) NULL,
[SHTFooterTitle6] [varchar](50) NULL,
[SHTFooterTitle7] [varchar](50) NULL,
[SHTFooterTitle8] [varchar](50) NULL,
[SHTFooterTitle9] [varchar](50) NULL,
[SHTFooterTitle10] [varchar](50) NULL,
[SHTFooterTitle11] [varchar](50) NULL,
[SHTFooterTitle12] [varchar](50) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
ALTER TABLE [data].[TEMP_TAB_SHEET_U17] ADD DEFAULT ((0)) FOR [Phase]
GO
Query completed running with the error :
Msg 2714, Level 16, State 6, Line 2
There is already an object named 'TEMP_TAB_SHEET_U17' in the database.
Msg 1781, Level 16, State 1, Line 2
Column already has a DEFAULT bound to it.
Msg 1750, Level 16, State 0, Line 2
Could not create constraint. See previous errors.
How to solve this problem.
Thank you in advance.
Awan
Have you tried to set the constrain on creation:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [data].[TEMP_TAB_SHEET_U17](
[TAL] [varchar](30) NULL,
[Phase] [smallint] NOT NULL DEFAULT 0,
...
....
GO
SET ANSI_PADDING OFF
GO