Copy Scalar Functions from one Database to another - sql-server-2008

How do I copy just the Scalar Functions from one Database to another? I'm not worried about copying any tables or data. I tried performing an Export Task but that seemed to only let me move tables.

These steps were done on SQL Server 2008 R2 in SSMS.
In short, I used Task -> Generate Scripts... instead of Script Database as -> Create To. The latter only returned a SQL script to create the Database (e.g. Create Database, Alter Database, and Filegroups) without creating any other objects in the Database (e.g. Tables, Views or Functions).
Here are the exact steps with screenshots:
Right click on the database with the functions you want and go to Generate Scripts
Click through the first screen of the Wizard
Choose User-Defined Functions
Finish the wizard.
Also, this answer, while it isn't an exact corollary, prompted me to look for the Generate Scripts option.

-- This program copies (CREATE OR ALTER THE FUNCTION) a single Function from one database to another
-- *** Note that all objects mentioned in the function must exist in the target database ***
declare #SourceDatabase nvarchar(50);
declare #SourceSchemaName nvarchar(50)
declare #TargetDatabase nvarchar(50);
declare #FunctionName nvarchar(50);
set #SourceDatabase = N'Northwind' -- The name of the Source database
set #SourceSchemaName = N'dbo' -- The name of the Function SCHEME
set #FunctionName = N'WriteToTextFile' -- The name of the Function
set #TargetDatabase = N'AdventureWorks' -- The name of the Target database
declare #sql nvarchar(max)
-- If the Function SCHEME does not exist, create it
set #sql = ' use [' +#TargetDatabase +'] ' +
' IF NOT EXISTS (SELECT * FROM sys.schemas WHERE lower(name) = lower(''' + #SourceSchemaName + ''')) '+
' BEGIN ' +
' EXEC('' CREATE SCHEMA '+ #SourceSchemaName +''') ' +
' END'
exec (#sql);
-- CREATE OR ALTER THE FUNCTION
set #sql = ''
set #sql = #sql + ' use [' + #TargetDatabase +'] ;' +
' declare #sql2 nvarchar(max) ;' +
' SELECT #sql2 = coalesce(#sql2,'';'' ) + [ROUTINE_DEFINITION] + '' ; '' ' +
' FROM ['+#sourceDatabase+'].[INFORMATION_SCHEMA].[ROUTINES] ' +
' where ROUTINE_TYPE = ''FUNCTION'' and ROUTINE_SCHEMA = ''' +#SourceSchemaName +''' and lower(ROUTINE_NAME) = lower(N''' + #FunctionName + ''') ; ' +
' set #sql2 = replace(#sql2,''CREATE FUNCTION'',''CREATE OR ALTER FUNCTION'')' +
' exec (#sql2)'
exec (#sql)

Related

OPENJSON in compatibility level 100 SQL SERVER 2016

I need to use the functionality of OPENJSON() in an old database with compatibility level 100. The server runs SQL SERVER 2016. So i came up with this idea: Create another DB "GeneralUTILS" (lvl 130) in the same server and call this function from lvl 100 DB:
CREATE FUNCTION [dbo].[OPENJSON_](#json NVARCHAR(MAX))
RETURNS #Results TABLE ([Key] nVARCHAR (4000) , [Value] NVARCHAR(MAX), [Type] INT)
AS
BEGIN
INSERT INTO #Results
SELECT * from OPENJSON(#json)
RETURN
END
But i don't have the WITH clause to modify the output table in the lvl 100 database.
Most important might be the question why you need this at all...
I hope I got correctly, what you need:
(Hint: This needs at least SQL-Server 2016)
--create two mock-up-databases
CREATE DATABASE dbOld;
GO
ALTER DATABASE dbOld SET COMPATIBILITY_LEVEL = 100; --v2008
GO
CREATE DATABASE dbForJsonIssues;
GO
ALTER DATABASE dbForJsonIssues SET COMPATIBILITY_LEVEL = 130; --v2016
GO
--Now we will create a stored procedure in the "higher" database
USE dbForJsonIssues;
GO
--Attention: replacing FROM is a very hacky way... Read the hints at the end...
--You might use parameters for the JSON-string and the JSON-path, but then you must use sp_executesql
CREATE PROCEDURE EXEC_Json_Command #Statement NVARCHAR(MAX), #TargetTable NVARCHAR(MAX)
AS
BEGIN
DECLARE #statementWithTarget NVARCHAR(MAX)=REPLACE(#Statement,'FROM',CONCAT(' INTO ',#TargetTable,' FROM'));
PRINT #statementWithTarget; --you can out-comment this line...
EXEC(#statementWithTarget);
END
GO
--Now we go into the "lower" database
USE dbOld;
GO
--A synonym is not necessary, but allows for easier code
CREATE SYNONYM dbo.ExecJson FOR dbForJsonIssues.dbo.EXEC_Json_Command;
GO
--This is how to use it
DECLARE #json NVARCHAR(MAX)=N'[{"someObject":[{"attr1":"11", "attr2":"12"},{"attr1":"21", "attr2":"22"}]}]';
DECLARE #Statement NVARCHAR(MAX)=CONCAT(N'SELECT * FROM OPENJSON(N''',#json,N''',''$[0].someObject'') WITH(attr1 INT,attr2 INT)');
--the target table will be created "on the fly"
--You can use ##SomeTarget too, but be careful with concurrencies in both approaches...
EXEC ExecJson #Statement=#Statement,#TargetTable='dbOld.dbo.SomeTarget';
SELECT * FROM SomeTarget;
--We can drop this table after dealing with the result
DROP TABLE SomeTarget;
GO
--Clean-up (carefull with real-data!)
USE master;
GO
DROP DATABASE dbOld;
DROP DATABASE dbForJsonIssues;
The most important concepts:
We cannot use the JSON-statements directly within the database, but we can create a statement on string base, pass it to the stored procedure and use EXEC() for its execution.
Using SELECT * INTO SomeDb.SomeSchema.SomeTargetTable FROM ... will create a table with the fitting structure. Make sure to use a table not existing in your database.
It is not really needed to pass the target table as parameter, you might place this in the statement yourself. Replacing the FROM in the stored procedure is a very shrewed way and could lead into troubles if from is found in another place.
You might use similar procedures for various needs...
Yeah. No way this would pass the smoke screen at our office. Anyway someone asked me to do something similar, but the use case was for parsing json arrays only. Since Json_Query and Json_Value are available I hacked this together just to give them something to work with. My colleague liked the results. Turns out he's much cooler than I am after he modified it.
Declare #Fields NVarchar(2000) = 'Name,Coolness'
Declare #Delimiter As Varchar(10) = ',';
Declare #Xml As Xml = Cast(('<V>' + Replace(#Fields, #delimiter, '</V><V>') + '</V>' ) As Xml);
Declare #Json Nvarchar(4000) = N'{"Examples":[{"Name": "Chris","Coolness": "10"},{"Name": "Jay","Coolness": "1"}]}';
Exec ('Begin Try Drop Table #JsonTemp End Try Begin Catch End Catch');
Create Table #JsonTemp (JsonNode Nvarchar(1000));
Declare #Max INTEGER = 100;
Declare #Index INTEGER = 0;
While #Index < #Max
Begin
Declare #Affected Integer = 0;
Declare #Select Nvarchar(200) = '''' + 'lax$.Examples[' + Convert(Nvarchar, #Index) + ']' + '''';
Declare #Statement Nvarchar(2000)= 'Select Json_Query(' + '''' + #Json + '''' + ', ' + #Select + ') Where Json_Query(' + '''' + #Json + '''' + ', ' + #Select + ') Is Not Null';
Insert Into #JsonTemp (JsonNode) Exec sp_executesql #Statement;
Set #Affected = ##RowCount;
If (#Affected = 0) Begin Break End
Set #Index = #Index + 1;
End
Declare #Table Table(Field NVarchar(200));
Declare #Selector NVarchar(500) = 'Json_Value(' + '''' + '{"Node":' + '''' + ' + ' + 'JsonNode' + ' + ' + '''' + '}' + '''' + ', ' + '''' + '$.Node.#Field' + '''' + ')';
Insert Into #Table(Field)
Select N.value('.', 'Varchar(10)') As Field
From #XML.nodes('V') As A(N);
Declare #Selectors Varchar(8000);
Select #Selectors = Coalesce(#Selectors + ', ', '') + Replace(#Selector, '#Field', Field) + ' As ' + Field
From #Table
Exec ('Select ' + #Selectors + ' From #JsonTemp');

Drop All constraints in a Table

Am trying to write script for removing Constraints.
I have the below function to select the Constarints in my DataBase
SELECT name
FROM sys.foreign_keys
And I have written alter scripts using the above scripts
SELECT
'ALTER TABLE ' + OBJECT_NAME(parent_object_id) +
' DROP CONSTRAINT ' + name
FROM sys.foreign_keys
Using the above query how can I execute these constraints ?
I can use DROP DATABASE DBName. But am just trying to drop tables by dropping Constraints.
is it possible without going for SP ? Or any easy ways I can proceed?
Well you can always copy the output from the bottom pane, paste it into the top pane, and hit F5. Or you can build a string to execute directly:
DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += N'
ALTER TABLE ' + QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id))
+ '.' + QUOTENAME(OBJECT_NAME(parent_object_id)) +
' DROP CONSTRAINT ' + QUOTENAME(name) + ';'
FROM sys.foreign_keys;
PRINT #sql;
-- EXEC sp_executesql #sql;
(When you are happy with the PRINT output, comment it out and uncomment the EXEC. Note that the print output will be truncated to 8K in Management Studio but the variable really holds the entire command.)
Also I don't know how this really relates to whether you are using a stored procedure or not, or why you are trying to do it "w/o going for SP"... this query can be run as a stored procedure or not, it all depends on how often you're going to call it, where the procedure lives, etc.
This worked for me in SQL Server 2008:
DECLARE #SQL NVARCHAR(MAX) = N'';
SELECT #SQL += N'
ALTER TABLE ' + OBJECT_NAME(PARENT_OBJECT_ID) + ' DROP CONSTRAINT ' + OBJECT_NAME(OBJECT_ID) + ';'
FROM SYS.OBJECTS
WHERE TYPE_DESC LIKE '%CONSTRAINT' AND OBJECT_NAME(PARENT_OBJECT_ID) = 'YOUR_TABLE';
PRINT #SQL
--EXECUTE(#SQL)
Of course, uncomment the EXECUTE(#SQL) when ready to run
The correct-marked question does not work for me. But this works for me in SQL Server 2017:
DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += N'
ALTER TABLE ' + QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id))
+ '.' + QUOTENAME(OBJECT_NAME(parent_object_id)) +
' DROP CONSTRAINT ' + QUOTENAME(name) + ';'
FROM sys.objects
WHERE type_desc LIKE '%CONSTRAINT'
AND OBJECT_NAME(PARENT_OBJECT_ID) LIKE 'your_table_name';
EXEC sp_executesql #sql;

Creating 'util' - stored procedure section, as with .net helper classes

A few minutes ago I was only searching for a simple syntax (SQL server) query that will copy a table Row .
This is usually done from time to time, when working on a ASP.net project, testing data with queries
inside the SQL SERVER management studio . so one of the routine actions is copying a row, altering the required columns to be different from each other, then testing data with queries
So I've encountered - this stored procedure- ,as answer by Dan Atkinson
but adding it to where all non testing purpose are stored lead me to think
is it possible to store them in sorted order so I could Distinguish
'utils' or 'testingPurpose' ones from those used in projects
(default folder inside managment treeview is Programmabilty) could this be another folder too
or this is not an option ?
if not , I thought of Utils. prefix like that (if no other way exist)
dbo.Utils.CopyTableRow
dbo.Utils.OtherRoutineActions ....
Or there's a designated way to achieve what I was thinking of.
this is a first "Util" stored procedure i've made , found it's only solution
prefexing it via Util_
ALTER PROCEDURE [dbo].[Utils_TableRowCopy](
#TableName VARCHAR(50) ,
#RowNumberToCopy INT
)
AS
BEGIN
declare #RowIdentity sysname =
(SELECT name FROM sys.identity_columns WHERE object_id = object_id(#TableName)
)
DECLARE #columns VARCHAR(5000), #query VARCHAR(8000);
SET #query = '' ;
SELECT #columns =
CASE
WHEN #columns IS NULL THEN column_name
ELSE #columns + ',' + column_name
END
FROM INFORMATION_SCHEMA.COLUMNS
WHERE (
TABLE_NAME = LTRIM(RTRIM(#TableName))
AND
column_name <> LTRIM(RTRIM(#RowIdentity))
);
SET #query = 'INSERT INTO ' + #TableName + ' (' + #columns + ') SELECT ' + #columns + ' FROM ' + #TableName + ' WHERE ' + #RowIdentity + ' = ' + CAST(#RowNumberToCopy AS VARCHAR);
--SELECT SCOPE_IDENTITY();
declare #query2 VARCHAR(100) = ' Select Top 1 * FROM '+ #TableName +' Order BY ' + #RowIdentity + ' desc' ;
EXEC (#query);
EXEC (#query2);
END

sql update with dynamic column names

EDIT: Database names have been modified for simplicity
I'm trying to get some dynamic sql in place to update static copies of some key production tables into another database (sql2008r2). The aim here is to allow consistent dissemination of data (from the 'static' database) for a certain period of time as our production databases are updated almost daily.
I am using a CURSOR to loop through a table that contains the objects that are to be copied into the 'static' database.
The prod tables don't change that frequently, but I'd like to make this somewhat "future proof" (if possible!) and extract the columns names from INFORMATION_SCHEMA.COLUMNS for each object (instead of using SELECT * FROM ...)
1) From what I have read in other posts, EXEC() seems limiting, so I believe that I'll need to use EXEC sp_executesql but I'm having a little trouble getting my head around it all.
2) As an added extra, if at all possible, i'd also like to exclude some columns for particular tables (structures vary slightly in the 'static' database)
here's what i have so far.
when executed, #colnames returns NULL and therefore #sql returns NULL...
could someone guide me to where i might find a solution?
any advice or help with this code is much appreciated.
CREATE PROCEDURE sp_UpdateRefTables
#debug bit = 0
AS
declare #proddbname varchar(50),
#schemaname varchar(50),
#objname varchar(150),
#wherecond varchar(150),
#colnames varchar(max),
#sql varchar(max),
#CRLF varchar(2)
set #wherecond = NULL;
set #CRLF = CHAR(10) + CHAR(13);
declare ObjectCursor cursor for
select databasename,schemaname,objectname
from Prod.dbo.ObjectsToUpdate
OPEN ObjectCursor ;
FETCH NEXT FROM ObjectCursor
INTO #proddbname,#schemaname,#objname ;
while ##FETCH_STATUS=0
begin
if #objname = 'TableXx'
set #wherecond = ' AND COLUMN_NAME != ''ExcludeCol1'''
if #objname = 'TableYy'
set #wherecond = ' AND COLUMN_NAME != ''ExcludeCol2'''
--extract column names for current object
select #colnames = coalesce(#colnames + ',', '') + QUOTENAME(column_name)
from Prod.INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = + QUOTENAME(#objname,'') + isnull(#wherecond,'')
if #debug=1 PRINT '#colnames= ' + isnull(#colnames,'null')
--replace all data for #objname
--#proddbname is used as schema name in Static database
SELECT #sql = 'TRUNCATE TABLE ' + #proddbname + '.' + #objname + '; ' + #CRLF
SELECT #sql = #sql + 'INSERT INTO ' + #proddbname + '.' + #objname + ' ' + #CRLF
SELECT #sql = #sql + 'SELECT ' + #colnames + ' FROM ' + #proddbname + '.' + #schemaname + '.' + #objname + '; '
if #debug=1 PRINT '#sql= ' + isnull(#sql,'null')
EXEC sp_executesql #sql
FETCH NEXT FROM ObjectCursor
INTO #proddbname,#schemaname,#objname ;
end
CLOSE ObjectCursor ;
DEALLOCATE ObjectCursor ;
P.S. i have read about sql injection, but as this is an internal admin task, i'm guessing i'm safe here!? any advice on this is also appreciated.
many thanks in advance.
You have a mix of SQL and dynamic SQL in your query against information_schema. Also QUOTENAME isn't necessary in the where clause and will actually prevent a match at all, since SQL Server stores column_name, not [column_name], in the metadata. Finally, I'm going to change it to sys.columns since this is the way we should be deriving metadata in SQL Server. Try:
SELECT #colnames += ',' + name
FROM Prod.sys.columns
WHERE OBJECT_NAME([object_id]) = #objname
AND name <> CASE WHEN #objname = 'TableXx' THEN 'ExcludeCol1' ELSE '' END
AND name <> CASE WHEN #objname = 'TableYy' THEN 'ExcludeCol2' ELSE '' END;
SET #colnames = STUFF(#colnames, 1, 1, '');

Drop database in SQL Server using wildcard

I have an application that creates a separate database (SQL Server 2008) for each new customer, during testing we end up with a lot of databases called PREFIX.whatever ...
I would love a script that would look for all databases starting with PREFIX. and drop them so we can start a clean test cycle. Any help greatly appreciated.
SELECT ' DROP DATABASE [' + NAME + ']' FROM sys.sysdatabases where name like 'PREFIX%'
Copy the output and execute this to drop Databases in your criteria. You can also schedule this on a daily basis with a little tweaking.
Update:
We ended up expanding the answer from Baaju so I thought I would share it. We call teh following script from MSBuild and it cleans out all of teh existing DB's created during testing:
use master
DECLARE #Name nvarchar(1000);
DECLARE testdb_cursor CURSOR FOR
SELECT 'ALTER DATABASE' + '[' + NAME + ']' + ' SET SINGLE_USER WITH ROLLBACK IMMEDIATE DROP DATABASE ' + '[' + NAME + ']' FROM sys.sysdatabases where name like 'TCM.%'
OPEN testdb_cursor;
-- Perform the first fetch and store the value in a variable.
FETCH NEXT FROM testdb_cursor
INTO #Name;
-- Check ##FETCH_STATUS to see if there are any more rows to fetch.
WHILE ##FETCH_STATUS = 0
BEGIN
-- Concatenate and display the current values in the variables.
exec sp_executesql #Name;
-- This is executed as long as the previous fetch succeeds.
FETCH NEXT FROM testdb_cursor
INTO #Name;
END
CLOSE testdb_cursor;
DEALLOCATE testdb_cursor;
Just ran into this and come up with a slight variation to allow immediate execution without cursors:
DECLARE #SQL NVARCHAR(MAX) = ''
SELECT #SQL = #SQL
+ 'ALTER DATABASE [' + [name] + '] SET SINGLE_USER WITH ROLLBACK IMMEDIATE; '
+ 'DROP DATABASE [' + [name] + ']; '
FROM sys.databases
WHERE [name] like 'temp_%' AND create_date < DATEADD(day,-7,GETDATE())
-- display statements
SELECT #SQL
-- execute (uncomment)
--EXEC sp_executesql #SQL
The above is deleting any databases starting with "temp_" and older than 7 days, but that can be adapted obviously to any situation.
DANGER: Mess up your query, delete some or all of your databases. I left the EXEC statement commented out just to try to avoid someone doing doing this through copy/paste.