Assigning the variables to the Temporary table columns - sql-server-2008

I'm working on a Task where I should get the datas from all the table from the Database without knowing the table names. What I'm planning to do is to write a query to get all the table names and column names and store it in a temporary table.
Once I got the table name and column name I have to store the values to the variables like #Table_Name and #Column_Name. Then I have to call the table names within a Loop and have to write a query like this
"select * from '''+#Table_Name+''' where '''+#Column_Name+''' = 1"
Earlier I used cursor to get the values and store it in a variable and call it within the loop. But it seems that the cursor is consuming more time and memory. So I thought of changing it using temporary table.
My doubt is, is it possible to assign the Temporary table values to a variable to use it later. If possible how to do it?

Your idea may not work, I feel, since you are trying to incorporate the concept of arrays in SQL Server. SQL Server does not have arrays, but it has tables. You can obtain the data you need by creating a temporary table. Inserting into it by firing a query like
select * from sys.tables.
Then trying to do your operation by loading each row from this table. After your work is done, by all means drop the table.

Related

Create a table with one column from a list of values as input to stored procedure

I am receiving a list as input for a stored procedure in MySQL 5.6 and need to create a temporary table that has a a column (listOfUsers). Each item in the list, needs to be its own row for this column.
All the answers I've seen so far, show the list being used in a WHERE clause to filter a query. I am not trying to filter anything, just create a table with one column from a list.
Is this possible?
I am going to assume some specifics in the example, I hope my assumptions are relevant to your problem. Suppose we have a comma-delimited list on one line in a file. Then we can do:
create table t1 (s varchar(50));
load data local infile '/tmp/file.txt' into table t1
lines terminated by ',';
update t1 set s = replace(s, "\n", "");
The last update is needed to remove the spurious newline from the last column. Something based on this idea will hopefully solve your problem.
If the data is not coming from a file, a simple solution that requires minimum application coding is to put the data in a temporary file, and then apply the method above. Alternatively, you can just parse out the input in the application, and then build a multi-row insert.

Oracle Trigger: How to put OLD and NEW values as JSON for multiple tables

I'm wondering if this is possible in Oracle 12g to make a procedure that will be called on update from triggers on multiple tables with different columns.
From my understanding I got two values OLD and NEW. I'm making trigger that work AFTER UPDATE & FOR EACH ROW. Does this possible to send whole row ( variables :OLD or :NEW) to some function like JSON_OBJECT or etc. which will parse row and produce output that can be stored in some audit table?
Main reason for such needs is to not keep in every trigger their own list of columns names for table. Because it different and every change in every table structure will affect trigger.
Or maybe I'm wrong and you can suggest how to properly solve this?
For test cases I have something like this:
AUDIT_TBL (ID, TABLE_NAME,OLD_JSON,NEW_JSON,DATE);
TABLE1 (ID,KEY,VALUE);
TABLE2 (ID,NAME,SURNAME,KEY);
TABLE3 (KEY,COLUMN1,CLOMUNT2,COLUMN3);
I was planning to have trigger on TABLE1,TABLE2,TABLE3 that will run some procedure with parameters as row (:OLD or :NEW), will get a result and put it into AUDIT_TBL.
Any suggestions or ideas how to do this properly and without blood?

Get hold of the columns names of a temporary table in mysql

I need to automatically pivot the results of a query execution in mysql.
The problem is that the results are dynamic, depending on the parameters of the query I'm executing.
So to create a dynamic pivoting procedure, I need to get hold of the columns returned by the query I'm executing.
I'm saving the results into a in memory temp table.
If I do "Describe table" or "Show columns from table", I can see the columns, but cannot get hold of them programmatically, to perform the actions with the table names.
As it's a temporary table, the information_schema.columns does not contain any information either.
So the questions are:
1 - Can I get hold (in mysql) of the results of describe command
2 - Is there another way to get hold of the column names of a temp table.
Thanks in advance!

MySQL - Automate a view update

I'd like to know if it is possible to create an SQL function that will automatically update my View and add to it new tables in my DB.
My DB consist of multiple tables (same data structure) and are named as follow "MM_DD", now I would like to create a VIEW that joins all this data ( pretty simple , see query below) but I wish to automate the process so every time a new table is added the view will update.
CREATE OR REPLACE VIEW `viewTable` AS
select *,
md5(CONCAT(`columnA`,`columnB`)) AS myPK_id
from `05_06`
union all
select *,
md5(CONCAT(`columnA`,`columnB`)) AS myPK_id
from `05_08`
ect...
What I am doing at the moment is using PHP every time a table is added. It loops through the tables and create / update the view.
select * from information_schema.tables WHERE table_name LIKE '%05%'
Now that I have an array of table names -> create my Query string -> replace view...
Is this possible to do this in SQL?
If so, what is the correct approach?
Write a stored procedure to rebuild your view. You'll need to use what MySQL internally calls "prepared statements" so you can use string manipulation.
You'll still use your SELECT ... FROM information_schema.tables query to drive this stored procedure.
Once you get this working, you can embed it in a mySQL event, and arrange to run it automatically. For example, you could run it at the same time late at night.
Or, you can invoke the stored procedure immediately after you create one of these tables.

Attempting to create variable variables in mysql ONLY. (Not using PHP)

Context
I have a dynamically generated .sql file that contains a series of insert statements which are going to be used for inserting data into many different tables that are dependent on other tables.
After each insert statement is ran, if a particular table has an autoincremented id column, then the text "SET #autoIncrementColumnName = LAST_INSERT_ID();" is generated which stores the last insert id of that insert statement in a mysql variable. Then if there is another INSERT statement for that particular table, the process is repeated. The problem is that each statement "SET #autoIncrementColumnName = LAST_INSERT_ID();" overwrites the previous variable before it is able to use the variable later on in the .sql file.
So then later on in the .sql script where you see two lines like these:
INSERT INTO relatedTable (col1,col2,specialColumn,col3,col4) VALUES ('','',#autoIncrementColumnName,'','');
INSERT INTO relatedTable (col1,col2,specialColumn,col3,col4) VALUES ('','',#autoIncrementColumnName,'','');
It needs to insert the mysql value it stored earlier but all of the variables are being overwritten except for one.
Two Questions
Is it possible to create variable variables using only MYSQL? Like this:
SET #dynamicVarName = CONCAT('guestCreditCardId', LAST_INSERT_ID());
SET ##dynamicVarName = LAST_INSERT_ID();
If variable variables are not possible, what solution could I use?
While looking deeply into the problem at hand, I found that I was able to avoid headaches by creating multiple functions/methods, each one being responsible for a specific part of the sql generation. That way later down the road, if you needed to create another dynamic sql statement, you can place it where you need to by just calling another function/method from wherever you need to.
For instance
If you have 5 tables you are generating "INSERT INTO" sql statements for, and 3 of those tables have records that are not being used in the creation of other dynamic sql statements, then you can have one function/method handle all tables that don't require data from other tables.
For any special cases
Then create separate functions/methods for special cases.