How do I execute multiple SQL Statements in Access' Query Editor? - ms-access

I have a text file with a few SQL statements in it that I want to run
on an Access database. I thought that should be possible with Access'
Query Editor. So, I go into this editor and paste the statements:
insert into aFewYears (yr) values ('2000')
insert into aFewYears (yr) values ('2001')
insert into aFewYears (yr) values ('2002')
insert into aFewYears (yr) values ('2003')
Trying to run them (by hitting the red exclamation mark) I receive a
Missing semicolon (;) at end of SQL statement.
This could be taken as an indication that the editor would allow to execute
multiple statements. So, I change the statements and append such a
semicolon at the end:
insert into aFewYears (yr) values ('2000');
insert into aFewYears (yr) values ('2001');
insert into aFewYears (yr) values ('2002');
insert into aFewYears (yr) values ('2003');
Then I get a
Characters found after end of SQL statement.
which probably could be taken as an indication that it is not possible
to execute multiple statements.
Ok, so the question: is it possible to execute multiple statements in the
query editor, or is it possible to somehow batch-execute sql statements in a
file in/on/against Access.
Thanks / Rene
edit The insert statements were used as an example and I realize that they are less than perfect, because they all go to the same table and such a thing can obviously somehow be solved by using one statement that has a union or something. In my actual case that I am trying to solve, the file contains not only insert statements but also create table statements and insert statements with different underlying tables. So I hoped (and still hope) that there is something like my beloved SQL*Plus for Oracle that can execute a file with all kinds of SQL Statements.

You can easily write a bit code that will read in a file. You can either assume one sql statement per line, or assume the ;
So, assuming you have a text file such as:
insert into tblTest (t1) values ('2000');
update tbltest set t1 = '2222'
where id = 5;
insert into tblTest (t1,t2,t3)
values ('2001','2002','2003');
Note the in the above text file we free to have sql statements on more then one line.
the code you can use to read + run the above script is:
Sub SqlScripts()
Dim vSql As Variant
Dim vSqls As Variant
Dim strSql As String
Dim intF As Integer
intF = FreeFile()
Open "c:\sql.txt" For Input As #intF
strSql = input(LOF(intF), #intF)
Close intF
vSql = Split(strSql, ";")
On Error Resume Next
For Each vSqls In vSql
CurrentDb.Execute vSqls
Next
End Sub
You could expand on placing some error msg if the one statement don't work, such as
if err.number <> 0 then
debug.print "sql err" & err.Descripiton & "-->" vSqls
end dif
Regardless, the above split() and string read does alow your sql to be on more then one line...

Unfortunately, AFAIK you cannot run multiple SQL statements under one named query in Access in the traditional sense.
You can make several queries, then string them together with VBA (DoCmd.OpenQuery if memory serves).
You can also string a bunch of things together with UNION if you wish.

Better just create a XLSX file with field names on top row.
Create it manually or using Mockaroo.
Export it to Excel(or CSV) and then import it to Access using New Data Source -> From File
IMHO it's the best and most performant way to do it in Access.

"I hoped (and still hope) that there is something like my beloved SQL*Plus for Oracle that can execute a file with all kinds of SQL Statements."
If you're looking for a simple program that can import a file and execute the SQL statements in it, take a look at DBWConsole (freeware). I have used it to process DDL scripts (table schema) as well as action queries. It does not return data sets so it's not useful for SELECT queries. It supports single line comments prefixed by -- but not multi-line comments wrapped in /* */. It supports command line parameters.
If you want an interactive UI like Oracle SQL Developer or SSMS for Access then Matthew Lock's reference to WinSQL is what you should try.

You might find it better to use a 3rd party program to enter the queries into Access such as WinSQL I think from memory WinSQL supports multiple queries via it's batch feature.
I ultimately found it easier to just write a program in perl to do bulk INSERTS into an Access via ODBC. You could use vbscript or any language that supports ODBC though.
You can then do anything you like and have your own complicated logic to handle the importing.

create a macro like this
Option Compare Database
Sub a()
DoCmd.RunSQL "DELETE * from TABLENAME where CONDITIONS"
DoCmd.RunSQL "DELETE * from TABLENAME where CONDITIONS"
End Sub

Related

SSIS Execute SQL task based on parameter

Can i do something like below, let me know
IF #parameter=1 BEGIN ...query... END IF #parameter=2
Need the correct syntax if it is possible.
It's OLE DB connection.
Not a Stored Proc. just a sql query
DECLARE #param AS INT = ?;
IF #param = 1
BEGIN
SELECT 1 AS Y;
END
ELSE IF #param = 2
BEGIN
SELECT 2 AS Y;
END
There are two question marks in your query and probably you were passing only one variable. I have seen code where developers pass the same value twice (or multiple) times. This is inefficient. A better way is to receive the passed parameters in SSIS variables. Advantages:
1. You need to pass one value only once.
2. More importantly, if you change the order in which the passed parameters are used in the sql, you do not need to change their order on the user-interface of Execute SQL Task Editor//Parameters. This is what Andy Leonard has suggested later in his response.
You can. Assuming you are referring to an Execute SQL Task, the parameters in an Execute SQL Task using an OLE DB connection utilize question marks (?) as parameter placeholders. You map the placeholders to SSIS variables on the Parameter Mapping page of the Execute SQL Task. In the SQLStatement property you would enter:
If (?=1)
begin
... {some T-SQL here} ...
end
If (?=2)
begin
... {some T-SQL here} ...
end
That's one way to accomplish what I think you are asking.
Another way is to create an Execute SQL Task to read the value of #parameter from the database into an SSIS variable. Then you can build two Execute SQL Tasks - one with each option for T-SQL as the SQLStatement property - and use expressions on precedent constraints to determine which Execute SQL Task to execute.
Hope this helps,
:{>
You cannot use Execute SQL Task to run Transact-SQL statements.
For setting conditional SQL Statement based on what you are trying to achieve.
In Execute SQL Task editor
In general tab, leave the SQLStatement blank.
In parameter mapping tab, add parameter and map User::Parameter variable to Parameter Name 0.
In Expression tab, set the SQLStatementSource to
(DT_NUMERIC, 18, 0) #[User::Parameter]==1 ? ...query 1... : ...query 2...

Run complex SQL scripts from memo (multi lines)

I'm executing sql scripts while using ADO and MSSQL server .
Under Here you will find an first example of a multi lines sql statement like :
use master;
go;
EXEC sp_detach_db
#dbname=N'DATABASENAME';
go;
I copy these lines from a Tmemo to my TADOQuery.sql.text but fail as already the go statement is not recognized and I get a keyword error by the mssql server.
Can I run the whole sript as one TQquery or do I have th split my query into several pieces, separated by the semicolon and iterate through the whole text ?
At first your code is not valid (no ; after GO) and has to be like this
USE master;
GO
EXEC sp_detach_db
#dbname=N'DATABASENAME';
GO
In fact GO is a delimiter used by MSSMS to separate the SQL-Statements.
If you want to use the same scripts as MSSMS do, you have to work on that like MSSMS.
Split the script into single parts by delimiter GO
Send every part to the Database
You have to split every statement whithout sending go.
SQL-Server is not interpreting GO, this is done by MSSMS.

Bulk Insert not working in SQL Server 2008

Here's my SQL to bulk load a CSV file into SQL Server 2008, but its returning:
0 row(s) affected.
Code:
USE energyDB
GO
BULK INSERT energydata
FROM 'c:\temp\24544_MSSQL_out.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
The CSV file looks like this (the top line is ignored)
24544,"1970-01-01 10:00:00","8056060 kWh"
24544,"2012-12-04 00:15:00",0.176
24544,"2012-12-04 00:30:00",0.163
24544,"2012-12-04 00:45:00",0.016
Bulk insert doesn't remove quotes from the data, you'll either need to change the file being imported or import to a table where every column is a character field and strip the quotes and convert datatypes in a query. I've just found the following MSDN article:
http://msdn.microsoft.com/en-us/library/ms188609.aspx
It states:
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations.
And then continues on with a few examples of the situations it will work under.
This question is a little old, but it showed up when I searched for the problem, so I thought I would provide my solution.
In my case it was a simple mistake of not inserting into the right table, so be sure to check the table that you are inserting into. In trying to figure out what was going on though, I found that you can have the bulk insert process create an error file that will hopefully guide you in the right direction. To do this, you can use something like ERRORFILE ='E:\Error.txt'. This should output the error you are receiving to a file called Error.txt. I've provided a full example below:
BEGIN TRANSACTION
BEGIN TRY
BULK INSERT [Table Name]
FROM 'E:\FileName.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n', ERRORFILE ='E:\Error.txt')
COMMIT TRANSACTION
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION
END CATCH
You'll notice that I've wrapped the bulk insert process in a Transaction. This is so that if any issue occurs during the bulk insert it will rollback everything and I won't get a partial data import.

Save SQL statement in Memo/Text Field

I am building a Batch table in an Access database to save operations from a form to be processed after the user clicks the submit button (on the form).
My only concern is that the SQL Statements themselves will have text qualifiers in them. When i submit the sql statement to be stored in the database i have to wrap the sql string in a text qualifier and i want to make sure that the statement qualifiers will not be escaped when performing a Insert statement into the batch table.
Example:
SQL Statement (operational statement)
INSERT INTO tblGrpLoc (gid, txt) VALUES (2, 'Select * From tblInformation')
SQL Statement (batch storage)
INSERT INTO tblBatch(act, sql) VALUES (0, 'INSERT INTO tblGrpLoc (gid, txt) VALUES (2, 'Select * From tblInformation')')
Eventually i would iterate through the Batch table and only execute the field sql and update another field to denote its execution but i want to make sure that the sql field itself will be homogenous with the sql statement to be executed with no loss of string qualifiers.
Edited (2012-08-13 # 13:42pm CST)
To give you an idea of how this nesting is being incorporated here is the method:
Public Sub BatchAdd(ByRef db As Database, action As BatchAction, sql As String)
Dim bsql As String
Dim bact As Integer: bact = CInt(action)
bsql = SQLInsert("tblBatchTransaction", _
"action, txt", _
(CStr(bact) & ",'" & sql & "'"))
db.Execute bsql
End Sub
SLQInsert simply builds a SQL Insert statment. No you can see how i might have a String Qualifier issue arise.
If I understand your question correctly, you want to store your SQL statement in such a way that it can be run as-is. However, the internal single quotes are getting in the way. Try replacing the outer single quotes with double quotes:
INSERT INTO tblBatch(act, sql) VALUES (0, "INSERT INTO tblGrpLoc (gid, txt) VALUES (2, 'Select * From tblInformation')")

Does SQL Server Management Studio (or SQL Server) evaluate *all* expressions?

Here's my configuration:
I have a re-runnable batch script that I use to update my database.
Inside of that batch script, I have code that says the following:
If Table 'A' doesn't exist, then create Table 'A' and insert rows into it.
Later on in that batch script, I create an schemabound indexed view on that table.
And if you didn't already know, indexed views require specific client settings.
Sometimes, when I re-run the script, that is after the table has been created, SQL Server Management Studio evaluates the "insert rows" code, which is protected by the 'If this table doesn't exist' code, and yields the following error:
Msg 1934, Level 16, State 1, Line 15
INSERT failed because the following SET options have incorrect settings: 'CONCAT_NULL_YIELDS_NULL, ANSI_WARNINGS, ANSI_PADDING, ARITHABORT'. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or filtered indexes and/or query notifications and/or XML data type methods and/or spatial index operations.
Please note: If someone were to try this INSERT statement in a vacuum, I would fully expect SSMS to generate this error.
But not when it's protected by a conditional block.
My Question:
Does the SSMS compiler evaluate all expressions, regardless of whether they will actually be executed?
Yes, it evaluates all of them,take a look at this
declare #i int
select #i =1
if #i = 1
begin
declare #i2 int
set #i2 = 5
end
else
begin
declare #i2 int
set #i2 = 5
end
Msg 134, Level 15, State 1, Line 12
The variable name '#i2' has already been declared. Variable names must be unique within a query batch or stored procedure.
Another example with temp tables is here: What is deferred name resolution and why do you need to care?
your only way out would be to wrap it inside dynamic SQL
Note that most of the settings you mention are connection-level, i.e. in case you set/change them they stay in effect unless you close the connection or explicitly change their value.
Returning to your question. The error you mention looks like runtime error, i.e. the INSERT is actually being executed. It would be better if you could show your script (omitting details, but keeping batches).
Edit: it is not SSMS compiler that evaluates SQL you try to execute - it is SQL Server. What do you meant by 'evaluate'? Is it 'execute'? When you run a batch (which is what actually is being executed by a server), SQL Server first does syntactic analysis and throws error in case it finds any syntactic error, nothing is being executed at this point of time. In case syntax is ok, the server starts executing you batch.
Again, the error you show seems to be runtime - so I guess you'd carefully watch for the conditions and track what happens (or provide us more details about 'sometimes').