select library in SAS from SSIS - ssis

I am using SSIS to extract some data out of an SAS server.
using this connection setup (SAS IOM Data Provider 9.3)
I can get the connection to read the default Library/Shared data folder.
What do I need to change/set to get it to read a different library?
These are the properties of the libraries:
The one on the left is the one I can read, the one on the right is the one I am trying to access.

If your data folder contains *.sas7bdat files then you could use this:
http://microsoft-ssis.blogspot.com/2016/09/using-sas-as-source-in-ssis.html

Simply write your SAS libname statement inside the SAS Workspace Init Script box, eg as follows:
libname YOURLIB "/your/path/to/sas/datasets" access=readonly;
More info: http://support.sas.com/kb/33/037.html

Related

Azure Data Factory - Process SSIS Output

I'm working to lift an SSIS package into Azure Data Factory V2, and I have successfully set up an IR and executed the package.
Now I'm attempting to work with the results in ADF. This package was originally designed to return a recordset to the calling client. Now that I'm in ADF, I'd like to take the recordset produced by the package and copy it to table storage. However, I see no way to access this recordset from within the ADF pipeline.
Is it possible to access and process this recordset from the host ADF pipeline, or will the package itself have to be modified to no longer return a recordset and perform the copy instead?
In the SSIS create a text file as output and copy it to a location/folder in blob or even your on premise folder.
If you run SSIS on premise, store it in an on premise folder , use AZCopy
tool to move it Azure blob to a BLOB
https://blogs.technet.microsoft.com/canitpro/2015/12/28/step-by-step-using-azcopy-to-transfer-files-to-azure/
Otherwise, you run SSIS on Azure as you mentioned .Copy the output of
your rowset to a flat file using flat file connection manager.Create another dataflow task in which you can upload the file to Azure BLOB in
https://www.powerobjects.com/blog/2018/11/20/uploading-azure-blob-ssis/
Now your Azure pipleline can access that BLOB as a source in the copy
activity and dump it to the table storage as a sink.
Let me know if you need more details on the implementation.

How to set the path of a CSV file that is in account storage in azure data factory pipeline

I have created a SSIS package that reads from a CSV file (using the Flat file connection manager) and loads records into a database. I have deployed it on Azure data factory pipeline and I need to give the path of the CSV file as a parameter. I have created a azure storage account and uploaded the source file there as shown below.
Can I just give the URL of the source file for the Import file in the SSIS package settings as shown below? I tried it but it is currently throwing 2906 error. I am new to Azure - appreciate any help here.
First, you said Excel and then you said CSV. Those are two different formats. But since you mention the flat file connection manager, I'm going to assume you meant CSV. If not, let me know and I'll update my answer.
I think you will need to install the SSIS Feature Pack for Azure and use the Azure Storage Connection Manager. You can then use the Azure Blob Source in your data flow task (it supports CSV files). When you add the blob source, the GUI should help you create the new connection manager. There is a tutorial on MS SQL Tips that shows each step. It's a couple years old, but I don't think much has changed.
As a side thought, is there a reason you chose to use SSIS over native ADF V2? It does a nice job of copying data from blob storage to a database.

The destination component does not have any available inputs for use in creating a path

I'm working with legacy tsql code that outputs to txt files.
For security reasons, I'm replacing these outputs with SSIS packages.
I've gotten most of them to work, but one particular one gives me the following error:
TITLE: Microsoft Visual Studio
Cannot create connector.
The destination component does not have any available inputs for use in creating a path.
The data flow itself is very simple. OLE DB Source runs an SQL command, then outputs to a flatfile source that points to an existing txtfile that was created by the TSQL.
Anyone know what the error means in regards to the available inputs?
Your SSIS toolbox is divided into 3 general groupings (pre 2012/2014)
Sources
Transformations
Destinations
A source has 1 to N output paths. Nothing can feed into a source. Things can only consume what a Source emits.
A Transformation does not generate* rows, it accepts rows from an upstream provider (either a Source or another Transformation). A Transformation has 1 to N output paths.
A Destination is the terminus for data. I'm not aware of any destinations that accept more than one input. It has one optional output path, Error.
Your problem, therefore, is that you are trying to route data into a Source. Change that to a Flat File Destination.
Replace the Flat File Source with a Flat File Destination task. Click on the RowCount task and drag the green arrow to the new destination.

importing thousands of files in a flat file manager SSIS data flow task sql server 2008

I am trying to import multiple tab delimited files into a sql server table using a SSIS package. I set the flat file source and created a flat file connection manager but I was told I will need to create multiple flat file sources for this. This cannot be true right?
Is there not someway I can use a loop and the source folder directory location?
So long as the files are all the same structure, you'd use a for-each loop, of type file. Point it at the folder with the files in, and assign a variable to the file+path. Then use that variable as an expression on the flat file connection manager.
Here is a link that shows how to do this: http://www.sqlis.com/sqlis/post/Looping-over-files-with-the-Foreach-Loop.aspx
I like the layout and the graphical indicators used on that page.

iSeries Export to CSV

Is there an iSeries command to export the data in a table to CSV format?
I know about the Windows utilities, but since this needs to be run automatically I need to run this from a CL program.
You can use CPYTOIMPF and specify the TOSTMF option to place a CSV file on the IFS.
Example:
CPYTOIMPF FROMFILE(DBFILE) TOSTMF('/outputfile.csv') STMFCODPAG(*PCASCII) RCDDLM(*CRLF)
If you want the data to be downloaded directly to a PC, you can use the "Data Transfer from iSeries" function of IBM iSeries Client Access to create a .CSV file. In the file output details dialog, set the file type to Comma Separated Variable (CSV).
You can save the transfer description to be reused later.
You could use a trigger. The iSeries Client Access software wont do since that is a windows application, what I understand is that you need the data to be exported each time that the file is written. Check this link to know more about triggers.
You are going to need FTP to perform that action.
If your iSeries shop uses ZMOD/FTP your shortest solution is a few lines of code away -- 3 lines to be exact -- the three lines are to Start FTP, Put DBF, and finally, End FTP.
IF you don't use ZMOD/FTP:
- You could use native FTP/400 to accomplish what you need to do, but it is quite involved!!!
- you may probably need to use an RPGLE program to parse, format, and move, data into a "flatfile", then use native FTP/400 to FTP the file out
- and yes, a CL will need as a wrapper!
You can do it all in one very simple CL program:
CPYTOIMPF the file TOSTMF -> the cvs file will be in the IFS
FTP the file elsewhere (to a server or a PC)
It works like a charm