Securing a Script Task in SSIS - ssis

I have a Script Task with C# code written inside. The code is supposed to make several REST calls to get some data. The credentials (username, password) are hard-coded within the script. What are the things to do to make sure that my package is secured, and what is the best practice in similar scenarios, keeping in my mind that there is no possibility to use third-party API connectors, and Script Task is my only option ?

The best approach would be to move Login and Password from script task into Package parameters and declare Password as being sensitive. Thus Login and Password can be specified later at Package start or stored at Environment variables. Sensitive Password parameter means that it will be stored encrypted and cannot be dumped to a file, for example.
The following code sample shows how to use encrypted password in your script task
Dts.Variables["$Package::YourPassword"].GetSensitiveValue().ToString()
If you need to distribute your package without disclosing Login and Password, switch to another authentication method, perhaps, with certificates. Script Task source code cannot be obfuscated, so everyone who can download the package from the server has an opportunity to inspect your Script Task.

Related

is there any way to create an excel file and save it or email it?

is there any way using SSIS (or any MSSQL Server features) to automatically run a stored procedure and have the output saved as an Excel files (or even flat file) and then have the newly created file sent to people via email?
sorry im a complete newbie to SSIS.
In broad strokes, you'll have an SSIS package with 2 tasks and 3 connection manager.
The first task is a Data Flow Task. Much as the name implies, the data is going to flow here - in your case, from SQL Server to Excel.
In the Data Flow task, add an OLE DB Source to the data flow. It will ask what Connection Manager to use and you'll create a new one pointed at your source system. Change the source from the Table Selector to a Query and then reference your stored procedure EXECUTE dbo.ExportDaily'
Hopefully, the procedure is nothing more than select col1, col2, colN from table where colDate = cast(getdate() as date) Otherwise, you might run into challenges for the component to determine the source metadata. Metadata is the name of the game in an SSIS data flow. If you have trouble, the resolution is version dependent - pre 2012 you'd have a null operation select as your starting point. 2012+ you use the WITH RESULT_SETS to describe the output shape.
With our source settled, we need to land that data somewhere and you've indicated Excel. Drag an Excel destination onto the canvas and again, this is going to need a connection manager so let it create one after you define where the data should land. Where you land the data is important. On your machine, C:\user\pmaj\Documents is a valid path, but when this runs on a server as ServerUser1... Not so much. I have a pattern of C:\ssisdata\SubjectArea\Input & Output & Archive folders.
Click into the Columns tab, and there's nothing to do here as it auto-mapped source columns to the destination. Sort the target column names by clicking on the header. A good practice is to scroll through the listing and look for anything that is unmapped.
Run the package and confirm that we have a new file generated and it has data. Close Excel and run it again. It should have clobbered the file we made. If it errors (and you don't have your "finger" on the file by having it open in Excel, then you need to find the setting in the Excel destination that says overwrite existing file)
You've now solved the exporting data to Excel task. Now you want to share your newfound wisdom with someone else and you want to use email to do so.
There are two ways of sending email. The most common will be the Email task. You'll need to establish a connection to your SMTP server and I find this tends to be more difficult in the cloud based world - especially with authentication and this thing running as an unattended job.
At this point, I'm assuming you've got a valid SMTP connection manager established. The Send Email Task is straightfoward. Define who is receiving the email, the subject, body and then add your attachment.
An alternative to the Send Mail Task, is to use an Execute SQL Task. The DBAs likely already have sp_send_dbmail configured on your server as they want the server to alert them when bad things happen. Sending your files through that process is easier as someone else has already solved the hard problems of smtp connections, permissions, etc.
EXECUTE msdb.dbo.sp_send_dbmail
#profile_name = 'TheDbasToldMeWhatThisIs'
, #recipients ='pmaj#a.com;billinkc#b.com'
, #subject = 'Daily excel'
, #body = 'Read this and do something'
, #file_attachments = 'C:\ssisdata\daily\daily.xlsx';
Besides using existing and maintained mechanism for mailing the files, Execute SQL Task is easily parameterized with the ? place holder so if you need to change profile as the package is deployed through dev/uat/prod, you can create SSIS Variables and Parameters and map values into the procedure's parameters and configure those values post deployment.

Filemaker - how/where to use custom function?

I have downloaded the BaseElements plugin for Filemaker and managed to get it installed, I have downloaded this specifically to make use of "BE_ExportFieldContents" (https://baseelementsplugin.zendesk.com/hc/en-us/articles/204700538-BE-ExportFieldContents) which basically allows me to export from a Container field on a server side script. I have looked through the documentation and cannot seem to find help.
Now I have the function, I'm completely at a loss on how to actually call the function? I want to export something from the container file to the filemaker documents path - so my question is, where and how the hell do I use this function in Filemaker? Apologies in advance for the noob question.
You make a script where you call this function from the record in question. This script can be run in the client, or via a schedule on FileMaker Server or via the Perform Script on Server script step.
The syntax is like this:
BE_ExportFieldContents ( field ; outputPath )
Where the ‘field’ parameter is the container field and the ‘outputPath’ is where you want the file to end up.
Usually you call such functions via the Set Variable script step. After the execution the variable contains any error or result from the call.
Note that the plugin needs to be installed and enabled on the server for it to work there.

Executing one subroutine without opening Access

I would like to know if there is an easy workaround for my following question. I have an access database that have different modules in vba (and of course each module with different subroutines). How can I do to create an icon or an executable file that by clicking on it it runs one of the subroutines of one of the modules without opening access?
The reason of this is because when I am away people need to run some of these subroutines and these users don't have any experience with Access.
You can start Access with a command line option to run a named Access macro. (That means an Access macro object. Some people also call VBA procedures macros, but an Access macro object is different.)
An Access macro has a RunCode method which you can use to run a VBA function. Since the code you want to run is a subroutine, create a new function which calls that subroutine and shuts down Access afterward, and use the function with the macro's RunCode method.
After you have the macro working correctly, test it from a Windows Command Prompt session following this pattern:
"<path to MSACCESS.EXE>" "<path to db file>" -X <macro name>
After working out those details, you can create a desktop shortcut to do the same thing.
However, if your Access operation must be run by you or another user on a regularly scheduled basis (daily, weekdays only, etc.), you could create a Windows Scheduled Task to do it and forget about other users and desktop shortcuts.
Note this suggestion isn't exactly what you requested because it does open Access. But it could close Access after the operation is finished, so perhaps it will be acceptable.

MS Access automation

I need to run a MS Access job as an automated task. I know Access isn't really built for this type of task, but I have MOST of it working except for one, critical part. In short, this is what it's supposed to do:
Generate a PDF report for a user
Generate an email for the user
Attach the PDF to the email
Send the email via SMTP
It works if a user is logged into a desktop session. The process needs to run as an automated process, without requiring a user to be logged in. Using Powershell and the built-in Task Scheduler (Windows 7 Ultimate, 64-bit), I'm able to get it running on schedule. But the Access code fails when it tries to save the PDF. Through experimentation, I learned that I need to save to the "My Documents" folder, and I have the process running as "me", but I keep getting the same error message:
8/18/2014 4:00:17 PM Report Error in <method name>
2302
-1
0
<project name> can't save the output data to the file you've selected.
MTS
So I suspect that if I select the correct location to save the PDF, it will work. Is there a special location that the system and/or Task Scheduler (TS) can save to? Is there a special way to share a folder that it will allow TS to write to it (without requiring a user to be logged in)?
I personally usually save all these types of files/reports into the %TEMP% folder, which seems appropriate for this application since it is only to store it until it is emailed.
I haven't had any permission issues saving into this folder yet.
If you're unfamiliar with %TEMP% you can search for Environ variables and there are usually different useful file paths to common folders used by the system e.g. %APPDATA%,%USERPROFILE%` etc
Thanks for all the input. After exhaustive testing--based on the response from ashareef above--I've demonstrated that it can't be done. I tried saving to the following environment variable locations:
TEMP
APPDATA
LOCALAPPDATA
PUBLIC
USERPROFILE
I also tried:
C:\Users
C:\Temp
C:\Users\<my user name>\Documents
And none of those worked if I set the task to run whether I was logged in or not:
One location does work, but only if I'm logged in and I set the task to run only when I'm logged in:
C:\Users\<my user name>\Documents
To sum up:
Saving a file from Access
Running as a Scheduled Task
Whether you're logged in or not
Is not possible! So here at work, we're going with Plan B. Thanks for your help!

How to get a response from a script back to Hudson and to fail/success emails?

I'm starting a Python script from a Hudson job. The script is started though 'Execute Windows batch command' in build section as 'python my_script.py'
Now I'd need to get some data created by the script back to Hudson and add it to the fail/success emails. My current approach is that the Python script writes data to stderr which is read to a temp file by the batch and then taken into an environment variable. I can see the environment variable correctly right after the script execution (using set command), but in the post-build actions it's not visible any more. The email sending is probably done in different process, so the variables are not visible anymore. I'm accessing the env vars in the email as ${ENV, varname} (or actually in debug mode as $ENV to print them all)
Is there a way to make the environment variable global inside Hudson?
Or can someone provide a better solution for getting data back from Python script to Hudson.
All the related parts (Hudson, batch and Python script) are under my control and can be modified as needed.
Thanks.
Every build step get's is own shell. This implies, that your environment variables are only valid within the build step.
You can just write the data in a nice format to the std output (use a header that is easy to identify) and if the job fails, the data output gets attached in the email.
If you insist on only putting in the data, you can use the following token for the Editable Email Notification post build action (Email-ext plugin).
${BUILD_LOG_REGEX, regex, linesBefore, linesAfter, maxMatches, showTruncatedLines, substText}