Sql Server - Run job which contains dtsx that creates files in a remote folder in another server - ssis

I am doing a SSIS dtsx file in which I access data in Dynamics CRM. This is the procedure:
* Someone copies files in a folder (.txt files from OCR, that's an external process).
* The SSIS grabs those files and copy them to another folder "Processed". Then the SSIS looks for data in those .txt and creates records for entities in Dynamics CRM.
* When some record canĀ“t be created on CRM, it is an Error and the SSIS creates a temporal table in SQLServer with the error. Then the SSIS creates some .txt files which show all the errors, the number of the row and name of the original file.
I could run it from the Visual Studio, but I am not able to run it from a job. I've followed those steps: https://www.codeproject.com/Articles/14401/How-to-Schedule-and-Run-a-SSIS-package-DTS-Job-in , but still not running.
What may I do to handle this?
The dtsx in the job is in a SQL server and the files which I want to work with are in a CRM server.
Thanks very much for your help!

Finally ir worked!!
I followed the steps again and I could run the job successfully.

Related

SSIS package only executes the last part of package in SQL Agent job but executes all of the package in Visual Studio

I created an SSIS package which has 3 script tasks that checks if a different file exists for each and if the any of the script tasks finds the file, then it goes to a foreach loop which imports data to a sql server staging table, then copy and renames the file.
It works successfully within Visual Studio, however when I set up a sql agent job, the package only processes the last file and ignores the first two.
Any ideas why this may be occurring?
Probably either permissions of the SQL Agent, or invalid file paths on the SQL Server.

SSIS ETL Package Execution Issues

I have a SSIS package where excel files are uploading to the target SQL Server db.
The DFT executes inside a for each loop container as files are coming in random basis. The successful files processed and moved to Success folder and unprocessed files moved to Failed folder. But in such cases even the files processed and moved to success folder but the data actually has not been uploaded into target table. How to resolve such issues please do reply me with a fruitful solution.

BIDS - OLEDB Connection errors in SSIS while writing to Excel 2007 (xlsx) files

For the past few days I'm running into an error, which is well known, but I can't understand what I need to do, even after reading so many different solutions. But please let me start with the task.
A predecessor created a business critical SSIS package using SQL Server Data Tools (2005) 4-5 years ago that basically reads a large table in the database and then segregates the data categorically and pumps the data into separate tables in the same database. At the end it reads the data from these segregated/categorised data tables and exports the data into respective Excel files in a Network drive in the same folder. All these tables have different data dictionaries. All these Excel files are 97-2003 format (.xls).
The Production server is SQL 2005 and Windows 2003. and a New Development environment is created with SQL Server 2012 and Windows 2012, where I need to migrate all the databases, SQL Jobs, SSIS packages. Majority of them completed and are running without issue. I left the complex SSIS packages to the last, so I can deliver something to business to test on.
Now my task is to upgrade the package to write into Excel 2007 xlsx files. No changes at the database level. So, I created OLE DB Connections for all the Excel files and the connections appear to work fine when clicked on Test Connection in Connection dialog. All these Excel files sit in the Dev SQL Server in the same folder (\DevServer\p$\SSIS_Jobs\Process_Data) as the SSIS package. I set the Extended Properties = Excel 12.0 XML in Connection manager. But when I run the package in the BIDS, I'm getting
"Failed to acquire connection "Excel07_Con1". Connection may not be configured or you may not have the right permissions on this connection."
The package is set to 32 bit mode and MSOffice installed is 32bit and installed the Microsoft Access Database Engine 2010 (32-bit) drivers. And the Dev Network drive has Full rights to EveryOne to ReadWrite.
Since this is the last step in the process, the whole job is failing because of this. I'm sure gone through lot of responses to similar questions. Any help would be highly appreciated.
Thanks - Madhu
have you checked project properties? It might be the case that the project in BIDS the following property Runas64Bit is set to TRUE.
Thanks for your responses, I've solved the issue by recreating the package from scratch in SSDT2012. Now the package is working. I suspect it could be the Excel drivers.
Thanks for your time again. - Madhu

Transfer output file of T-SQL statement in SSIS package to another server

I have an existing SSIS package on my DB server and I have another server which is the application server which uses this DB server for its operation. This SSIS package, is being used by a job running on the DB server. I would like to add an additional step. That step is just a basic select count(*) query to get count from one table. I implemented that on my test server and gave the output path in the advanced tab as well. And i got the output in a simple text file.
My question now is, How do i send this output file to my application server instead? Because the output path of a T-SQL server seems to only take local drives. I tried giving the path of my application server(which was being used by the SSIS package already) but the output doesn't come on the text file even 'though it says the job executed successfully. I don't see the necessity to create another SSIS package for just getting a simple count info.
Use the UNC path to your application server file system.

Create a task in SQL or Windows Server to clean a directory nightly

I have a temp directory on my website where users export data in .csv files.
The newer intranet apps delete the file after it's sent to the client but the legacy apps just leave the files in this directory.
I'd like to create a task to clean this directory nightly. There can be .csv files and directories with files in them.
Basically I want to run:
del *.* /s
rd /s
...every night at midnight.
Would love to do it with a SQL maintenance task but that only runs on the actual SQL server and doesn't work with mapped drives (unless I'm missing something).
How does one go about performing this task?
Can it be done through SQL server somehow?
You have the option of creating "Jobs" that run stored procedures or bits of code. These jobs can be scheduled to run daily, weekly, etc. Check out this thread: How can i create SQL Agent job in SQL Server 2008 standard?
Sounds like you may be using SQL Server Express Edition for which Microsoft removed the SQL Agent on 2008 and above. In cases like this you're best choice is using the Windows Scheduler to run your commands via a batch file.