I have created a job in Talend which I want to schedule everyday through Windows Task Schedular on my dedicated machine. I have exported the job and gave the path of executable batch file to Task Scheduler. Now the issue is that when the job executes through the Task Scheduler it does not insert any data from source MySQL database tables (from a remote client machine) into destination MySQL database tables (at my local dedicated machine). Whereas, whenever I execute the same batch file just by clicking on it then it works perfectly fine.
Please help!
Related
I have created one project for SSIS and deployed that on sql server 2014. When I am running packages from sql server data tool its running fine and performing all operations, but when I am running from catlog procs [SSISDB].[CATALOG].[Create_execution] its shows run successful but I can't see any data into my staging tables. I have used configuration tables to configure connection and files path.
Any idea.
PLease check the user privileges (file system, etc.) - if you call the procedures with another user than you execute the package from within Data Tools, this might be the reason.
In MySql DB, How to populate a specific table with the data of an excel file every day.
I want this job to run every morning at 8.00am.
How can i do this?
You cannot do that with MySQL Workbench. You need some external timer service (cron job in linux/mac, scheduler service in Win) to run a batch file to make the import.
I have a temp directory on my website where users export data in .csv files.
The newer intranet apps delete the file after it's sent to the client but the legacy apps just leave the files in this directory.
I'd like to create a task to clean this directory nightly. There can be .csv files and directories with files in them.
Basically I want to run:
del *.* /s
rd /s
...every night at midnight.
Would love to do it with a SQL maintenance task but that only runs on the actual SQL server and doesn't work with mapped drives (unless I'm missing something).
How does one go about performing this task?
Can it be done through SQL server somehow?
You have the option of creating "Jobs" that run stored procedures or bits of code. These jobs can be scheduled to run daily, weekly, etc. Check out this thread: How can i create SQL Agent job in SQL Server 2008 standard?
Sounds like you may be using SQL Server Express Edition for which Microsoft removed the SQL Agent on 2008 and above. In cases like this you're best choice is using the Windows Scheduler to run your commands via a batch file.
I have about 3 or so weekly generated .CSV files that are automatically placed in a particular directory on my MySQL box.
What I would like to do is run some sort of automated script that opens the MySQL command line and executes a query to delete all records in the corresponding table and LOAD DATA INFILE from the CSV's.
Being that I'm on Windows I cannot do a chronjob, though I was thinking I could do some sort of batch script and run it as a Scheduled Task.
Any idea how I would go about doing this?
Depending on your MySQL version you could use CREATE EVENT to schedule some database statements, check:
http://dev.mysql.com/doc/refman/5.1/en/create-event.html
I have two DB servers, i.e Server1 and Server2.
I want to transfer data from Server1 to Server2 every morning (at 9:00AM lets say).
How can we achieve it?
Can this transfer of data be done automatically?
My choice on a windows machine is to create a batch file that runs mysqldump with the parameters that suite you best.
This batch file can be tied to the windows scheduler for an automated execution at any point in time.
you can consult this page for some guideline by the MySQL community.
Now that you have a dump of your DB your script should send it to the destination server and deploy it locally (can also be done automatically).
i use the parameters for mysqldump that allows me to incrementally add data to the new server
i transfer the dump using DeltaCopy which is a community windows wrapper around the rsync program (if interested you should check Syncrify on that page as well)
both of those last point allow for an extremely faster process than copying the entire DB everytime.