I use WinSCP 5.17 to retrieve files from an FTP, when I retrieve files from the root folder of the FTP all works, but as soon as I try to retrieve the files from the sub-folder it doesn't work.
Here is the instruction I use:
get /Clients/Folder2/Folder3/*.* F:\folder1\folder2\
and this is the error message:
Error listing directory '/Clients/Folder2/Folder3'.
Bad message (badly formatted packet or protocol incompatibility).
Error code: 5
Error message from server: Bad message
Thanks for your help.
Finally I solved the problem, I generated the script from WinSCP GUI and it works.
open ftps://username:password#ftp-adresse.azure.com/ -certificate="ee:5f:af:7c:26:6b:bb:6f:cd:86:6a:2c:03:1e:8f:ab:e7:63:fd:43" -rawsettings FollowDirectorySymlinks=1
cd /Clients/folder2/folder3
lcd "F:\folder1\folder2"
get "*.xlsx"
exit
I am having a User_file.txt file placed at WinSCP root folder USERDATA, which I am trying to download to the given location C:\User\Local\.
Executable- C:\Program Files (x86)\WinSCP\WinSCP.exe
Arguments - /log=c:\path\to\log\winscp.log /command "open ftp://username:password#example.com/" "get /USERDATA/User_file.txt C:\User\Local\" "exit"
I am facing below error:
No connection could be made the target machine actively refused it
Not sure what I am missing. I took reference from couple of related post on Stack Overflow.
You want to use SFTP, but use FTP (ftp://) protocol in the session URL.
Use sftp:// instead:
... "open sftp://username:password#example.com/" ...
WinSCP GUI can generate a correct script/command-line template for you.
I'm having a problem in Production when accessing the JSON (let's call it mydata.json) where I store the data of my ruby webapp, deployed with Heroku. The way I access the download of this file is by putting into the browser:
my-heroku-page.herokuapp.com/mydata.json
but I get the HTTP 500 error page.
Always using the browser, when I try to create it I get the Ruby page:
We're sorry, but something went wrong.
If you are the application owner check the logs for more information.
First I must say I'm using the same source code I used for another webapp that is actually working, I just modified the database.yml according to my new host, database, username and password. The previous webapp was perfectly working, i.e I could create the table and access the data. Secondly the error doesn't occur with localhost.
I tried to create a Dataclip in Heroku:
Select * from mydata order by created_at desc;
but I get the error:
"Your query couldn't be created."
ERROR: relation "mydata" does not exist
LINE 3: FROM mydata
Plus when I check the heroku pg:info I get 0 Tables
=== DATABASE_URL
Plan: Hobby-dev
Status: Available
Connections: 0/20
PG Version: 9.5.5
Created: 2016-11-15 08:30 UTC
Data Size: 7.4 MB
Tables: 0
Rows: 0/10000 (In compliance)
Fork/Follow: Unsupported
Rollback: Unsupported
Add-on: postgresql-convex-54172
It seems like mydata.json is not created in production, but in localhost is working fine and I can create/download a blank one. I'm sure I'm missing something easy here, maybe in the database.yml.
I will edit the question if additional info are required. Any help is appreciated.
Thanks,
Simone
I installed according to instruction latest GT.M release version 6.0-000_linux_i686_pro.tar.gz from http://sourceforge.net/projects/sanchez-gtm.
Before instalation I set $gtm_log then went through innstalation. Afterwards I set $gtm_dist and added to the PATH I set $gtmroutines
I started gtm from parent directory and I was trying to create default Global Directory.
GTM>d ^GDE
%GDE-I-GDUSEDEFS, Using defaults for Global Directory
/usr/lib/mumps.gld
When using exit command I get this error:
GDE> exit
%GDE-I-VERIFY, Verification OK
%GDE-I-GDCREATE, Creating Global Directory file
/usr/lib/mumps.gld
%GDE-I-WRITEERROR, Cannot exit because of write failure. Reason for failure:
13,GDEPUT+48^GDEPUT,%SYSTEM-E-ENO13, Permission denied
Can anyone help me solve this problem?
You should set $gtmgbldir before creating the default Global Directory.
For example, export gtmgbldir="~/mumps.gld" will result in the Global Directory (and by default, the database file as well) being created in your user's home directory, which is a probably better idea than creating them in /usr/lib/.
You should also have a look at the file $gtm_dist/gtmprofile. It can be sourced to create a default environment perfect for learning, but it is also a good source of information to learn which environment variables need/can be defined to what effect.
Is there a way to connect to a network drive that requires a different username/password than the username/password of the user running the package?
I need to copy files from a remote server. Right now I map the network drive in Windows Explorer then do I filesystem task. However, eventually this package will be ran automatically, from a different machine, and will need to map the network drive on its own. Is this possible?
You can use the Execute Process task with the "net use" command to create the mapped drive. Here's how the properties of the task should be set:
Executable: net
Arguments: use \Server\SomeShare YourPassword /user:Domain\YourUser
Any File System tasks following the Execute Process will be able to access the files.
Alternative Method
This Sql Server Select Article covers the steps in details but the basics are:
1) Create a "Execute Process Task" to map the network drive (this maps to the z:)
Executable: cmd.exe
Arguments: /c "NET USE Z: "\\servername\shareddrivename" /user:mydomain\myusername mypassword"
2) Then run a "File System Task" to perform the copy. Remember that the destination "Flat File Connection" must have "DelayValidation" set to True as z:\suchandsuch.csv won't exist at design time.
3) Finally, unmap the drive when you're done with another "Execute Process Task"
Executable: cmd.exe
Arguments: /c "NET USE Z: /delete"
Why not use an FTP task to GET the files over to the local machine? Run SSIS on the local machine. When transferring using FTP in binary, its real fast. Just remember that the ROW delimter for SSIS should be LF, not CRLF, as binary FTp does not convert LF (unix) to CRLF (windows)
You have to map the network drive, here's an example that I'm using now:
profile = "false"
landingPadDir = Dts.Variables("strLandingPadDir").Value.ToString
resultsDir = Dts.Variables("strResultsDir").Value.ToString
user = Dts.Variables("strUserName").Value.ToString
pass = Dts.Variables("strPassword").Value.ToString
driveLetter = Dts.Variables("strDriveLetter").Value.ToString
objNetwork = CreateObject("WScript.Network")
CheckDrive = objNetwork.EnumNetworkDrives()
If CheckDrive.Count > 0 Then
For intcount = 0 To CheckDrive.Count - 1 Step 2 'if drive is already mapped, then disconnect it
If CheckDrive.Item(intcount) = driveLetter Then
objNetwork.RemoveNetworkDrive(driveLetter)
End If
Next
End If
objNetwork.MapNetworkDrive(driveLetter, landingPadDir, profile, user, pass)
From There just use that driveLetter and access the file via the mapped drive.
I'm having one issue (which led me here) with a new script that accesses two share drives and performs some copy/move operations between the drives and I get an error from SSIS that says:
This network connection has files open or requests pending.
at Microsoft.VisualBasic.CompilerServices.LateBinding.InternalLateCall(Object o, Type objType, String name, Object[] args, String[] paramnames, Boolean[] CopyBack, Boolean IgnoreReturn)
at Microsoft.VisualBasic.CompilerServices.NewLateBinding.LateCall(Object Instance, Type Type, String MemberName, Object[] Arguments, String[] ArgumentNames, Type[] TypeArguments, Boolean[] CopyBack, Boolean IgnoreReturn)
at ScriptTask_3c0c366598174ec2b6a217c43470f581.ScriptMain.Main()
This is only on the "2nd run" of the process and if I run it a 3rd time it all works fine so I'm guessing the connection isn't being properly closed or it is not waiting for the copy/move to complete before moving forward or some such, but I'm unable to find a "close" or "flush" command that prevents this error. If you have any solution, please let me know, but the above code should work for getting the drive mapped using your alternate credentials and allow you to access that share.
Zach
To make the package more robust, you can do the following;
In the first Execute Process Task, set - FailTaskIfReturnCodeNotSuccessValue = False
This will let the package run if the last disconnect has not worked.
This is an older question but more recent versions of SQL Server with SSIS databases allow you to use a proxy to execute SQ Server jobs.
In SSMS Under Security<Credentials set up a credential in the database mapped to the AD account you want to use.
Under SQL Server Agent create a new proxy giving it the credential from step 1 and permissions to execute SSIS packages.
Under the SQL Server Agent jobs create a new job that executes your package
Select the step that executes the package and click EDIT. In the Run As dropdown select the Proxy you created in step 2