Not able to fetch the data from server with multiple rdp - powershell-remoting

We are running a batch file from Server1 which calls the Powershell scripts in Server2. The Powershell script will get the date and time of Server3. When we trigger the execution from server1 it is not working. PowerShell script not able to get the server3 time. When we run the powershell script manually from server2 it is working.

The problem here is with Double hopping. This is not allowed by default in PowerShell remoting.
To do it in that way , you will have to delegate your credential you use to run the script from server1 to server 2.
Do a Get-Help Enable-WSMANCredSSP -Full
You will get more info here
Regards,
Kvprasoon

Related

Zabbix get the result of running a command on the ESXi server

On the ESXi server (I connect via SSH), I execute the command to view the SMART disk:
esxcli storage core device smart get -d=t10.ATA_INTEL_SSDSC2BB080G4__PHSL4
How can I pass the received data to Zabbix?
I'm trying to create an Item with type "script". I create this Item in a template. The template already connects to ESXi using macros, Zabbix receives data. But in the script field of the Item, you simply cannot insert this command, you just need to write the script.
In Zabbix a Script is a command you can execute using Host the context menĂ¹, or a command you can execute as an Action, when a Problem is triggered.
An item to perform a command via SSH, should use the "SSH agent" item type.
You cannot perform an esxcli command through the vSphere HTTPS SDK with Zabbix.
See
https://www.zabbix.com/documentation/6.0/en/manual/web_interface/frontend_sections/administration/scripts
https://www.zabbix.com/documentation/6.0/en/manual/config/items/itemtypes/ssh_checks
Example:

Internal 500 error on Google Compute Engine, installing littlest jupyter

"Internal 500 server error" after VM runs for a day or two.
This is the second time it has happened, I start the instance, install littlest Jupyterhub
(see details below). I can login to the external ip, for a day, but then it stops
with internal 500 error. I cannot ssh or get into the instance, only alternate is to
create a new instance and re-do. What is the problem?
I have installed littlest jupyterhub using on this instance, using
#!/bin/bash
curl https://raw.githubusercontent.com/jupyterhub/the-littlest-jupyterhub/master/bootstrap/bootstrap.py | sudo python3 - --admin master
I would recommend you enable access on your instance to the serial console [1].
You will also need to setup a password for your user following this documentation [2].
With these two steps done, you should be able to reconnect to your instance once you are locked out like you mentioned by following this [3].
You should then be able to investigate what is going on in the instance.
Then try to verify if your application is still running, if the SSH server is still running etc.
Frederic
[1] https://cloud.google.com/compute/docs/instances/interacting-with-serial-console#enable_instance_access
[2] https://cloud.google.com/compute/docs/instances/interacting-with-serial-console#setting_up_a_local_password
[3] https://cloud.google.com/compute/docs/instances/interacting-with-serial-console#connectserialconsole

Flat File Source Failing When Deployed and Executed

I am trying to create a simple SSIS package that can be deployed to a VM on another domain (same computer) that has SSIS on it. The package that I have created will transfer data if I simply run the data flow from SSDT on the source computer, but when I try to deploy and execute it, it appears to have 2 errors. The first being that it cannot open the datafile "C:\SSIS\Product Data.txt". The second being that the Flat Files Source 1 failed the pre-execute phase. I think that the second error is caused from the first. Does anyone know how to create a package that can transfer data to another computer without the flat file source?
Thanks in advance!
C:\SSIS.... will refer to the local C drive of the computer that is executing the package. So if you have it on the VM it would be at the \VM\C$. Typically it means 1 of 2 things.
1) The file path is not correct and you are not referencing where the file actually is.
2) the user you are executing the package as does not have permission to that file.
So where is the file is it on the C drive of the machine you are executing the SSIS package on? The answer to that wasn't clear in your question.
Second how is the file being executed? manual execution such as DTS_Exec? SQL Agent? T-SQL? If manual then the user logged in executing the task must have permission, if SQL Agent then the sql agent service account and possible the sql server service account have to have permission to the file, and T-sql sql server service account needs to have permission.
When it is local file path then a local user or domain user will work for the service accounts. If network path then the local user used for service account won't work and you will need to setup a way to run as a different credential.

how to execute script on zabbix agent?

I have made a magic script on Zabbix agent and i wanna to execute on Zabbix agent using Zabbix item.i will be thankful to you in advance for this . do something if you can.
You would create a user parameter that would look like any other item from the server side. On the agent side, you would have to edit the agent configuration file and restart the agent.
In addition to user parameters, Zabbix supports remote commands. These are appropriate (and indeed invoked from) actions as a result of some condition. So in the answer from Richv you would use that for polling for data, e.g. to find out if a process is running. Continuing that example, if the process is not running and you wanted to start it, you can ask the agent to execute a command that might start the process. To do that you must enable remote commands in the agent config file, and configure it as an action (that in turn is linked to whatever condition you had concerns about).
1st in the zabbix_agentd.conf for the remote host, add EnableRemoteCommands=1
Then see the video at https://youtu.be/G6jfahBZwlk
In the video, a bat file is created on the remote host, and then an item is created for the remote host using the Zabbix server interface, and selecting the system.run option in the key options.

Running an SSIS Package as a SQL Server Agent Job doesn't copy files?

I have an SSIS package I can import into Integration Services on my server and run with no problems. All it does is copy files from a directory on the network to the server it is running on.
When I execute the SQL Agent Job it says the job ran successfully but no files are copied. I verify there are files in the source location and the destination path exists. I am also using absolute paths (no mapped drives).
Why doesn't it copy any files when I run it as a SQL Agent Job?
FYI - the source directory is actually on a UNIX box and to map a drive to that location you have to enter a user/password combination.
I have a feeling that the SQL Agent Job runs as NT SERVICE\SQLSERVERAGENT, which is not the user that has permission to the UNIX box. Is there a way to run the SQL job as a specific user?
Thanks in advance.
You need to create a Credential, a SQL Agent Proxy, and then assign the proxy account to the SQL Agent job step. Proxy accounts are specific to each subsystem (e.g Powershell, CmdExec, SSIS, etc.)
-- creating credential
USE [master]
GO
CREATE CREDENTIAL [Superuser] WITH IDENTITY = N'DOMAIN\account', SECRET = N'mypassword'
GO
-- creating proxy for CmdExec subsystem, adding principal
USE [msdb]
GO
EXEC msdb.dbo.sp_add_proxy #proxy_name=N'My custom proxy',#credential_name=N'Superuser',
#enabled=1
GO
EXEC msdb.dbo.sp_grant_proxy_to_subsystem #proxy_name=N'My custom proxy', #subsystem_id=3
GO
EXEC msdb.dbo.sp_grant_login_to_proxy #proxy_name=N'My custom proxy', #fixed_server_role=N'sysadmin'
GO
-- assigning job step to run as a given proxy user
USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep #job_id=N'0df2dac2-4754-46cd-b0bf-05ef65e1f87e', #step_id=1 , #subsystem=N'CmdExec',
#proxy_name=N'My custom proxy'
GO