Executing a script from Zabbix Server with parameters - zabbix

I have made some script that a zabbix user can run from the zabbix server.
I would like to know if it's possible to run a script (from Administration -> script) on zabbix server with filled parameters ?
For example I have the script "doSomething" in Zabbix server that launch a script localy, this script need 1 parameter that change every time. How can i do that ?
Thank you

From the documentation, the scripts defined in Administration -> scripts
become available for execution by clicking on the host in various
frontend locations (Dashboard, Problems, Latest data, Maps)
You can supply parameters with macros:
The following macros are supported in the commands: {HOST.CONN},
{HOST.IP}, {HOST.DNS}, {HOST.HOST}, {HOST.NAME}. If a macro may
resolve to a value with spaces (for example, host name), don't forget
to quote as needed. Since Zabbix 2.2, user macros are supported in
script commands.
You can define a user macro (ie: {$SOMEPARAM}) and use it as a param, but you need an external tool to redefine its value: it depends on what you are trying to achieve.

Related

Zabbix 5.4: Report manager is disabled

I just installed Zabbix 5.4 and i want to test 'scheduled reports' but i keep getting:
Report manager is disabled.
Here's the interface:
Since the feature is new i can't find a solution. so if someone is familiar please help me fix it and enable the report manager.
You need to enable report writer in server config.
In /etc/zabbix/zabbix_server.conf (assuming you are running Zabbix on a Linux machine) change the value of StartReportWriters to at least 1, then restart the server.
Edit
You also need to set WebServiceURL parameter in zabbix_server.conf. The URL should be in format <host>:<port>/report. Default port is 10053.
WebServiceURL=http://localhost:10053/report
/report path is mandatory and hardcoded, and so can not be changed.
Next, Frontend URL parameter must be set. Go to Administration → General → Other parameters, and specify the full URL of the Zabbix web interface in the Frontend URL parameter.
More detailed instructions can be found in official documentation.

File share delivery of SSRS without domain

I am trying to schedule the delivery of a report to a shared folder in a workgroup(without domain). But I keep getting the error message of
Failure writing file: A logon error occurred when attempting to access the file share. The user account or password is not valid.
I have tried several combinations of acounts with or without ComputerName:
ShareAccount
Share\ShareAccount
Server\ServerAdmin
Server\ShareAccount
And I have created an identical account with same password on both side.
Also, have tried set and unset unattended execution account with server administrator account.
I am sure the shared folder can be accessed with the same UNC path and account in windows explorer. Not sure what else I can try.
Is there anybody successfully do the file share delivery without domain? Or any other way I can schedule to export a report?
This feature works fine in SSRS so it is your settings which are wrong.
You will also want to have the subscription run as a specified user.
Create a local user on the computer to where you wish to save your report. Call it ReportUser.
For the purposes of this answer, we will call the the computer where you wish to save the report FileServer.
ReportUser needs write access to the share you are trying to use.
Try your report - if it still doesn't work then:
Launch Windows Explorer but Run As your new ReportUser - you will need to enter the password you have just created.
Navigate to the share by typing \\computername\fileshare - this proves your share is setup correctly.
Right click in the folder and create a new text document. this proves you have write permission to the folder.
Successfully completing those steps will mean that SSRS will be able to write to the share.
Within SSRS you need to be writing to:
\\computername\fileshare
The username will be \FileServer\ReportUser with a password that you have just created.
One more thing - run the schedule straight after your test - to prove something isn't happening to the network, e.g. overnight maintenance etc.
Environment: All machines are Windows Server. SSRS SQL Server 2016 version on one machine (SSRS service is the sole process running there). SSRS catalog on another machine that hosts SQL Server 2016. File delivery to a third machine.
On the SSRS machine (the one hosting the Reporting Services service), create a local account.
On the receiving machine (the one where the file will be delivered), create a local account with the same name and password as above. Also on the receiving machine, share a directory and grant read/write permissions to the local account just created.
On the Subscription tab of the Report Manager interface (or whatever is used to create a subscription), for the "Credentials used to access the file share" setting, select "Use the following Windows user credentials". Enter the name of the account created above, but do not prefix it with anything ("FILESERVER\ShareDeliveryUser" bad; "ShareDeliveryUser" good). Enter the password.
I tried numerous combinations, including attempting to use the "file share account," but this was the only way that worked.
Strangely, on the Report Manager interface, the "Result" of the last run always shows "Failure writing file...", although the file is indeed delivered.
Attributing original answer to post by user ExoStatic here https://social.msdn.microsoft.com/Forums/en-US/bdc5b51c-444b-442d-9657-3cf5495e79d0/file-share-delivery-failing#7725882e-d7c6-4b3d-88f6-2620409c3d48. Edited for clarity.

Customizing a GCE Ubuntu VM image

I have a Google Cloud Platform account that I access from a VirtualBox VM. I am using the Google Compute Engine for a project that I am currently working on, and I had to create a custom image based on the Ubuntu 14.04 image that's available there.
I made changes to the Ubuntu image by ssh'ing into an Ubuntu 14.04 instance, (from my Vbox VM terminal) installing the Matlab compiler runtime, and downloading some other files that I needed. I created the custom image by following the steps according to the documentation.
However, now the changes I made are only available to me when I SSH from my Vbox VM terminal. I need to be able to run a certain matlab program Via startup scripts, how can I make it so that all users using this image have access to the customizations I made? Is there a way I can do this without having to make the edits by ssh'ing from the developers console and redoing all the changes?
EDIT: I don't think I was very clear so Ill give some examples. say my Google account is alexanderlang. When I ssh into an instance created from my custom image from the developers console, bash prompt looks like:
alexanderlang#myinstance $
My Vbox username is alex, and when I ssh into the same instance from my vbox terminal, bash prompt looks like:
alex#myinstance $
alex#myinstance can run matlab programs, but alexanderlang#myinstance cannot. I'm talking about the same instance, created from the same image. I think this might have something to do with the ssh keys for my custom image, but I don't know how to change or remove those keys.
When you connect to your VM instance via ssh by using either Developers Console or gcloud, the user account is dynamically created (if it doesn't already exist) by setting metadata on the VM. The question is: how does each tool choose your username?
When you use Google Developers Console, the only information it knows about you is your Google Account name, so it uses that, e.g., <first-name>_<last-name> or similar.
When you connect to your instance via gcloud, it knows the value of $USER so it uses that instead.
Note that in either case, your account has passwordless sudo access, so if you want to switch from one account to the other, you can run:
sudo su alex
while logged in as alexanderlang and then you have access to all the programs that alex does.
Similarly, you can run:
sudo su alexanderlang
while logged in as alex to do the reverse.
Startup scripts run as root. To run commands as another user, you need to do two things:
change to that username
run commands as that user
sudo su alex will create a new shell and hence ignore the rest of the script (until you manually exit the user shell, which is not what you want).
You can use sudo su alex -c 'command to run' but since what you want to run is a complex script, you need to first save the script to a file, and then run it.
Your options are:
pre-create the shell script to run
dynamically generate it from the startup script
Doing (1) is easy if the script never changes. For frequently-changing scripts (and it sounds like, many dynamically created VMs), you want to use option (2).
Here's how to do this in a startup script:
cat > /tmp/startup-script-helper.sh <<EOF
# ... put the script contents here ...
EOF
sudo su alex -c '/tmp/startup-script-helper.sh'
You can use Packer to create a derived image from a stock GCE VM image. Packer will let you do the following very easily:
boot a GCE VM using an image you specify
run some customization step, e.g., shell script, or Chef/Puppet/etc.
save the resulting image in your Google Cloud Platform project
Then, you can boot any number of new VMs using your newly-created image.
Note that since your VM image will be stored on Google Cloud Storage, you will be charged for the space it uses. Current pricing for Google Cloud Storage standard class is USD $0.026 / GB / month. A typical VM image should be less than 1GB.
You can see a complete example of how I used Packer to build VMs and pre-installed Ambari on it via my GitHub repo.

Gnome 3 automatic execution of a script that needs network

my old father is using ubuntu-gnome. He has no static ip address. In order to perform remote administration, I need to know his ip. I was using dyndns free account (configuration in the adsl modem), but this will stop working in a couple of days.
I would like to run a script each time he logs in to publish his ip on my website. I have tried to put a script on the boot, but the network is not available. It seems that it is gnome 3 that starts the network, but I do not know much about gnome 3.
How should I do to have my script run automatically as soon as the network is available ?
One possible non-elegant solution for this is to put your script in his cron to run every X minutes :)
Looking to mine /etc/NetworkManager/ looks like there is a folder dispatcher.d that I think it'll do what you want. Just experiment with a bash/perl/python w/e script in there set the permission appropriately. You can find the UUID in the system-connections/ folder. More information is available in man networkmanager.
EDIT: Look what I found: https://askubuntu.com/questions/13963/call-script-after-connecting-to-a-wireless-network. Seems like this is exactly what you want.
The easiest way is to use another dynamic DNS service. I used to use my own. You could also put curl or wget command to cron or create a systemd service that will call that command periodically. As a target you would have to use your machine with a web server where you can see the IP in your logs.
It is not Gnome that connects the network, it is a system service called NetworkManager. It tries to connect at boot if possible. In some cases it waits for wireless signal, in other cases it waits for a user password. I recently verified that in Fedora, NetworkManager properly implements the systemd's network-online.target but it may have yet to be fixed in other distributions, see the upstream bug report.
https://bugzilla.gnome.org/show_bug.cgi?id=728965
If you want to run a system service just after boot, you need to use:
[Unit]
...
Wants=network-online.target
After=network-online.target
You could also just run a script that calls nm-online at the beginning to wait for the network connectivity if you can expect the connectivity to come up in reasonable time, otherwise it times out. Such a script can be run from any environment including a user session.
And, as noted already, you can put a script into /etc/NetworkManager/dispatcher.d that will be called on any network configuration change and such a script can then filter connection up events and start the notification script.

The user data source credentials do not meet the requirements to run this report or shared dataset error when running reports

I get the following error when trying to run reports:
The current action cannot be completed. The user data source credentials do not meet the requirements to run this report or shared dataset. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. (rsInvalidDataSourceCredentialSetting)
By the way I am running it from VS2010 with SQL Server 2008 Reporting Services.
How do I solve this issue?
Yes, I've seen this. You can set the Credential and Connection Information such that a report is run impersonating the unattended user account. This article explains how to set up this type of report running. This setup is especially useful if you want to use the credentials inside a dynamic connection string (for example when you need to insert the credentials through a parameter).
If you don't want to run using the unattended user account, you should review your DataSource and connection string as defined in the report. Perhaps play around with the settings and different configurations for the datasource to create a different setup. The above links should be a start for some documentation.
In my case, it was because of some deployment parameters.
Go to Project Property by Right-clicking at the Project name in Solution Explorer and select Properties.
In Configuration Properties > General, change OverwriteDatasets and OverwriteDataSources in Deployment section to True for both parameters.
Click OK.
in my case, replacing linked server connections with local (fetched from remote locations and stored in local tables)data connections helped. we also checked this for ALL subreports/linked reports and it worked fine.
This happened to me today, it was because I was using the wrong datasource in my report. So I changed the datasource manually in Report manager and it worked. I guess another choice is to redeploy your report with the correct datasource.
This happened to me today. I am using Visual Studio 2019 for creating the reports for SQL Server 2014.
One of my reports had an embedded data source, but it was unconfigured/not configured properly. (You can see embedded data sources in the "Report data" pane under "data sources").
However, the embedded data source wasn't actually being used. I created the embedded data source earlier for debugging and forgot about it.
After deleting the unused embedded data source, the error went away.
This is from Microsoft:here
User Action
Change the settings for the current report so that it can run unattended, and then try to create the subscription or other scheduled operation again. Use the following steps to configure a report to run unattended:
1) Go to the Data Sources properties page of the report that you want to automate.
2) For the Connect Using option, select Credentials stored securely in the report server.
3) In User Name and Password, type credentials that can be used to access the database. If you are using SQL Server as the data source, the user name must be valid for both logging on to the server and for accessing the database that contains the data for the report.
4) If the user name and password are credentials for a Windows account, select Use as Windows Credentials. If the credentials are for a SQL Server user login, do not select this check box.
Do not select the check box Impersonate the authenticated user after a connection has been made to the data source, regardless of authentication type. This option cannot be used for reports that run unattended.