zabbix monitoring file size, path is depending on application name - zabbix

Hi I want to monitor the growth of the file size of a log file with zabbix but the path differs for every application.
The log file is in: d:\data[foo]\data\log\server.log where [foo] is an application name.
How can I monitor the growth file size. And when the growth is to fast I want to trigger an alert.
Can someone help me?

You create a LLD discovery to find all the files using a dedicated user parameter, and create a prototype item with vfs.file.size[{#FULL_PATH}].
The corresponding prototype trigger can be a threshold, or a forecast. The first one is simpler, like "if file size is bigger than 10 MB".

Related

Cloudconnect CSV buffer size

When I try to load a big CSV from a zip file, the execution log give me the following error:
----------------------------------------- Error details ------------------------------------------
Component [Clientes:CLIENTES1] finished with status ERROR.
The size of data buffer is only 100663296. Set appropriate parameter in defaultProperties file.
--------------------------------------------------------------------------------------------------
How can I set the appropriate parameter in defaultProperties file?
I tried this link, but my cloudconnect run configurations page is different from the link:
I've created the parameters file and filled the additional parameters with the right values like said the tutorial (code bellow) and the same error appear in the screen.
Name: -config; Value: new_buffer_size.txt
The new_buffer_size.txt content have just this line:DEFAULT_INTERNAL_IO_BUFFER_SIZE = 200000000
How can I solve this problem? I need to solve this before the world explodes.
CloudConnect is designed to develop ETL(s), which can be run on GoodData cloud workers and therefore some lower level settings are surpassed as in this case. The only legitimate way is to modify the ETL the way it can process the data with current settings. Regarding to docs, the referenced article is outdated. GoodData docs team is aware if it and they are preparing docs refactoring.
Note: As you have probably noticed, CloudConnect is being powered by Javlin's Clover ETL, therefore feel free to check their forums, as you would find there how to overcome the issue on lower level (no UI), but it would work only for data processing on the local machine.

File-Monitoring via Lua Script

Good evening,
I am currently developing a way to import machine created data from a csv sheet into a database.
The question I have is, is there a way to react to a change in a csv file with Lua.
The file gets a line in this format:
17162H,"801234500001",9/23/2016 12:33:30 PM,"INV"
Every time a scanner is finishing a scan process, added under the old lines, but there is no direct connection to the database, to trigger the script.
It doesn't matter if the change is detected via different file size, foldersize (of the folder that contains the file) or a change within the file information (like date of last opening), but I can't open and read in it permanently due performance reasons.
Also this is the first time I ask here, so sorry for my clunky way, I'll try to improve myself with that over time.
Take a look at linotify, it has lua bindings for inotify and looks like it should do the trick, using the "modify" event to trigger your script.
I use LibUV based variant in my spylog apllication
Usage:
file_monitor(path_to_file, {eol = '\r?\n'}, function(line)
...
end)
If you need to run this on Windows, you can use winapi library, which supports file watchers. Here is an example of how it's used in one of my projects; you'll need to call winapi.sleep() to allow time for the check to trigger.

Zabbix - track config files

I would like to track changes to one config file. The reason for this is that multiple users access it to solve different issues, but every now and then those fixes break something else. diff function in Zabbix shows that a file was changed, but I would like Zabbix to also track what changed. Is there a combination of triggers that would let me do that? Any help is greatly appreciated.
Do you store file checksum or contents in the item? In any case, there is no built-in way to do that, but you can implement it with a script.
If checksum, you will need a way to store the previous version, new version and run the diff command. The easiest would be a userparameter that would do a diff between a temporary copy of the file and the current copy, then copy the current file over the temporary copy. In this case, you would store diff results directly in an item and your trigger would check that the last value is not an empty string. See https://www.zabbix.com/documentation/3.0/manual/config/items/userparameters for more information on userparameters.
If you are storing file contents already, presumably you want to reuse them. This would be a bit more complicated, as you would have to kick off the script whenever a new value arrives - maybe a special trigger could kick off an action that would compare the last two values (probably using the API), then push the result in another item that has another trigger. Unless you have a good reason to do it this way, I'd opt for the first approach.
Make a copy off your file
file.txt.copy or something like that. Make this file only writable by zabbix.
Create an item and trigger on zabbix to check when the file was changed (using diff or checksum)
Create a action on zabbix to execute a script that will
1 - diff between file.txt and file.txt.copy and send this diference to your email
2 - Copy file.txt to file.txt.copy so you can do the diff next time the file change.
To reate a action with script.
Create a action on zabbix. Go to "operations" tab. Select "Remote Command" from option.
Choose custom script.
Put the script with the whole path and arg's.
Sample
/opt/script/my_script.sh
The user zabbix must have permission to ruin the script.
Zabbix docs

Image synchronisation in a single Region

I would know something about fiware-glancesync component. I would like to synchronise only one image. I mean, I want to synchronise one single image in a region without modifying the current configuration file. How can I define a new configuration parameters (if it is possible) to do it with the GlanceSync?

The algorithm used to select the images can be defined by the user. The easiest
and best way to suncrhonise only one or a set of images is modifying the glancesync.conf configuration file inside ./conf directory. I recomend the creation of a new section [test] in order that you do not modify the current [master] section. Just write the following lines:
[test]
metadata_condition = image.name == 'GIS_GE'
credential= admin,<your secret>,http://130.206.112.3:5000/v2.0,admin
Keep in mind that '130.206.112.3' is the IP of the keystone service inside the FIWARE Lab, and the first and second admin are the OS_USERNAME and OS_TENANT_NAME. Last but not least 'your secret' is the password in base64 format.
And then, only execute the command:
./sync.py test:<name of the node, e.g. Lannion2>
See documentation in GlanceSync - Glance Synchronization Component in order to know more details about the image synchronization.
If you can obtain more information about the configuration of the GlanceSync, take a look to GlanceSync Configuration.

How can I get a Windows batch or Perl script to run when a file is added to a directory?

I am trying to write a script that will parse a local file and upload its contents to a MySQL database. Right now, I am thinking that a batch script that runs a Perl script would work, but am not sure if this is the best method of accomplishing this.
In addition, I would like this script to run immediately when the data file is added to a certain directory. Is this possible in Windows?
Thoughts? Feedback? I'm fairly new to Perl and Windows batch scripts, so any guidance would be appreciated.
You can use Win32::ChangeNotify. Your script will be notified when a file is added to the target directory.
Checking a folder for newly created files can be implemented using the WMI functionality. Namely, you can create a Perl script that subscribes to the __InstanceCreationEvent WMI event that traces the creation of the CIM_DirectoryContainsFile class instances. Once that kind of event is fired, you know a new file has been added to the folder and can process it as you need.
These articles provide more information on the subject and contain VBScript code samples (hope it won't be hard for you to convert them to Perl):
How Can I Automatically Run a Script Any Time a File is Added to a Folder?
WMI and File System Monitoring
The function you want is ReadDirectoryChangesW. A quick search for a perl wrapper yields this Win32::ReadDirectoryChanges module.
Your script would look something like this:
use Win32::ReadDirectoryChanges;
$rdc = new Win32::ReadDirectoryChanges(path => $path,
subtree => 1,
filter => $filter);
while(1) {
#results = $rdc->read_changes;
while (scalar #results) {
my ($action, $filename) = splice(#results, 0, 2);
... run script ...
}
}
You can easily achieve this in Perl using File::ChangeNotify. This module is to be found on CPAN: http://search.cpan.org/dist/File-ChangeNotify/lib/File/ChangeNotify.pm
You can run the code as a daemon or as a service, make it watch one or more directories and then automatically execute some code (or start up a script) if some condition matches.
Best of all, it's cross-platform, so should you want to switch to a Linux machine or a Mac, it would still work.
It wouldn't be too hard to put together a small C# application that uses the FileSystemWatcher class to detect files being added to a folder and then spawn the required script. It would certainly use less CPU / system resources / hard disk bandwidth than polling the folder at regular intervals.
You need to consider what is a sufficient heuristic for determining "modified".
In increasing order of cost and accuracy:
file size (file content can still be changed as long as size is maintained)
file timestamp (If you aren't running ntpd time is not monotonic)
file sha1sum (bulletproof but expensive)
I would run ntpd, and then loop over the timestamps, and then compare the checksum if the timestamp changes. This can cover a lot of ground in little time.
These methods are not appropriate for a computer security application, they are for file management on a sane system.