Zabbix internal and external housekeeping - zabbix

Zabbix configuration in GUI have an "Internal housekeeping" option for some types of data. Anyone can tell me what the difference between internal and external housekeeping? What will be if i am uncheck "Enable internal housekeeping"?
"It is possible to override the history/trend storage period even if
internal housekeeping is disabled. Thus, when using an external
housekeeper, the history storage period could be set using the history
Data storage period field."

When internal housekeeping is enabled, Zabbix automatically removes old data, as per the sections in your screenshot. "External housekeeping" is something you would implement yourself and Zabbix would not have knowledge or control over it.
History/trend retention period override controls how the frontend works (disabling those fields in the frontend, controlling how graphs display data in some cases), but does not affect the data retention.

Related

Huge data transfer usage on RDS w/ MySQL

We started using RDS last month for our database needs but we're seeing "data transfer in" usage of about 3~6gb EACH DAY. Our database size is about 4gb. How's that possible? Is this some misconfiguration on my part?
We're also seeing 8~14gb of "data transfer out" each day and I really can't say why.
It's my first time using AWS (we're also using S3 but I've checked the reports and everything is accurate there) so I'm kinda lost.
For context, our application is built in JSF2 and we use Hibernate. We also have a web service running on PHP for a mobile application. We expect anywhere between 20~200 users daily 24/7.
I've set up the security groups to only allow inbound from our servers (and I removed all rules for outbound, is that fine?).
Our instance: Single-AZ class db.t2.micro

PostgreSQL monitoring With Zabbix

In Zabbix, what is the possibility to monitor many postgres databases .
because only one server database can be defined in odbc.ini
Thanks .
If you look at the zabbix share site here, you will see several templates which provide for monitoring via the zabbix agent. By using the zabbix agent instead of the server side odbc option, you push the actual connectivity out to the agent (which could, of course, be on the same server as zabbix). This allows you to do separate discovery and even separate credentials for each server (e.g. in user macros).
I did not experiment with the templates present there, merely offer them as examples. Determining what exactly one monitors on a database is the more difficult subject, of course, since it can be highly variable by site -- what is "normal" at that site, and so what is worthy of an alert. But with LLD and ability to do arbitrary queries, you can form any items and triggers you may need. Effectively you can have your DBA (might be you also of course) craft the criteria, and just put it wholesale into the template.

fiware spagobi cockpit graphics not upgrade

All graphics in my cockpit are not updated, even though the data source dataset is scheduled to be updated every 1 minute, and checking in the bbdd the dataset is updated correctly every 1 minute...
my dataset config:
How can I see the updated graphics? maybe needs to change something in spagobi server or configuration?
Cockpit uses a cache mechanism that permits to query datasets coming from different datasources while joining them but has nothing to do with the dataset's persistence.
At this moment, there are two ways to get updated data while using cockpit:
by cleaning the cache using the button inside the cockpit itself;
by using the cache cleaning scheduling setting.
In the latter case, enter the Configuration Management as admin to change the value for
SPAGOBI.CACHE.SCHEDULING_FULL_CLEAN
variable to HOURLY. This setting creates a job that periodically (every hour, which is the minimun) cleans the cache used by cockpits.

Global variables and sessions in asp.net

I'm new to web development, and coming from the world of java and android I have a few questions. (I'm using asp.net).
Let's assume I have a simple webpage with a label showing a number and a button. When any user presses the button, the number gets incremented automatically for all the users viewing the site, even if they do not refresh the page. Would I use sessions to achieve this or there another concept I should look into?
I have 2 types of counters which I store in a mysql table with the following schema.
Counter_ID Increment_Value
Each counter is active for a set amount of time and only one instance of a counter can be active at one point in time. After this time, the counter is reset to 0 and a new instance of the counter is created. I store all the instances which are active as well as past instances in a table with this schema.
Instance_ID Counter_ID Counter_Value Status(Active/Complete) Time_Remaining
When a user opens a page dedicated to one of the two counter types, the information about the current running instance of that counter needs to be loaded. Would I just execute a SQL query to achieve this and read the information for active counters every time the counter page is loaded or is there a way in which I can store this information on the site so that the site "knows" which instance is currently active and does not require an SQL query for each request (using a global variable concept) ? Obviously, the situations described above are just simplified examples which I use to explain my issue.
You can use ApplicationState to cache global values that are not user-specific. In your first example, since the number is incremented for all users you can transactionally store it in the database whenever it is incremented, and also cache it in ApplicationState so that it can be read quickly when rendering pages on the server. You will have to be careful to ensure you are handling concurrency properly so that each time the number is incremented the Database AND the cache are updated atomically.
It's a little unclear from your question, but if your requirement is to also publish changes to the number in real-time to all users who are currently using your website you will need to look at real-time techniques. Websockets are good for this (if available on the server and client browser). Specifically, on the .NET platform SignalR is a great way to implement real-time communication from server to client and with graceful fall-back in case WebSockets are not supported.
Just to be clear, you would not use Session storage for this scenario (unless I have misinterpreted your question). Session is per-user and should typically not affect other users in the system. Your example is all about global values so Session is not the correct choice in this case.
For your second example, using ApplicationState and transactional DB commits you should be able to cache which counter is currently active and switch them around at will provided you lock all your resources while you perform the switch between them.
Hopefully that's enough information to get you heading in the right direction.

Should I move client configuration data to the server?

I have a client software program used to launch alarms through a central server. At first it stored configuration data in registry entries, now in a configuration XML file. This configuration information consists of Alarm number, alarm group, hotkey combinations, and such.
This client connects to a server using a TCP socket, which it uses to communicate this configuration to the server. In the next generation of this program, I'm considering moving all configuration information to the server, which stores all of its information in a SQL database.
I envision using some form of web interface to communicate with the server and setup the clients, rather than the current method, which is to either configure the client software on the machine through a control panel, or on install to ether push out an xml file, or pass command line parameters to the MSI. I'm thinking now the only information I would want to specify on install would be the path to the server. Each workstation would be identified by computer name, and configured through the server.
Are there any problems or potential drawbacks of this approach? The main goal is to centralize configuration and make it easier to make changes later, because our software is usually managed by one or two people at most.
Other than allowing for the client to function offline (if such a possibility makes sense for your application), there doesn't appear to be any drawback of moving the configuration to a centralized location. Indeed even with a centralized location, a feature can be added in the client to cache the last known configuration, for use when the client is offline).
In case you implement a [centralized] database design, I suggest to consider storing configuration parameters in an Entity-Attribute-Value (EAV) structure as this schema is particularly well suited for parameters. In particular it allows easy addition and removal of particular parameters and also the handling parameters as a list (paving the way for a list-oriented display as well in the UI, and therefore no changes needed in the UI either when new types of parameters are introduced).
Another reason why configuartion parameter collections and EAV schemas work well together is that even with very many users and configuration points, the configuration data remains small enough that is doesn't suffer some of the limitations of EAV with "big" tables.
Only thing that comes to mind is security of the information. In either case you probably have that issue though. Probably be easier to interface with though with a database as everything would be in one spot.