As we have already mentioned earlier, there are new opportunities for mass data collection in the latest Zabbix version – Zabbix 3.4. Now let’s dwell on this functionality in more detail, and in order to make it more clear, we will tell about it in two examples:

  • one-time collection of the data received in JSON with the console: Mercury 236 electric power consumption meter
  • collection of hard drives and SSDs S.M.A.R.T. attributes obtained in tabular form with smartmontools.

 

But what was the problem exactly?

Collection of data with console utilities or API calls was possible even earlier, but there were difficulties:

  • slow utilities start-up for each desired item of data
  • access to the resource (disk, port, counter, API application) for each data item
  • results should be parsed with external scripts/utilities
  • and if you later had to fix the parsing – you had to update the UserParameters or scripts again
  • among other things, simultaneous requests from several Zabbix pollers resulted in an error when accessing, for example, a serial port.

In general, it was like this:

And with the emergence of dependent items, this became possible:

How does it work?

  • In Zabbix 3.4, the data source can be another data item called a parent or a master item. Such an item can, for example, contain an array of data in a JSON, XML or frivolous text format.
  • When new data arrives at the parent item, the remaining data items, which are called dependent, access the parent item and with the help of such preprocessing functions as JSON path, XPath or Regex the desired metric is taken from the text.

Besides, preprocessing is also innovation for 3.4 version, it is implemented by adding new processes preprocessing_manager and preprocessing_worker at the Zabbix-server. Therefore, if you are upgrading from 3.2, do not forget to update the template for the server to monitor their work.

Let us turn to examples.

Mercury 236

Let’s imagine that our project, besides containers, virtual machines, applications, network devices, databases, business indicators and everything else requiring control, has a need to monitor the electricity and other engineering systems, like climate equipment. We use standard devices for our area, three-phase electricity meter Mercury 236 ART-01 PQRS with RS-485 interface, over which the communication is made through the manufacturer’s proprietary protocol.

This is quite a challenge – immediate collection of indicators of voltage, power, current, consumption and frequency. Connecting such a device to a server with a Zabbix agent is a feasible task – it will be sufficient to have a serial port with RS-485, for example, in the form of a USB adapter. But how the data should be read? But for github and good people who shared their solution for smart homes, we would spend time writing the module for Zabbix, which we would have to teach to use the power meter protocol and interrogate the indicators.

The utility is simple and convenient (which grants its author huge karma bonus) connecting to the counter by the specified port, reads the data and gives us as text, CSV or JSON file.

Let’s try to install and run:

Running! Excellent, now connecting the power meter, interrogating it, receiving JSON file:

As a result, the utility has already done all the hard work for us, implementing the protocol of communication with the power meter, pulled out the data and even offered us the results in the convenient JSON object. But in previous Zabbix versions we could not use it as simple as that – we would have to write a wrapper script, and, most importantly, implement a mechanism for access control to the serial port environment. If two Zabbix pollers simultaneously made a request to it – one for the value of the current of the third phase 3, and the other for the current of phase 2 – we wouldn’t receive anything.

With 3.4, everything becomes much simpler, and now we can quickly and easily transfer data from third-party console utilities to Zabbix without resorting to wrapping scripts, and not running 10 times the same for each data item separately. So,

let’s configure the launch of the mercury236 utility from Zabbix

To run the script, let’s create a new Zabbix agent in the UserParameter config:

Saving the file, not forgetting to restart our Zabbix agent.

Now let’s create a parent item in the new template:

As you can see, there is nothing special in the parent item – just a check through UserParameter of the Zabbix agent. And this means that there are no restrictions to the type of verification acting as a parent item – the data can be obtained with Zabbix trapper or with external checks. It should be noted though that we chose the ‘Type of Information – Text’ and the ‘History storage period’ of 1 day (you could config 0 days if do not want to store original message at all) – longer storage is intended for metrics separately in dependent items. Note! Preprocessing in this data item stays unaltered.

Let’s set up receipt of the our power meter metrics

To start creating dependent items, you can use the new assistant. Well, or just clicking “Create Item”:

Let’s create an item for the voltage of the first phase, select:

  • Type: Dependent item
  • Master item: mercury-get

Then, in the Preprocessing tab, add JSON Path expression:

JSON path: $.U.p1

By the way, a small tip. In order not to spend a lot of time debugging and error correction, before filling out JSON Path, it is convenient to quickly check the correctness of the expression online, for example here: http://jsonpath.com/ by copying JSON received from the utility.

Creating other metrics of interest to us is similar. Including for accumulated energy by day rate.

To do this, create a new data item and select:

  • Type: Dependent item
  • Master item: mercury-get

But in the “Preprocessing” tab, pay attention to two nuances:

  • we will use the notation with square brackets, since the JSON path contains a hyphen
  • preprocessing can be multi-step, for example, here we multiply the result of the first step by 1000 to get W*h from kW*h
 
Let’s do the same for the other key metrics of the counter, in the end we get the following list:

Let’s perfect our template

To make the template complete, let’s add triggers with macros, making it as flexible as possible. Do not forget about trigger dependencies.

What comes out

The template is ready, the data runs, let’s look at what we get:

All the latest data collected in one request:

Note that the last check timestamp of all the metrics is absolutely identical.

The final template for the power meter is available in the solution repository at share.zabbix.com here.

Summing up:

  • we re-used a good piece of software and wasted no time writing our own implementation of data collection on the Mercury protocol.
  • UserParameter remains but collapsed to a simple request. In fact, you can even use system.run[].
  • We also didn’t have to write wrapper scripts. All was parsed through the JSON path in the template.
  • The counter did not suffer much, one request – all the data we needed received at once.

Smartctl and smartmontools

 

There are solutions available in Zabbix to control S.M.A.R.T. attributes of hard drives and SSDs with UserParameters. This approach is plausible, but it is not deprived of shortcomings:

  • redundant launches of the smartctl utility, which in turn had to access the hard drive controller each time
  • we had to do separate parsing for Linux and Windows. It is especially painful to work with this solution in Win: (for /F.. so … we have to escape double-quote with more double-quotes…. Aaarrrgh!!!!).

We will try to solve both shortcomings with 3.4.

The case with smartmontools has two differences from the example with the power meter above:

  • smartctl does not return JSON
  • there can be different number of drives in the server, so we need to use low-level discovery(LLD).

But it’s okay! First, the dependent items work for the LLD as well, and secondly, we have PCRE regex among the preprocessing filters, and we will use it to pull the required attributes from the not super-highly structured response of the utility. Approximately this:

Let’s get started.

Simplifying UserParameters

Before:

After:

Similar way for Windows, incidentally getting rid of CMD magic with for /F and find. Look here.

Creating new parent items

To collect all the attributes of S.M.A.R.T., create a prototype of the master item:

As in the previous example, nothing special needs to be configured. Only ‘Type of information’ – Text, and ‘History storage period’ – 1 day.

To collect test results and inventory data, run smartctl with other keys. Therefore, create two more data items in a similar way:

  •         uHDD.i[“{#DISKNAME}”]
  •         uHDD.health[“{#DISKNAME}”]

Setting up collection of S.M.A.R.T. attributes

Create a dependent item for attribute 5, Reallocated:

And in the Preprocessing tab, use regular expression:

And just as for JSON Path, in order not to spend a lot of time debugging and error correction, before filling out regex, it is convenient to quickly check the correctness of the expression online, for example here: https://regex101.com/ by copying our smartctl output there.

In the end, we get the following list of prototypes:

Test, see what happened

For two HDDs:

For SSD with Windows:

Summing up the example of smartmontools:

  • all parsing from UserParameters removed
  • no external scripts (except for LLD), no external dependencies, all parsing occurs on the Zabbix server, it is easy to see and correct it there, if necessary
  • when the utility or API does not return XML/JSON – no big deal, you can always try using regular expressions
  • no longer torturing hard drives – first we get the entire list of parameters for S.M.A.R.T., and then decompose it using metrics for Zabbix server.

Updated template (and updated triggers, items for SSD added) is available in the solution repository at share.zabbix.com here.

In conclusion

Mass collection of metrics with dependent items is a simple and easy way to reduce the load on the network and resources of the monitored systems, as well as reduce the need for external scripts. We are sure that many users of Zabbix will like it!

We’ll keep on exploring new features of Zabbix 3.4 in our future blogs. To be continued!

Check out our blog post on new possibilities of monitoring Java applications with Zabbix 3.4.

Subscribe
Notify of
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Thorsten.Kramm
Thorsten.Kramm
6 years ago

This a really awesome feature. I just tested it. It works great.
But I noticed something you should consider. If the Json Path contains blank space it’s not working. If your returning json data contains keys with a blank, you must convert this on the sender side.
{"physical read IO requests": "6075", "physical read bytes": "73523200", "physical read requests optimized": "0"} is a valid json object.
But you cannot create a preprocessing step, getting an object like
$. physical read IO requests.
Using $. "physical read IO requests" does not work either.
I ended up replacing blanks to underscore before sending the data to the Zabbix Server.

But never the less. It’s great. Now it’s super easy to query the Oracle Sysstat with only a few lines of code.
#!/usr/bin/python

import cx_Oracle
import json

try:
db = cx_Oracle.connect("hase/[email protected]")
cursor = db.cursor()
cursor.execute("select name,value from v$sysstat")
ret = {}
for row in cursor:
key = row[0].replace(" ","_")
ret[key] = str(row[1])
print json.dumps(ret)

except cx_Oracle.DatabaseError, e:
error, = e.args
print >> sys.stderr, "Oracle-Error-Code:", error.code
print >> sys.stderr, "Oracle-Error-Message:", error.message

finally:
cursor.close()
db.close()

Don’t forget to put ORACLE_HOME in the userparameter
UserParameter=oracle.sysstat,ORACLE_HOME=/u01/app/oracle/product/11.2.0/xe /var/lib/zabbix/oracleSysStat.py

trackback

[…] P.S.S. Get acquainted with other Zabbix 3.4 features in our blog post on mass data collection and preprocessing. […]

2
0
Would love your thoughts, please comment.x
()
x