*Data Processing operations per Hour,* or in brief DPH, is the amount of operations performed by the Servitly back end each hour to process incoming data.

A specific number of *Data Processing operations* are consumed each time there is the need to process a data, and this includes:

- processing messages published by the connected products to save the raw metric values in the database;
- processing of computed metrics with
*CONTINUOUS*evaluation; - processing of work sessions;
- processing of events (Failures, Anomalies, Operations) with
*CONTINUOUS*evaluation.

Note that, only *CONTINUOUS* evaluation consumes DPH, *SAMPLED* evaluation is free of charge.

*CONTINUOUS*: periodically all the data points since the last evaluation are considered in the computation, and multiple data points could be generated.*SAMPLE*: periodically only the last data point is considered in the computation, and at most one data point is generated.

The frequency of data publishing is up to the connected product, instead periodic computations are based on a fixed evaluation interval.

Day by day, Servitly's billing engine calculates the average DPH consumed by data point processing operations and sums the number of DPH consumed by periodic calculations. Here below, you can find more details about how DPH are computed.

# Data Points Processing

Each time the IoT Connector receives a message, the value of the fields mapped to a metric is extracted, and for each value saved in the database, one DP operation is consumed. The number of data points takes into account any type of metrics except the default ones (Connection Status, Cloud Status).

The number of data points is calculated by summing:

*Incoming Data Points*: the data points saved as a result of receiving an IoT message sent by the connected product.*Computed Data Points*: The data points saved as a result of the calculated metrics

The number of DPH consumed by data points, is obtained by averaging the number of data points saved in the month by the number of hours in the month.

// SCENARIO

One thing ONLINE from 08:00 to 18:00 every day

When ONLINE sends 1 message with 2 values every 10 seconds

// CALCULATIONS

Messages per hour: 3600 sec / 10 sec = 360

Total number of data points saved daily: 360 x 2 x 10 = 7200 data points

DPH to save 1 data point at hour: 1 DPH

Daily average DPH: 7200 / 24 X 1 =300 DPH

The DPH consumption is strictly related to the message publishing rate and the number of metrics.

Note that, the rate of incoming data points is limited, and when this limit is reached, exceeding data points are discarded. For more details, refer to the Publishing Rate Limit article.

# Computed Metrics Processing

The Servitly data computation engine, for each computed metric with *CONTINUOUS* evaluation, periodically verifies the presence of new data points for the input metric-based variables.

The retrieved set of data points is then processed by the computation engine, and a data point for each distinct timestamp is generated, and stored in the database as a new metric value.

// SCENARIO

A derived metric with 3 input metrics (expression: M1 x M2 x M3)

Evaluation interval = 120 sec

// CALCULATIONS

Standard DPH (60 seconds interval): 8 DPH + (3 Inputs - 1) x 4 DPH = 16 DPH

Real DPH (120 seconds interval): 16 x (60 / 120) =8 DPH

The DPH consumption is strictly related to evaluation interval and the number of input variables based on metrics.

# Work Sessions Processing

The Servitly work-session computation engine, for each work-session definition verifies, according to the start and stop conditions, the presence of work-sessions to create or to be stopped and historicized. Moreover, when a work-session is active, the monitored metrics are periodically computed and updated to the work-session itself.

// SCENARIO

A work-session definition with 4 monitored metrics

Evaluation interval = 120 sec

// CALCULATIONS

Standard DPH (60 seconds interval): 30 DPH + (4 Mon. Metrics - 1) x 8 DPH = 52 DPH

Real DPH (120 seconds interval): 52 x (60 / 120) =26 DPH

For each monitored metric, the initial, minimum, maximum and final value are calculated and stored in a work-session.

The DPH consumption is strictly related to evaluation interval and the number of monitored metrics to be computed.

# Events Processing

The Servitly event computation engine, for each event with CONTINUOUS evaluation, according to the active and clear condition variables, verifies the presence of events to activate or to be cleared and historicized.

// SCENARIO

A FAILURE event definition with 3 metrics used in the Active/Clear conditions

Evaluation interval = 120 sec

// CALCULATIONS

Standard DPH (60 seconds interval): 20 DPH + (3 Cond. Metrics. - 1) x 8 DPH = 36 DPH

Real DPH (120 seconds interval): 36 x (60 / 120) = 18DPH

The DPH consumption is strictly related to evaluation interval and the number of metrics involved in the Active and Clear conditions.

## Comments

0 comments

Please sign in to leave a comment.