Welcome to the Linux Foundation Forum!

The ongoing debate about increase vs rate: can you help?

I have been seeing lots of discussion online on the usage of increase vs rate in Prometheus alerts and to this date I am not clear about it. For example:

Official documentation states:
"rate(v range-vector) : calculates the per-second average rate of increase of the time series in the range vector."

In simple terms, this means that we will get the increase for every second and the value for the given second will be the average increment in the given range, I if I got it right.

Conversely, regarding increase:

"increase(v range-vector) : calculates the increase in the time series in the range vector."

For me this means it wont distribute the average among the seconds, but instead will show the single increment for the given range(with extrapolation). Is that correct?

Can you help in clarifying in simple terms what is the real difference? Here is an example alert rule:

sum(increase(http_server_duration_count{service_name="aservice", status=~"5[0-9]{2}", path="/a/path"}[5m])) by (path) / sum(increase(http_server_duration_count{service_name="aservice", path="/a/path"}[[5m])) by (path) > 0.05

Can this simple use rate?

Thank you


Upcoming Training