Proposal: Define warning and alert-rates automatically.

More
6 years 7 months ago #3536 by heliosh
Hi,

Since radmon.org is logging all values, it would be straight-forward to set warning- and alert-levels automatically depending on the recorded range of values from a certain station.
During a calibration phase the average value and standard deviation could be calculated.

Warning-Level could then be set for example to 4-sigma (0.00317% of all values, equivalent to one false warning every 22 days when reporting once per minute)
Alert-Level could be for example 4.5-sigma (0.000345% of all values, or statistically occurring once every 201 days) or maybe even 5 sigma which would be equal to one false alert every 6.3 years.
With 132 active stations the entire network would produce statistically one false alert every 17 days.

I think that would make alerts more meaningful and make the levels easier to compare and observe.
What do you think of this idea?

Please Log in or Create an account to join the conversation.

Moderators: Gamma-Man
Time to create page: 2.408 seconds
Powered by Kunena Forum
Everything's free. Please support us by considering a donation. Log in first!
Solar powered Raspberry Pi 4 server stats: CPU 49% Memory 15% Swap 17% CPU temp=56.0'C Uptime 41 Days