Welcome, Guest
Username: Password: Remember me

TOPIC: Proposal: Define warning and alert-rates automatically.

Proposal: Define warning and alert-rates automatically. 5 months 3 weeks ago #3536

  • heliosh
  • heliosh's Avatar
  • New Member
  • Posts: 5
  • Thank you received: 1
  • Karma: 0

Since radmon.org is logging all values, it would be straight-forward to set warning- and alert-levels automatically depending on the recorded range of values from a certain station.
During a calibration phase the average value and standard deviation could be calculated.

Warning-Level could then be set for example to 4-sigma (0.00317% of all values, equivalent to one false warning every 22 days when reporting once per minute)
Alert-Level could be for example 4.5-sigma (0.000345% of all values, or statistically occurring once every 201 days) or maybe even 5 sigma which would be equal to one false alert every 6.3 years.
With 132 active stations the entire network would produce statistically one false alert every 17 days.

I think that would make alerts more meaningful and make the levels easier to compare and observe.
What do you think of this idea?
The administrator has disabled public write access.
Time to create page: 0.056 seconds
Powered by Kunena Forum