CPM to μSv/h

More
3 months 3 weeks ago - 3 months 2 weeks ago #6237 by bwalter
CPM to μSv/h was created by bwalter
It doesn't look like there is any indication of dose units displayed on the map or in graphs.
In my station description I put the conversion ratio from CPM to μSv/h for my SBM-19 tube, which is
(650 CPM = 1 μSv/h calibrated). The SBM-20 is ~ (174.43 = 1 μSv/h)
The CPM averaging for normal background is good for a baseline if this is all that is wanted.

bwalter
Last edit: 3 months 2 weeks ago by bwalter. Reason: line cr correction

Please Log in or Create an account to join the conversation.

More
3 months 2 weeks ago #6239 by DonZalmrol
Replied by DonZalmrol on topic CPM to μSv/h
The "dose units" value is in CPM, not uSV.

For a normal background radiation with your SBM-19 you should get between 75-85 for your Counts Per Minute uploads depending your location.

https://www.pocketmagic.net/tube-sbm-19-%D1%81%D0%B1%D0%BC-19/

Please Log in or Create an account to join the conversation.

More
3 months 2 weeks ago - 3 months 2 weeks ago #6240 by bwalter
Replied by bwalter on topic CPM to μSv/h
There is a great misunderstanding about Counts Per Minute or events per minute and dose rate.
The CPM amount doesn’t show you the amount or strength of the radiation.
Everyone that uses a Geiger Counters should read the articles on the sites below.

What Is CPM in Radiation? | SOEKS USA (soeks-usa.com)

Counts per minute - Wikipedia
Last edit: 3 months 2 weeks ago by bwalter. Reason: CR for last line

Please Log in or Create an account to join the conversation.

More
2 months 3 weeks ago #6263 by Simomax
Replied by Simomax on topic CPM to μSv/h
bwalter is correct with regards to CPM and dose. CPM is simply an indication of the number of events the tube has detected within a set period. In the case of CPM the set period is one minute. So CPM would only tell us how many events have happened over that period, not the radiation dose received.

The dose depends on various factors, and it is those factors that requires us to calibrate any device that would show a dose value. Different GM tubes/scintillators will give us a much varied CPM value, but the effective dose will be the same. Therefore, two calibrated devices with completely different tubes should show the same dose value, within a margin of error, but the CPM value will be very different. From what I have seen and learned, the margin of error is usually around +-10% - 20%. 10% in a good counter, up to 20% in a cheaper counter. This margin of error allows us to use a single device to read a dose value from many sources. Usually dose rate meters, as opposed to counters will be calibrated for the middle of their usable range. A dose rate meter that would be calibrated would use something like Cobalt 60 to calibrate the meter. Many GM tubes state a CPM to dose rate value in their data sheet. The LND712 GM tube data sheet states that the gamma sensitivity of Cobalt 60 is 18 CPS per mR/h (milliroentgen/hour) or 1080 CPM for the specific dose. The SBM-20 GM tube data sheet states the gamma sensitivity of Cobalt 60 is 22 CPS per mR/h, or 1320 CPM per mR/h. Both GM tubes will show a different CPM value, for exactly the same dose, as physics dictates they have different sensitivities. Things start to get a little funky when you change the source for say Ceisium 137 or Strontium 90. Different sources will give different CPM values on different tubes, but also their dose will vary depending on the type of source. If a dose rate meter was calibrated to Strontium 90 it will no longer be valid for measuring Cobalt 60 dose, however the CPM will not have change when measuring the Cobalt 60.

It is important to not confuse counters with dose rate meters. A counter is simply that, it counts events. A dose rate meter should be calibrated to a specific source for the effective dose rate.

In Radlog/Radmon we are able to set a conversion factor for our tubes. From what I have seen most data sheets will show the conversion factor in mR/h so this should be converted to uSv/h if that is the dose rate unit you are using. The conversion factor (and take this with a pinch of salt, as different information sources state different conversion factors  so is for example only) for a LND712 tube has a conversion factor of 0.00812 for Co-60 and 0.0090 for Cs-137 (for uSv/h). Both of the samples will have the same effective dose, but the CPM will be different. For a SBM-20 tube the factors are 0.00664 for Co-60, 0.00584 for Cs-137 and 0.00504 for Ra-226 (Radium 226.) When we are using a dose rate meter for general use we will calibrate it to approximately the center of the range of the samples we expect to use it with. This would typically be around 0.0057 for a SBM-20 GM tube and 0.0081 for a LND712 GM tube. This will put the dose rate reading somewhere in the middle of the +-10% margin of error and will offer us results that are reasonably within range.

The way in which most of use use our counters and meters is for an indication only. Something to let us know if this or that source is active, and they work pretty well for us in that manner. To really tighten down on getting real accurate numbers we would have to be in a lab environment will some quite expensive equipment that is calibrated to the actual source we are sampling, usually inside a heavily lead shielded enclosure to reduce background counts to an absolute minimum. Now I mention it, you also have to take into account the background counts of the device you are using. So if your background is 20 CPM, and your source shows 100 CPM, the actual count from your source would be 80 CPM, and then do the math to convert to uSv/h or mR/h or whatever.

Whilst I don't think this has ever been mentioned here, I think it is fair to say that most users here will agree on the reason we mainly use CPM with Radmon, is that most of the stations here are for background monitoring only. As are my two active counters. We set them up, gather some data over some time and that gives us a baseline for what we expect to see from our devices. Should the background count go up we know that based on our baseline readings over time that a higher reading would indicate there is more radiation present than background. Of course we can't tell what radiation it is (unless we have a gamma spectrometer (this is getting into expensive lab equipment)) with our devices, only that something isn't the same as it was. It will also give an indication whether the higher reading is acceptable, or a SHTF situation. If the case ever arises that a device starts to read higher one would check other devices within as close proximity as possible to draw up a picture of what is actually happening in the environment. If you are wanting to actually measure the exact dose from a sample then you would need a lab environment, with associated equipment.

So in a nutshell, the dose we perceive from our devices is a rough calculation based on the middle ground of the radioactive sources the tube is designed to detect, whereas the CPM value will vary much dependant on the source itself and the tube used. A SBM-20 tube will show a lot less CPM than a SBM-19 tube, even though the dose rate is exactly the same when used with the same source. A device with a SBM-20 tube that is calibrated for Co-60 will show slightly incorrect dose rate for Cs-137, and the same device calibrated for Cs-137 will show a slightly incorrect dose rate for Co-60, and that is why we have our margin of error, and why generally we use CPM in Radmon as is used for an indication only, and not for measuring actual dose rate.
The following user(s) said Thank You: FSM19, bwalter

Please Log in or Create an account to join the conversation.

Moderators: GetSiriusGamma-Man
Time to create page: 0.213 seconds
Powered by Kunena Forum
Everything's free. Please support us by considering a donation. Log in first!
Solar powered Raspberry Pi 4 server stats: CPU 98% Memory 19% Swap 17% CPU temp=69.6'C Uptime 32 Days