Daily Dose Physics Science

Daily Dose: Radiation Trouble

Update: See update section following the article for the latest.

I recently described in general terms a new method that I’m developing which uses Yttrium to estimate the carbon export from the ocean’s photic zone, so today I saw it fitting to rant about an issue I’m currently facing. Assuming that you have either read that previous article or are familiar with the Thorium Disequilibrium method, then it should be no surprise that the “signal” for which we are attempting to measure from the radionuclide is the decay of the Y-90 into Zirconium. This decay, which results in the emission of a beta particle (a high energy electron), is measured using a device called a beta counter.

Since in either Thorium or Yttrium disequilibrium methods the parent particles who produce the Th234 and Y90 are stripped from the sample, the total number of Th234 or Y90 particles in the sample decreases over time according to its half life. Therefore, the signal that we measure with the beta counter drops off over time. To give an example, Y90 has a half life of 3 days. So if we start off with a signal of 1 dpm (decay per minute, a measure of radioactivity), then after 3 days we would see a signal of 0.5 dpm. In another 3 days, the signal would drop by another half to just 0.25 dpm and so on.

With this paradigm in mind, every signal will be attenuated over time based on the length of its half life. For the first few days my Y90 sample was showing just this sort of behaviour. At first the signal was 0.71, then a couple days later it was down to 0.52. Since my method doesn’t isolate Y90 from other beta emitting isotopes, such as Th234, we would expect the signal to drop from an initial value to a lower, yet not ½ after one half life due to the presence of these other radionuclides.

So my sample was behaving ideally indicating a Y90 signal of around $2\cdot (0.71-0.52)=0.36dpm$, but then shit hit the fan. Yesterday I placed the sample back in the beta counter and I’m now getting a reading of 1.30 dpm! I’ve since checked the calibration and the machine is definitely working properly.

The radiation signal can’t increase under our current paradigm, so such a result tells us that there must be another, unforeseen process at play for which we now have to account. Since the signal is increasing, it suggests that there might be another isotope in the sample which was originally depleted within the sample, yet whose parent particle was not. The parent then decayed into this new isotope—a process not captured by the beta counter—and which we are now seeing as it decays away. If this is true then the half life of the parent particle is longer than its daughter particle and longer than Y90, and the daughter particle’s half life is likely quite short (high signal even with very little material).

This is just one possibility, but it seems the most plausible given the parallel methodology employed in my new method and in the established Thorium method. I will be quite interested to see how the signal changes in the next couple of days, and it goes to show how science can, sometimes, be an exciting adventure.

Update

After running the sample again, we’re retrieving values in line with what we were expecting to see (0.43 dpm, not 1.30 dpm as before). Since the abnormal, high counts were gone, I decided to go back and parse the raw data count data myself and the high counts vanished. Apparently the software controlling the beta counter, and which subsequently collates and calculates the statistics, wasn’t working properly yielding false values. While not a particularly exciting answer to the mystery, it is a welcomed solution. Call me old fashioned, but I like my methods to work the way I designed them to. 

Back To Top