where does counts/s normalization make sense?
after the discussion was stated in #1524 regarding the ct display, the question was raised in the chat (@matias.guijarro, @valentin.valls, @claustre, @sebastien.petitdemange) for which type of counters it makes sense to have a normalized value (counts/s) in addition to their real value.
I propose (other opinions welcome) that a normalization per second (counts/s
in spec) make sense for all channels where the published value is proportional to the count time. (@denolf, @pguillou, @alejandro.homs, @sole, fell invited to comment on this statement)
As far as I can see this concerns
- SamplingCounters in INTEGRATE mode
- certain Integrating counters
- ... what else?
SamplingCounter INTEGRATE
I had a look at the INTEGRATE
mode of the bliss SamplingCounter again.
For this issue I will use the following device simulating a diode that is used to monitor an almost constant beam current
from bliss.common.counter import SoftCounter
import numpy
class ConstantCurrentSimDiode:
def value(self,mean_value=100,rel_err=.1):
err = (numpy.random.rand() - .5)*mean_value*rel_err*2
return mean_value + err
ccsd=ConstantCurrentSimDiode()
mydiode=SoftCounter(ccsd)
mydiode.mode='INTEGRATE'
Not looking at the implementation of the sampling counter modes and pretending to be a scientist that is new to bliss I do the following:
TEST_SESSION [35]: ct(1,mydiode)
Tue Mar 17 15:21:16 2020
value = 99.98172861184565 ( 99.98172861184565/s)
Out [35]: Scan(number=12, name=ct, path=)
TEST_SESSION [36]: mydiode.statistics
Out [36]: SamplingCounterStatistics(mean=99.98172861184491, N=8792, std=5.6901471905312695, var=32.377775049910895, min=90.00366901077004, max=109.99806421627078, p2v=19.994395205500737, count_time=1, timestamp='2020-03-17 15:21:16.654606')
now I repeat the same with .1s count time:
TEST_SESSION [37]: ct(.1,mydiode)
Tue Mar 17 15:21:58 2020
value = 10.013567217458515 ( 100.13567217458515/s)
Out [37]: Scan(number=13, name=ct, path=)
TEST_SESSION [38]: mydiode.statistics
Out [38]: SamplingCounterStatistics(mean=100.13567217458511, N=998, std=5.604182283749987, var=31.406859069497216, min=90.01304306836516, max=109.99351925504344, p2v=19.980476186678274, count_time=0.1, timestamp='2020-03-17 15:21:58.806351')
so in the INTEGRATE
mode the reading is proportional to the counting time while the reading in 'MEAN' mode is independent of the counting time. From my point of view this is logical. (sorry @matias.guijarro, I think in the chat I was wrong for a moment)
For completeness here the same example in MEAN
mode where the returned value is roughly 100 independent of the counting time.
TEST_SESSION [42]: mydiode.mode='MEAN'
TEST_SESSION [43]: ct(.1,mydiode)
Tue Mar 17 15:26:21 2020
value = 100.14168033516445 ( 1001.4168033516445/s)
Out [43]: Scan(number=15, name=ct, path=)
TEST_SESSION [44]: mydiode.statistics
Out [44]: SamplingCounterStatistics(mean=100.14168033516442, N=996, std=5.848186452365233, var=34.20128478162825, min=90.04117356925855, max=109.98889962991959, p2v=19.94772606066104, count_time=0.1, timestamp='2020-03-17 15:26:21.531121')
Integrating Counters
For me it is not clear which integrating counters full fill the condition that their published value is proportional to counting time. Just looking at the first example that came to my head (Lima BPM) I can not give a clear answer.
On the one hand we have
TEST_SESSION [11]: isinstance(lima_simulator.bpm.x,IntegratingCounter)
Out [11]: True
where x
is a coordinate in pixel or microns so that a normalization per second does not make sense while there is the intensity
counter on the same device where one would want to have a normalized value.