There has been some discussion on LENR Forum of data resolution in the Fabiani spreadsheet. From Jed Rothwell:

*2) If you look at the T_out data from this file*

*http://coldfusioncommunity.net…01/0194.16_Exhibit_16.pdf*

*It appears that it wasn’t to the nearest 0.1 deg C. Here we are working with a discrete set of possible temperature values: 103.9, 104.5, 105.1.*

*P. 7 shows 4 digit precision.*

*So more accurate would be to say the temperature data was reported to the nearest 0.5 or 0.6 deg C.*

*I have never heard of an electronic thermometer that registers to the nearest 0.5 deg C. It is always some decimal value: 1, 0.1, 0.01 . . . This one clearly registers to 4 digits, although I doubt the last 3 are significant.*

It is clear that this was not an “electronic thermometer,” but a temperature sensor that generates a signal, often it is a voltage, that varies with temperature. As an example, the TI LM34 sensor generates 10 mV per degree F. This voltage may be sensed and recorded by computer using an ADC, which will have a certain resolution. We are possibly seeing the resolution of the ADC. The voltage reading will be quantized by the ADC.

Looking at the data on page 7, we can see that the only Tout values are 105.0728, 104.5046, and 103.9364. The first jump is 0.5682. The next jump is 0.5682, the same. This is 1.02276 F; the resolution is close to 1 degree F.

I’m suspecting an 8 bit ADC, with full scale being 256 F. Whatever, the resolution sucks. Maybe someone can find the magic approach that explains the exact decimals. (The device provides a voltage which is digitized with the increment being one bit. The temperature is then calculated using an offset and a ratio. This creates the 4-place decimals.)

The Tin temperatures also show quantization. The increment is the same, 0.5682 C., so the values are 63.4544, 64.0226, 64.5908, 65.1590, 65.7272, 66.2954, 66.8636, 67.4318, 68.0000, 68.5682, 69.1364.

That exact value of 68 C pokes me in the eye…. coincidence, perhaps.

There is no sign of calculation roundoff error there; these numbers are likely multiples of 0.5682 C exactly, plus some offset. The recorded data may have been volts, recorded to a certain precision, and then for the spreadsheet this was multiplied by a constant, so the quantized voltage then shows up as quantized temperature. This was not recorded with high precision.

The pressure is also apparently quantized. Now, this is wild: the pressure is close to 1 bar. Absolute pressure, not gauge. The only values shown are 0.9810 and 1.0028, and the value oscillates between them. So the increment is 0.0218 bar. What gauge was this? Penon had said he was going to use PX3098-100A5V, an Omega gauge. This is a 6.9 bar full-scale absolute pressure gauge. The specified accuracy is +/- 0.25% FS, so it would be +/- about 0.02 bar. Then we have possible digitization error, so total error could be 0.04 bar.

The digitization error was unnecessary, at this level. Besides the fact that the pressure gauge selected was too insensitive if pressure was going to be close to 1 bar, the quantization indicates that low-resolution ADC was used. Who chose the ADC hardware? Fabiani?