I’m trying to learn for my Statistics class and I’m stuck. Can you help?
Two MV imaging panels, each consisting of a 100100 array of 11 mm2 detectors, were used to image the beam of a Co-60 unit at the same location. Each imaging panel was exposed to the source for 1 second, and the averaged intensity was the same (1000, arbitrary unit) for both detectors. However, the standard deviation was 50 for the first panel and 35 for the second panel. Assuming all detectors are the same, can you explanation what might cause such discrepancy? (10 points)
2. The measurements in the Question 1 were repeated but this time a slab of 1515 cm2 solid water with a thickness of 0.5 cm was put on top of the detector panel. The averaged intensity was again the same (1600, arbitrary unit) for both detectors and the standard deviation was 59 for the first panel and 41 for the second panel. Are the signal to noise ratio improved or deteriortaed by this additional layer of solid water? What are the reasons for such improvement or deterioration? (10 points)