Difference between revisions of "Poisson Statistics"
Line 41: | Line 41: | ||
== Notes == | == Notes == | ||
− | This experiment can be done by one student in one session. Taking sets of background data without a source is the most time consuming part, taking around an hour. | + | This experiment can be done by one student in one session. |
+ | Taking sets of background data without a source is the most time consuming part, taking around an hour. |
Latest revision as of 14:43, 23 March 2021
Poisson Statistics
Background
The Poisson distribution can characterize random events that occur at a well-defined average rate. It is widely used in atomic and sub-atomic physics. The Poisson distribution is effective in a variety of statistical applications. The most common involve event probabilities, but several assumptions must hold true: i) the rate at which random events occur does not change for the duration of the measurement; ii) the occurrence of one event does not change the likelihood of another event; iii) events occur at a slow enough rate that they can be individually distinguished.
In this experiment, Poisson statistics will be used to analyze random radioactive decay events that occur in a defined time interval. A decay occurs an integer k number of times in the interval, including possibly not at all (k = 0). The average number of events expected in a defined time interval is λ, known as the event rate. Given λ, the probability of observing k events in the time interval is:
Background Reading
History
Jan 2020 - Experiment set up and verified by Martin Hoeferkamp
Feb 2020 - Software migrated to Windows 10
Feb 2020 - Weak signal observed by group performing lab
Feb 2020 - Major failure of high-voltage source on UCS30 spectrometer, serial number 505, should be sent for repair: Equipment
Feb 25, 2020 - Experiment performed with alternate UCS30 with no issues
March 2021 - Experiment migrated to new computer with UNM Colleges login, no issues found
Notes
This experiment can be done by one student in one session. Taking sets of background data without a source is the most time consuming part, taking around an hour.