Книга: Practical Common Lisp
Inverse Chi Square
Inverse Chi Square
The implementation of inverse-chi-square
in this section is a fairly straightforward translation of a version written in Python by Robinson. The exact mathematical meaning of this function is beyond the scope of this book, but you can get an intuitive sense of what it does by thinking about how the values you pass to fisher
will affect the result: the more low probabilities you pass to fisher
, the smaller the product of the probabilities will be. The log of a small product will be a negative number with a large absolute value, which is then multiplied by -2, making it an even larger positive number. Thus, the more low probabilities were passed to fisher
, the larger the value it'll pass to inverse-chi-square
. Of course, the number of probabilities involved also affects the value passed to inverse-chi-square
. Since probabilities are, by definition, less than or equal to 1, the more probabilities that go into a product, the smaller it'll be and the larger the value passed to inverse-chi-square
. Thus, inverse-chi-square
should return a low probability when the Fisher combined value is abnormally large for the number of probabilities that went into it. The following function does exactly that:
(defun inverse-chi-square (value degrees-of-freedom)
(assert (evenp degrees-of-freedom))
(min
(loop with m = (/ value 2)
for i below (/ degrees-of-freedom 2)
for prob = (exp (- m)) then (* prob (/ m i))
summing prob)
1.0))
Recall from Chapter 10 that EXP
raises e to the argument given. Thus, the larger value
is, the smaller the initial value of prob
will be. But that initial value will then be adjusted upward slightly for each degree of freedom as long as m
is greater than the number of degrees of freedom. Since the value returned by inverse-chi-square
is supposed to be another probability, it's important to clamp the value returned with MIN
since rounding errors in the multiplication and exponentiation may cause the LOOP
to return a sum just a shade over 1.
- Chapter 7. The state machine
- Example NAT machine in theory
- What is needed to build a NAT machine
- Placement of NAT machines
- The final stage of our NAT machine
- 2.2.5. ATM Switching
- The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
- Shell Pattern-Matching Support
- The KDE Archiving Tools (KDE ark and kdat)
- Using scp to Copy Individual Files Between Machines
- Using sftp to Copy Many Files Between Machines
- Configuring a Local Caching Nameserver