Electronic Transactions on Numerical Analysis
its book of mathematics on numerical analysis.
23 The Law of Multiplication
Probability trees are one of the most useful
tools in probability and applied mathematics. We
will use them throughout the rest of this text. The
graph in Figure 1 is a probability tree.
24 The Bonferroni Inequality
1By far its most useful application is in joint confidence intervals. The inequality gives
you a confidence interval without assuming independence of the various parameters. It usually
turns out at around 95% confidence that the confidence region isn�t much smaller than with the
assumption of independence.
25 Sigma Notation
The sign 3, called sigma, is a glorified symbol for addition that can be quite useful. It
is utilized throughout mathematics, statistics, computer science and all other mathematical
disciplines. With the 3 there is usually an index that typically is an i or j.
27 Binary Arithmetic
In the previous section we looked at the binomial distribution. The binomial distribution
is essentially the mathematics of repeatedly flipping a coin (and there is no requirement that the
coin be unbiased). These coin flips are known as Bernoulli trials. Anytime you repeat an
experiment with two possible outcomes and with your experiments independent of each other,
you are performing Bernoulli trials. Frequently we desire to simulate such experiments on the
computer.1
28 Bayes' Rule
Bayes' rule is a probability benchmark. Once you understand it, it indicates that you
have moved on to a new level. Suppose that we have partitioned events into mutually exclusive
and exhaustive cases E1, E2, ..., En. That is, exactly one of E1, E2, through, En will occur. Ei
might be the weather (such as temperature will be between 10 and 30 degrees). Suppose also,
that for any Ei we know the probability of X (which might be the event that the Raiders win).
That is,
29 Markov Chains
Markov chains are one of the most fun tools of probability; they give a lot of power for
very little effort. We will restrict ourselves to finite Markov chains. This is not much of a
restriction. We get much of the most interesting cases and avoid much of the theory.1 This
chapter defines Markov chains and gives you one of the most basic tools for dealing with them:
matrices.
30 Classification of States
In a Markov chain, each state can be placed in one of the three classifications.1 Since
each state falls into one and only one category, these categories partition the states. The secret
of categorizing the states is to find the communicating classes. The states of a Markov chain can
be partitioned into these communicating classes. Two states communicate if and only if it is
possible to go from each to the other. That is, states A and B communicate if and only if it is
possible to go from A to B and from B to A.
31 Geometric Series
Geometric series are a basic artifact of algebra that everyone should know.1 I am
teaching them here because they come up remarkably often with Markov chains. The finite
geometric series formula is at the heart of many of the fundamental formulas of financial
mathematics. All students of the mathematical sciences should be intimately familiar with this
topic and have all the formulas memorized.
32 Averages
You should remember the arithmetic average. Given n data points, their arithmetic
average is their sum divided by n. Now suppose that we have the average of n numbers, An. We
are given a new data point x and we would like to compute the new average of all n + 1
numbers, An + 1. Many people simply add up all n + 1 numbers and then divide by n + 1.
However,
21 PIE - Free eBook 21 PIE - Download ebook 21 PIE free
|