The mean and variance dominate statistical measurements in both the time and frequency domains. They are also reflected by so-called amplitude domain measurements. The most basic of these is called a histogram. To measure a histogram, break a signal’s potential amplitude range into a contiguous series of N amplitude categories (i.e. x is between a and b) and associate a counter with each category.
Initialize the measurement process by zeroing all of the counters. Take a sample from the time-series in question, find the category its amplitude fits within and, increment the associated counter by one. Repeat this action thousands of times. Plot the counts retained (vertically) against their category amplitude (horizontally). You have just measured a histogram.
A histogram may also be used to graphically present tabular measurements from an experiment or even gaming odds. For example, consider tossing dice. If you toss a single (honestly constructed) die, any one of its six numbered faces may face up and the odds are 1 in 6 that any specific number will be rolled. As a histogram, this amounts to 1 count for each of the possible tossed numbers, 1 through 6, a rectangular distribution.
Now consider rolling two dice (or one die twice) and recording their sum. There are now 36 possible combinations that might be rolled with sums spanning 2 to 12. However, the 11 different possible sums are not equally probable. There are six combinations totaling 7, five totaling either 6 or 8, four totaling 5 or 9, three totaling 4 or 10, two totaling 3 or 11 and only one way to roll either a 2 or a 12. The histogram now takes on a triangular shape. Finally, consider what happens when you roll three dice. The number of combinations increases to 216 with 16 different possible sums. The likelihood of a 10 or 11 is 27 times as probable as rolling a 3 or an 18. With three dice in the game, the histogram takes on a bell-shaped curve. These three histograms are plotted below for comparison.
The three histograms have different independent-variable spans. If we choose to plot the averaged (mean) number tossed instead of the sums, we can align the three histograms horizontally, making comparisons between the three plots simpler. If we now change the vertical scale of each trace so that each curve bounds a (dimensionless) unit area, we have converted the three histograms to Probability Density Functions (PDF). Note that if the independent variable (horizontal axis) carries an engineering unit, the vertical (probability density) axis must bear the reciprocal of that unit to render the bounded area dimensionless.
Several important things happen when a histogram is scaled as a probability density function. Since the area under this p(x) curve is 1.0, and the curve spans all known possibilities of the independent variable, x, it may be used to evaluate the probability of x falling between two known bounds, say Xaand Xb.
Before we continue, reflect for a moment on the three PDFs plotted from our dice-throwing model. Note the rapid convergence from a rectangular PDF towards a bell-shaped one as independent variables are added or averaged together. This clear progression is observed in all manner of natural phenomena. Since most occurrences involve the summation or integration of many independent component happenings, many things in nature tend to have bell-shaped PDFs. Pass a signal of almost any PDF shape through a filter or averaging process (be it electrical or mechanical) and the output will tend strongly to the naturally occurring mean-centered symmetric bell curve.