SUPPORT THE WORK

GetWiki

normalizing constant

ARTICLE SUBJECTS
aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
ARTICLE TYPES
essay  →
feed  →
help  →
system  →
wiki  →
ARTICLE ORIGINS
critical  →
discussion  →
forked  →
imported  →
original  →
normalizing constant
[ temporary import ]
please note:
- the content below is remote from Wikipedia
- it has been imported raw for GetWiki
{{Use American English|date = March 2019}}{{Short description|Constant a such that af(x) is a probability measure}}{{distinguish|Proportionality factor}}In probability theory, a normalizing constant or normalizing factor is used to reduce any probability function to a probability density function with total probability of one.For example, a Gaussian function can be normalized into a probability density function, which gives the standard normal distribution. In Bayes' theorem, a normalizing constant is used to ensure that the sum of all possible hypotheses equals 1. Other uses of normalizing constants include making the value of a Legendre polynomial at 1 and in the orthogonality of orthonormal functions.A similar concept has been used in areas other than probability, such as for polynomials.

Definition

In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g., to make it a probability density function or a probability mass function.Continuous Distributions at University of Alabama.Feller, 1968, p. 22.

Examples

If we start from the simple Gaussian functionp(x)=e^{-x^2/2}, quad xin(-infty,infty) we have the corresponding Gaussian integralint_{-infty}^infty p(x) , dx = int_{-infty}^infty e^{-x^2/2} , dx = sqrt{2pi,},Now if we use the latter's reciprocal value as a normalizing constant for the former, defining a function varphi(x) asvarphi(x) = frac{1}{sqrt{2pi,}} p(x) = frac{1}{sqrt{2pi,}} e^{-x^2/2} so that its integral is unitint_{-infty}^infty varphi(x) , dx = int_{-infty}^infty frac{1}{sqrt{2pi,}} e^{-x^2/2} , dx = 1 then the function varphi(x) is a probability density function.Feller, 1968, p. 174. This is the density of the standard normal distribution. (Standard, in this case, means the expected value is 0 and the variance is 1.)And constant frac{1}{sqrt{2pi}} is the normalizing constant of function p(x).Similarly,sum_{n=0}^infty frac{lambda^n}{n!} = e^{lambda} ,and consequentlyf(n) = frac{lambda^n e^{-lambda}}{n!} is a probability mass function on the set of all nonnegative integers.Feller, 1968, p. 156. This is the probability mass function of the Poisson distribution with expected value λ.Note that if the probability density function is a function of various parameters, so too will be its normalizing constant. The parametrised normalizing constant for the Boltzmann distribution plays a central role in statistical mechanics. In that context, the normalizing constant is called the partition function.

Bayes' theorem

Bayes' theorem says that the posterior probability measure is proportional to the product of the prior probability measure and the likelihood function. Proportional to implies that one must multiply or divide by a normalizing constant to assign measure 1 to the whole space, i.e., to get a probability measure. In a simple discrete case we have
P(H_0|D) = frac{P(D|H_0)P(H_0)}{P(D)}
where P(H0) is the prior probability that the hypothesis is true; P(D|H0) is the conditional probability of the data given that the hypothesis is true, but given that the data are known it is the likelihood of the hypothesis (or its parameters) given the data; P(H0|D) is the posterior probability that the hypothesis is true given the data. P(D) should be the probability of producing the data, but on its own is difficult to calculate, so an alternative way to describe this relationship is as one of proportionality:
P(H_0|D) propto P(D|H_0)P(H_0).
Since P(H|D) is a probability, the sum over all possible (mutually exclusive) hypotheses should be 1, leading to the conclusion that
P(H_0|D) = frac{P(D|H_0)P(H_0)}{displaystylesum_i P(D|H_i)P(H_i)} .
In this case, the reciprocal of the value
P(D)=sum_i P(D|H_i)P(H_i) ;
is the normalizing constant.Feller, 1968, p. 124. It can be extended from countably many hypotheses to uncountably many by replacing the sum by an integral.For concreteness, there are many methods of estimating the normalizing constant for practical purposes. Methods include the bridge sampling technique, the naive Monte Carlo estimator, the generalized harmonic mean estimator, and importance sampling.WEB, Gronau, Quentin, 2020, bridgesampling: An R Package for Estimating Normalizing Constants,weblink September 11, 2021, The Comprehensive R Archive Network,

Non-probabilistic uses

The Legendre polynomials are characterized by orthogonality with respect to the uniform measure on the interval [−1, 1] and the fact that they are normalized so that their value at 1 is 1. The constant by which one multiplies a polynomial so its value at 1 is a normalizing constant.Orthonormal functions are normalized such that langle f_i , , f_j rangle = , delta_{i,j} with respect to some inner product {{math|⟨f, g⟩}}.The constant {{math|1/{{radic|2}}}} is used to establish the hyperbolic functions cosh and sinh from the lengths of the adjacent and opposite sides of a hyperbolic triangle.

See also

Notes

{{reflist}}

References

  • Continuous Distributions at Department of Mathematical Sciences: University of Alabama in Huntsville
  • BOOK, Feller, William, William Feller, An Introduction to Probability Theory and its Applications (volume I), John Wiley & Sons, 1968, 0-471-25708-7,


- content above as imported from Wikipedia
- "normalizing constant" does not exist on GetWiki (yet)
- time: 10:06pm EDT - Sat, May 04 2024
[ this remote article is provided by Wikipedia ]
LATEST EDITS [ see all ]
GETWIKI 23 MAY 2022
GETWIKI 09 JUL 2019
Eastern Philosophy
History of Philosophy
GETWIKI 09 MAY 2016
GETWIKI 18 OCT 2015
M.R.M. Parrott
Biographies
GETWIKI 20 AUG 2014
CONNECT