Physics 402 - Problem Set 4 - Solutions

Solutions prepared by Tobin Fricke <tobin@pas>, 2006-03-04

10. Some exercises with Characteristic Functions (10 pts)

10.1 If η=aζ+b where a and b are constants, what is the relation between the characteristic function of ζ and that of η? (10 points)

The characteristic function of ζ is scaled (in time) by a, and multiplied by a phase factor exp[itb].

10.2(a) Find the characteristic function of the Gaussian distribution (5 points)

The probability density function of the Gaussian distribution is:

In[1]:=

The characteristic function is the expectation value of Exp[-i t η].

In[2]:=

Out[2]=

10.2(b) Find the first four moments explicitly as functions of μ and σ (5 points)

The moments may be calculated from the characteristic function:

We can ask Mathematica to calculate this for the first four values of n using the code given in the solution to problem set 1:

In[3]:=

In[4]:=

Out[4]//TableForm=

<ξ> | μ |

Alternatively and equivalently, we can just expand the characteristic function in a Taylor series about 0, and then read off the moments, up to the factors of I and 1/k!:

In[5]:=

Out[5]=

Note

The Wikipedia page describing this distribution may be interesting: http://en.wikipedia.org/wiki/Gaussian_distribution , as may be the page describing Characteristic functions: http://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)

11. Cauchy (Lorentz) distribution (10 points)

11.1 Find the characteristic funcion (5 points)

In[6]:=

The integral to get the Characteristic function is the Fourier transform of the probability density function:

Evaluate this as a contour integral using Jordan's lemma. There are poles at k=±so we have to split the integral up into two cases: t>0 and t<0:

In[7]:=

Out[7]=

In[8]:=

Out[8]=

We find:

11.2 Show that all the odd moments are zero and that all the even moments (except μ0) are infinite. (5 points)

The nth moment is the expectation value of , which may be found from the following integral:

In[9]:=

For n=0, the integral converges to the mean of the distribution, μ=π.

For any odd n, the integrand is odd and the integration over a symmetric interval, so the integral vanishes. Odd moments are zero.

For any even n, the integral does not converge (it goes to infinity). Even moments are infinite.

What does this mean in terms of the characteristic function?

Note

The Wikipedia page describing this distribution may be interesting: http://en.wikipedia.org/wiki/Cauchy_distribution

12. Angular probability distribution (10 points)

Assume that the angle φ can take any value between 0 and 2π with equal probability.

What are the probability distributions of x=cos φ and y = sin φ?

This is similar to problem 8 (on the previous problem set) except that this time we are given the transformation function and asked to find the resulting distribution, where before we were given the desired distribution and asked to find the transformation by which it could be obtained. We work from the conservation of probability formula:

We're given that is uniform for φ∈[0,2π], which means (φ)=. We solve for (x) :

Solving for φ involves inverting the relationship x(φ) = cos φ. However, this relation is not a bijection. We find that the inverse is multi-valued, where the following holds for any integer n:

In[10]:=

It turns out to not matter too much, as the 2nπ term drops out upon taking the derivative:

In[11]:=

Out[11]=

The multivaluedness reappears when we require that this probability density be normalized. We must multiply this density by two to account for the two possible values of the inverse within the region of interest.

In[12]:=

Out[12]=

We also see that we should have taken the minus option in the ±. Putting this together with the factor of two, we have:

In[13]:=

Out[13]=

The distribution of y turns out to be the same.

What is the joint distribution of x and y?

Any combination of x and y that produces a point on the unit circle is equally likely. Any combination of x and y that is not on the unit circle has probability zero. The joint probability distribution is therefore:

where δ is the Dirac delta function.

Are they statistically independent?

The variables x and y are clearly not independent, as knowing the value of one of the random variables reduces the possible values of the second variable from a continuum to at most two distinct values:

Statistical independence requires

which clearly does not hold here.

Created by Mathematica (March 6, 2006) |