Monte Carlo Integration with Mathematica 3.0
Author
S. Saarinen
Title
Monte Carlo Integration with Mathematica 3.0
Description
A new integration method is included with Mathematica 3.0.
Category
Academic Articles & Supplements
Keywords
URL
http://www.notebookarchive.org/2018-10-10naxkd/
DOI
https://notebookarchive.org/2018-10-10naxkd
Date Added
2018-10-02
Date Last Modified
2018-10-02
File Size
18.03 kilobytes
Supplements
Rights
Redistribution rights reserved




Monte Carlo Integration with Mathematica 3.0
Monte Carlo Integration with Mathematica 3.0
A new integration method is included with Mathematica 3.0.
by Sirpa Saarinen
The Monte Carlo integration method is used to numerically approximate
an integral by a sum that is the average of the integrand evaluated at a set of points:
an integral by a sum that is the average of the integrand evaluated at a set of points:
In[]:=
b
∫
a
N
∑
i=1
f()
x
i
N
The formula is similar to a quadrature formula except that in a Monte Carlo
method the's are independent, often random, sample points.
The Monte Carlo method is used when traditional
quadrature methods either fail to converge or are computationally too
inefficient. Such cases arise when one is integrating certain special functions
or when doing multidimensional integration, both of which might cause a
prohibitive number of quadrature points to be used when high precision is
needed. The Monte Carlo method solves this problem by choosing independent
sample points in the integration region and forming an average.
Traditionally two sets of independent sample points have been used: pseudo-random and
quasi-random sequences. Pseudo-random sequences are what we usually call
random numbers, and quasi-random sequences are sequences that, when used in
the estimates give low approximation errors (at least theoretically, there
always exist exceptions).
method the
x
i
The Monte Carlo method is used when traditional
quadrature methods either fail to converge or are computationally too
inefficient. Such cases arise when one is integrating certain special functions
or when doing multidimensional integration, both of which might cause a
prohibitive number of quadrature points to be used when high precision is
needed. The Monte Carlo method solves this problem by choosing independent
sample points in the integration region and forming an average.
Traditionally two sets of independent sample points have been used: pseudo-random and
quasi-random sequences. Pseudo-random sequences are what we usually call
random numbers, and quasi-random sequences are sequences that, when used in
the estimates give low approximation errors (at least theoretically, there
always exist exceptions).
In Version 3.0 of Mathematica you can now experiment with these two
types of Monte Carlo integration methods in a convenient way. The
function NIntegrate has two new methods: MonteCarlo and QuasiMonteCarlo.
Let us first review them briefly. The method MonteCarlo uses pseudo-random
points in calculating an integral:
types of Monte Carlo integration methods in a convenient way. The
function NIntegrate has two new methods: MonteCarlo and QuasiMonteCarlo.
Let us first review them briefly. The method MonteCarlo uses pseudo-random
points in calculating an integral:
In[]:=
NIntegrate[x, {x, 0, 1}, Method -> MonteCarlo]
This uses some Random-sequence. However, sometimes you want to see
the same answer turn up each time, and you can specify your favorite
seed by giving it as an argument to the method:
the same answer turn up each time, and you can specify your favorite
seed by giving it as an argument to the method:
In[]:=
NIntegrate[x,{x,0,1},Method->MonteCarlo[90]]
Here we used the seed 90. If we do this calculation again, we will
get the same answer. However, this is the only way to set the seed for
your Monte Carlo integration---using SeedRandom before a call like In[1]
will not guarantee you any specific random number sequence.
The method QuasiMonteCarlo, uses a quasi-random series, similar to the one
found in Wozniakowski 1991]. Here we have only one choice for the sequence
get the same answer. However, this is the only way to set the seed for
your Monte Carlo integration---using SeedRandom before a call like In[1]
will not guarantee you any specific random number sequence.
The method QuasiMonteCarlo, uses a quasi-random series, similar to the one
found in Wozniakowski 1991]. Here we have only one choice for the sequence
In[]:=
NIntegrate[x E^(y^2),{x,0,1},{y,x,1},
Method -> QuasiMonteCarlo]
Method -> QuasiMonteCarlo]
but we are guaranteed to always get the same result. In our experience the
QuasiMonteCarlo method will give a more accurate answer than the
MonteCarlo method.
Remember that in this type of method the error is proportional to 1/Sqrt[N],
where N is the number of points used. You can specify the number of points
used in a MonteCarlo calculation
by changing MaxPoints, otherwise a default value of 50000 will be used.
If you specify MaxPoints only but no method, then the QuasiMonteCarlo
method is used. The Monte Carlo methods in Mathematica are non-adaptive,
so when you specify a certain MaxPoints, the integrand will be evaluated at all of these
points, uniformly throughout the integration region, and this might be
time consuming if the integrand does not converge easily. Therefore, the time
it takes to evaluate an integral is proportional to the number of MaxPoints
and in some cases, the form of your function. This is related to the Compiled-option---more about this below. Here we try to use 900 points to
evaluate the integrand:
QuasiMonteCarlo method will give a more accurate answer than the
MonteCarlo method.
Remember that in this type of method the error is proportional to 1/Sqrt[N],
where N is the number of points used. You can specify the number of points
used in a MonteCarlo calculation
by changing MaxPoints, otherwise a default value of 50000 will be used.
If you specify MaxPoints only but no method, then the QuasiMonteCarlo
method is used. The Monte Carlo methods in Mathematica are non-adaptive,
so when you specify a certain MaxPoints, the integrand will be evaluated at all of these
points, uniformly throughout the integration region, and this might be
time consuming if the integrand does not converge easily. Therefore, the time
it takes to evaluate an integral is proportional to the number of MaxPoints
and in some cases, the form of your function. This is related to the Compiled-option---more about this below. Here we try to use 900 points to
evaluate the integrand:
In[]:=
NIntegrate[x^3, {x,0,9}, Method->MonteCarlo,
MaxPoints->900]
MaxPoints->900]
In[]:=
InputForm[%4]
and we get an inaccurate answer (the correct one is 1640.25). Note that here
the output is truncated because the AccuracyGoal and PrecisionGoal
are automatically set to 2 for these two methods in NIntegrate (the default for
the other methods is 6). You can change them as when using any other method
but the calculation times might become very long.
For one-dimensional integrands, the MonteCarlo and QuasiMonteCarlo
methods can take much longer than just using NIntegrate's default methods
or the other methods for one-dimensional integration. However, for
multi-dimensional integration, the Monte Carlo-type methods are often
faster:
the output is truncated because the AccuracyGoal and PrecisionGoal
are automatically set to 2 for these two methods in NIntegrate (the default for
the other methods is 6). You can change them as when using any other method
but the calculation times might become very long.
For one-dimensional integrands, the MonteCarlo and QuasiMonteCarlo
methods can take much longer than just using NIntegrate's default methods
or the other methods for one-dimensional integration. However, for
multi-dimensional integration, the Monte Carlo-type methods are often
faster:
In[]:=
NIntegrate[
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*
x[4]^2 + 0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1},
{x[2], 0, 1},
{x[3], 0, 1},
{x[4], 0, 1},
Method->MonteCarlo[35]]
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*
x[4]^2 + 0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1},
{x[2], 0, 1},
{x[3], 0, 1},
{x[4], 0, 1},
Method->MonteCarlo[35]]
vs.
In[]:=
NIntegrate[
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*
x[4]^2 + 0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1},
{x[2], 0, 1},
{x[3], 0, 1},
{x[4], 0, 1}
]//Timing
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*
x[4]^2 + 0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1},
{x[2], 0, 1},
{x[3], 0, 1},
{x[4], 0, 1}
]//Timing
However, the comparison is not really fair since the MonteCarlo method uses
PrecisionGoal and AccuracyGoal equal to 2.
PrecisionGoal and AccuracyGoal equal to 2.
In[]:=
NIntegrate[
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*x[4]^2 +
0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1}, {x[2], 0, 1},
{x[3], 0, 1}, {x[4], 0, 1},
AccuracyGoal->2,PrecisionGoal->2]//Timing
0.1*Cos[0.1*x[1]*x[2]*x[3]*x[4]] -
0.07000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]*x[2]*x[3]*x[4] -
0.006000000000000002*
Cos[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^2*x[2]^2*x[3]^2*x[4]^2 +
0.0001000000000000001*
Sin[0.1*x[1]*x[2]*x[3]*x[4]]*
x[1]^3*x[2]^3*x[3]^3*x[4]^3,
{x[1], 0, 1}, {x[2], 0, 1},
{x[3], 0, 1}, {x[4], 0, 1},
AccuracyGoal->2,PrecisionGoal->2]//Timing
In other cases, the Monte Carlo approach is somewhat more accurate:
In[]:=
k=0.1;k=0.25;k=Pi/2;
In[]:=
w = k x1 x2 x3 x4;
In[]:=
e[x1_, x2_, x3_, x4_]:=
(w=k x1 x2 x3 x4;
k Cos[w]-7 k w Sin[w]-
6 k w^2 Cos[w]+k w^3 Sin[w]);
(w=k x1 x2 x3 x4;
k Cos[w]-7 k w Sin[w]-
6 k w^2 Cos[w]+k w^3 Sin[w]);
In[]:=
NIntegrate[
e[x1,x2, x3, x4],
{x1,0,1},{x2,0,1},
{x3,0,1},{x4,0,1}, ]
Method->QuasiMonteCarlo]//Timing
e[x1,x2, x3, x4],
{x1,0,1},{x2,0,1},
{x3,0,1},{x4,0,1}, ]
Method->QuasiMonteCarlo]//Timing
vs. the ordinary NIntegrate with lower accuracy:
In[]:=
NIntegrate[
e[x1,x2, x3, x4],
{x1,0,1},{x2,0,1},
{x3,0,1},{x4,0,1},
AccuracyGoal->2,
PrecisionGoal->2]//Timing
e[x1,x2, x3, x4],
{x1,0,1},{x2,0,1},
{x3,0,1},{x4,0,1},
AccuracyGoal->2,
PrecisionGoal->2]//Timing
(the full calculation using NIntegrate, Method->MultiDimensional takes over
280 seconds).
One thing to look out for especially in the MonteCarlo methods is whether the
function is defined outside of NIntegrate.NIntegrate uses HoldAll so a
definition like the one below will take longer than expected:
280 seconds).
One thing to look out for especially in the MonteCarlo methods is whether the
function is defined outside of NIntegrate.NIntegrate uses HoldAll so a
definition like the one below will take longer than expected:
In[]:=
k = PolyLog[2,x];
In[]:=
NIntegrate[k,{x,0,.97},
Method->MonteCarlo[78]]//Timing
Method->MonteCarlo[78]]//Timing
versus the case where a compiler was used and the function was given in
the first argument explicitly:
the first argument explicitly:
In[]:=
NIntegrate[
PolyLog[2,x], {x,0,.97},
Method->MonteCarlo[78]]//Timing
PolyLog[2,x], {x,0,.97},
Method->MonteCarlo[78]]//Timing
This example is interesting because it is one of the rare cases where
using the compiler makes the timing worse (because a special function was used
that does not compile):
using the compiler makes the timing worse (because a special function was used
that does not compile):
In[]:=
NIntegrate[PolyLog[2, x], {x ,0, .97},
Method -> MonteCarlo[78],
Compiled -> False] //Timing
Method -> MonteCarlo[78],
Compiled -> False] //Timing
In most cases, however, the default, Compiled -> True, gives the fastest
timing:
timing:
In[]:=
NIntegrate[Sin[x]^2 Cos[x y],
{x, 0, 1}, {y, 2, 2 I},
Method -> QuasiMonteCarlo] //Timing
{x, 0, 1}, {y, 2, 2 I},
Method -> QuasiMonteCarlo] //Timing
In[]:=
NIntegrate[Sin[x]^2 Cos[x y],
{x, 0, 1}, {y, 2, 2 I},
Method -> QuasiMonteCarlo,
Compiled -> False] //Timing
{x, 0, 1}, {y, 2, 2 I},
Method -> QuasiMonteCarlo,
Compiled -> False] //Timing
For some cases, the compiler cannot handle the small numbers that appear
in the calculation.
in the calculation.
In[]:=
NIntegrate[x^2 BesselK[2,x],
{x, 0, Infinity},
Method -> MonteCarlo[90]] //Timing
{x, 0, Infinity},
Method -> MonteCarlo[90]] //Timing
and then using non-compiled evaluation
In[]:=
NIntegrate[x^2 BesselK[2,x],
{x, 0, Infinity},
Method -> MonteCarlo[90],
Compiled -> False] //Timing
{x, 0, Infinity},
Method -> MonteCarlo[90],
Compiled -> False] //Timing
gets rid of annoying messages. As with the other methods in NIntegrate,
you can integrate over complicated domains:
you can integrate over complicated domains:
In[]:=
NIntegrate[x^2 y z^2,
{x,0,2}, {y,0,Sqrt[1-x^2]}, {z,0,1/(1+y)},
Method->QuasiMonteCarlo]
{x,0,2}, {y,0,Sqrt[1-x^2]}, {z,0,1/(1+y)},
Method->QuasiMonteCarlo]
Basically, you can use the methods MonteCarlo and QuasiMonteCarlo to
experiment with Monte Carlo integration in Mathematica. However, in order
to get accurate answers it is best to first use the default
NIntegrate function and then only to resort to the less accurate
MonteCarlo-type methods.
experiment with Monte Carlo integration in Mathematica. However, in order
to get accurate answers it is best to first use the default
NIntegrate function and then only to resort to the less accurate
MonteCarlo-type methods.
References
References
Wozniakowski H. ``Average case complexity of multivariate integration'',
Bulletin of the American Mathematical Society, Vol. 24, Number 1, p. 185 (1991).
Bulletin of the American Mathematical Society, Vol. 24, Number 1, p. 185 (1991).
About the Author
About the Author
Sirpa Saarinen is a kernel developer with Wolfram Research, Inc. She received her Ph.D. from the University of Illinois at Urbana-Champaign in Computer Science. Her research areas in Mathematica include ODEs, optimization, numerical integration and linear algebra.
Sirpa Saarinen
Wolfram Research, Inc.
100 Trade Center Dr
Champaign, IL 61820
saarinen@wolfram.com
Sirpa Saarinen
Wolfram Research, Inc.
100 Trade Center Dr
Champaign, IL 61820
saarinen@wolfram.com


Cite this as: S. Saarinen, "Monte Carlo Integration with Mathematica 3.0" from the Notebook Archive (2002), https://notebookarchive.org/2018-10-10naxkd

Download

