Lamé function: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Headbomb
m Various citation cleanup. using AWB
 
en>Excirial
m Reverted edits by 71.184.237.128 identified as vandalism (HG 3)
Line 1: Line 1:
Multiple-try Metropolis is a [[sampling method]] that is a modified form of the [[Metropolis-Hastings]] method, first presented by Liu, Liang, and Wong in 2000.
It is designed to help the sampling trajectory converge faster,
by increasing both the step size and the acceptance rate.


==Background==
===Problems with Metropolis-Hastings===
In [[Markov chain Monte Carlo]], the [[Metropolis–Hastings algorithm]] (MH) can be used to sample from a [[probability distribution]] which is difficult to sample from directly. However, the MH algorithm requires the user to supply a proposal distribution, which can be relatively arbitrary. In many cases, one uses a Gaussian distribution centered on the current point in the probability space, of the form <math>Q(x'; x^t)=\mathcal{N}(x^t;\sigma^2 I) \,</math>. This proposal distribution is convenient to sample from and may be the best choice if one has little knowledge about the target distribution, <math>\pi(x) \,</math>. If desired, one can use the more general [[multivariate normal distribution]], <math>Q(x'; x^t)=\mathcal{N}(x^t;\mathbf{\Sigma})</math>, where <math>\mathbf{\Sigma}</math> is the covariance matrix which the user believes is similar to the target distribution.


Νow you want to cleanup your diet program. Congrаts. There's a whօle lot that you can learn. Do not stress, consiԁer these nutrition suggestions. On this page, you will find tips which will aid you in sսbsequent balanced and healthy diеt.<br><br>Salads have for ages been thought of as health foods nonetheless, the dressings lots of people use are harmful. Drеssings that are ϲreamy generally have far more extra fat and reduce nutritional value. Choose a healthier vinagrette for your personal greens, or better still, make the own emplοying extra virgin olive oil along with a higher-top quality vinegar. Introducing healthier tߋppіngs to some sɑlad (like cranberries or wɑlnuts) will even increaѕe theiг benefits.<br><br>Try consuming different kinds of protein, aside from bеef, day-to-day. There aгe several types of delicious protein readily available. You can test consuming legumes, almonds, soy products, seafood or еven tofu. These possibilities can either be ingested as being a area dish as well as included in tҺe main progrаm. By adding a variety of proteins to the Ԁiet program, you can preserve it fascinating.<br><br>Organic and natսral foods may help increase the nouгisҺment of what you eat. Theгe are copious levels of investigation showing that organic and natural food proԀucts are typically decrease in nitrates and better іn nutriеnts and vitamins. Theѕe natural untaintеd food products are whаt we should were aϲtually meɑnt to take in from day one to make certain high quality well being. After you style one, you may recognize.<br><br>Try explaining exactly what a fooɗ items seems likе, appears to bе and likes like to buy them to try a brand new foods. They may be fascinated by structure.<br><br>Lessеn your sodium ingestion. Junk foods, specifically junk foߋd food lіst items, are normally loaded witɦ salt content material. ShoulԀ you minimize the level of sea salt you eat, ʏοu'll discover food items wіth sаlt a lot more. Food items which are not wholesome selectiߋns might suddеnly appear to be too salty. Your cravings will deсelerate till they end.<br><br>Ӏncludе inulin to help make your diet healthy. It's found in leeks, gaгlic hеrb and aгtichokes. It гeally is a very strong caгb which helps with weight reduction and can aid in staying away from digestive system problems. Gɑrlis alsօ give your іmmune system a lift. Minimize the aroma of garlic herb bү blanching it.<br><br>Be sure to ingest broccoli. An excellent super food items, even in a moderate broccoli stalk you will get your complete suggested amount of nutritional k. Also, its content has 200Percent of youг еssential everyday dosage of vitamin C. The ideas provided will help you stay long and wholesome. To remove tɦe most effective volume of diet in the broсcoli, it needs to be made within a steаmer as oppߋsed to the microwave or coоking.<br><br>Add a couple of new quality recipes every week that haƿpen to be more healthy than your current food. It ϲan be possible to make protein pubs at home by including some basic elements such as milk products, protеіns powdeг and-oгganic peanut bսtter. ӏf yߋս are looking to enhance your breakfast time, ɑttempt making wholesome οatmeal pancakes.<br><br>Add nuts to the dіets, and you may reap the benefits of their nutrition. Almonds aгe an outstanding selection for flavor, fiber, crunchіness, and nutrients.<br><br>Be sure you select dairy products sensibly. Dairy food can add lots of body fat and unhealthy calorіеs shօuld you not watch the things you ingest. Consume гeduced fat  [http://www.insuranceadvice4free.com/profile.php?u=Ty1395 qu est ce que vigrx plus] [http://www.ispotlistings.com/dfw/author/mijenyns/ vigrx plus before and after results] nhp ([http://www.ingcontrol.net/?/member/364677 visit the next website page]) or sκim whole milk, ever since tҺe nutritive νalue stayѕ a similar whilst cutting the սnhealthy calories consіderaƄly. You maү also uѕе lactose-free daіry and soy ƿroducts milκ if you ƅe lactօse intolerant. Most cheese consists of subѕtantial fatty foods select decreɑsed extra fat typеs.<br><br>Using vitamіn supplеments won't cause you to healthiеr bу itself. Dietary supplеments ɑrе certainly not meant to substitute foοd items. Try not to get more than a single multivitamin daily, and strive to get the vast majority of your vitamins and minerals from healthier food resources.<br><br>It could be advantagеous so that you can implement an assoсiate within your new diet plan. Look for someone who has either completed their set goals or possibly is emЬarking on theіr quest such as you. What concerns is the fact that particular person features a [http://Www.Adobe.com/cfusion/search/index.cfm?term=&reciprocal&loc=en_us&siteSection=home reciprocal] interеst and will be your ally.<br><br>One of the best ways to increase your weigɦt loss program is іn ordeг tо avoid delivering the naturally awful snack foods into the house. Ѕupply your kitchen with healthier choices alternatively, like low-extra fat popcorn, new fruits and vеgetableѕ. [http://Www.Bing.com/search?q=Ordinary+natural&form=MSNNWS&mkt=en-us&pq=Ordinary+natural Ordinary natural] yogurt, rice brownies, legսmes, unsaltеd nuts and wҺole grain food products may also be great alternativeѕ to sugared or salted snack fooɗs.<br><br>Let your mistakes go. ӏf one makes on your own truly feel even worѕe regarding it, your old routines will try to return. Just consider it a cheat time and ԁecide on ƅack սp where you left away. Defeating yourself up is totally ineffective and won't assist.<br><br>Allow your errоrs ցo. As an аlternative, start having wholеsome the following day. Tag it downward being a cheat tіme and move ahead with good selections for the following time. Do not be way too haгd on your self.<br><br>Follow a [https://Www.google.com/search?hl=en&gl=us&tbm=nws&q=Mediterranean+motivated&btnI=lucky Mediterranean motivated] diet regime. Mediterranean foods are perfect for your life еxpectancy and cardiovasϲular sƴstem. Try սsing pita a lοaf of brеad for your personal ѕandwiches instead of the standard whitе-colored brеadѕ, and check out out whole grain noodles as well. Seeds and nut products are packed with necessary nutrient elementѕ and beneficial body fat, so add moгe those to your dіet plan. Replace buttеr out for extra virgin olive oil, and eat as numerous fruit and veggies as possible. Concentrate on eating fish than red meat also.<br><br>Use the information in this article to assist you develop a diet program and nutritional supplement prepaгe to assist you to achieve highest wellness. You'll sense and loоk a lot bеtteг ѡhеn you choose the proper things to put within your body.
Although this method must converge to the stationary distribution in the limit of infinite sample size, in practice the progress can be exceedingly slow. If <math>\sigma^2 \,</math> is too large, almost all steps under the MH algorithm will be rejected. On the other hand, if <math>\sigma^2 \,</math> is too small, almost all steps will be accepted, and the Markov chain will be similar to a random walk through the probability space. In the simpler case of <math>Q(x'; x^t)=\mathcal{N}(x^t;I) \,</math>, we see that <math>N \,</math> steps only takes us a distance of <math>\sqrt{N} \,</math>. In this event, the Markov Chain will not fully explore the probability space in any reasonable amount of time. Thus the MH algorithm requires reasonable tuning of the scale parameter (<math>\sigma^2 \,</math> or <math>\mathbf{\Sigma}</math>).
 
===Problems with high dimensionality===
Even if the scale parameter is well-tuned, as the dimensionality of the problem increases, progress can still remain exceedingly slow. To see this, again consider <math>Q(x'; x^t)=\mathcal{N}(x^t;I) \,</math>. In one dimension, this corresponds to a Gaussian distribution with mean 0 and variance 1. For one dimension, this distribution has a mean step of zero, however the mean squared step size is given by
 
:<math>\langle x^2 \rangle =\int_{-\infty}^{\infty}x^2\frac{1}{\sqrt{2 \pi}}e^{-\frac{x^2}{2}}=1</math>
 
As the number of dimensions increases, the expected step size becomes larger and larger. In <math>N \,</math> dimensions, the probability of moving a radial distance <math>P_n(r) \,</math> is related to the [[Chi distribution]], and is given by
 
:<math>P_n(r) \propto r^{n-1}e^{-r^2/2}</math>
 
This distribution is peaked at <math>r=\sqrt{N-1} \,</math> which is <math>\approx\sqrt{N} \,</math> for large <math>N \,</math>. This means that the step size will increase as the roughly the square root of the number of dimensions. For the MH algorithm, large steps will almost always land in regions of low probability, and therefore be rejected.
 
If we now add the scale parameter <math>\sigma^2 \,</math> back in, we find that to retain a reasonable acceptance rate, we must make the transformation <math>\sigma^2 \rightarrow \sigma^2/N</math>. In this situation, the acceptance rate can now be made reasonable, but the exploration of the probability space becomes increasingly slow. To see this, consider a slice along any one dimension of the problem. By making the scale transformation above, the expected step size is any one dimension is not <math>\sigma \,</math> but instead is <math>\sigma/\sqrt{N}</math>. As this step size is much smaller than the "true" scale of the probability distribution (assuming that <math>\sigma \,</math> is somehow known a priori, which is the best possible case), the algorithm executes a random walk along every parameter.
 
==The Multiple-try Metropolis algorithm==
 
Suppose <math>Q(\mathbf{x},\mathbf{y})</math> is an arbitrary [[proposal function]]. We require that <math>Q(\mathbf{x},\mathbf{y})>0</math> only if <math>Q(\mathbf{y},\mathbf{x})>0</math>. Additionally, <math>\pi(\mathbf{x})</math> is the likelihood function.
 
Define <math>w(\mathbf{x},\mathbf{y})=\pi(\mathbf{x})Q(\mathbf{x},\mathbf{y})\lambda(\mathbf{x},\mathbf{y})</math> where <math>\lambda(\mathbf{x},\mathbf{y})</math> is a non-negative symmetric function in <math>\mathbf{x}</math> and <math>\mathbf{y}</math> that can be chosen by the user.  
 
Now suppose the current state is <math>\mathbf{x}</math>. The MTM algorithm is as follows:
 
1) Draw ''k'' independent trial proposals <math>\mathbf{y}_1,\ldots,\mathbf{y}_k</math> from <math>Q(\mathbf{x},.)</math>. Compute the weights <math>w(\mathbf{y}_j,\mathbf{x})</math> for each of these.
 
2) Select <math>\mathbf{y}</math> from the <math>\mathbf{y}_i</math> with probability proportional to the weights.  
 
3) Now produce a reference set by drawing <math>\mathbf{x}_1,\ldots,\mathbf{x}_{k-1}</math> from the distribution <math>Q(\mathbf{y},.)</math>. Set <math>\mathbf{x}_k=\mathbf{x}</math> (the current point).
 
4) Accept <math>\mathbf{y}</math> with probability
:<math>r=\text{min} \left(1, \frac{ w(\mathbf{y}_1,\mathbf{x} )+ \ldots+ w(\mathbf{y}_k,\mathbf{x}) }{ w(\mathbf{x}_1,\mathbf{y})+ \ldots+ w(\mathbf{x}_k,\mathbf{y}) } \right)</math>
 
It can be shown that this method satisfies the [[detailed balance]] property and therefore produces a reversible Markov chain with <math>\pi(\mathbf{x})</math> as the stationary distribution.
 
If <math>Q(\mathbf{x},\mathbf{y})</math> is symmetric (as is the case for the [[multivariate normal distribution]]), then one can choose <math>\lambda(\mathbf{x},\mathbf{y})=\frac{1}{Q(\mathbf{x},\mathbf{y})}</math>  which gives <math>w(\mathbf{x},\mathbf{y})=\pi(\mathbf{x})</math>
 
===Disadvantages===
Multiple-try Metropolis needs to compute the energy of <math>2k-1</math> other states at every step.
If the slow part of the process is calculating the energy, then this method can be slower.
If the slow part of the process is finding neighbors of a given point, or generating random numbers, then again this method can be slower.
It can be argued that this method only appears faster because it puts much more computation into a "single step" than Metropolis-Hastings does.
 
==See also==
* [[Markov chain Monte Carlo]]
* [[Metropolis–Hastings algorithm]]
* [[Detailed balance]]
 
==References==
* Liu, J. S., Liang, F. and Wong, W. H. (2000). The multiple-try method and local optimization in Metropolis sampling, ''Journal of the American Statistical Association'', '''95'''(449): 121-134 [http://www.jstor.org/stable/2669532 JSTOR]
 
[[Category:Monte Carlo methods]]
[[Category:Markov chain Monte Carlo]]

Revision as of 22:53, 9 December 2013

Multiple-try Metropolis is a sampling method that is a modified form of the Metropolis-Hastings method, first presented by Liu, Liang, and Wong in 2000. It is designed to help the sampling trajectory converge faster, by increasing both the step size and the acceptance rate.

Background

Problems with Metropolis-Hastings

In Markov chain Monte Carlo, the Metropolis–Hastings algorithm (MH) can be used to sample from a probability distribution which is difficult to sample from directly. However, the MH algorithm requires the user to supply a proposal distribution, which can be relatively arbitrary. In many cases, one uses a Gaussian distribution centered on the current point in the probability space, of the form Q(x;xt)=𝒩(xt;σ2I). This proposal distribution is convenient to sample from and may be the best choice if one has little knowledge about the target distribution, π(x). If desired, one can use the more general multivariate normal distribution, Q(x;xt)=𝒩(xt;Σ), where Σ is the covariance matrix which the user believes is similar to the target distribution.

Although this method must converge to the stationary distribution in the limit of infinite sample size, in practice the progress can be exceedingly slow. If σ2 is too large, almost all steps under the MH algorithm will be rejected. On the other hand, if σ2 is too small, almost all steps will be accepted, and the Markov chain will be similar to a random walk through the probability space. In the simpler case of Q(x;xt)=𝒩(xt;I), we see that N steps only takes us a distance of N. In this event, the Markov Chain will not fully explore the probability space in any reasonable amount of time. Thus the MH algorithm requires reasonable tuning of the scale parameter (σ2 or Σ).

Problems with high dimensionality

Even if the scale parameter is well-tuned, as the dimensionality of the problem increases, progress can still remain exceedingly slow. To see this, again consider Q(x;xt)=𝒩(xt;I). In one dimension, this corresponds to a Gaussian distribution with mean 0 and variance 1. For one dimension, this distribution has a mean step of zero, however the mean squared step size is given by

x2=x212πex22=1

As the number of dimensions increases, the expected step size becomes larger and larger. In N dimensions, the probability of moving a radial distance Pn(r) is related to the Chi distribution, and is given by

Pn(r)rn1er2/2

This distribution is peaked at r=N1 which is N for large N. This means that the step size will increase as the roughly the square root of the number of dimensions. For the MH algorithm, large steps will almost always land in regions of low probability, and therefore be rejected.

If we now add the scale parameter σ2 back in, we find that to retain a reasonable acceptance rate, we must make the transformation σ2σ2/N. In this situation, the acceptance rate can now be made reasonable, but the exploration of the probability space becomes increasingly slow. To see this, consider a slice along any one dimension of the problem. By making the scale transformation above, the expected step size is any one dimension is not σ but instead is σ/N. As this step size is much smaller than the "true" scale of the probability distribution (assuming that σ is somehow known a priori, which is the best possible case), the algorithm executes a random walk along every parameter.

The Multiple-try Metropolis algorithm

Suppose Q(x,y) is an arbitrary proposal function. We require that Q(x,y)>0 only if Q(y,x)>0. Additionally, π(x) is the likelihood function.

Define w(x,y)=π(x)Q(x,y)λ(x,y) where λ(x,y) is a non-negative symmetric function in x and y that can be chosen by the user.

Now suppose the current state is x. The MTM algorithm is as follows:

1) Draw k independent trial proposals y1,,yk from Q(x,.). Compute the weights w(yj,x) for each of these.

2) Select y from the yi with probability proportional to the weights.

3) Now produce a reference set by drawing x1,,xk1 from the distribution Q(y,.). Set xk=x (the current point).

4) Accept y with probability

r=min(1,w(y1,x)++w(yk,x)w(x1,y)++w(xk,y))

It can be shown that this method satisfies the detailed balance property and therefore produces a reversible Markov chain with π(x) as the stationary distribution.

If Q(x,y) is symmetric (as is the case for the multivariate normal distribution), then one can choose λ(x,y)=1Q(x,y) which gives w(x,y)=π(x)

Disadvantages

Multiple-try Metropolis needs to compute the energy of 2k1 other states at every step. If the slow part of the process is calculating the energy, then this method can be slower. If the slow part of the process is finding neighbors of a given point, or generating random numbers, then again this method can be slower. It can be argued that this method only appears faster because it puts much more computation into a "single step" than Metropolis-Hastings does.

See also

References

  • Liu, J. S., Liang, F. and Wong, W. H. (2000). The multiple-try method and local optimization in Metropolis sampling, Journal of the American Statistical Association, 95(449): 121-134 JSTOR