Ricart–Agrawala algorithm

From formulasearchengine
Revision as of 13:31, 22 November 2013 by 14.139.185.114 (talk) (Algorithm)
Jump to navigation Jump to search

BHHH is an optimization algorithm in numerical optimization similar to Gauss–Newton algorithm. It is an acronym of the four originators: Berndt, B. Hall, R. Hall, and Jerry Hausman.

Usage

If a nonlinear model is fitted to the data one often needs to estimate coefficients through optimization. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is Q(β). Then the algorithms are iterative, defining a sequence of approximations, βk given by

βk+1=βkλkAkQβ(βk),,

where βk is the parameter estimate at step k, and λk is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm λk is determined by calculations within a given iterative step, involving a line-search until a point βk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, Q has the form

Q=i=1NQi

and A is calculated using

Ak=[i=1NlnQiβ(βk)lnQiβ(βk)]1.

In other cases, e.g. Newton-Raphson, Ak can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.Template:Cn

Literature

  • Berndt, E., B. Hall, R. Hall, and J. Hausman, (1974), “Estimation and Inference in Nonlinear Structural Models”, Annals of Economic and Social Measurement, 3, 653–665.
  • Luenberger, D. (1972), Introduction to Linear and Nonlinear Programming, Addison Wesley, Reading Massachusetts.
  • Gill, P., W. Murray, and M. Wright, (1981), Practical Optimization, Harcourt Brace and Company, London
  • Sokolov, S.N., and I.N. Silin (1962), “Determination of the coordinates of the minima of functionals by the linearization method”, Joint Institute for Nuclear Research preprint D-810, Dubna.