Loading ...
Sorry, an error occurred while loading the content.

AI-GEOSTATS: Regression Slope and kriging - the answers

Expand Messages
  • Joe Geo
    Dear ai-geostats readers Please find below the replys I received on my questions regarding regression slope and kriging Dear ai-geostat following the original
    Message 1 of 1 , May 11, 2003
    • 0 Attachment
      Dear ai-geostats readers

      Please find below the replys I received on my questions regarding regression
      slope and kriging

      Dear ai-geostat following the original questions below.

      Thanks to Donald Myers and Isoble Clark

      I have some questions regarding kriging.

      1.Kriging is often cited as least squares regression method - this I
      understand for liner regression but how does this actually occur in the
      kriging matrix ? Are the covariances the square values that are being
      minimised ?

      2.I read in seveal papers that it is possible to calculate the slope of
      regression from parameters of the krigin system. Specifically:

      (Block variance-kriging variance + abs(lagrangian))/(block varaince -
      kriging variance + 2 x abs(lagrangian))

      I can follow the derivation of the kriging variance and I can see the
      purpose of the lagrangian, where does the block variance come from and how
      (in conceptual terms) does this equation give the regression slope of true
      versus actual block grades ?


      Replys



      Donald E. Myers

      A couple of observations about kriging and regression

      You might want to look at a paper by S. Goldberg in the J. American
      Statistical Assn, 1962 where derives what turns out to be the kriging
      estimator (albeit without any acknowledgement of the geostatistical
      literature which was pretty sparse at that time. He does it entirely in
      terms of regression.

      Several important distinctions or contrasts between kriging and regression
      (partly theoretical and partly practical)
      1. For regression the response variable does not have to be the same as the
      regressor/control variable(s)
      2. The regression model includes an error term (there are different
      possibilities for the assumptions on this error term, e.g., not
      intercorrelated, constant variance,. Usually the regression approach focuses
      on "removing" the error term.
      3. At least sometimes in regression, the regressor /control variable(s) are
      deterministic. To that degree, universal kriging is the analogue of
      generalized regression (see a paper by M. David et al in Math Geology a few
      years ago comparing universal kriging with a nugget effect model vs trend
      surface analysis)
      4. This one is probably most important, in the statistical form of
      regression, the covariance values are estimates not computed (i.e., computed
      from a model for the covariance function or variogram) .
      5. From a least squares perspective, one can fit a "regression" model to
      data without any statistical assumptions at all (of course then strictly
      speaking you can't do any statistical inference).
      6. Finally, and I am sure that this goes back much further (presumably it is
      part of the motivation for Krige's and Matheron's work), if you have
      jointly distributed random variables Z0, Z1, ...., Zn each with finite
      variances then the "optimal" estimator of Z0 given the data Z1,..., Zn is
      the conditional expectation of Z0 given Z1,...,Zm ("optimal" meaning
      unbiased and minimal variance of the error of estimation). Moreover if in
      addition the random variables are multivariate gaussian then the conditional
      expectation is linear in Z1,...,Zn. That is,

      E[Z0 | Z1,...,Zn] = mu0 + Sum (i=1,..,n)ai [ Zi- mui]

      mo is the expected value of Z0, the mui's are the expected values of the
      Zi's. This of course looks like the Simple Kriging estimator except that
      usually we would assume that all the mu's are the same. This connection is
      exploited by Journel in his 1980 paper in Math Geology discussing the bias
      correction for lognormal kriging.

      7. In kriging the form of the estimator is assumed, i.e., Simple kriging vs
      Ordinary/Universal kriging.. IN PARTICULAR IT IS ASSUMED TO BE A LINEAR
      FUNCTION OF THE DATA. There is no distributional assumption (although there
      are authors that from time to time that keep saying there is a multivariate
      gaussian assumption). There is some form of stationarity assumption although
      this is primarily used to justify estimating and modeling the
      covariance/variogram from the data. Two conditions are imposed on the
      coefficients in the estimator, unbiaseness and minimal estimation variance.
      These together with the lineaity assumption are sufficent to derive the
      kriging equations, they are analogous to the regression equations but not
      exactly the same. The kriging variance is the minimized estimation variance
      (obtained from the specified covariance/variogram model).

      Now to your questions, "slope" is only going to really make sense if there
      is only one regressor variance, i.e., the regression equation is not only
      linear but has only one variable. While the kriging equations will work with
      only one data point one would usually not restrict it to that.

      Kriging can be used in two general forms, point estimation and "block"
      estimation. The form of the estimator is the same but for "block" estimation
      there is a modification that accounts for the change in spatial correlation
      resulting from a change in support. There really isn't an analogue of this
      for regression (as an example however of an attempt to do this see
      1984, DeVerle Harris and D.E. Myers, World Oil Resources/A Statistical
      Perspective. in Advances in Energy Systems and Technology, Vol 4, Academic
      Press).


      It is also possible to use "non-point" grades, to that extent you might
      think of the :"slope" as relating "true" to "estimated" grades but unless
      you use a unique search neighborhood the "slope" will change from estimate
      to estimate.

      All of the above is probably a bit long winded but my point is that the
      connection between kriging and regression is not a simple one, it has
      multiple facets.

      Donald E. Myers
      http://www.u.arizona.edu/~donaldm

      Isobel Clark

      >1.Kriging is often cited as least squares regression
      Simple kriging as invented by Danie Krige in the
      1950's was exactly a linear regression method. The
      'kriging' system has a 'left-hand-side' consisting of
      the variance/co-variance matrix between sample pairs
      and a 'right-hand-side' consisting of the co-variances
      between each sample and the unknown value. Krige
      derived the variances and co-variances empirically
      from 50 years of historical data.

      In the early 1960s, Matheron's work put this on a
      modelling (theoretical) footing by suggesting that the
      co-cariances could be modelled by a function -- the
      semi-variogram reversed. Thus the l.h.s became
      co-variances or semi-variograms depending on your
      personal preference and likewise the r.h.s.

      However, Matheron also introduced the notion that the
      weights should add up to one and invented 'ordinary'
      kriging which is not (strictly) classical linear
      regression.

      >2.I read in seveal papers that it is possible to
      >calculate the slope of regression from parameters of the krigin system.
      All explained in my 1983 paper 'regression revisited'
      in Mathematical Geology. I can send you a copy if you
      can't find it or you can download it from:

      http://uk.geocities.com/drisobelclark/resume/Publications.html

      (note capital P)

      Isobel Clark
      http://geoecosse.bizland.com/whatsnew.htm

      _________________________________________________________________
      MSN Instant Messenger now available on Australian mobile phones.�Go to
      http://ninemsn.com.au/mobilecentral/hotmail_messenger.asp


      --
      * To post a message to the list, send it to ai-geostats@...
      * As a general service to the users, please remember to post a summary of any useful responses to your questions.
      * To unsubscribe, send an email to majordomo@... with no subject and "unsubscribe ai-geostats" followed by "end" on the next line in the message body. DO NOT SEND Subscribe/Unsubscribe requests to the list
      * Support to the list is provided at http://www.ai-geostats.org
    Your message has been successfully submitted and would be delivered to recipients shortly.