DemingRegressor
mcup.deming.DemingRegressor
Bases: BaseRegressor
Regression estimator using Deming (total least squares) joint optimisation.
Optimises jointly over model parameters and latent true x values, giving an
exact treatment of both x and y measurement errors. Slower than
XYWeightedRegressor but more accurate when x errors are large or the
model is strongly nonlinear.
Supports two solvers selected via the method argument:
"analytical"— joint optimisation with(J^T W J)^{-1}covariance."mc"— Monte Carlo sampling with Welford online covariance (default, robust for nonlinear models).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
func
|
Callable
|
Model function with signature |
required |
method
|
str
|
Solver to use, either |
'mc'
|
n_iter
|
int
|
Maximum number of Monte Carlo iterations. Default |
10000
|
rtol
|
Optional[float]
|
Relative tolerance for MC convergence stopping. Default |
None
|
atol
|
Optional[float]
|
Absolute tolerance for MC convergence stopping. Default |
None
|
optimizer
|
str
|
SciPy optimizer name used for parameter fitting. Default |
'BFGS'
|
Attributes:
| Name | Type | Description |
|---|---|---|
params_ |
Fitted parameter array. |
|
params_std_ |
Standard deviations of fitted parameters. |
|
covariance_ |
Full parameter covariance matrix. |
|
n_iter_ |
Actual number of MC iterations run (MC method only). |
Notes
X may be shape (n,) for scalar input or (n, k) for k-dimensional
input per data point. x_err must match X.shape exactly; set a column
to zero for any feature that is known exactly. Zero-error features are pinned
to their observed values inside the joint optimisation — they are excluded from
the latent-variable cost to avoid division by zero.
Source code in mcup/deming.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 | |