Skip to content

faster Cox, fix to tests #43

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 205 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
205 commits
Select commit Hold shift + click to select a range
b6dd8ae
updating .travis.yml to do a doc build
jonathan-taylor Sep 25, 2019
7ce3837
trying to fix .travis.yml
jonathan-taylor Sep 25, 2019
3ea185b
full model simulation
jonathan-taylor Sep 25, 2019
f01f3c0
BF: fixing seed in some tests that can fail randomly
jonathan-taylor Sep 25, 2019
a1879b2
added Lee and ROSI examples to docs
jonathan-taylor Sep 25, 2019
93636b1
updating install instructions
jonathan-taylor Sep 25, 2019
6dc72a4
change of title
jonathan-taylor Sep 25, 2019
f3c5a36
updates to python scripts to selectinf
Nov 21, 2019
c5052da
C code for update for Cox partial likelihood
jonathan-taylor Feb 22, 2020
32502b3
updated C software for cox, wrapper
jonathan-taylor Feb 22, 2020
3a41a6b
minor fixes, added Cox code
jonathan-taylor Feb 22, 2020
d6a1aa5
merging
jonathan-taylor Feb 22, 2020
5a89aa3
updating C, cleanup cox
jonathan-taylor Feb 22, 2020
74098c7
added case weights but results don't quite agree with R
jonathan-taylor Feb 22, 2020
94e2c6a
updated cox code
jonathan-taylor Feb 22, 2020
ffbc2e1
updating C software
jonathan-taylor Apr 2, 2020
b35e935
fixing imports
jonathan-taylor Apr 2, 2020
0b7d566
some regreg changes to incorporate, more np decorators
jonathan-taylor Apr 2, 2020
9b6a065
one more missing import
jonathan-taylor Apr 2, 2020
82bc491
fix to travis yaml
jonathan-taylor Apr 2, 2020
357cb4c
comment causing problem in travis
jonathan-taylor Apr 2, 2020
38eb998
older version of glmnet for older version of R
jonathan-taylor Apr 2, 2020
798718a
forcing to be an ndarray
jonathan-taylor Apr 2, 2020
f552441
fixing version of glmnet
jonathan-taylor Apr 2, 2020
048b18c
try doc build with 3.6
jonathan-taylor Apr 2, 2020
2808d30
removing doc build for now
jonathan-taylor Apr 2, 2020
44bdd2b
py35 build on appveyor failing for pandas / cython issue
jonathan-taylor Apr 2, 2020
245411a
added class for posterior sampling
snigdhagit Apr 16, 2020
11e3e1e
Merge branch 'snigdhagit-master' into test-snigdha-merge
jonathan-taylor Apr 16, 2020
0f5e4d6
reverting imports to original form, attributes can be found in other …
jonathan-taylor Apr 16, 2020
55863df
Merge branch 'test-snigdha-merge'
jonathan-taylor Apr 16, 2020
da2fcb3
corrected prior
snigdhagit Apr 18, 2020
f2c51eb
able to use degenerate gaussian randomization for e.g. followup LASSO
jonathan-taylor Apr 21, 2020
9854fe3
added a local scaling for the Langevin sampler
snigdhagit May 2, 2020
5b74daf
added prior var in test
snigdhagit May 2, 2020
58e9d85
added both samplers to posterior inference
snigdhagit May 3, 2020
202ee2e
fixed subgradient in split lasso
snigdhagit May 4, 2020
bfc82d0
posterior samplers-- some changes
snigdhagit May 17, 2020
fa23e89
added test instances
snigdhagit Jun 22, 2020
19bbd55
some cleanup
jonathan-taylor Jun 23, 2020
3932618
using data frame summary output
jonathan-taylor Jun 23, 2020
af7705a
drop the losers query
jonathan-taylor Jun 23, 2020
657a212
make sure we get no 0-sized samples
jonathan-taylor Jun 23, 2020
0b7d8a3
import integer division
jonathan-taylor Jun 23, 2020
264174f
Merge pull request #47 from jonathan-taylor/degenerate_random
jonathan-taylor Jun 23, 2020
019ce94
RF: worked on api for posterior sampling, merged snigdha's changes
jonathan-taylor Jun 23, 2020
db08c33
Merge branch 'snigdhagit-master'
jonathan-taylor Jun 23, 2020
5d0b128
added code for selective_mle related outputs
snigdhagit Jun 24, 2020
2cf0e52
added plots to the examples-MLE file
snigdhagit Jun 24, 2020
c91670c
including sim_xy directly in script
jonathan-taylor Jun 24, 2020
7aab2a9
renaming columns in output
jonathan-taylor Jun 24, 2020
0d3b6a5
updates on mle script
jonathan-taylor Jun 24, 2020
cc911ed
Merge branch 'master' into snigdhagit-master
jonathan-taylor Jun 24, 2020
e77d947
adding a dispersion option to Lee et al lasso
jonathan-taylor Jun 25, 2020
c3f38ab
a few minor changes to test_cv_mle script -- to be replaced by an exa…
jonathan-taylor Jun 25, 2020
c3b1deb
Merge branch 'snigdhagit-master'
jonathan-taylor Jun 25, 2020
7fd336a
script replaced by a notebook in compare-selection
jonathan-taylor Jun 25, 2020
7835ac8
fixinf change of pval to pvalue
jonathan-taylor Jun 25, 2020
408f26e
change of lower/upper to lower_confidence/upper_confidence
jonathan-taylor Jun 25, 2020
9159043
commit changes to test_mle
snigdhagit Jun 25, 2020
6f546e9
add approx log reference
snigdhagit Jun 29, 2020
309306c
added test for pivot b.o. approx reference
snigdhagit Jun 29, 2020
b35377e
added approximate ci b.o. approx reference
snigdhagit Jun 30, 2020
9223ccc
sigma instead of sigma_sq while setting scale parameter of posterior …
snigdhagit Jun 30, 2020
4b77d8a
added hiv test: carved posterior interval estimates
snigdhagit Jul 2, 2020
8f4e612
changing order of output of log posterior
jonathan-taylor Jul 8, 2020
56f9ba6
BF: changing prior as well
jonathan-taylor Jul 8, 2020
ae211d7
BF: two more priors needed changing
jonathan-taylor Jul 8, 2020
93bfded
matching default prior
jonathan-taylor Jul 8, 2020
c8ed7cb
using discrete family for approximate grid inference
jonathan-taylor Jul 8, 2020
28fcb50
code cleanup for readability
jonathan-taylor Jul 8, 2020
7042db4
rename selective MLE method due to signature conflict
jonathan-taylor Jul 8, 2020
1da37cf
unused grid arguments
jonathan-taylor Jul 8, 2020
2830262
moved gaussian query specific methods to that class
jonathan-taylor Jul 8, 2020
50017cb
Merge branch 'snigdhagit-master'
jonathan-taylor Jul 8, 2020
2da4d06
cleanup of approx reference code
jonathan-taylor Jul 8, 2020
ae7c0de
BF: remove statsmodells dependency
jonathan-taylor Jul 8, 2020
07cfc9e
commit before switch
snigdhagit Jul 11, 2020
69713fd
test bias
snigdhagit Jul 13, 2020
54478ed
commit before switch
Jan 9, 2021
e45a42e
MCMC free pivots for group lasso
Feb 3, 2021
bc2107e
commit changes before switch
Feb 25, 2021
c04d7a6
BF: misnamed columns
jonathan-taylor Mar 2, 2021
fc9b0a3
BF: renaming of module
jonathan-taylor Mar 2, 2021
f9a1978
fix some warnings about literal comparison
jonathan-taylor Mar 2, 2021
5df425b
standalone functions for lasso inference; added nongaussian split-las…
jonathan-taylor Mar 16, 2021
8b595e1
signature of logistic instance
jonathan-taylor Mar 16, 2021
4dc9701
Merge branch 'snpnet_lasso'
jonathan-taylor Mar 16, 2021
cffdb0f
fixing approxiamte reference tests
jonathan-taylor Mar 16, 2021
13b8d2b
changing docstring
jonathan-taylor Mar 16, 2021
66fda01
fix nan signs
jonathan-taylor Mar 16, 2021
7926ad4
ensuring Gaussian mle is scale invariant; testing mle for other famil…
jonathan-taylor Mar 16, 2021
812d4e1
new pivots based on exact reference
Apr 19, 2021
71bf5f1
added test for ci
Apr 19, 2021
c8d3190
adding level argument for approximate reference
jonathan-taylor Apr 28, 2021
80c0cd8
tests for new equicorrelated instance
Apr 29, 2021
8cd202b
commit changes
May 16, 2021
18033ae
test to compare unbiased estimates
May 18, 2021
b133174
updates to test for unbiased est
May 24, 2021
93d1a67
commit changes before switch
May 24, 2021
a4c47ba
MLE code updated
Jun 7, 2021
24ab71c
updated posterior inference
Jun 7, 2021
7062cd3
updates to approx_reference
Jun 7, 2021
93e808f
removed interp1d for now to compute reference on a grid
Jun 13, 2021
43d78d2
added option to use interp1d
Jun 14, 2021
0b623f9
updated tests
Jun 14, 2021
58792c8
added barrier affine
Jun 14, 2021
9011fcc
fixed a sign
Jun 17, 2021
9e47c05
update to test
Jun 27, 2021
05d08e9
modified mle and reference code for group lasso
Jun 28, 2021
5208437
removing reparam_map
jonathan-taylor Jul 6, 2021
ccfeb0f
approx reference test
jonathan-taylor Jul 6, 2021
d753a92
removing unused code
jonathan-taylor Jul 6, 2021
13994bf
renaming logdens_linear
jonathan-taylor Jul 12, 2021
6a9a901
rename target_linear->score_decomp, target_offset->score_resid
jonathan-taylor Jul 12, 2021
7fc46cf
some more renaming
jonathan-taylor Jul 12, 2021
175cae7
finished rename, and rewrite in terms of regression parameters for LASSO
jonathan-taylor Jul 13, 2021
6b93957
doc describing rename
jonathan-taylor Jul 13, 2021
dd4597a
small comment
jonathan-taylor Jul 13, 2021
82cb60d
computing M1, M2, M3 within query, so data splitting runs now
jonathan-taylor Jul 13, 2021
d91f463
update doc
jonathan-taylor Jul 13, 2021
a762a48
commit changes so far
Jul 19, 2021
7856e7a
commit before switch
Jul 19, 2021
73137f8
some sign fixes
Jul 19, 2021
5449179
commit before switch
Jul 20, 2021
cce962e
regress_target_score scaled by dispersion
Jul 20, 2021
cb73217
changes to selective mle + target: added comments
Jul 20, 2021
e99d75e
updated query: moved calculations for M1, M2, M3
Jul 25, 2021
071a143
commit before switch
Jul 25, 2021
66a388e
commit before switch
Jul 25, 2021
197e927
update posterior inf
Jul 26, 2021
275c953
scaled M1, M2, M3 with dispersion
Jul 26, 2021
56ca78f
created all necessary objects
Jul 26, 2021
f826098
cleaned up some more
Jul 27, 2021
2edd860
compare branches in progress
Jul 27, 2021
ebbda3e
compare branches in progress
Jul 27, 2021
02d0d6d
compare branches in progress
Jul 27, 2021
d61fd1f
compare branches in progress
Jul 27, 2021
8c386d8
some more tests
Jul 27, 2021
9ea85fa
updated exact ref
Aug 3, 2021
2b72a5f
check in progress: w master
Aug 8, 2021
5eadfe6
updated approx reference
Aug 9, 2021
bb2802c
clean up for all the tests
Aug 10, 2021
40ac8ab
delete some comments
jonathan-taylor Aug 18, 2021
9f67754
making dispersion default to 1 optional, fixing tests to use smaller …
jonathan-taylor Aug 18, 2021
b571985
suppressing dispersion to _setup_implied_gaussian; putting back in te…
jonathan-taylor Aug 18, 2021
f580e93
removing dispersion from selective mle
jonathan-taylor Aug 18, 2021
e578753
removing dispersion where possible
jonathan-taylor Aug 18, 2021
51918b2
WIP: moving targets to base module; fixing slope and screening
jonathan-taylor Aug 18, 2021
048474b
fix target calls in multiple queries
jonathan-taylor Aug 18, 2021
707c2e9
Merge branch 'snigdhagit-refactor_names' into refactor_names
jonathan-taylor Aug 18, 2021
3d3f778
BF: fixing handling of dispersion
jonathan-taylor Aug 18, 2021
c8eca0a
using NamedTuple for target specification as this arg appears over an…
jonathan-taylor Aug 24, 2021
3320bd8
test scale for posterior log likelihood
Aug 31, 2021
5fb9c81
commit changes before switch
Sep 9, 2021
68b1aa4
fix alignment of _setup_sampler
Sep 21, 2021
2ac3f27
added setup_inference post fit: can pass dispersion argument
Oct 3, 2021
990152c
update tests-- selected targets (mle)
Oct 4, 2021
add1888
removing dispersion from list returned by target forming functions
Oct 4, 2021
afa029e
give dispersion as an argument for posterior class: needs it for samp…
Oct 4, 2021
e37d0c2
add setup_inference to split_lasso, lasso
Oct 4, 2021
66e4e40
updated tests: for mle
Oct 4, 2021
4c66570
updated tests for posterior inference
Oct 4, 2021
a828983
updated tests for approx and exact reference
Oct 4, 2021
77b0650
all tests pass
Oct 5, 2021
076137f
Merge pull request #56 from snigdhagit/refactor_names
jonathan-taylor Oct 19, 2021
7fd4401
deleted unused methods for sampling from query
Nov 1, 2021
d02f25b
some more clean up
Nov 1, 2021
15aaa6e
added class for MLE based inference
Nov 3, 2021
56b0902
other changes
Nov 3, 2021
91353fc
some more clean up for query
Nov 4, 2021
5e448a1
removed regress_opt from return list
Nov 4, 2021
6f38200
changed some names of variables in posterior: for consistency
Nov 4, 2021
69e7dd1
changed some names of variables: for consistency
Nov 4, 2021
cd730d6
some more name changes for variables
Nov 5, 2021
eb0d16e
removed regress_opt from return list; some more consistency fixes
Nov 6, 2021
b295db6
Merge pull request #57 from snigdhagit/refactor_names
jonathan-taylor Nov 9, 2021
4418bf7
a little reorg -- one method for inference
jonathan-taylor Nov 17, 2021
ffd89dd
BF: fixing method name
jonathan-taylor Nov 17, 2021
e7a1c4a
more cleanup, added QuerySpec named tuple
jonathan-taylor Nov 17, 2021
a7704a4
more cleanup of selective_MLE
jonathan-taylor Nov 17, 2021
7bbcf90
specification as a property; use QS instead of Q; standardizing grid_…
jonathan-taylor Nov 17, 2021
559bc96
simplifying grid methods
jonathan-taylor Nov 17, 2021
aea5df8
renaming temporary matrices
jonathan-taylor Nov 17, 2021
8fffbb2
more cleanup; remains to unify U1,U2,U3... calculation across 4 methods
jonathan-taylor Nov 18, 2021
53645b7
adjusted cov in split_lasso; adjusted intervals for actual target in …
Nov 23, 2021
dad81a2
minor fix in return list
Nov 25, 2021
34dc78a
Merge pull request #58 from snigdhagit/refactor_names
jonathan-taylor Nov 29, 2021
40cd77b
fixing approx_reference
jonathan-taylor Nov 30, 2021
1508e06
refactor SLOPE to new form; added split_slope
jonathan-taylor Dec 7, 2021
8d05d91
moved U1-5 calculations from methods to base
Dec 14, 2021
dd1bae0
Merge pull request #59 from snigdhagit/refactor_names
jonathan-taylor Jan 11, 2022
a63dae6
WIP: screeening
jonathan-taylor Jan 11, 2022
a481857
removed redundant interaction functions; have the U quantities comput…
jonathan-taylor Jan 11, 2022
a425869
bootstrap lasso version
jonathan-taylor Dec 14, 2022
74369ba
updating docs
jonathan-taylor Apr 25, 2023
4aee7ad
update requirements, remove some np.float
jonathan-taylor Apr 25, 2023
442ca90
Merge branch 'bootstrap'
jonathan-taylor Apr 25, 2023
14f022d
update python version for readthedocs
jonathan-taylor Apr 25, 2023
5552303
adding regreg as a submodule
jonathan-taylor Apr 25, 2023
57da659
trying to build regreg first
jonathan-taylor Apr 25, 2023
f503c5c
trying path of regreg
jonathan-taylor Apr 25, 2023
6d2260d
trying again
jonathan-taylor Apr 25, 2023
456dcb8
using URL in requirements file
jonathan-taylor Apr 25, 2023
e448111
trying again
jonathan-taylor Apr 25, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,18 @@ sphinx:
#formats: all

# Optionally set the version of Python and requirements required to build your docs

python:
version: 3.6
version: 3.8
install:
- requirements: https://raw.githubusercontent.com/jonathan-taylor/regreg/master/requirements.txt
- method: pip
path: https://github.com/jonathan-taylor/regreg.git
- requirements: requirements.txt
- requirements: doc-requirements.txt
- requirements: doc/requirements.txt
- method: setuptools
path: .

submodules:
include: all

24 changes: 19 additions & 5 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ dist: trusty
python:
- 2.7
- 3.5
- 3.6
notifications:
email: false
addons:
Expand Down Expand Up @@ -69,6 +70,7 @@ matrix:
env:
- INSTALL_TYPE=requirements
- DEPENDS=

before_install:
- source travis-tools/utils.sh
- travis_before_install
Expand All @@ -87,8 +89,9 @@ install:
- if [ "$RUN_R_TESTS" ]; then
sudo apt-get install -y r-base r-base-dev r-cran-devtools r-cran-rcpp;
pip install rpy2 statsmodels -c constraints.txt ;
Rscript -e "library(Rcpp); Rcpp::compileAttributes('selectiveInference')";
sudo Rscript -e "install.packages(c('glmnet', 'intervals', 'adaptMCMC', 'SLOPE', 'knockoff'), repos='http://cloud.r-project.org')";
sudo Rscript -e "install.packages(c('devtools', 'intervals', 'adaptMCMC', 'SLOPE'), repos='http://cloud.r-project.org')";
sudo Rscript -e "require(devtools); install_version('glmnet', version='2.0-18', repos='http://cloud.r-project.org')";
sudo Rscript -e "install.packages('knockoff', repos='http://cloud.r-project.org')";
git clone https://github.com/jonathan-taylor/R-selective.git;
cd R-selective;
git submodule init;
Expand All @@ -107,11 +110,22 @@ script:
# No figure windows for mpl; quote to hide : from travis-ci yaml parsing
- pip install -r requirements.txt -c constraints.txt; # older rpy2
# Change into an innocuous directory and find tests from installation
- mkdir for_testing
- cd for_testing
- 'echo "backend : agg" > matplotlibrc'

- |
if [ "$DOC_BUILD" ]; then
pip install -r doc-requirements.txt;
cd doc;
jupytext --sync source/*/*.ipynb;
sudo apt-get install pandoc;
make html;
fi
#
# # Build the htmlwithout the API documentation, for the doctests
#
# fi
# Doctests only on platforms that have compatible fp output
- mkdir for_testing
- cd for_testing
- if [ `uname` == "Darwin" ] ||
[ "${TRAVIS_PYTHON_VERSION:0:1}" == "3" ]; then
DOCTEST_ARGS="--with-doctest";
Expand Down
2 changes: 1 addition & 1 deletion C-software
Submodule C-software updated 2 files
+245 −0 src/cox_fns.c
+85 −0 src/cox_fns.h
14 changes: 8 additions & 6 deletions appveyor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,6 @@ environment:
- PYTHON: C:\Python36-x64
NP_BUILD_DEP: "1.13.3"
NP_TEST_DEP: "1.13.3"
- PYTHON: C:\Python35-x64
NP_BUILD_DEP: "1.13.3"
NP_TEST_DEP: "1.13.3"

- PYTHON: C:\Python37
NP_BUILD_DEP: "1.14.5"
Expand All @@ -39,9 +36,14 @@ environment:
- PYTHON: C:\Python36
NP_BUILD_DEP: "1.13.3"
NP_TEST_DEP: "1.13.3"
- PYTHON: C:\Python35
NP_BUILD_DEP: "1.13.3"
NP_TEST_DEP: "1.13.3"

# problem with pandas + cython for py35
# - PYTHON: C:\Python35-x64
# NP_BUILD_DEP: "1.13.3"
# NP_TEST_DEP: "1.13.3"
# - PYTHON: C:\Python35
# NP_BUILD_DEP: "1.13.3"
# NP_TEST_DEP: "1.13.3"

install:
- cmd: echo "Using cmd"
Expand Down
169 changes: 169 additions & 0 deletions doc/Gaussian queries.Rmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
---
jupyter:
jupytext:
formats: ipynb,Rmd
text_representation:
extension: .Rmd
format_name: rmarkdown
format_version: '1.2'
jupytext_version: 1.10.2
kernelspec:
display_name: Python 3
language: python
name: python3
---

## KKT conditions

$$
\omega = \nabla \ell(o) + u + \epsilon o.
$$

## Current terms used in selective MLE

- `observed_score_state`: for LASSO this is $S=-X^TY$ (and for any linear regression), in general it should be
$\nabla \ell(\beta^*) - Q(\beta^*)\beta^*$, call this $A$

- `opt_offset`: this is $\hat{u}$ or (changed everywhere to `observed_subgrad`)

- `opt_linear`: this is $\nabla^2 \ell(\hat{\beta}) + \epsilon I$ restricted to "selected" subspace, call this $L$

## Rewrite of KKT

$$
\omega = Lo + S + u.
$$

## More terms in the code

- Randomization precision `randomizer_prec` call this $\Theta_{\omega}=\Sigma_{\omega}^{-1}$ so $\omega \sim N(0, \Theta^{-1})$.

- `cond_cov`= $\Sigma_{o|S,u}$, `cond_mean`, `cond_precision`=$\Sigma_{o|S,u}^{-1}=\Theta_{o|S,u}$:
describe implied law of $o|S,u$. These are computed in `_setup_implied_gaussian`. Specifically, we have

$$
\begin{aligned}
\Sigma_{o|S,u} = (L^T\Theta L)^{-1}
\end{aligned}
$$

- `regress_opt` (formerly `logdens_linear`) call this $A$: this is the regression of $o$ onto $S+u$, in the implied
Gaussian given $u,S$ i.e.

$$
E[o|S,u] = A(S+u) = -\Sigma_{o|S,u} L^T \Theta_{\omega}(S+u).
$$

- `cond_mean` is the conditional mean of $o|S,u$ evaluated at observed $S,u$: $A(S+u)_{obs}$. Or, `regress_opt_score(observed_score_state + observed_subgrad)`


## Target related

- `observed_target, target_cov, target_prec`: not much explanation needed $\hat{\theta}, \Sigma_{\hat{\theta}}, \Theta_{\hat{\theta}} = \Sigma_{\hat{\theta}}^{-1}$

- `target_score_cov`: $\Sigma_{\hat{\theta},S}$

- `regress_target`: regression of target onto score, formally this would be $\Sigma_{\hat{\theta},S}\Theta_S $ (transpose of usual way of writing regression, not in code yet), let's call it $B$ for now

- `cov_product`: $\Sigma_S \Theta_{\omega}$: product of score covariance and randomization precision.

- `cov_score`: $\Sigma_S$

- `score_offset = observed_score_state + observed_subgrad`=$S+u$

### In `selective_MLE`

- `target_linear`: $\Sigma_{S,\hat{\theta}}\Theta_{\hat{\theta}}= \Sigma_S B^T\Theta_{\hat{\theta}}$ (changed name to `regress_score_target`)

- `target_offset`: $S+u-\Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta} = S+u - \Sigma_{S,\hat{\theta}} \Theta_{\hat{\theta}} \hat{\theta}$ (changed name to `resid_score_target`)

- `target_lin`: $A\Sigma_S B^T \Theta_{\hat{\theta}} = -(L^T\Theta_{\omega}L)^{-1} L^T\Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}}$ (changed name to `regress_opt_target`

- `target_off`: $A(S+u - \Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta})$ `resid_opt_target`

- `_P`: $\Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} (S+u-\Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta}) = \Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} (S+u) - \Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta} = \Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} (S+u) - \Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} \Sigma_{\omega} \Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta} $.
Let's call `_P` $\xi$

- `_prec`: $\Theta_{\hat{\theta}} + \Theta_{\hat{\theta}} B\Sigma_S \Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}}
- \Theta_{\hat{\theta}} B \Sigma_S A^T \Theta_{o|S,u} A \Sigma_S B^T \Theta_{\hat{\theta}}$

- `C`: something that can be computed with all of the above... I guess (but am not sure) that `_prec` is
the precision of the (best case, no-selection) unbiased estimate of our target when we condition on $N,u$

- More precisely,

$$
\begin{aligned}
\Theta_{\hat{\theta}} C &= \xi + (A\Sigma_S B^T \Theta_{\hat{\theta}})^T L^T \Theta_{\omega} L (A\Sigma_S B^T \Theta_{\hat{\theta}})^T \hat{\theta} - (A\Sigma_S B^T \Theta_{\hat{\theta}})^T L^T \Theta_{\omega} L A(S+u) \\
&= \xi + \Theta_{\hat{\theta}}B \left(\Sigma_S A^T L^T\Theta_{\omega} L A \Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta} - \Sigma_S A^T L^T\Theta_{\omega} L A(S+u) \right) \\
&= \xi + \Theta_{\hat{\theta}}B \left(\Sigma_S \Theta_{\omega} L (L^T\Theta_{\omega} L)^{-1} L^T \Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}} \hat{\theta} + \Sigma_S \Theta_{\omega}L A(S+u) \right) \\
\end{aligned}
$$

The expression $A(S+u)$ is `cond_mean` and the other term can be computed straightforwardly. We've used the fact
$$
A\Sigma_S = -\Sigma_{o|S,u}L^T\Theta_{\omega} \Sigma_S =- (L^T\Theta_{\omega}L)^{-1}L^T\Theta_{\omega}\Sigma_S
$$

<!-- #region -->



- Don't know what to sensibly call the last three things... but `_P` and `_prec` are the arguments to the
optimization problem so these are what needs computing. I did change `_prec` to `prec_target_nosel`

- `cov_target.dot(regress_opt_target.T.dot(prec_opt))`. This is

$$-\Sigma_{\hat{\theta}} \Theta_{\hat{\theta}}B \Sigma_S\Theta_{\omega} L (L^T\Theta_{\omega}L)^{-1} (L^T\Theta_{\omega} L) = B \Sigma_S\Theta_{\omega} L$$

- `regress_opt_target.T.dot(prec_opt)`. This is

$$-\Theta_{\hat{\theta}}B \Sigma_S\Theta_{\omega} L (L^T\Theta_{\omega}L)^{-1} (L^T\Theta_{\omega} L) = \Theta_{\hat{\theta}} B \Sigma_S\Theta_{\omega} L$$

- `regress_opt_target.T.dot(prec_opt).dot(regress_opt_target)`: This is

$$
\Theta_{\hat{\theta}}B \Sigma_S\Theta_{\omega} L (L^T\Theta_{\omega}L)^{-1} L^T\Theta_{\omega} \Sigma_S B^T \Theta_{\hat{\theta}}
$$
<!-- #endregion -->

### Computational considerations?


#### Case 1: $\Theta_{\omega}^{1/2}$ is known


Another potential downside to all this is that these matrices will generally be $p \times p$. I think in `price_of_selection` I had written some way of doing part of this without having to form all of these matrices
explicitly. However, the difference of the last two matrices in `_prec` can be computed (if we know $\Sigma_{\omega}^{\pm 1/2}$ as identity minus rank $E$ matrix I think and
$$
A^T\Sigma_{o|S,u}A = \Theta_{\omega} L^T \Sigma_{o|S,u} L \Theta_{\omega}
$$
so we want to compute
$$
\Theta_{\omega} - \Theta_{\omega} L^T \Sigma_{o|S,u} L \Theta_{\omega} = \Theta_{\omega}^{1/2}(P - \Theta_{\omega}^{1/2}L^T (L^T\Theta_{\omega} L)^{-1} L\Theta_{\omega}^{1/2}) \Theta_{\omega}^{1/2}
$$
with $P$ projection onto $\text{row}(\Sigma_{\omega})$. So we need to compute projection on to a $E$-dimensional
subspace of $\text{row}(\Sigma_{\omega})$. Morally, this makes sense even if $\Sigma_{\omega}$ is not full rank but seems a little sketchy.

We might also try computing
$$
\begin{aligned}
\Sigma_S\Theta_{\omega}\Sigma_S - \Sigma_S\Theta_{\omega} L^T \Sigma_{o|S,u} L \Theta_{\omega} \Sigma_S &= \Sigma_S \Theta_{\omega}^{1/2}(P - \Theta_{\omega}^{1/2}L^T (L^T\Theta_{\omega} L)^{-1} L\Theta_{\omega}^{1/2}) \Theta_{\omega}^{1/2} \Sigma_S \\
&= \Sigma_S \Theta_{\omega} \Theta_{\omega}^{-1/2}(P - \Theta_{\omega}^{1/2}L^T (L^T\Theta_{\omega} L)^{-1} L\Theta_{\omega}^{1/2}) \Theta_{\omega}^{-1/2} \Theta_{\omega} \Sigma_S \\
&= \Sigma_S \Theta_{\omega} \Sigma_{\omega}^{1/2}(P - \Theta_{\omega}^{1/2}L^T (L^T\Theta_{\omega} L)^{-1} L\Theta_{\omega}^{1/2}) \Sigma_{\omega}^{1/2} \Theta_{\omega} \Sigma_S \\
&= \Sigma_S \Theta_{\omega} (\Sigma_{\omega} - PL^T (L^T\Theta_{\omega} L)^{-1} LP) \Theta_{\omega} \Sigma_S \\
&= \Sigma_S \Theta_{\omega} (\Sigma_{\omega} - L^T (L^T\Theta_{\omega} L)^{-1} L) \Theta_{\omega} \Sigma_S \\
\end{aligned}
$$

## Three matrices

- All the computations above can be expressed of some target specific info like $B, \Theta_{\hat{\theta}}, \Sigma_{\hat{\theta}}, \hat{\theta}$ and

$$
\begin{aligned}
M_1 &= \Sigma_S \Theta_{\omega} \\
M_2 &= M_1 \Sigma_{\omega} M_1^T \\
M_3 &= M_1 L (L^T\Sigma_{\omega}L)^{-1} L M_1^T
\end{aligned}
$$
Loading