Skip to content

Commit 1fd8187

Browse files
committed
enh: add book chapter reference
1 parent 553a1cb commit 1fd8187

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

docs/assets/BrainHack2024/index.html

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -328,14 +328,16 @@
328328
# Why standardizing?
329329

330330
.boxed-content[
331-
.distribute.large[
331+
.distribute[
332332
* Increase reliability -- classic test theory:
333333
* Repetition
334334
* Standardization
335335

336336
* Other
337337
* Pushing the *truck factor* above 1.0.
338338
* Engage users
339+
340+
*Standardized Preprocessing in Neuroimaging: Enhancing Reliability and Reproducibility* doi:[10.31219/osf.io/42bsu](https://doi.org/10.31219/osf.io/42bsu), chapter 3.1 of *Methods for analyzing large neuroimaging datasets* doi:[10.17605/OSF.IO/D9R3X](https://doi.org/10.17605/OSF.IO/D9R3X) edited by R. Whelan and H. Lemaître.
339341
]]
340342

341343
---

0 commit comments

Comments
 (0)