-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathREADME.Rmd
218 lines (163 loc) · 9.57 KB
/
README.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
---
output: github_document
---
<!-- README.md is generated from README.Rmd. Please edit that file -->
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```
# tidyneuro
<!-- badges: start -->
[](https://github.com/neuroimaginador/tidyneuro/actions)
[](https://codecov.io/gh/neuroimaginador/tidyneuro?branch=master)
<!-- badges: end -->
There are a lot of packages in the R ecosystem focused on several aspects of Neuroimage processing, many of them listed in the [CRAN Task View: Medical Imaging](https://CRAN.R-project.org/view=MedicalImaging) section: `oro.nifti`, `oro.dicom`, `RNifti`, `fmri`, `divest` (among others) for reading/writing of images; `ANTsR`, `fslr` for more advanced processing methods; `RNiftyReg` specialized in registration, `mritc` focused in image segmentation. Many more, for DTI, fMRI, molecular imaging, and visualization are available in [CRAN](https://CRAN.R-project.org).
However, to get a complete Neuroimaging pipeline working, it is common to rely on several different packages, and, in general, no single package can offer all the functionalities desired. Tracking the requirements and dependencies of a pipeline is time-consuming, since the user has to manually find, for each function used, the packages it depends on.
In addition, the most typical form of building a pipeline in R is via scripting. Once the needed functions are defined or imported from a third-party package, one or more scripts are written in order to get all processes done.
But scripts are error-prone, difficult to trace and debug if the script is large, and require some level of customization when one tries to run the script in a different environment (computer, laboratory...). These cons make us think of defining self-contained workflows, easily shareable, which can perform the same operations as a script but with greater legibility and traceability.
The goal of this package is to provide certain features to manage the situations before:
- **Simple definition of Neuroimaging flows**, which can incorporate any `R` function (including those from other packages, such as `ANTsR` or `fslr`) as a part of it. At definition time, the package warns about inconsistencies on the flow from inputs to outputs (undefined functions or variables).
- Creation of a **common interface for the execution** of Neuroimaging workflows, reducing the need of constantly looking up in the documentation of several packages. Instead of using a different syntax for each function, just _execute_ the flow, specifying the required outputs, and all associated functions are run in the right order to get the results.
- Ability to **automatically track dependencies**. Every time a new function or process is inserted into a flow, the _flow_ object tries to guess its dependencies, by looking up in the different loaded namespaces, and storing (internally) this information to be used in the export stage.
- **Workflows can be shared** across multiple systems. A _flow_ object can be exported and saved as a regular .zip file, and then reloaded in a different computer. When importing a flow, the package detects (from in internals) if there are missing dependencies (packages not installed on the system) and prompts the user to install them.
- **Extensibility**. A node in the workflow is a single function in `R`, that is, something that can be computed in runtime. This means that a process in the pipeline can be, for example, a function to generate a plot, or to compute image statistics, or a function that wraps a random forest which has been previously trained to infer a labelling on the input image. The `tidyneuro` package provides the tools to extend this functionality to other _computations_: machine learning or deep learning models, for instance.
- **Simple memory management**. The memory used by intermediate outputs when running a flow is automatically released when those outputs are no longer needed for further results, thus keeping the memory footprint as small as possible.
- Work natively with **NIfTI files**.
## Installation
<!-- You can install the released version of tidyneuro from [CRAN](https://CRAN.R-project.org) with: -->
<!-- ``` r -->
<!-- install.packages("tidyneuro") -->
<!-- ``` -->
<!-- And the development version from [GitHub](https://github.com/) with: -->
You can install the development version from [the GitHub repo](https://github.com/neuroimaginador/tidyneuro) with:
``` r
# install.packages("remotes")
remotes::install_github("neuroimaginador/tidyneuro")
```
## The `tidyneuro` approach
```{r echo = FALSE, warning=FALSE, message=FALSE}
suppressPackageStartupMessages(library(tidyneuro))
##%######################################################%##
# #
#### Auxiliary functions ####
# #
##%######################################################%##
# - register to a template (MNI152 in case of structural imaging)
# Returns the transformation, as declared in the ANTsRCore package
register_to_template <- function(image, template) {
#...
}
# Returns the image reoriented in the template space
get_img_in_template <- function(tx) {
#...
}
# Returns the affine transformation
get_tx_to_template <- function(tx) {
#...
}
# Returns the scale parameter as the determinant of the affine transformation
get_scale_parameter <- function(M) {
#...
}
template <- file.path(".", "resources",
"volumetry", "MNI152_T1_1mm.nii.gz")
# - bias field correction, just a simple wrapper around
# n4BiasFieldCorrection
bfc <- function(img) {
#...
}
# - brain extraction
# Uses MALF technique with a "brain extraction" dataset
# Returns the brain mask
make_brain_extraction <- function(image) {
#...
}
# - segmentation (GM, WM an CSF)
# Uses FSL's FAST tool to find a segmentation into GM, WM and CSF
# Returns the segmentation as a labelled image with labels 0 (background),
# 1 (CSF), 2 (GM) and 3 (WM)
get_segmentation <- function(img, mask = NULL) {
#...
}
count_by_ROI <- function(img) {
#...
}
##%######################################################%##
# #
#### Volumetry flow ####
# #
##%######################################################%##
volumetry_flow <- workflow(name = "volumetry",
work_dir = "~/flow/volumetry",
inputs = "T1")
volumetry_flow %>%
step(register_to_template,
inputs = "T1",
output = c("T1 (MNI)", "Affine transform (Native -> MNI)"),
template = template) %>%
step(get_scale_parameter,
inputs = "Affine transform (Native -> MNI)",
output = "Scale parameter")
volumetry_flow %>%
step(bfc,
inputs = "T1 (MNI)",
output = "Bias field corrected T1")
volumetry_flow %>%
step(make_brain_extraction,
inputs = "Bias field corrected T1",
output = "Brain mask")
volumetry_flow %>%
step(function(A, B) {A * B},
inputs = c("Bias field corrected T1", "Brain mask"),
output = "Skull-stripped T1")
volumetry_flow %>%
step(get_segmentation,
inputs = "Skull-stripped T1",
output = "Segmented T1")
volumetry_flow %>%
step(count_by_ROI,
inputs = "Brain mask",
output = "Brain volume (MNI)")
volumetry_flow %>%
step(function(A, B) {A * B},
inputs = c("Brain volume (MNI)", "Scale parameter"),
output = "Brain volume")
volumetry_flow %>%
step(count_by_ROI,
inputs = "Segmented T1",
output = "Tissue volume (MNI)")
volumetry_flow %>%
step(function(A, B) {A * B},
inputs = c("Tissue volume (MNI)", "Scale parameter"),
output = "Tissue volume")
```
In `tidyneuro`, a _workflow_ is an ordered collection of processes that convert inputs into outputs, such that the output of a process may be the input to another one.
By defining the appropriate functions, one can model the pipeline in the correct order and obtain a flow as the one depicted in the following figure.
```{r example_flow, echo = FALSE, fig.dim = c(17, 9), out.width="50%", out.height="50%", fig.align="center"}
f <- function(a, b) {a + b}
my_flow <- workflow(name = "test",
inputs = c("Input1", "Input2"))
my_flow %>%
step(f, inputs = c("Input1", "Input2"), output = "Output1") %>%
inputs("Input3") %>%
step(f, inputs = c("Input2", "Input3"), output = "Output2") %>%
step(f, inputs = c("Input1", "Input3"), output = "Output3") %>%
step(f, inputs = c("Output3", "Input3"), output = "Output4")
my_flow %>% plot()
```
In this case, we have 3 inputs and several outputs. The arrows indicate which inputs are used to compute each of the other outputs.
## Example
This is a basic example which shows you how to solve a common problem, the computation of tissue (gray matter, white matter and cephalospinal fluid) volume:
```{r example}
library(tidyneuro)
## basic example code
```
```{r volumetry, echo = FALSE, dpi=96, fig.dim=c(14,16), out.width="80%", out.height="80%", fig.align="center"}
volumetry_flow %>% plot()
```
## Code of Conduct
Please note that the tidyneuro project is released with a [Contributor Code of Conduct](https://contributor-covenant.org/version/2/0/CODE_OF_CONDUCT.html). By contributing to this project, you agree to abide by its terms.