Skip to content

Commit 027b899

Browse files
authored
Update README.md
1 parent 31789b6 commit 027b899

File tree

1 file changed

+39
-36
lines changed

1 file changed

+39
-36
lines changed

Diff for: papers/ed-afm/README.md

+39-36
Original file line numberDiff line numberDiff line change
@@ -1,64 +1,67 @@
1-
# ED-AFM
2-
Electrostatic Discovery Atomic Force Microscopy
1+
# Electrostatic Discovery Atomic Force Microscopy
2+
3+
The scrpts in the original repository at https://github.com/SINGROUP/ED-AFM are reproduced here for convenience.
34

45
Paper: [*N. Oinonen et al. Electrostatic Discovery Atomic Force Microscopy, ACS Nano 2022*](https://pubs.acs.org/doi/10.1021/acsnano.1c06840)
56

67
Abstract:
78
_While offering unprecedented resolution of atomic and electronic structure, Scanning Probe Microscopy techniques have found greater challenges in providing reliable electrostatic characterization at the same scale. In this work, we introduce Electrostatic Discovery Atomic Force Microscopy, a machine learning based method which provides immediate quantitative maps of the electrostatic potential directly from Atomic Force Microscopy images with functionalized tips. We apply this to characterize the electrostatic properties of a variety of molecular systems and compare directly to reference simulations, demonstrating good agreement. This approach opens the door to reliable atomic scale electrostatic maps on any system with minimal computational overhead._
89

9-
![Method schematic](/figures/method_schem.png)
10+
![Method schematic](https://github.com/SINGROUP/ED-AFM/blob/master/figures/method_schem.png)
1011

1112
## ML model
1213

1314
We use a U-net type convolutional neural network with attention gates in the skip connections. Similar model was used previously by Oktay et al. for segmenting medical images (https://arxiv.org/abs/1804.03999v2).
1415

15-
![Model schematic](/figures/model_schem.png)
16-
![AG schematic](/figures/AG_schem.png)
16+
![Model schematic](https://github.com/SINGROUP/ED-AFM/blob/master/figures/model_schem.png)
17+
![AG schematic](https://github.com/SINGROUP/ED-AFM/blob/master/figures/AG_schem.png)
1718

18-
Our implementation of the model in PyTorch can be found in this repository: `edafm/models.py` contains the model definition and `edafm/layers.py` contains definitions of layers and layer blocks used by the model. In `edafm/models.py`, two modules can be found: `AttentionUNet`, which is the generic version of the model, and `EDAFMNet`, which is a subclass of the former specifying the exact hyperparameters for the model that we used.
19+
The model implementation can be found in [`mlspm/image/models.py`](https://github.com/SINGROUP/ml-spm/blob/main/mlspm/image/models.py), where two modules can be found: [`AttentionUNet`](https://ml-spm.readthedocs.io/en/latest/reference/mlspm.image.html#mlspm.image.models.AttentionUNet), which is the generic version of the model, and [`EDAFMNet`](https://ml-spm.readthedocs.io/en/latest/reference/mlspm.image.html#mlspm.image.models.EDAFMNet), which is a subclass of the former specifying the exact hyperparameters for the model that we used.
1920

20-
In `EDAFMNet` one can also specify pretrained weights of several types to download to the model using the `trained_weights` argument:
21+
In `EDAFMNet` one can also specify pretrained weights of several types to download to the model using the `pretrained_weights` argument:
2122

22-
- 'base': The base model used for all predictions in the main ED-AFM paper and used for comparison in the various test in the supplementary information of the paper.
23-
- 'single-channel': Model trained on only a single CO-tip AFM input.
24-
- 'CO-Cl': Model trained on alternative tip combination of CO and Cl.
25-
- 'Xe-Cl': Model trained on alternative tip combination of Xe and Cl.
26-
- 'constant-noise': Model trained using constant noise amplitude instead of normally distributed amplitude.
27-
- 'uniform-noise': Model trained using uniform random noise amplitude instead of normally distributed amplitude.
28-
- 'no-gradient': Model trained without background-gradient augmentation.
29-
- 'matched-tips': Model trained on data with matched tip distance between CO and Xe, instead of independently randomized distances.
23+
- `'base'`: The base model used for all predictions in the main ED-AFM paper and used for comparison in the various test in the supplementary information of the paper.
24+
- `'single-channel'`: Model trained on only a single CO-tip AFM input.
25+
- `'CO-Cl'`: Model trained on alternative tip combination of CO and Cl.
26+
- `'Xe-Cl`': Model trained on alternative tip combination of Xe and Cl.
27+
- `'constant-noise'`: Model trained using constant noise amplitude instead of normally distributed amplitude.
28+
- `'uniform-noise'`: Model trained using uniform random noise amplitude instead of normally distributed amplitude.
29+
- `'no-gradient'`: Model trained without background-gradient augmentation.
30+
- `'matched-tips'`: Model trained on data with matched tip distance between CO and Xe, instead of independently randomized distances.
3031

3132
The model weights can also be downloaded directly from https://doi.org/10.5281/zenodo.10606273. The weights are saved in the state_dict format of PyTorch.
3233

3334
## Data and model training
3435

35-
We don't provide the full training/validation/test sets for download because they are very large (~1TiB in total). Instead, we provide the database of molecular geometries that can be used to generate the full dataset using ProbeParticleModel. The provided script `generate_data.py` does the data generation and will download the molecule database automatically. Alternatively, the molecule database can be downloaded directly from https://doi.org/10.5281/zenodo.10606443.
36+
We provide a database of molecular geometries that can be used to generate the full dataset using [`ppafm`](https://github.com/Probe-Particle/ppafm). The provided script `generate_data.py` does the data generation and will download the molecule database automatically. Alternatively, the molecule database can be downloaded directly from https://doi.org/10.5281/zenodo.10609676.
3637

3738
The model training can be done using the provided script `train.py`. Note that performing the training using all the same settings as we used requires a significant amount of time and also a significant amount VRAM on the GPU, likely more than can be found on a single GPU. In our case the model training took ~5 days using 4 x Nvidia Tesla V100 (32GB) GPUs. However, inference on the trained model can be done even on a single lower-end GPU or on CPU.
3839

39-
All the data used for the predictions in the paper can be found under the `data` directory.
40+
All the data used for the predictions in the paper can be found at https://doi.org/10.5281/zenodo.10609676.
4041

4142
## Figures
4243

43-
The scripts used to generate most of the figures in the paper are provided under the directory `figures`. The scripts correspond to the figures as follows:
44-
45-
- Fig. 1: sims.py
46-
- Fig. 2: ptcda.py
47-
- Fig. 3: bcb.py
48-
- Fig. 4: water.py
49-
- Fig. 5: surface_sims_bcb_water.py
50-
- Fig. S1: model_schem.tex
51-
- Fig. S3: stats.py\*
52-
- Fig. S4: esmap_sample.py and then esmap_schem.tex
53-
- Fig. S5: stats_spring_constants.py\*
54-
- Fig. S6: afm_stacks.py and afm_stacks2.py
55-
- Fig. S7: sims_hartree.py
56-
- Fig. S8: ptcda_surface_sim.py
57-
- Fig. S9: single_tip.py
58-
- Fig. S10: sims_Cl.py
59-
- Fig. S11: height_dependence.py
60-
- Fig. S12: extra_electron.py
61-
- Fig. S13: background_gradient.py
44+
The scripts used to generate most of the figures in the paper are provided under the directory `figures`. Running the scripts will automatically download the required data.
45+
46+
The scripts correspond to the figures as follows:
47+
48+
- Fig. 1: `sims.py`
49+
- Fig. 2: `ptcda.py`
50+
- Fig. 3: `bcb.py`
51+
- Fig. 4: `water.py`
52+
- Fig. 5: `surface_sims_bcb_water.py`
53+
- Fig. S1: `model_schem.tex`
54+
- Fig. S3: `stats.py`\*
55+
- Fig. S4: `esmap_sample.py` and then `esmap_schem.tex`
56+
- Fig. S5: `stats_spring_constants.py`\*
57+
- Fig. S6: `afm_stacks.py` and `afm_stacks2.py`
58+
- Fig. S7: `sims_hartree.py`
59+
- Fig. S8: `ptcda_surface_sim.py`
60+
- Fig. S9: `single_tip.py`
61+
- Fig. S10: `sims_Cl.py`
62+
- Fig. S11: `height_dependence.py`
63+
- Fig. S12: `extra_electron.py`
64+
- Fig. S13: `background_gradient.py`
6265

6366
\* Precalculated MSE values used by the plotting script are provided under `figures/stats`. The scripts used to calculate these values are also under `figures/stats`.
6467

0 commit comments

Comments
 (0)