Skip to content

Commit 2ff4b06

Browse files
committed
fix doc
Signed-off-by: mgqa34 <[email protected]>
1 parent 74c4d40 commit 2ff4b06

File tree

3 files changed

+6
-6
lines changed

3 files changed

+6
-6
lines changed

doc/tutorial/fdkt/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# FATE-LLM: FDKT
2-
The Algorithm is based on paper [Federated Domain-Specific Knowledge Transfer on Large Language Models Using Synthetic Data](https://arxiv.org/pdf/2405.14212),
2+
The algorithm is based on paper [Federated Domain-Specific Knowledge Transfer on Large Language Models Using Synthetic Data](https://arxiv.org/pdf/2405.14212),
33
a novel framework that enables domain-specific knowledge transfer from LLMs to SLMs while preserving SLM data privacy.
44

55
## Citation
@@ -11,4 +11,4 @@ If you publish work that uses FDKT, please cite FDKT as follows:
1111
journal={arXiv preprint arXiv:2405.14212},
1212
year={2024}
1313
}
14-
```
14+
```

doc/tutorial/fedmkt/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# FATE-LLM: FedMKT
22

3-
The Algorithm is based on paper ["FedMKT: Federated Mutual Knowledge Transfer for Large and SmallLanguage Models"](https://arxiv.org/pdf/2406.02224), We integrate its code into the FATE-LLM framework.
3+
The algorithm is based on paper ["FedMKT: Federated Mutual Knowledge Transfer for Large and SmallLanguage Models"](https://arxiv.org/pdf/2406.02224), We integrate its code into the FATE-LLM framework.
44

55
## Citation
66
If you publish work that uses FedMKT, please cite FedMKT as follows:
@@ -11,4 +11,4 @@ If you publish work that uses FedMKT, please cite FedMKT as follows:
1111
journal={arXiv preprint arXiv:2406.02224},
1212
year={2024}
1313
}
14-
```
14+
```

doc/tutorial/pdss/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# FATE-LLM: PDSS
2-
The Algorithm is based on paper ["PDSS: A Privacy-Preserving Framework for Step-by-Step Distillation of Large Language Models"](https://arxiv.org/pdf/2406.12403), which introduce a novel framework for privacy preserving federated distillation. We integrate its code into the FATE-LLM framework.
2+
The algorithm is based on paper ["PDSS: A Privacy-Preserving Framework for Step-by-Step Distillation of Large Language Models"](https://arxiv.org/pdf/2406.12403), which introduce a novel framework for privacy preserving federated distillation. We integrate its code into the FATE-LLM framework.
33

44
## Citation
55
If you publish work that uses PDSS, please cite PDSS as follows:
@@ -10,4 +10,4 @@ If you publish work that uses PDSS, please cite PDSS as follows:
1010
journal={arXiv preprint arXiv:2406.12403},
1111
year={2024}
1212
}
13-
```
13+
```

0 commit comments

Comments
 (0)