Skip to content

Commit 8f773dc

Browse files
Update GEPA version (#8628)
1 parent 96536ea commit 8f773dc

File tree

2 files changed

+4
-3
lines changed

2 files changed

+4
-3
lines changed

dspy/teleprompt/gepa/gepa.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,6 +136,7 @@ class GEPA(Teleprompter):
136136
"""
137137
GEPA is an evolutionary optimizer, which uses reflection to evolve text components
138138
of complex systems. GEPA is proposed in the paper [GEPA: Reflective Prompt Evolution Can Outperform Reinforcement Learning](https://arxiv.org/abs/2507.19457).
139+
The GEPA optimization engine is provided by the `gepa` package, available from [https://github.com/gepa-ai/gepa](https://github.com/gepa-ai/gepa).
139140
140141
GEPA captures full traces of the DSPy module's execution, identifies the parts of the trace
141142
corresponding to a specific predictor, and reflects on the behaviour of the predictor to
@@ -175,8 +176,8 @@ def metric(
175176
...
176177
```
177178
178-
GEPA can also be used as a batch inference-time search strategy, by passing `valset=trainset, track_stats=True`, and using the
179-
`detailed_results` attribute of the optimized program (returned by `compile`) to get the Pareto frontier of the batch.
179+
GEPA can also be used as a batch inference-time search strategy, by passing `valset=trainset, track_stats=True, track_best_outputs=True`, and using the
180+
`detailed_results` attribute of the optimized program (returned by `compile`) to get the Pareto frontier of the batch. `optimized_program.detailed_results.best_outputs_valset` will contain the best outputs for each task in the batch.
180181
181182
Example:
182183
```

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ dependencies = [
4242
"rich>=13.7.1",
4343
"numpy>=1.26.0",
4444
"xxhash>=3.5.0",
45-
"gepa==0.0.1"
45+
"gepa==0.0.2"
4646
]
4747

4848
[project.optional-dependencies]

0 commit comments

Comments
 (0)