Skip to content

Commit e2f9615

Browse files
committed
use @clip-anytorch , thanks to @rom1504
1 parent 0d1c07c commit e2f9615

File tree

3 files changed

+4
-10
lines changed

3 files changed

+4
-10
lines changed

README.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -499,9 +499,7 @@ loss.backward()
499499

500500
Although there is the possibility they are using an unreleased, more powerful CLIP, you can use one of the released ones, if you do not wish to train your own CLIP from scratch. This will also allow the community to more quickly validate the conclusions of the paper.
501501

502-
First you'll need to install <a href="https://github.com/openai/CLIP#usage">the prerequisites</a>
503-
504-
Then to use a pretrained OpenAI CLIP, simply import `OpenAIClipAdapter` and pass it into the `DiffusionPrior` or `Decoder` like so
502+
To use a pretrained OpenAI CLIP, simply import `OpenAIClipAdapter` and pass it into the `DiffusionPrior` or `Decoder` like so
505503

506504
```python
507505
import torch

dalle2_pytorch/dalle2_pytorch.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -172,11 +172,7 @@ def __init__(
172172
self,
173173
name = 'ViT-B/32'
174174
):
175-
try:
176-
import clip
177-
except ImportError:
178-
print('you must install openai clip in order to use this adapter - `pip install git+https://github.com/openai/CLIP.git` - more instructions at https://github.com/openai/CLIP#usage')
179-
175+
import clip
180176
openai_clip, _ = clip.load(name)
181177
super().__init__(openai_clip)
182178

@@ -1636,4 +1632,3 @@ def forward(
16361632
return images[0]
16371633

16381634
return images
1639-

setup.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
'dream = dalle2_pytorch.cli:dream'
1111
],
1212
},
13-
version = '0.0.72',
13+
version = '0.0.73',
1414
license='MIT',
1515
description = 'DALL-E 2',
1616
author = 'Phil Wang',
@@ -23,6 +23,7 @@
2323
],
2424
install_requires=[
2525
'click',
26+
'clip-anytorch',
2627
'einops>=0.4',
2728
'einops-exts>=0.0.3',
2829
'kornia>=0.5.4',

0 commit comments

Comments
 (0)