-
Notifications
You must be signed in to change notification settings - Fork 477
Python quit unexpectedly while running scripts #210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
When you tried installing from source, did you first run |
I have a very similar issue when trying to run neuralcoref in google collab. The kernel instantly crashes when creating a doc object with neuralcoref added to the nlp pipeline.
|
What are the versions of spaCy and neuralcoref you're using ? |
spacy = 2.1.8 |
Ok so you'd need to either downgrade spaCy to 2.1.0, or keep spaCy as-is but build neuralcoref from source. See also #197 |
I tried downgrading spacy from spacy==2.2.0 to 2.1.0. This solves the issue. Thanks for your help. |
First I installed neuralcoref and its dependencies using pip in my global python environment..
When I run the script containing
import spacy
import neuralcoref
nlp = spacy.load('en')
neuralcoref.add_to_pipe(nlp)
doc1 = nlp('My sister has a dog. She loves him.')
print(doc1._.coref_clusters)
doc2 = nlp('Angela lives in Boston. She is quite happy in that city.')
for ent in doc2.ents:
print(ent._.coref_cluster)
It throws error :

Then I tried installing neuralcoref with source, When I run the command
pip install -e .
in cloned neuralcoref path, It throws error like below:

Kindly anyone help me to resolve this issue
Thanks in Advance.
The text was updated successfully, but these errors were encountered: