neuralcoref: spacy.strings.StringStore has the wrong size, try recompiling
Spacy works perfectly fine for me with the usual spacy-provided models, but trying to load en_coref_md
or en_coref_lg
fails with the following message:
$ pip install https://github.com/huggingface/neuralcoref-models/releases/download/en_coref_md-3.0.0/en_coref_md-3.0.0.tar.gz
$ python
Python 3.7.0 (default, Jun 28 2018, 07:39:16)
[Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>
>>> import spacy
>>> spacy.load('en_coref_md')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/yyy/miniconda2/envs/xxx/lib/python3.7/site-packages/spacy/__init__.py", line 17, in load
return util.load_model(name, **overrides)
File "/yyy/miniconda2/envs/xxx/lib/python3.7/site-packages/spacy/util.py", line 114, in load_model
return load_model_from_package(name, **overrides)
File "/yyy/miniconda2/envs/xxx/lib/python3.7/site-packages/spacy/util.py", line 134, in load_model_from_package
cls = importlib.import_module(name)
File "/yyy/miniconda2/envs/xxx/lib/python3.7/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/yyy/miniconda2/envs/xxx/lib/python3.7/site-packages/en_coref_md/__init__.py", line 6, in <module>
from en_coref_md.neuralcoref import NeuralCoref
File "/yyy/miniconda2/envs/xxx/lib/python3.7/site-packages/en_coref_md/neuralcoref/__init__.py", line 1, in <module>
from .neuralcoref import NeuralCoref
File "strings.pxd", line 23, in init en_coref_md.neuralcoref.neuralcoref
ValueError: spacy.strings.StringStore has the wrong size, try recompiling. Expected 88, got 112
>>> spacy.__version__
'2.0.11'
My environment:
python 3.7
spacy 2.0.11
mac
Not sure if this makes any difference, but spacy was installed via conda while coref was installed via pip. This is part of the output of conda list
spacy 2.0.11 py37h6440ff4_2
en-coref-md 3.0.0 <pip>
About this issue
- Original URL
- State: closed
- Created 6 years ago
- Reactions: 1
- Comments: 19 (2 by maintainers)
Not sure if this is other people’s problem but for me it was because of the version of Neuralcoref. My system had installed 3.1 but 2.0 is the only compatible version with the Spacy 2.0.12 version I was using.
Additionally no version of 2.0 is available on PyPi so you’ll have to download the tar.gz and pip install from that version.
Download version 2.0 here
Then just run
pip install neuralcoref-2.0.tar.gz
(assuming you download the tar.gz to the directory you’re in) or target the URL from above.Everything worked for me after that. Hope it helps!
I get the same error with
spacy 2.0.13
(from conda-forge) and python3.6.6
:I couldn’t find a way to make coref work 😦
similar to omstuhler, I followed the readme. The download and install works fine.
Importing results in: RuntimeWarning: spacy.tokens.span.Span size changed, may indicate binary incompatibility. Expected 72 from C header, got 80 from PyObject return f(*args, **kwds)
This is an old thread, and already closed, but for anyone coming here from a search engine:
As a workaround for the incompatibility issue, you can either downgrade spaCy to 2.1.0, or build neuralcoref from source if you want to use a more recent version of spaCy.
We plan on keeping the releases of neuralcoref and spaCy more in-sync in the future. Merging this with Issue #197.
What happens when you follow the instructions in the readme for this error?
@respectPotentialEnergy Can you try with system installed python instead of conda-installed? Make sure to create a python virtual environment first though before installing any packages and testing the neural-coref model.
I used to face a lot of inconsistency in dependencies using conda (the very things it was supposed to solve 😏 ). I find it much easier to work with a clean python virtual environment created from the system installed python.