Use pattern matching on the string. I have downloaded the mistral model and saved it on microsoft azure. Solved by update my transformers package to the latest version.
KeyError 'mistral' · Issue 27959 · huggingface/transformers · GitHub
I am still getting this error.
I try to run it on oracle linux server.
Mixtral and mistral v0.2 doesnt use it anymore too. Number of tokens (760) exceeded maximum. The adapter files and model files. Just make sure that you install autoawq after you have installed the pr:
Along with the base model, we also have an adapter to load. From transformers import pipline, autotokenizer, automodelforcausallm. I'm trying to utilize the mistral 7b model for a conversationalretrievalchain, but i'm encountering an error related to token length: Traceback (most recent call last):
I have the latest version of transformers, yet still getting the keyerror 'mistral'.
Successfully merging a pull request may close this issue. Mistral’s current version requires transformers minimum version 4.34.0 (there’s. Mistral is not in 4.33.3 yet.



