It appears that the Flash series models are not compatible with the vLLM framework for inference.

#1
by WEISHU - opened

It appears that the Flash series models are not compatible with the vLLM framework for inference.

Sign up or log in to comment