How to use Yinpei/racer-llava-llama3-lora-rich-betterswitch with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("lmms-lab/llama3-llava-next-8b") model = PeftModel.from_pretrained(base_model, "Yinpei/racer-llava-llama3-lora-rich-betterswitch")
RACER-LlaVA checkpoint trained with rich instructions. Retrained for better RVT-to-RACER switch.
Base model