Change add_bos_token to true

#5
by JaumePrats - opened
Language Technologies Laboratory @ Barcelona Supercomputing Center org

Otherwise this causes the special tokens in chat_template to not be applied correctly in certain inference frameworks, such as llama.cpp.

gonzalez-agirre changed pull request status to merged

Sign up or log in to comment