Open-Llama/configs
2023-05-05 17:05:41 +08:00
..
4w_cn_vocab_wudao15.model update utils 2023-04-12 17:15:40 +08:00
10w_vocab_wudao5_pile10.model add high-performance Llama pre-train code 2023-03-26 23:59:53 +08:00
default_config.yaml unified pre-training and instrcution-tuning both use train_lm and dataset 2023-04-27 19:42:06 +08:00
instruct_config.yaml add xP3 dataset and belle_2M 2023-05-05 17:05:41 +08:00
llama_tokenizer_extended.model using huggingface datasets to accelerate training, using open-llama to pretrain 2023-04-24 19:13:53 +08:00
llama_tokenizer.model add high-performance Llama pre-train code 2023-03-26 23:59:53 +08:00
pretrain_config.yaml add xP3 dataset and belle_2M 2023-05-05 17:05:41 +08:00