Open-Llama/configs
2023-04-28 15:01:01 +08:00
..
4w_cn_vocab_wudao15.model update utils 2023-04-12 17:15:40 +08:00
10w_vocab_wudao5_pile10.model add high-performance Llama pre-train code 2023-03-26 23:59:53 +08:00
default_config.yaml unified pre-training and instrcution-tuning both use train_lm and dataset 2023-04-27 19:42:06 +08:00
instruct_config.yaml update header config and add padding to concat_multiple_sequence 2023-04-27 23:42:11 +08:00
llama_tokenizer_extended.model using huggingface datasets to accelerate training, using open-llama to pretrain 2023-04-24 19:13:53 +08:00
llama_tokenizer.model add high-performance Llama pre-train code 2023-03-26 23:59:53 +08:00
pretrain_config.yaml update readme 2023-04-28 15:01:01 +08:00