Upload tokenizer
59505d3 - last-checkpoint Training in progress, step 100, checkpoint
- runs Training in progress, step 100
- 1.52 kB initial commit
- 582 Bytes Training in progress, step 50
- 42 MB Training in progress, step 100
- 510 Bytes Upload tokenizer
- 1.8 MB Upload tokenizer
- 493 kB Upload tokenizer
- 981 Bytes Upload tokenizer
training_args.bin Detected Pickle imports (11)
- "transformers.trainer_utils.SchedulerType",
- "torch.device",
- "accelerate.utils.dataclasses.DistributedType",
- "torch.float16",
- "transformers.training_args.TrainingArguments",
- "accelerate.state.PartialState",
- "transformers.trainer_utils.IntervalStrategy",
- "accelerate.utils.dataclasses.DeepSpeedPlugin",
- "transformers.integrations.deepspeed.HfTrainerDeepSpeedConfig",
- "transformers.trainer_utils.HubStrategy",
- "transformers.training_args.OptimizerNames"
How to fix it?
6.33 kB Training in progress, step 50 - 2.66 kB Training in progress, step 50