which is the context length of GLM-4.7 ? 202752 or 128000 ?
#33
by
tuo02
- opened
In config.json file, max_position_embeddings=202752; while in tokenizer_config.json file, model_max_length=128000, which is the context length of GLM-4.7 ? And sglang output this message "Token indices sequence length is longer than the specified maximum sequence length for this model (192619 > 128000). Running this sequence through the model will result in indexing errors"