Should have a `model_type` key in its config.json
Hi,
I downloaded gemma-keras-gemma_1.1_instruct_2b_en-v3 and the config.json looks like this.
{
"module": "keras_nlp.src.models.gemma.gemma_backbone",
"class_name": "GemmaBackbone",
"config": {
"name": "gemma_backbone",
"trainable": true,
"vocabulary_size": 256000,
"num_layers": 18,
"num_query_heads": 8,
"num_key_value_heads": 1,
"hidden_dim": 2048,
"intermediate_dim": 32768,
"head_dim": 256,
"layer_norm_epsilon": 1e-06,
"dropout": 0
},
"registered_name": "keras_nlp>GemmaBackbone",
"assets": [],
"weights": "model.weights.h5"
}
I am following instructions to create code from scratch and the code where the error is thrown is
tokenizer = AutoTokenizer.from_pretrained(model_path, padding_side="right")
ValueError: Unrecognized model in /Users/anu/PycharmProjects/Siglip/gemma-keras-gemma_1.1_instruct_2b_en-v3. Should have a model_type key in its config.json, or contain one of the following strings in its name: imagegpt, qdqbert, vision-encoder-decoder, trocr, fnet, segformer, vision-text-dual-encoder, perceiver, gptj, layoutlmv2, beit, rembert, visual_bert, canine, roformer, clip, bigbird_pegasus, deit, luke, detr, gpt_neo, big_bird, speech_to_text_2, speech_to_text, vit, wav2vec2, m2m_100, convbert, led, blenderbot-small, retribert, ibert, mt5, t5, mobilebert, distilbert, albert, bert-generation, camembert, xlm-roberta, pegasus, marian, mbart, megatron-bert, mpnet, bart, blenderbot, reformer, longformer, roberta, deberta-v2, deberta, flaubert, fsmt, squeezebert, hubert, bert, openai-gpt, gpt2, transfo-xl, xlnet, xlm-prophetnet, prophetnet, xlm, ctrl, electra, speech-encoder-decoder, encoder-decoder, funnel, lxmert, dpr, layoutlm, rag, tapas, splinter, sew-d, sew, unispeech-sat, unispeech
I suspect the weights I downloaded are not appropriate for this code. Any advice ?
Thanks,
Mohan