Inference Providers
Active filters: sea
Text Generation
• 7B • Updated • 8.12k
• 68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated • 16
• 3
LoneStriker/SeaLLM-7B-v2-GGUF
7B • Updated • 57
• 6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
• Updated • 3
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
• Updated • 4
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
• Updated • 7
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
• Updated • 3
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
• Updated • 2
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
• 7B • Updated • 3
Text Generation
• 8B • Updated • 123
• 28
Text Generation
• 4B • Updated • 100
• 6
Text Generation
• 2B • Updated • 111
• • 8
Text Generation
• 0.6B • Updated • 118
• 9
Text Generation
• 8B • Updated • 126
• 8
Text Generation
• 4B • Updated • 38
• 2
Text Generation
• 2B • Updated • 284
• • 6
Text Generation
• 0.6B • Updated • 101
• 7
sail/Sailor-1.8B-Chat-gguf
2B • Updated • 207
• 3
sail/Sailor-0.5B-Chat-gguf
0.6B • Updated • 321
• 4
4B • Updated • 221
• 3
8B • Updated • 259
• 5
Text Generation
• 9B • Updated • 12.2k
• 50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B • Updated • 117
• 8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
• 2B • Updated • 9
• 2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B • Updated • 11
• 1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
• 9B • Updated • 329
• 1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
• 7B • Updated • 309
• 1
Image-Text-to-Text
• 8B • Updated • 14
• 5
NghiemAbe/SeaLLM-7B-v2.5-AWQ
Text Generation
• Updated • 6