ResNet18 - ImageNet

Model Description

ResNet18 Version 1.5

This model has been optimized for deployment on Blaize Xplorer AI accelerators using the Blaize Picasso SDK, a key component for building, optimizing, and deploying graph-native AI applications on the Blaize Graph Streaming Processor (GSP) hardware. The GSP architecture features a unique graph-based design that optimizes performance, power efficiency, and reduces latency, designed specifically to deliver efficient edge AI inference.

Original Model

Training Dataset

  • Name: ImageNet
  • Description: ImageNet 2012
  • Source: https://www.image-net.org/
  • License: LicenseRef-ImageNet
  • Version: 2012
  • Attribution: "ImageNet: A large-scale hierarchical image database" by J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei.

Model Variants

This repository contains multiple quantization and input resolution variants:

Variant Quantization Input Resolution (W×H) File
Resnet18_ImageNet_amp_224_224 AMP 224×224 Resnet18_ImageNet_amp_224_224.bm

Quantization Methods

  • INT8: Uses integer quantization with fixed-point scaling. Provides the fastest inference with reduced model size.
  • BF16: Uses bfloat16 datatype. Provides the highest precision.
  • AMP (Automatic Mixed Precision): Combines INT8 and BF16 to balance precision and inference speed.

Usage

These models are designed to be loaded and executed using the Blaize Picasso SDK. Refer to the Picasso SDK documentation for integration details. The modeltool utility can be used to inspect a model:

blaize-modeltool info -i hf://Blaize-AI/Resnet18_ImageNet/Resnet18_ImageNet_amp_224_224.bm

Requirements

  • Blaize Xplorer AI accelerator
  • Blaize Picasso SDK installed and configured

Additional Information

For more information about Blaize technology, visit blaize.com.

License

The models provided are third party content licensed directly to you by the third-party developer. The license is provided above, and you are responsible for reviewing and complying with the license. Blaize has not modified any model source code but has performed hardware-specific optimizations to enable compatibility with Blaize hardware. Blaize's provision of these models does not imply Blaize's review, approval, or endorsement of the original model, or any affiliation with or certification by the model developer. Under no circumstances will Blaize be liable to you or to any third party in any way for the third-party models, including but not limited to for any claim, damages, or other liability, whether in an action of contract, tort or otherwise, arising out of or our in connection with the models, or use or other dealings with the model. Blaize provides no warranty of any kind or any assurances that the models provided do not infringe any patent, copyright, or any intellectual property rights of third parties. By downloading the models, you agree that you must evaluate, and bear all risks associated with, the use of the third party models, including any reliance on the accuracy, completeness, or usefulness of such third party content.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support