API Docs

Init module for MCT API.

import model_compression_toolkit as mct

ptq

gptq

qat

core

data_generation

pruning

exporter

  • exporter: Module that enables to export a quantized model in different serialization formats.

trainable_infrastructure

  • trainable_infrastructure: Module that contains quantization abstraction and quantizers for hardware-oriented model optimization tools.

set_log_folder

  • set_log_folder: Function to set the logger path directory and to enable logging.

keras_load_quantized_model

target_platform

  • target_platform: Module to create and model hardware-related settings to optimize the model according to, by the hardware the optimized model will use during inference.

  • get_target_platform_capabilities: A function to get a target platform model for Tensorflow and Pytorch.

  • DefaultDict: Util class for creating a TargetPlatformCapabilities.

Indices and tables

Note

This documentation is auto-generated using Sphinx