image
imagewidth (px) 32
32
| label
class label 1k
classes |
|---|---|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
|
0tench, Tinca tinca
|
ImageNet SDXL Quantized
This repository provides the ImageNet-1K dataset pre-encoded with the Stable Diffusion XL VAE encoder and quantized to uint8, allowing for faster training of latent diffusion models by eliminating the need for on-the-fly encoding.
Key Features
- Reduces quantization error by 2dB PSNR compared to a linear encoding scheme
- Provided in both 256 and 512 resolutions
- Compatible with NumPy, JAX, and PyTorch
Usage
Loading the dataset
The encoded and quantized images are written as PNG files, and can be loaded without any special tools.
>>> from datasets import load_dataset
>>> ds = load_dataset("jon-kyl/imagenet-sdxl-quantized", "256") # or "512"
>>> ds
DatasetDict({
train: Dataset({
features: ['image', 'label'],
num_rows: 1281167
})
validation: Dataset({
features: ['image', 'label'],
num_rows: 50000
})
test: Dataset({
features: ['image', 'label'],
num_rows: 100000
})
})
>>> ds["train"][0] # SDXL encoder reduces image size by factor of 8.
{'image': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=32x32>,
'label': 0}
Dequantization
To dequantize the data, use the quantization module:
>>> from quantization import optimized_for_sdxl as q
>>> ds = ds.with_format("numpy") # or "jax", "torch"
>>> for split in dataset:
... for example in dataset[split]:
... # `dequantize` infers the backend and imports it lazily.
... dequantized = q.dequantize(example["image"])
Decoding with SDXL VAE
We provide a shorthand for the JAX implementation of the SDXL VAE:
import jax
from encode import load_decoder
# load_decoder() returns a plain function of 1 argument.
decoder = jax.jit(load_decoder())
# The function accepts jax or numpy arrays.
decoded_jax = decoder(dequantized)
Important: When using other decoder implementations, remember to invert the SDXL
scaling_factorbefore decoding.
Details
Unlike other approaches, this implementation specifically minimizes quantization error as measured in image space by performing a grid search over saturating functions and scale factors.
Our analysis shows that the CDF of the normal distribution (i.e., erf) with a scale factor of 0.7 provides optimal results at both 256×256 and 512×512 resolutions.
Encoding Process
- Preprocessing: Square crop along the long edge, then Lanczos resample to target size
- Encoding: Apply SDXL encoder
- Scaling: Multiply by SDXL scaling factor (0.13025) to roughly normalize variance
- Quantization:
- Apply additional scale parameter (0.7)
- Apply nonlinearity function (Normal CDF)
- Quantize to 8 bits
def quantize(x: FloatArray) -> UInt8Array:
"""
Pseudocode for quantization.
"""
x = x * scale # (-inf, inf)
x = nonlinearity(x) # [-1, 1)
x = x * 128 + 128 # [0, 256)
x = to_uint8(x) # [0, 255]
return x
- Downloads last month
- 352