Platform specific autodetection and tuning performed on install
Optimized processing with latest torch developments with built-in support for torch.compile
and multiple compile backends: Triton, ZLUDA, StableFast, DeepCache, OpenVINO, NNCF, IPEX, OneDiff
Improved prompt parser
Built-in queue management
Enterprise level logging and hardened API
Built in installer with automatic updates and dependency management
Mobile compatible
Main interface using StandardUI:
Main interface using ModernUI:
For screenshots and informations on other available themes, see Themes Wiki
Model support
Additional models will be added as they become available and there is public interest in them
See models overview for details on each model, including their architecture, complexity and other info
[!TIP]
All command line options can also be set via env variable
For example --debug is same as set SD_DEBUG=true
Backend support
SD.Next supports two main backends: Diffusers and Original:
Diffusers: Based on new Huggingface Diffusers implementation
Supports all models listed below
This backend is set as default for new installations
Original: Based on LDM reference implementation and significantly expanded on by A1111
This backend and is fully compatible with most existing functionality and extensions written for A1111 SDWebUI
Supports SD 1.x and SD 2.x models
All other model types such as SD-XL, LCM, Stable Cascade, PixArt, Playground, Segmind, Kandinsky, etc. require backend Diffusers
Collab
We’d love to have additional maintainers (with comes with full repo rights). If you’re interested, ping us!
In addition to general cross-platform code, desire is to have a lead for each of the main platforms
This should be fully cross-platform, but we’d really love to have additional contributors and/or maintainers to join and help lead the efforts on different platforms
If you’re unsure how to use a feature, best place to start is Wiki and if its not there,
check ChangeLog for when feature was first introduced as it will always have a short note on how to use it