Core ML Converted Model:
- This model was converted to Core ML for use on Apple Silicon devices. Conversion instructions can be found here.
- Provide the model to an app such as Mochi Diffusion Github / Discord to generate images.
split_einsum
version is compatible with all compute unit options including Neural Engine.original
version is only compatible withCPU & GPU
option.- Custom resolution versions are tagged accordingly.
- The
vae-ft-mse-840000-ema-pruned.ckpt
VAE is embedded into the model. - This model was converted with a
vae-encoder
for use withimage2image
. - Descriptions are posted as-is from original model source.
- Not all features and/or results may be available in
CoreML
format. - This model does not have the unet split into chunks.
- This model does not include a
safety checker
(for NSFW content). - This model can be used with ControlNet.
XSarchitectural-InteriorDesign-ForXSLora:
Source(s): CivitAI
XSarchitectural-InteriorDesign-ForXSLora
![00047-2114630674.png](https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/fbf8b33b-e1df-4dd4-9c3d-96ce2b815000/width=450/00047-2114630674.jpeg)
![xyz_grid-0000-3416252952.png](https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/82f996bb-817f-42af-767f-0f8ebdc91000/width=450/xyz_grid-0000-3416252952.jpeg)
![XSIDV2.png](https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/229dafd7-ac30-49ca-ad5c-3de393e9fc00/width=450/XSIDV2.jpeg)
![00055-786679420.png](https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/b44babd5-8fd6-4a13-2313-793f40128a00/width=450/00055-786679420.jpeg)
XSarchitectural-InteriorDesign-ForXSLora Suggestions for use: Lora, TI, VAE and other documents for indoor use
This data model has a corresponding LoRa model. If you wish to use it, please download it:
XSarchitectural-7Modern interior | Stable Diffusion LORA | Civitai
XSarchitectural-38InteriorForBedroom | Stable Diffusion LORA | Civitai
If you think it's good, give me a five-star review
If you have any questions, please contact me at [email protected]
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.