Creating videos with stable diffusion is attracting a lot of attention!
Model and Lora
Stability AI Releases Stable Animation SDK, a Powerful Text-to-Animation Tool for Developers
This reference-only ControlNet can directly link the attention layers of your SD to any independent images, so that your SD will read arbitary images for reference. You need at least ControlNet 1.1.153 to use it.https://github.com/Mikubill/sd-webui-controlnet/discussions/1236
Web UI Extensions
Reference-only Control released
Turn video into a sequence of still images, img2img frame by frame, and again into video.
The trick is to use multiple control nets. opencv.
The method of merging between frames is used. FFMPEG need.
Can be linked to Ebsynth.
Stable Diffusion WebUI plugin used to create img2img videos with loopback and temporal blur to improve video stability and minimize the flicker that is characteristic of img2img animation. ffmpeg.
The plugin at this stage is only a proof of concept for the backend logic of MasaCtrl, I created it mainly for testing vid2vid consistency using Video Loopback.
Please write your tips and tricks that are not documented on Automatic1111 Wiki for preparation of a comprehensive tutorial
many tips and tricks.