Here is my quantized FP8 Version of the Wan2.1_14B 720p i2v model so that we can run it on 50 and 40 series or for even faster inference on bigger cards.Enjoy!
wan21I2v14b720pFP8_v10.safetensors