No, lol. Well, at least I’m not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:
The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.
Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.
Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they “distilled” theirs from… Not really sure what exactly they think they are differentiating themselves from, reading the article…
No, lol. Well, at least I’m not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:
Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.
Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they “distilled” theirs from… Not really sure what exactly they think they are differentiating themselves from, reading the article…