I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.
I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.
Sick, I only need 90gb of VRAM!
I’ve got it running with a 3090 and 32GB of RAM.
There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).
Yeah but damn does it get slow.
I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.
Languages are complex and, more importantly, much less forgiving to error
Hopefully we see more specific hardware for this. Like extension cards with pretty much just tensor cores and their own ram.
I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.
Do they want consumer ai cards to exist though?
Think about the data!
Card makers? They only want money, if theres enough consumer level demand they will make them.
I guess your right.
It’s not quite consumer level I’d say but Coral.ai has some small Google Edge based TPUs.