- cross-posted to:
- sneerclub@awful.systems
- tech@kbin.social
- cross-posted to:
- sneerclub@awful.systems
- tech@kbin.social
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
Exactly, that’s where the too slow part comes in. To get more robust behavior it needs multiple layers of meta analysis, but that means it would take way more text generation under the hood than what’s needed for one shot output.
Yes, but in terms of speed you don’t need the same parameters and quantization for the secondary layers.
If you haven’t seen it, see how fast a very capable model can actually be: https://groq.com/
Yeah I’ve seen that. I think things will get much faster very quickly, I’m just commenting on the first Gen tech we’re seeing right now.