- cross-posted to:
- sneerclub@awful.systems
- tech@kbin.social
- cross-posted to:
- sneerclub@awful.systems
- tech@kbin.social
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
Worked fine for me:
System Prompt:
User Prompt:
Assistant Message:
User Prompt:
Assistant Message:
Hm, so while the AI doesn’t “understand” (a woo word until someone can define it for me), it seems to accidentally, without any understanding, behave exactly like it understands.
It doesn’t understand, it just pulls from enough text written by humans that understand things that they wrote that it can retrieve the correct text from prior human understanding to give coherent answers.