- cross-posted to:
- artificial_intel@lemmy.ml
- cross-posted to:
- artificial_intel@lemmy.ml
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
Hm, so while the AI doesn’t “understand” (a woo word until someone can define it for me), it seems to accidentally, without any understanding, behave exactly like it understands.
It doesn’t understand, it just pulls from enough text written by humans that understand things that they wrote that it can retrieve the correct text from prior human understanding to give coherent answers.