

I don’t know the specifics for this reported case, and I’m not interested in learning them, but I know part of the controversy with the grok deep fake thing when it first became a big story was that Grok was starting to add risqué elements to prompted pictures even when the prompt didn’t ask for them. But yeah, if users are giving shitty prompts (and I’m sure too many are), they are equally at fault with Grok’s devs/designers who did not put in safeguards to prevent those prompts from being actionable before releasing it to the public








My city uses pedestrian aprons at intersections to pile up the snow plowed off of the road. So even when the snow on the path melts, the pile will be there for several additional weeks or even months making it unusable. Very cool