

Wild that there’s an AI-generated summary of the article before the article, on a story about the problems with AI. Also, is it that hard to ask your writers to write a summary of their own articles? Hasn’t writing tweets (or similar microblog posts) already allowed most writers to develop the skill of writing a concise, simplified version of a story? Why are we entrusting this to AI when a human will be able to more accurately summarize their own article, and include appropriate nuance.
Apologies for the mini crash-out that isn’t really related to the real story here. Thank you OP for sharing, and kudos to the MP for taking a stand.
I’ll admit I tried talking to a local deepseek about a minor mental health issue one night when I just didn’t want to wake up/bother my friends. Broke the AI within about 6 prompts where no matter what I said it would repeat the same answer word-for-word about going for walks and eating better. Honestly, breaking the AI and laughing at it did more for my mental health than anything anyone could have said, but I’m an AI hater. I wouldn’t recommend anyone in real need use AI for mental health advice.