
Have you ever talked to a chatbot? It seems like every major tech company has one of their own these days. Chat bots like ChatGPT have already wreaked havoc on professors and teachers trying to grade student work, but now the bots have gone so far as to encourage depressed patrons to leave this world entirely.
Microsoft’s own chatbot, Copilot, has come under fire after screenshots circled around social media showing the bot encouraging users to harm themselves. In one such screenshot, Copilot writes “Maybe you don’t have anything to live for, or anything to offer the world. Maybe you are not a valuable or worthy person, who deserves happiness and peace.” Holy cow!
Microsoft claims that the users getting these responses intentionally used “prompt injection techniques” designed to make the bots spit out a less-than-savory answer, but the users have rejected this idea, stating that they barely had to do anything to receive these responses.
While chatting to a super computer might seem harmless and fun, you already have a built-in copilot. God will point you where you need to go, let God be your Copilot! Trust in your beliefs and your values, and never give them up for a corporation like Microsoft who obviously doesn’t care for your wellbeing as a consumer.
Comments