- An Xbox executive suggested that the staff laid use AI for emotional support and career guidance
- Suggestion provoked backlash and motivated the executive to remove his linkedIn post
- Microsoft has invested 9,000 employees in recent months, investing heavily in AI.
Microsoft has been fulfilling its AI ambitions for the past several years, but about the power of AI, an executive pitch has landed with a strange thud.
Amidst the largest round of pruning in two years, about 9,000 people, Matt Turnbull, Executive Manufacturer of Matte Turnbull Publication Matt Turnbull suggested that AI chatbots could help those affected people to resume their grief, start crafts and rebuild their confidence.
The gesture was for support, but it angered several game developers.
Turnbull took its possibly thorough meaning, but certainly worked poorly and gave a timely message to LinkedIn. He shared ideas for signs to give AI chatbot, which he claimed that he could help navigate career uncertainty and emotional disturbance.
The backlash was sharp and angry, which he was leading to the removal of the post, but you can still read thanks for Brandon Shefield’s Blussky Post.
Matt Turnbull, Executive Manufacturer of Xbox Game Studio Publication – Microsoft Sorting – Suggestion on LinkedIn that perhaps those who have been allowed to go should turn to AI for help. He seriously thought that it would be a good idea.
– @brandon.insertcredit.com (@brandon.insertcredit.com.bsky.social, 2025-07-07t07: 54: 06.534Z
Turnbull urged colleagues that to reduce the “emotional and cognitive weight” of job loss in their posts on AI, to reduce quick views for 30-day recovery plans and linkedIn messages. Possibly the most eyebrow -enlarging suggestions were suggested to help the IMPOSTOR syndrome re -prepared.
“An AI device is a replacement for your voice or living experience,” wrote by Turnbull. “But at a time when mental energy is rare, these devices can help you to be faster, calm and unstable with more clarity.”
Even the most charitable interpretation of his post cannot ignore how kind and bad time is the advice. And Angry Game Developers flooded the comments, possibly leading to the removal of the post.
To keep it lightly, they do not agree that the closure is a emotional puzzle that is best solved with an algorithm. Instead, perhaps a human career and the upheaval of life can understand, and how it requires human compassion, support network, and tangible help, like, says, an introduction to someone who can help you get a new job.
AI therapy
The incident is worse in terms of the construction of billions of AI infrastructure of Microsoft, while dramatically shrinking its gaming teams. After losing their jobs, urging the developers kept to be thin on AI is more than hypocritical; It is asking people to use the technology that can lead to lack of jobs.
Being clearly and highly fair for turnbull, the use of AI can help with some mental health concerns and can be useful in re -starting or preparing for job interviews. It is not a terrible idea to make AI a part of outplayment services. This can already promote internal coaching and career-transmission hand Microsoft offer, adding to recruiters, resume workshops, and provides it. But it cannot replace those human services. And one of those who asks you to use AI to find a new job, this is contrary to support. This is just an insult at the top of the injury.
Microsoft’s AI The dual approach to closure and double people on the infrastructure is a test of its company’s culture as its technical ability. Will we see a new standard where the AI ​​Prompt Package comes with a new standard instead of consultation and depression? If the message is, “Let us feel free to use the chatbot to help you after setting you on fire,” expects a much more derogatory, tone-deaf nonsense from the authorities.
Perhaps they should ask chatbots how to ange them without interacting with humans, because this is a lesson that they have not learned well.