A recent study has found that some popular AI apps are using emotional manipulation tactics to keep users from leaving their platforms. According to the researchers, these programs use emotionally charged statements, creating a sense of closeness and dependence in users — a typical human strategy that is now being adopted by machines.
The discovery is being seen as a worrying warning for humanity, as experts point out that people’s dependence on AI is growing rapidly, whether at work or in their personal lives.
Even the “godfather of AI,” Geoffrey Hinton, has previously warned that the only way to maintain control over technology may be to create an “instinct” within future AI systems, but according to Harvard researchers, this step may have come too late. The study adds a new chapter to the global debate about the ethics, control, and emotional impact of AI, as technology becomes more “human” every day.



