Meta recently admitted what many of us already suspected: conversations with the company's AI chatbot are not private. Messages exchanged can be accessed and analyzed by employees or collaborators, under the guise of improving the model's performance. Unfortunately, this insufficiently communicated openness raises serious problems.
I have personally observed with increasing concern how intimate details or sensitive information end up being available to internal teams, without users being fully aware of this. Meta's privacy policy vaguely mentions that this data can be used not only to train the AI, but also for various other internal purposes, which leaves room for broad and potentially abusive interpretations.
This lack of transparency has provoked vehement reactions from data protection experts and user communities. In my view, it is unacceptable that a company of Meta’s size would not provide clear guarantees regarding the privacy of a service that, by its very nature, invites personal discussions.
Looking further, i fear that these kinds of practices are leading us towards a society in which digital control becomes a new form of authority. As AIs accumulate data on every aspect of our lives, the risk is that this information will be used not only for commercial purposes, but also to shape behavior and curb freedom of expression. We are potentially moving towards a model of governance based on fear — where citizens, knowing that their every word and action is being monitored and archived, end up self-censoring. A new kind of digital dictatorship, more subtle and insidious than anything we have ever known.
What seemed like a friendly innovation in communication with artificial intelligence turns out to be an open door to practices of surveillance and influencing human behavior. What do we risk when our intimate data becomes tools in a possible new regime based on fear and self-censorship?
What we once thought was just a tool for help or entertainment could turn into a means of surveillance and shaping human behavior through fear.