
In 2009, two former Yahoo employees, Jan Koum and Brian Acton launched a modest messaging application from a small office in Santa Clara, California. WhatsApp, as they named it, started with barely anyone paying attention. Its early user count was unimpressive, hovering in the thousands, and the app struggled to find its footing. But something quiet was happening. People were tired of paying for SMS. They wanted something simpler, cheaper, faster. WhatsApp was all three. Within two years, it had crossed 200 million users. By 2013, that number had exploded past 400 million. Then came the moment that changed everything. In February 2014, Mark Zuckerberg acquired WhatsApp for $19 billion at the time, the largest acquisition of a venture-backed company in history. The green app had a new owner, and with that transaction, a new era began. Not just for WhatsApp, but for how the world would think about the value of human conversation.
That $19 billion question was never really about the app. It was about the data. When Facebook bought WhatsApp, it was purchasing access to the communication habits, contact networks, and behavioural patterns of hundreds of millions of people. The users, of course, were not asked. They were not consulted nor compensated. They were the product, invisible in the transaction that was supposedly made on their behalf. And that dynamic where users generate value they never own sits at the heart of what is increasingly being called the data sovereignty crisis.
Data sovereignty, simply put, is the idea that people should have control over their own data: who collects it, how it is stored, what it is used for, and who profits from it. It sounds like a basic right. In practice, it is one that nearly every major social platform has quietly, systematically, and legally stripped away. The terms and conditions that users click through without reading are not accidental walls of text. They are carefully drafted agreements that transfer ownership of user-generated data to corporations. By the time most people understand what they have agreed to, the exchange has long since been made.
The scale of this is difficult to grasp. WhatsApp today serves over two billion users across more than 180 countries. Facebook, now rebranded as Meta, sits at the centre of an advertising empire worth hundreds of billions of dollars, an empire built almost entirely on the behavioural data of ordinary people sending messages, sharing photos, and keeping in touch with family. The 2021 WhatsApp privacy policy update, which forced users to share their data with Meta or lose access to the app, triggered a global backlash. Millions migrated to Signal and Telegram overnight. Regulatory bodies across Europe, India, and Brazil launched investigations. Germany’s Federal Cartel Office and the Irish Data Protection Commission, which serves as the European Union’s lead regulator for Meta under the General Data Protection Regulation, both scrutinised the policy. Yet despite the noise, the fundamental model did not change. The appetite for user data remained, and the platforms remained dominant.
It is within this landscape that a different kind of builder is asking a different kind of question. Rather than accepting that data extraction is simply the cost of connectivity, there is a growing movement of engineers and entrepreneurs who believe the architecture of social communication must be rebuilt from the ground up. One such voice comes from Abdulmalik Uthman, a software engineer currently completing his master’s degree at Northumbria University in London. Uthman has been developing Reachme, a social platform built on a privacy-first foundation that integrates livestreaming and video calls, the kinds of features that keep billions engaged on Meta’s properties while placing data ownership firmly back in the hands of the user.
What makes this approach significant is not simply that it promises privacy. Many apps have made that promise. Signal encrypts messages. DuckDuckGo avoids tracking searches. ProtonMail protects emails. The question is whether a platform can offer the richness, the social depth, the real-time connection that modern users expect and still refuse to monetise their behaviour. The conventional wisdom in Silicon Valley is that it cannot. That scale requires advertising, advertising requires data, and data requires surveillance. Reachme’s premise is that this logic is not inevitable. It is a design choice.
The European Union’s General Data Protection Regulation, which came into force in 2018, was supposed to shift this balance. It gave citizens the right to access, correct, and delete their personal data. It required explicit consent for data collection. It imposed heavy fines for violations. Meta alone has been fined over one billion euros under GDPR since its introduction. And yet, the fundamental experience of using a Meta platform has not changed for the average person. The regulations addressed symptoms without touching the underlying structure: a business model that is profitable precisely because it treats personal data as raw material.
This is the gap that builders like Uthman are attempting to fill. The technical challenge is real. Designing a platform where users own their data requires rethinking how content is stored, how identities are verified, how revenue is generated, and how trust is maintained at scale. End-to-end encryption helps, but it is not sufficient on its own. Data minimisation collecting only what is strictly necessary must be baked into the architecture, not bolted on as a feature. Decentralised storage models, where user data does not sit on a single corporate server, offer one path forward. Open protocols like ActivityPub, which powers federated platforms such as Mastodon, demonstrate that social networking can function without a central authority holding all the keys.
The commercial dimension cannot be ignored either. WhatsApp is free because Meta subsidises it with advertising revenue generated elsewhere in its ecosystem. A privacy-first platform must find a different answer to the question of sustainability. Subscription models, open-source community funding, and freemium structures with clearly defined boundaries around data use are among the alternatives being explored across the industry. None of them are as immediately scalable as advertising. But none of them require treating users as an inventory to be sold.
What is striking about this moment is that the demand is real. The Edelman Trust Barometer and multiple independent surveys in recent years have consistently shown that users, across age groups and geographies, are increasingly concerned about how their data is used. Young people in particular the generation that has grown up entirely inside the surveillance economy report discomfort with the trade-offs they have inherited. They use the platforms because they must, because their social lives exist there. But they do not trust them. That gap between use and trust is exactly where the next generation of social tools will be built.
WhatsApp changed how the world communicates. There is no disputing that. But the architecture it operates within one where a conversation between two people in Lagos or London is also a data point in an advertising profile is a choice, not a necessity. The infrastructure of human connection does not have to be owned by the same companies that profit from observing it. The technology exists to build differently. The regulatory pressure is mounting. The public appetite is there. The only question that remains is whether the builders who understand this will move fast enough before the platforms they are challenging simply buy them too.




























