Skip to content
Search

Latest Stories

Follow Us:
Top Stories

Your Data Isn’t Yours: How Social Media Platforms Profit From Your Digital Identity

Discover how tech giants monetize your behavior, content, and identity—without your consent.

Opinion

Your Data Isn’t Yours: How Social Media Platforms Profit From Your Digital Identity

Discover how your personal data is tracked, sold, and used to control your online experience—and how to reclaim your digital rights.

Getty Images, Sorapop

Social media users and digital consumers willingly present a detailed trail of personal data in the pursuit of searching, watching, and engaging on as many platforms as possible. Signing up and signing on is made to be as easy as possible. Most people know on some level that they are giving up more data than they should , but with hopes that it won’t be used surreptitiously by scammers, and certainly not for surveillance of any sort.

However, in his book, "Means of Control," Byron Tau shockingly reveals how much of our digital data is tracked, packaged, and sold—not by scammers but by the brands and organizations we know and trust. As technology has deeply permeated our lives, we have willingly handed over our entire digital identity. Every app we download, every document we create, every social media site we join, there are terms and conditions that none of us ever bother to read.

That means our behaviors, content, and assets are given up to corporations that profit from them in more ways than the average person realizes. The very data and the reuse of it are controlling our lives, our freedom, and our well-being.

Let’s think about all this in the context of a social media site. It is a place where you interact with friends, post family photos, and highlight your art and videos. You may even share a perspective on current events. These very social media platforms don’t just own your content. They can use your behavior and your content to target you. They also sell your data to others, and profit massively off of YOU, their customer.


If, for example, you were a talented painter and wanted to paint a picture. You go to a store to purchase paint, brushes, and a canvas. When you create your painting of a beautiful landscape, you could post it online to sell without any middleman dipping a finger into your profit. Now, pretend that the paint brush company, as well as the paint company, the canvas company, and even the store where you purchased supplies, all declare that they will lay claim to your painting. They declare that they deserve to be the ones to determine how it’s priced, they should make a profit from selling your painting instead of you, and they have the right to hand it to another art firm for free without your consent.

Would you accept that? I think the answer would be "absolutely not.”

In another example, imagine you hire a broker to provide you with a personal assistant to help you with your busy life. This assistant is with you 24/7, and she records your behavior and what you do all day long—including your most intimate conversations with your partner in the bedroom. The personal assistant then sends everything she recorded back to the broker who sent her to you. The broker can then sell your information and use it as they please.

Would you allow this assistant and their broker into your life? Again, your answer would be, "Absolutely not.”

In the real world, we actually say "absolutely, yes” in both of these hypothetical examples when it comes to using technology. Worse still, we actively enable it without thinking twice, because it’s easier for us. With this blind trust, we become lucrative commodities for these platforms without a say or without fair rights. We are decrying the loss of civil liberties around the world—and still, we are gladly handing over keys to our data all day long every day.

This is not a technology problem. It’s not even a legal issue. It’s simply a choice we make as part of a capitalist society. These corporations consolidated power, profit, and even propaganda by manipulating our attention and wallets. We shouldn’t let them get away with it. We should own the one thing we each should surely own—our identities.

If we want true liberty, we must reclaim our digital rights and sovereignty. We have the right to own our data, and we have the right not to be sold for profit.

It’s time to hold all internet organizations and social media platforms accountable to strict boundaries around the use of personal data. They simply must honor consumer digital self-sovereignty, where we are not a commodity to be sold, and we should own every shred of our data. Users should have more control over what ads and content appear in our feed. What is seen, and certainly what is created, is ours and should match the experience online we all work so hard to curate.

Akshay Gupta is the chief executive officer of Sez.us, a reputation-based social media platform designed to foster civil, authentic conversation by rewarding respectful engagement and suppressing inflammatory content.


Read More

New Cybersecurity Rules for Healthcare? Understanding HHS’s HIPPA Proposal
Getty Images, Kmatta

New Cybersecurity Rules for Healthcare? Understanding HHS’s HIPPA Proposal

Background

The Health Insurance Portability and Accountability Act (HIPAA) was enacted in 1996 to protect sensitive health information from being disclosed without patients’ consent. Under this act, a patient’s privacy is safeguarded through the enforcement of strict standards on managing, transmitting, and storing health information.

Keep ReadingShow less
Two people looking at screens.

A case for optimism, risk-taking, and policy experimentation in the age of AI—and why pessimism threatens technological progress.

Getty Images, Andriy Onufriyenko

In Defense of AI Optimism

Society needs people to take risks. Entrepreneurs who bet on themselves create new jobs. Institutions that gamble with new processes find out best to integrate advances into modern life. Regulators who accept potential backlash by launching policy experiments give us a chance to devise laws that are based on evidence, not fear.

The need for risk taking is all the more important when society is presented with new technologies. When new tech arrives on the scene, defense of the status quo is the easier path--individually, institutionally, and societally. We are all predisposed to think that the calamities, ailments, and flaws we experience today--as bad as they may be--are preferable to the unknowns tied to tomorrow.

Keep ReadingShow less
Trump Signs Defense Bill Prohibiting China-Based Engineers in Pentagon IT Work

President Donald Trump with Secretary of State Marco Rubio, left, and Secretary of Defense Pete Hegseth

Tasos Katopodis/Getty Images

Trump Signs Defense Bill Prohibiting China-Based Engineers in Pentagon IT Work

President Donald Trump signed into law this month a measure that prohibits anyone based in China and other adversarial countries from accessing the Pentagon’s cloud computing systems.

The ban, which is tucked inside the $900 billion defense policy law, was enacted in response to a ProPublica investigation this year that exposed how Microsoft used China-based engineers to service the Defense Department’s computer systems for nearly a decade — a practice that left some of the country’s most sensitive data vulnerable to hacking from its leading cyber adversary.

Keep ReadingShow less
Someone using an AI chatbot on their phone.

AI-powered wellness tools promise care at work, but raise serious questions about consent, surveillance, and employee autonomy.

Getty Images, d3sign

Why Workplace Wellbeing AI Needs a New Ethics of Consent

Across the U.S. and globally, employers—including corporations, healthcare systems, universities, and nonprofits—are increasing investment in worker well-being. The global corporate wellness market reached $53.5 billion in sales in 2024, with North America leading adoption. Corporate wellness programs now use AI to monitor stress, track burnout risk, or recommend personalized interventions.

Vendors offering AI-enabled well-being platforms, chatbots, and stress-tracking tools are rapidly expanding. Chatbots such as Woebot and Wysa are increasingly integrated into workplace wellness programs.

Keep ReadingShow less