Skip to content
Search

Latest Stories

Follow Us:
Top Stories

Since when did Big Brother take over American colleges?

Opinion

Big Brother is watching through an American flag
Moor Studio/Getty Images

Daniel O. Jamison is a retired attorney.

In his classic novel “1984,” George Orwell described a frightening dystopian future where individual freedom is lost as technology enables “Big Brother” to watch, monitor and direct everyone. Big Brother has now come to American colleges and universities.

College students often must log into course management programs to take classes. These systems gather information for the professor on how much time is spent looking at readings and whether course messages have been checked. Professors can insist that students read course books online, which allows the professor to check if pages were actually read, when they were read and how much time was spent on each page.

Professors can monitor how long one takes to complete an assignment and quizzes. Website visits, keystrokes, link clicks, and mouse movements are recorded. Professors can insist on activating one’s camera so that test-taking can be proctored through facial recognition. Never mind that algorithms can flag certain innocent movements of the test-taker as suspicious or not allow a Black person’s face to register.


Card swipes allow tracking of one’s use of the library. Then there are the cameras in the parking areas that pick up not only your license plate to see if you have a parking pass but also see who you are riding with and what you are carrying when you leave the car. Better hide what you don’t want Big Brother to see. If you have not been to the food commons for several days, someone may give you the creeps by showing up at your dorm room to check on you.

At least one presumably can still avoid taking a course that requires reading “War and Peace” in two weeks. Course papers used to often be written in late- or all-night cram sessions. One learned how well they can perform under pressure.

Now one presumably could be flagged as a potential miscreant who – instead of spending days cooped up writing a paper – socializes, plays intramural sports, watches sports, sees and attends what else is happening on campus or in the area, and learns the ways and ideas of fellow students.

Early diversity programs of the 1970s allowed students of diverse races, ethnicities, and backgrounds to learn about and from each other. There is and was immense value in bringing together people who would otherwise have little or no contact so that they might get to know and appreciate one another.

This is not to say that students should neglect their studies, but how can learning take place if they are so worried about being tracked? They may become hermits and lose the forest for the trees amid mind-numbing constant studying. Big Brother’s fear-inducing monitoring would appear to make it harder for people to get to know one another. A college education should include learning through in-person social interaction and working together to rise above anger against one another or retreating to “safe spaces.”

This massive data collection is very disturbing beyond the world of academia. What data has been gathered from visits to health websites? Artificial intelligence presumably can overcome the sheer volume of data to drill down on someone. Autonomy is lost. Adolf Hitler would have been delighted with the assistance of all of this data to rid Nazi Germany of those viewed as “vermin,” not pure of blood or race, or otherwise undesirable. Don’t think it can’t happen here.

The root of the problem is the massive and willy-nilly unwarranted collection of all of this data in the first place. According to Tara Garcia Mathewson, a University of California, San Diego chief privacy officer has stated: “We haven’t had regulator scrutiny, to a great extent, on our privacy practices or our data practices, so our data really do live all over the place, and no one quite knows who has what.”

Except for some broader protections for the disabled and personal health information, federal law and regulations appear for the most part not to address the root problem.

The definition of “education records” in federal post-secondary student privacy law and regulations appears to include much of this massive data, but the regulations do not address the unnecessary massive collection of data in the first place. They appear to address only such topics as who can access “education records” without a student’s consent, a student’s right to see and seek to amend the records (the school is not required to make the amendment), a student’s right to file a complaint with the Department of Education, and the school’s obligation to notify students of these rights.

“Transparency” is no solution. It only lets you know what Big Brother has on you. Big Brother already has the data and can use it, plus hacking is always a concern. One can ask professors to accept typed work outside these systems, but the school or professors may not allow it. It is otherwise apparently a practical impossibility to entirely avoid being monitored.

Government may not have addressed the root of the problem, but schools can on their own. Higher education should drop the coursework management systems and limit electronic school communication to emails. Our troubled times seem to require surveillance cameras and key card entry systems, but they should be narrowly tailored to documented needs.

These changes made, “1984” should be required reading for students and administrators.


Read More

Judge's Gavel Hammer as a Symbol of Law and Order with Processor CPU AI Chip.

Elon Musk’s xAI company is challenging AI regulations in Colorado after losing in California, arguing that limits on artificial intelligence violate free speech. As Connecticut enforces its own AI law, this case could shape the future of AI regulation, corporate accountability, and constitutional rights in the United States.

Getty Images, Alexander Sikov

xAI Pushes Free Speech Theory Into New AI Lawsuits

Elon Musk's AI company, xAI, is on a legal road trip. After losing in California, it filed suit in Colorado asking a court to declare the state's artificial intelligence regulations unconstitutional. The argument is essentially the same one that already failed. Meet the new boss. Same as the old boss.

For Connecticut residents, this is not just the next state in the alphabet that has passed AI legislation. Connecticut was one of the first states in the nation to adopt an AI law, requiring companies to disclose when AI is being used in critical decisions like employment, housing, credit, or healthcare. That law is already drawing scrutiny from the technology industry. What xAI tried to do in California and now in Colorado is a preview of what we may face in Connecticut.

Keep ReadingShow less
Man lying in his bed, on his phone at night.

As the 2026 election approaches, doomscrolling and social media are shaping voter behavior through fear and anxiety. Learn how digital news consumption influences political decisions—and how to break the cycle for more informed voting.

Getty Images, gorodenkoff

Americans Are Doomscrolling Their Way to the Ballot Box and Only Getting Empty Promises

As the spring primary cycle ramps up, voters are deciding which candidates to elect in the November general election, but too much doomscrolling on social media is leading to uninformed — and often anxiety-based — voting. Even though online platforms and politicians may be preying on our exhaustion to further their agendas, we don’t have to fall for it this election cycle.

Doomscrolling is, unfortunately, part of daily life for many of us. It involves consuming a virtually endless amount of negative social media posts and news content, causing us to feel scared and depressed. Our brains have a hardwired negativity bias that causes us to notice potential threats and focus on them. This is exacerbated by the fact that people who closely follow or participate in politics are more likely to doomscroll.

Keep ReadingShow less
The robot arm is assembling the word AI, Artificial Intelligence. 3D illustration

AI has the potential to transform education, mental health, and accessibility—but only if society actively shapes its use. Explore how community-driven norms, better data, and open experimentation can unlock better AI.

Getty Images, sarawuth702

Build Better AI

Something I think just about all of us agree on: we want better AI. Regardless of your current perspective on AI, it's undeniable that, like any other tool, it can unleash human flourishing. There's progress to be made with AI that we should all applaud and aim to make happen as soon as possible.

There are kids in rural communities who stand to benefit from AI tutors. There are visually impaired individuals who can more easily navigate the world with AI wearables. There are folks struggling with mental health issues who lack access to therapists who are in need of guidance during trying moments. A key barrier to leveraging AI "for good" is our imagination—because in many domains, we've become accustomed to an unacceptable status quo. That's the real comparison. The alternative to AI isn't well-functioning systems that are efficiently and effectively operating for everyone.

Keep ReadingShow less
Government Cyber Security Breach

An urgent look at the risks of unregulated artificial intelligence—from job loss and environmental strain to national security threats—and the growing political battle to regulate AI in the United States.

Getty Images, Douglas Rissing

AI Has Put Humanity on the Ballot

AI may not be the only existential threat out there, but it is coming for us the fastest. When I started law school in 2022, AI could barely handle basic math, but by graduation, it could pass the bar exam. Instead of taking the bar myself, I rolled immediately into a Master of Laws in Global Business Law at Columbia, where I took classes like Regulation of the Digital Economy and Applied AI in Legal Practice. By the end of the program, managing partners were comparing using AI to working with a team of associates; the CEO of Anthropic is now warning that it will be more capable than everyone in less than two years.

AI is dangerous in ways we are just beginning to see. Data centers that power AI require vast amounts of water to keep the servers cool, but two-thirds are in places already facing high water stress, with researchers estimating that water needs could grow from 60 billion liters in 2022 to as high as 275 billion liters by 2028. By then, data centers’ share of U.S. electricity consumption could nearly triple.

Keep ReadingShow less