BuzzFeed Writer: Why Can’t Google, Facebook Get a Grip on Fake News?

What’s going on?

Humans haven’t been replaced by machines yet in at least one area: spotting news hoaxes. BuzzFeed senior writer Charlie Warzel joined Glenn and Stu today to talk about the tech world’s fake news problem and urge lawmakers to sit up and take notice of developing technology before it gets completely out of hand.

Give me the quick version:

After the tragic shooting in Florida last week, journalists and researchers noticed dozens of hoaxes that were going viral; impersonations of journalists; and posts and videos that claimed the victims were actors. All of those things violate the rules for platforms like Facebook, YouTube and Twitter.

Parkland marked the third time in four months that these tech companies had slipped up by allowing total misinformation about tragedies to be shared freely on their platforms, BuzzFeed reported. Why can’t they seem to do better?

Politicians need to wake up.

As technology advances, it’s getting more and more difficult to know what’s real and what’s fake. Warzel urged lawmakers to put in “safeguards” now before obscure Reddit threads become mainstream misinformation. How will we trust our eyes and ears when video and audio can be easily faked?

This article provided courtesy of TheBlaze.

GLENN: Every once in a while, we need to take a step back. Everybody right now is screaming, fake news, fake news. Both sides are doing it, and in some ways, both sides are right.

We're getting to a place that soon, you're not going to be able to believe your eyes and ears. And people don't really realize this. There's a guy named Aviv Ovadya. He predicted the fake news explosion. And now he's saying, oh, yeah, yeah, yeah, yeah. But that's just the beginning. That's nothing compared to what's on the recent or -- or near horizon.

STU: Yeah. Infopocalypse, potentially. And there's a great story about this in Buzzfeed from Charlie Warzel. It's a story about what's coming next.

Charlie Warzel is a reporter for Buzzfeed. Also writes something -- one of my favorite things to read, which because it's about Infowars and sort of that conspiracy media. And it's -- his last name is Warzel. It's called InfoWarzel, which is the greatest name of all time. It's a newsletter, and it's really worth your attention as well. He joins us now from Montana, Charlie, is that where you are?

CHARLIE: That's right. Missoula, Montana. Thanks for having me.

GLENN: You bet.

So, Charlie, I can't seem to get people to really get their arms around the idea that soon, we're not going to even know what reality is, and we don't -- we won't care.

JORDAN: Well, it's -- it's complicated, to some extent. But the best way that I can describe it is that these sort of hall of mirrors that we're sort of experiencing online right now. As you guys were saying earlier, everyone is sort of calling fake news with -- with sort of bad actors, acting in bad faith, putting out, you know, propaganda and content that's designed to manipulate. That isn't true.

All those things that we see, you know, in our Facebook feeds, in Twitter right now.

It's all going to potentially get far worse because the technology is going to allow it to come from people that perhaps we know.

So the -- you know, the -- the fake news that you're seeing, the misinformation, the propaganda, it could start coming from, you know, a loved one. You know, you could start getting emails from them, telling you things that didn't happen that were generated algorithmically. So it's not really that something new is going to happen. It's that everything happening right now, all this unrest, discord, confusion, and difficulty, sort of parsing reality, is going to become so much more sophisticated because of technology, that hasn't even been invented yet.

GLENN: What do you mean that you're going to get -- that you'll get something from your loved ones?

CHARLIE: Sure. So Aviv, the researcher who I spoke with, alongside many others who are doing, you know, really great work, sort of understanding how these platforms work. And the technology that's on the horizon. Aviv has this -- this term. And it's called laser fishing. So regular fishing, or spearphishing is when you maybe get a link from something -- an email address that is a couple characters off from somebody you know. And it's saying, hey, click this link. And then that link asks you for, you know, your password information. It's sort of a classic hacker trick. It's pretty low-tech.

This would sort of be something that would happen. Laser fishing is using AI and sort of this artificial intelligence and machine learning to understand things about you, understand the people that you talk to.

The conversation you have across social media with other people. Mine all that information. And then use it to manipulate you. So instead of getting an email from someone who -- who sounds like they could be somebody you own, the email is going to come from ostensibly someone you know, and it's going to have information that's pertinent to you. Information that you were perhaps expecting to hear from. So you're so much more likely to believe this information. And then offer things up.

You know, there's a lot of people -- Nigerian princes on the internet who are asking for money. But what if that person is your brother. And your brother says that he had a car accident. And he's stuck and needs to repair his car. Because you were having a conversation about, you know, cars and money or something like that along the line.

So this is -- being able to manipulate people, at the click of a mouse or a button, in this -- in this artificial intelligence way. And I think that -- I think that we're -- we're falling for the low-tech, low-fi stuff right now. So it's going to be hard to imagine, you know, how we can get up to speed on the other stuff.

STU: And the future of this, Charlie, goes even further than just say an email. It could be even audio or video coming from the people that you know convincing you to do something that winds up completely burning you.

CHARLIE: Absolutely. And I think you can see this not just in people asking for money, or you know, asking you for information. But this can be -- this can be used to manipulate government and diplomacy.

GLENN: Uh-huh.

CHARLIE: It's not hard to envision -- and many people sort of have already been talking about this. But it's not hard to envision any lawmaker has hundreds of hours of footage on themselves, either audio or video on the internet. The machine learning programs can take that. Can absorb it. And then what they can -- what they can do with that is -- is produce very hard -- hard to verify and real-looking video of people saying anything.

So, you know, you could have a video of Donald Trump potentially down the line, really antagonizing in -- in an aggressive way, North Korea.

And the stakes of that get higher and higher as the reaction times are -- are shorter. And people have to respond.

So you could really escalate, you know, political and -- and, you know, diplomatic tensions using this kind of technology.

GLENN: So I was talking about this, at the beginning of the year. And I laid out just some crazy predictions. And one of them was, if be the not this election of 2018, by 2020, this will be used in an effective way. And we may not know about it until after the election. But we are that close to this kind of stuff being used. Would you agree with that?

CHARLIE: Well, I think with the artificial intelligence stuff, with the video and audio manipulation, we may be a little further down the line from that. Because the real worry is not just some incredibly sophisticated programmer or one-off type person is going to be able to use this, who has, you know, access -- proprietary technology.

The real thing is when it becomes democratized, when you can manipulate -- when anyone with two or three hours of research on the internet, can do this.

And that, I think we're a little bit further off, but not too far. There are some -- some forums.

There's a forum on the site Reddit, which is called deepfakes. And it is where people are manipulating video right now.

Some of it is awful. Some of it is pornographic and very disturbing. But others are just -- you can go and look for yourself, are funny. People putting Nicholas Cage's face on Arnold Schwarzenegger.

GLENN: I don't know why Nicholas Cage is this guy. But his face is almost on everybody.

(laughter)

CHARLIE: He's an internet sensation.

GLENN: Yeah, he is.

CHARLIE: But, you know, it speaks to -- when people are kind of playing around with this, having fun with it, doing it in their spare time because it's entertaining, that is sort of a harbinger of something that is sort of scary, which you could in two or three hours, figure out how to do this yourself.

I think we're a little further than -- I think 2020, who knows. But it's definitely coming.

GLENN: I hope you're right.

Tell me a little bit about what Aviv talks about and describes as reality apathy.

CHARLIE: Sure.

It's basically the combination of all of this that we're talking about. Which is these sophisticated technological tools to sort of distort what's real and what's not. To the point where you become overwhelmed by the idea of all -- say you're being laser fished by, you know, 20 people. And when you go online and try to click a news link, you're not sure where the source is coming from, whether it's something you can trust, whether it's something you're not.

You're just besieged by what you believe is misinformation, but you can't even tell. So you start to disengage.

You know, if your inbox is something where you don't know what you're getting, what's real or what's not, you're going to maybe give up. And that is sort of -- that works also with -- with diplomacy. If people start, you know, spoofing calls to Congress, to lobby their lawmakers about some political issue, if that happens in a -- in a spoofing way so much that people can't get through on the lines, they're going to stop participating in -- in democracy, in that particular way. They might, you know, stop going online and sharing their own opinions or feel unsafe. They might just say, you know what, the news, it's just not worth it for me. That's scary.

GLENN: But going the other way as well, if you see a bunch of stuff that is fake and you don't know what to believe, somebody in power could actually be doing some really bad stuff. And nobody would know. Nobody would pay attention. They would say, well, that's just fake. Because that's what the politician would say.

CHARLIE: Yeah, an informed citizenry is a cornerstone of democracy.

GLENN: So how do we inform ourselves, going forward? Who is standing against this? How do we protect -- I mean, you can't put the genie back in the bottle. What do we do?

CHARLIE: Well, I think -- this is why I wanted to highlight Aviv's work. And, you know, I -- he's becoming labeled as sort of the person who called the misinformation fake news crisis before it became a thing. He's one of many. There are -- there are, you know, dozens of researchers like this, who are lobbying tech companies, thinking about this, on sort of the vanguard of this movement.

And I think journalists, news organizations, highlighting these people's work, giving them a platform to talk about this, is the first step. The second step is really, you know, putting pressure on these technology companies. And not just Facebook or Google or Twitter. But, you know, the hardware makers. People like Adobe, who -- people like potentially Apple. Companies that are starting -- that are going to be making this audio visual technology. And making them sort of understand that innovation is okay.

But we have to learn our lessons from, you know, this whole fake news situation that we're dealing with right now. And build this technology responsibly, with all of these sort of externalities baked in, and understand what we can -- that these things can be abused. So let's put in the safeguards now, instead of later.

STU: I think you could see tech companies at times, be a little bit absorbed by self-interest. But they're not nefarious actors, right?

My -- my issue with this, when I try to find optimism in the future here, Charlie, is eventually state actors. Hacker groups. Someone with actual nefarious intent, that you can't go and lobby and you don't have people with ethics trying to deal with are going to get control of this stuff and do things that are going to be really harmful and maybe irreversible.

CHARLIE: I think that is potentially true. I mean, all of this -- it's difficult. Because we're in speculation territory. It's difficult as a journalist, writing about this about going too far. You know, scaring people too much. But, I mean, I think what this -- what the last 18 months of sort of information crisis world that we're in, should be teaching us right now. Is that this is everyone's problem. Law makers, you know, need to get smart on this stuff quick. They need to, you know, be putting pressure on --

GLENN: Not going to happen.

CHARLIE: And I think they need to spend time, you know, really understanding this technology --

GLENN: Yes.

CHARLIE: -- themselves. And getting the government ready. There's not a lot of task forces here, to combat computational propaganda or misinformation.

GLENN: Charlie, look how we're dealing with Russia. Everybody is talking about, oh, well, Donald Trump, Hillary Clinton. Russia. Look at what Russia is doing. We can get to the rest of that and, you know, if somebody did something, they should go to jail. But we're missing the point, that Russia has come in and -- and announced, in advance, what they were going to do. And they did it.

CHARLIE: I think that what -- state-sponsored actors, all of this -- it's clearly manipulatable by them. And I think that we -- I think that that's certainly one -- one piece of the puzzle. I think that -- I think that this technology, we've spent so long thinking that this technology is a -- a universal positive. That there's no negative externalities to connecting the world.

And I think that that is, you know -- that's a naive look at this. And I think that we need to sort of change the way that we message about this technology, that it's just as much a force for -- for evil, potentially. As it is a force for good. And for, you know, the free circulation of information. So I think some of it just has to do with our mindset with this. This is -- you know, a new innovation is not good just by definition.

GLENN: Right.

CHARLIE: You have to earn that.

GLENN: Charlie, I had been concerned about this for a very long time. I was really glad to see your article and the fact that it was on Buzzfeed and people are reading it. And I'd love to stay in touch with you and have you on the program again, as we follow this story. Thank you very much, Charlie.

CHARLIE: Thanks for having me.

(music)

STU: Leave you with one last quote from Aviv Ovadya, the expert Charlie talked to: Alarmism can be good. You should be alarmist about this stuff. We are so screwed, it's beyond what most of us can imagine.

I mean, jeez. It's scary. Charlie Warzel tweeted from @worldofStu. But he's @CWarzel on Twitter. You can get his work on Buzzfeed. It's really interesting stuff. He dives into a lot of weird worlds. And it's really compelling.

EXPOSED: Why Eisenhower warned us about endless wars

PAUL J. RICHARDS / Staff | Getty Images

Donald Trump emphasizes peace through strength, reminding the world that the United States is willing to fight to win. That’s beyond ‘defense.’

President Donald Trump made headlines this week by signaling a rebrand of the Defense Department — restoring its original name, the Department of War.

At first, I was skeptical. “Defense” suggests restraint, a principle I consider vital to U.S. foreign policy. “War” suggests aggression. But for the first 158 years of the republic, that was the honest name: the Department of War.

A Department of War recognizes the truth: The military exists to fight and, if necessary, to win decisively.

The founders never intended a permanent standing army. When conflict came — the Revolution, the War of 1812, the trenches of France, the beaches of Normandy — the nation called men to arms, fought, and then sent them home. Each campaign was temporary, targeted, and necessary.

From ‘war’ to ‘military-industrial complex’

Everything changed in 1947. President Harry Truman — facing the new reality of nuclear weapons, global tension, and two world wars within 20 years — established a full-time military and rebranded the Department of War as the Department of Defense. Americans resisted; we had never wanted a permanent army. But Truman convinced the country it was necessary.

Was the name change an early form of political correctness? A way to soften America’s image as a global aggressor? Or was it simply practical? Regardless, the move created a permanent, professional military. But it also set the stage for something Truman’s successor, President Dwight “Ike” Eisenhower, famously warned about: the military-industrial complex.

Ike, the five-star general who commanded Allied forces in World War II and stormed Normandy, delivered a harrowing warning during his farewell address: The military-industrial complex would grow powerful. Left unchecked, it could influence policy and push the nation toward unnecessary wars.

And that’s exactly what happened. The Department of Defense, with its full-time and permanent army, began spending like there was no tomorrow. Weapons were developed, deployed, and sometimes used simply to justify their existence.

Peace through strength

When Donald Trump said this week, “I don’t want to be defense only. We want defense, but we want offense too,” some people freaked out. They called him a warmonger. He isn’t. Trump is channeling a principle older than him: peace through strength. Ronald Reagan preached it; Trump is taking it a step further.

Just this week, Trump also suggested limiting nuclear missiles — hardly the considerations of a warmonger — echoing Reagan, who wanted to remove missiles from silos while keeping them deployable on planes.

The seemingly contradictory move of Trump calling for a Department of War sends a clear message: He wants Americans to recognize that our military exists not just for defense, but to project power when necessary.

Trump has pointed to something critically important: The best way to prevent war is to have a leader who knows exactly who he is and what he will do. Trump signals strength, deterrence, and resolve. You want to negotiate? Great. You don’t? Then we’ll finish the fight decisively.

That’s why the world listens to us. That’s why nations come to the table — not because Trump is reckless, but because he means what he says and says what he means. Peace under weakness invites aggression. Peace under strength commands respect.

Trump is the most anti-war president we’ve had since Jimmy Carter. But unlike Carter, Trump isn’t weak. Carter’s indecision emboldened enemies and made the world less safe. Trump’s strength makes the country stronger. He believes in peace as much as any president. But he knows peace requires readiness for war.

Names matter

When we think of “defense,” we imagine cybersecurity, spy programs, and missile shields. But when we think of “war,” we recall its harsh reality: death, destruction, and national survival. Trump is reminding us what the Department of Defense is really for: war. Not nation-building, not diplomacy disguised as military action, not endless training missions. War — full stop.

Chip Somodevilla / Staff | Getty Images

Names matter. Words matter. They shape identity and character. A Department of Defense implies passivity, a posture of reaction. A Department of War recognizes the truth: The military exists to fight and, if necessary, to win decisively.

So yes, I’ve changed my mind. I’m for the rebranding to the Department of War. It shows strength to the world. It reminds Americans, internally and externally, of the reality we face. The Department of Defense can no longer be a euphemism. Our military exists for war — not without deterrence, but not without strength either. And we need to stop deluding ourselves.

This article originally appeared on TheBlaze.com.

Unveiling the Deep State: From surveillance to censorship

Chip Somodevilla / Staff | Getty Images

From surveillance abuse to censorship, the deep state used state power and private institutions to suppress dissent and influence two US elections.

The term “deep state” has long been dismissed as the province of cranks and conspiracists. But the recent declassification of two critical documents — the Durham annex, released by Sen. Chuck Grassley (R-Iowa), and a report publicized by Director of National Intelligence Tulsi Gabbard — has rendered further denial untenable.

These documents lay bare the structure and function of a bureaucratic, semi-autonomous network of agencies, contractors, nonprofits, and media entities that together constitute a parallel government operating alongside — and at times in opposition to — the duly elected one.

The ‘deep state’ is a self-reinforcing institutional machine — a decentralized, global bureaucracy whose members share ideological alignment.

The disclosures do not merely recount past abuses; they offer a schematic of how modern influence operations are conceived, coordinated, and deployed across domestic and international domains.

What they reveal is not a rogue element operating in secret, but a systematized apparatus capable of shaping elections, suppressing dissent, and laundering narratives through a transnational network of intelligence, academia, media, and philanthropic institutions.

Narrative engineering from the top

According to Gabbard’s report, a pivotal moment occurred on December 9, 2016, when the Obama White House convened its national security leadership in the Situation Room. Attendees included CIA Director John Brennan, Director of National Intelligence James Clapper, National Security Agency Director Michael Rogers, FBI Deputy Director Andrew McCabe, Attorney General Loretta Lynch, Secretary of State John Kerry, and others.

During this meeting, the consensus view up to that point — that Russia had not manipulated the election outcome — was subordinated to new instructions.

The record states plainly: The intelligence community was directed to prepare an assessment “per the President’s request” that would frame Russia as the aggressor and then-presidential candidate Donald Trump as its preferred candidate. Notably absent was any claim that new intelligence had emerged. The motivation was political, not evidentiary.

This maneuver became the foundation for the now-discredited 2017 intelligence community assessment on Russian election interference. From that point on, U.S. intelligence agencies became not neutral evaluators of fact but active participants in constructing a public narrative designed to delegitimize the incoming administration.

Institutional and media coordination

The ODNI report and the Durham annex jointly describe a feedback loop in which intelligence is laundered through think tanks and nongovernmental organizations, then cited by media outlets as “independent verification.” At the center of this loop are agencies like the CIA, FBI, and ODNI; law firms such as Perkins Coie; and NGOs such as the Open Society Foundations.

According to the Durham annex, think tanks including the Atlantic Council, the Carnegie Endowment, and the Center for a New American Security were allegedly informed of Clinton’s 2016 plan to link Trump to Russia. These institutions, operating under the veneer of academic independence, helped diffuse the narrative into public discourse.

Media coordination was not incidental. On the very day of the aforementioned White House meeting, the Washington Post published a front-page article headlined “Obama Orders Review of Russian Hacking During Presidential Campaign” — a story that mirrored the internal shift in official narrative. The article marked the beginning of a coordinated media campaign that would amplify the Trump-Russia collusion narrative throughout the transition period.

Surveillance and suppression

Surveillance, once limited to foreign intelligence operations, was turned inward through the abuse of FISA warrants. The Steele dossier — funded by the Clinton campaign via Perkins Coie and Fusion GPS — served as the basis for wiretaps on Trump affiliates, despite being unverified and partially discredited. The FBI even altered emails to facilitate the warrants.

ROBYN BECK / Contributor | Getty Images

This capacity for internal subversion reappeared in 2020, when 51 former intelligence officials signed a letter labeling the Hunter Biden laptop story as “Russian disinformation.” According to polling, 79% of Americans believed truthful coverage of the laptop could have altered the election. The suppression of that story — now confirmed as authentic — was election interference, pure and simple.

A machine, not a ‘conspiracy theory’

The deep state is a self-reinforcing institutional machine — a decentralized, global bureaucracy whose members share ideological alignment and strategic goals.

Each node — law firms, think tanks, newsrooms, federal agencies — operates with plausible deniability. But taken together, they form a matrix of influence capable of undermining electoral legitimacy and redirecting national policy without democratic input.

The ODNI report and the Durham annex mark the first crack in the firewall shielding this machine. They expose more than a political scandal buried in the past. They lay bare a living system of elite coordination — one that demands exposure, confrontation, and ultimately dismantling.

This article originally appeared on TheBlaze.com.

Trump's proposal explained: Ukraine's path to peace without NATO expansion

ANDREW CABALLERO-REYNOLDS / Contributor | Getty Images

Strategic compromise, not absolute victory, often ensures lasting stability.

When has any country been asked to give up land it won in a war? Even if a nation is at fault, the punishment must be measured.

After World War I, Germany, the main aggressor, faced harsh penalties under the Treaty of Versailles. Germans resented the restrictions, and that resentment fueled the rise of Adolf Hitler, ultimately leading to World War II. History teaches that justice for transgressions must avoid creating conditions for future conflict.

Ukraine and Russia must choose to either continue the cycle of bloodshed or make difficult compromises in pursuit of survival and stability.

Russia and Ukraine now stand at a similar crossroads. They can cling to disputed land and prolong a devastating war, or they can make concessions that might secure a lasting peace. The stakes could not be higher: Tens of thousands die each month, and the choice between endless bloodshed and negotiated stability hinges on each side’s willingness to yield.

History offers a guide. In 1967, Israel faced annihilation. Surrounded by hostile armies, the nation fought back and seized large swaths of territory from Jordan, Egypt, and Syria. Yet Israel did not seek an empire. It held only the buffer zones needed for survival and returned most of the land. Security and peace, not conquest, drove its decisions.

Peace requires concessions

Secretary of State Marco Rubio says both Russia and Ukraine will need to “get something” from a peace deal. He’s right. Israel proved that survival outweighs pride. By giving up land in exchange for recognition and an end to hostilities, it stopped the cycle of war. Egypt and Israel have not fought in more than 50 years.

Russia and Ukraine now press opposing security demands. Moscow wants a buffer to block NATO. Kyiv, scarred by invasion, seeks NATO membership — a pledge that any attack would trigger collective defense by the United States and Europe.

President Donald Trump and his allies have floated a middle path: an Article 5-style guarantee without full NATO membership. Article 5, the core of NATO’s charter, declares that an attack on one is an attack on all. For Ukraine, such a pledge would act as a powerful deterrent. For Russia, it might be more palatable than NATO expansion to its border

Andrew Harnik / Staff | Getty Images

Peace requires concessions. The human cost is staggering: U.S. estimates indicate 20,000 Russian soldiers died in a single month — nearly half the total U.S. casualties in Vietnam — and the toll on Ukrainians is also severe. To stop this bloodshed, both sides need to recognize reality on the ground, make difficult choices, and anchor negotiations in security and peace rather than pride.

Peace or bloodshed?

Both Russia and Ukraine claim deep historical grievances. Ukraine arguably has a stronger claim of injustice. But the question is not whose parchment is older or whose deed is more valid. The question is whether either side is willing to trade some land for the lives of thousands of innocent people. True security, not historical vindication, must guide the path forward.

History shows that punitive measures or rigid insistence on territorial claims can perpetuate cycles of war. Germany’s punishment after World War I contributed directly to World War II. By contrast, Israel’s willingness to cede land for security and recognition created enduring peace. Ukraine and Russia now face the same choice: Continue the cycle of bloodshed or make difficult compromises in pursuit of survival and stability.

This article originally appeared on TheBlaze.com.

The loneliness epidemic: Are machines replacing human connection?

NurPhoto / Contributor | Getty Images

Seniors, children, and the isolated increasingly rely on machines for conversation, risking real relationships and the emotional depth that only humans provide.

Jill Smola is 75 years old. She’s a retiree from Orlando, Florida, and she spent her life caring for the elderly. She played games, assembled puzzles, and offered company to those who otherwise would have sat alone.

Now, she sits alone herself. Her husband has died. She has a lung condition. She can’t drive. She can’t leave her home. Weeks can pass without human interaction.

Loneliness is an epidemic. And AI will not fix it. It will only dull the edges and make a diminished life tolerable.

But CBS News reports that she has a new companion. And she likes this companion more than her own daughter.

The companion? Artificial intelligence.

She spends five hours a day talking to her AI friend. They play games, do trivia, and just talk. She says she even prefers it to real people.

My first thought was simple: Stop this. We are losing our humanity.

But as I sat with the story, I realized something uncomfortable. Maybe we’ve already lost some of our humanity — not to AI, but to ourselves.

Outsourcing presence

How often do we know the right thing to do yet fail to act? We know we should visit the lonely. We know we should sit with someone in pain. We know what Jesus would do: Notice the forgotten, touch the untouchable, offer time and attention without outsourcing compassion.

Yet how often do we just … talk about it? On the radio, online, in lectures, in posts. We pontificate, and then we retreat.

I asked myself: What am I actually doing to close the distance between knowing and doing?

Human connection is messy. It’s inconvenient. It takes patience, humility, and endurance. AI doesn’t challenge you. It doesn’t interrupt your day. It doesn’t ask anything of you. Real people do. Real people make us confront our pride, our discomfort, our loneliness.

We’ve built an economy of convenience. We can have groceries delivered, movies streamed, answers instantly. But friendships — real relationships — are slow, inefficient, unpredictable. They happen in the blank spaces of life that we’ve been trained to ignore.

And now we’re replacing that inefficiency with machines.

AI provides comfort without challenge. It eliminates the risk of real intimacy. It’s an elegant coping mechanism for loneliness, but a poor substitute for life. If we’re not careful, the lonely won’t just be alone — they’ll be alone with an anesthetic, a shadow that never asks for anything, never interrupts, never makes them grow.

Reclaiming our humanity

We need to reclaim our humanity. Presence matters. Not theory. Not outrage. Action.

It starts small. Pull up a chair for someone who eats alone. Call a neighbor you haven’t spoken to in months. Visit a nursing home once a month — then once a week. Ask their names, hear their stories. Teach your children how to be present, to sit with someone in grief, without rushing to fix it.

Turn phones off at dinner. Make Sunday afternoons human time. Listen. Ask questions. Don’t post about it afterward. Make the act itself sacred.

Humility is central. We prefer machines because we can control them. Real people are inconvenient. They interrupt our narratives. They demand patience, forgiveness, and endurance. They make us confront ourselves.

A friend will challenge your self-image. A chatbot won’t.

Our homes are quieter. Our streets are emptier. Loneliness is an epidemic. And AI will not fix it. It will only dull the edges and make a diminished life tolerable.

Before we worry about how AI will reshape humanity, we must first practice humanity. It can start with 15 minutes a day of undivided attention, presence, and listening.

Change usually comes when pain finally wins. Let’s not wait for that. Let’s start now. Because real connection restores faster than any machine ever will.

This article originally appeared on TheBlaze.com.