BLOG

Newfangled Math Meets Newfangled Technology (Don't Try This at Home)

What do self-driving cars, new math and Fen-Phen have in common? A lot more than you'd think.

On this scintillating segment of The Glenn Beck Program, Glenn and his co-hosts discuss whether the government will allow self-driving cars to become a reality since they'll save an estimated 4,000 lives annually.

"Washington is already saying they have to be safer than that. Well, they're already much more safe than a car, than a regular car and people driving them. They're not having accidents unless people are interfering," Glenn said.

That's when co-host Stu Burguiere chimed in with his theory on risk aversion.

"I think people, generally speaking, react badly to new types of risk, even if it lowers the other type of risk that goes on," Stu said.

So far, so good, but introducing analogous, statistical information on Fen-Phen was his downfall.

"Oh, my gosh. He's so on his mathematical high horse," Glenn said jokingly.

Enjoy the complimentary clip above or read the transcript below for details.

GLENN: I was talking to a class of 18-year-olds. About 50 of them. I was teaching a class, and we were talking about the future. And I said, "How many people have their driver's license?" Everybody raised their hand. I said, "This will be the last generation that will have driver's licenses. Your kids will not get a driver's license."

JEFFY: No way.

PAT: I'm sure not.

GLENN: And they didn't grasp that. And it really shows how the American people have no concept of what's right around the corner.

JEFFY: Wow.

GLENN: And I said, "Your kids are actually -- your kids will actually say to you, they let you drive? You could drive?"

Yeah. Yeah. They did. Sounds really unsafe, now that I think about it. Yeah, but they did let us do that.

There was that story that was from the Washington Post earlier this week about how technology right now the Washington -- it will be the only thing that will stop this. And I don't know if you can, put the genie back in the bottle.

But Washington is saying, wait a minute here on these self-driving cars. I'm not sure we want these self-driving cars. Because if they only save about 4,000 people a year, is that enough?

I didn't even understand that. They're -- Washington is already saying, they have to be safer than that. Well, they're already much more safe than a car, than a regular car and people driving them. They're not having accidents unless people are interfering.

And if you had all self-driving cars -- and this will be the case. If everybody is in a self-driving car, supposedly you won't have any accidents, or you'll have very few accidents because they'll be able to navigate around each other and they'll talk to each other. And so there won't be that problem.

How does the government believe of saying, "Hey, if we could only save 4,000 people a year, maybe we shouldn't allow these to happen?"

STU: I think people generally speaking react badly to new types of risk, even if it lowers the other type of risk that goes on.

We see this with medications all the time, where -- there was one I was reading something about recently that had a -- they pulled it off the market, you know, decades ago. But they pulled it off the market. At the time, they believed it was a 20:1 reward to risk ratio. That it did create some new health problems for some, but for every one life that it ended, it saved 20 lives for what it was actually trying to solve. It was a 20:1, and they pulled it off the market.

GLENN: What was it?

STU: I don't remember. It was years ago.

GLENN: Did it kill them?

STU: Yeah. It was one of the weight loss drugs.

JEFFY: Phen-fen.

STU: Phen-fen.

PAT: Phen-fen killed one in 20 people?

STU: No, no. That's not what I said at all.

GLENN: That's what I heard too. Pat, that's what I heard. That's what America heard.

PAT: That's what I heard. I'm not sure phen-fen killed anybody.

STU: No, no. It didn't kill -- that's not what I said at all. I said for every 20 people -- it was a 20:1 reward to risk ratio. So the reward was 20 for every one of the risk. Right? So every one person it killed, it saved 20. That was what the doctors thought at the time. For every one person it killed, it saved 20.

PAT: Okay.

STU: Twenty times the amount of people saved for the one that it killed.

GLENN: I still hear that for every 20 --

STU: God, math is -- why do I bother? Why do I bother?

GLENN: Oh, my gosh. He's so on his mathematical high horse.

STU: I mean, that was not a difficult one.

GLENN: That's what I heard. Pat, you still hearing that?

PAT: Well, I mean, I know what he's saying.

GLENN: Pat, Pat.

PAT: Yes, that's what I'm hearing too, Glenn.

GLENN: That's exactly what you're hearing.

PAT: Yes. That's exactly what I'm hearing.

STU: But the point is, this happens with medications all the time, when you have something new. I think it's the same thing that we talk about with airplane flights. People are like, "Well, I don't have control of it." So they freak out because the plane might crash into a mountain. I did not just say everyone on a plane dies, just so we're clear.

GLENN: I'm getting on a plane tomorrow. I'm worrying about crashing into a mountain.

JEFFY: As long as you're not the one in 20.

GLENN: Every 20 flights, one crashes into a mountain?

PAT: That's what he just said. That's what Stu just said, and I don't believe it. I don't believe that's true.

GLENN: I don't think we can trust him and his stats.

PAT: Not at all.

STU: No, not at all.

GLENN: And his newfangled math.

STU: But the point is, when you have these new risks introduced, people just focus on the negative, they don't focus on the problem that it's solving.

GLENN: Well, here's where it's really going to come down to. Because it's really, when it comes to self-driving cars and robotics, it is not going to come down whether or not they're safer. Like, for instance, the trucks -- the automatic trucks that are going to be introduced soon, these will be self-driving trucks. I can guarantee you that they will be safer than what is, you know, currently happening in the United States and around the world.

But truck driving is like the number one job in many states. So all of those jobs will be lost. And they're teamster jobs. So what happens with the teamsters? What did they do when they start to lose and hemorrhage -- how much do they have with the government saying, "No, no, no."

For instance, I think what we'll settle on is, you have to -- you can't have those self-driving -- you have to have a teamster in a seat in one of those trucks.

PAT: Yeah.

GLENN: And they won't do anything, they'll just be there to oversee it. And that's the way the -- that will be the compromise. You have to have a human body in one of those trucks, even though the trucks are self-driving, don't have any reason for a human to be in it.

So it will be the job loss.

I was talking to a guy who lives in Silicon Valley and on the edge of virtual reality and really coming up with really amazing things. And he said, "Glenn, I'm really concerned -- he said, "I spent the weekend with a bunch of my colleagues in Silicon Valley that are all doing AI." And he said, "I'm really concerned, and they're very concerned about a coming really race war in America."

I wasn't aware of what he was talking about. I was going to another kind of race war. He's talking about a human versus robot race war. And I said, "What?"

He said, "The guys in Silicon Valley are very afraid right now because they don't -- they don't know how they're not all killed sometime soon in the future." And I said, "By the robots?" And he said, "No, by the people. Because they're going to be seen as the guys who are taking away all of their jobs because they're developing the robots." And he said, "People don't have any clue. They think Siri is AI. That's not Siri." When AI is introduced in robots form, it will change everything. Like, right now you could change a robot into this room -- this is how good AI is at this point, and they've demonstrated it. You can bring a robot into this room, and you could say, "Just turn on the lights and get things ready." Without giving it anymore instructions. And it can't leave the room. It can only use the things in the room. The robot will go over, try to turn on the light. The robot will look at the light, the lightbulb is burned out. It doesn't work. It will then scan the room to see if there's a lightbulb in it. It will go grab a new lightbulb. It goes and it reaches up and tries to reach for the light. It can't reach the light. It scans the room. It gets a chair, and it gets a table. It climbs on the chair onto the table, unscrews the lightbulb, puts the new lightbulb in, comes down off the table, onto the chair, onto the floor, puts everything back, and goes into sleep mode. Room's ready.

That's where we are now, to where it just -- it didn't program it, go change the lightbulb.

It saw the need, figured out a plan, figured out how to get to the lightbulb and how to make things right.

PAT: It's really hard to believe that we're there, because that's none of our experience with robotics. I mean, if you've ever had one of those vacuum cleaners that's supposed to vacuum your house --

GLENN: The Roomba.

PAT: You just turn it on, and it just goes. It's bouncing off the walls. It's doing the same area for an hour and a half. It's going outside.

JEFFY: Ours worked pretty good.

PAT: Comes back around. It mows the lawn.

Just vacuum my floor, will you!

GLENN: Right. That's what they're saying. They're saying that nobody understands how fast --

PAT: Yeah, because that's not our experience right now.

GLENN: -- how fast this is coming. And when it does, it will just eat jobs. And in Silicon Valley, they're not working for a lower unemployment rate. They're working for a 100 percent unemployment rate. They're saying, "Why should people work? Why do manual labor? The robots can do all the manual labor."

STU: And theoretically if you had enough money to spend -- I mean, we all like vacations.

JEFFY: Yeah.

STU: The only reason -- not the only reason, but one of the main reasons we work is to provide the basic needs of our lives, if you can do that and have money to do other things and enjoy yourself, you know, that would be theoretically the goal. But we have no idea the ramifications are of the that.

RADIO

Bill O'Reilly: How corrupt media twisted Joe Biden Ukraine scandal onto Trump

Bill O'Reilly gives his take on whether or not President Trump DID promise or threaten the President of Ukraine for dirt on Joe Biden, and whether or not the action is grounds for impeachment. But O'Reilly explains that either way, the media has demonstrated its corruptness yet again by manipulating the story away from a scandal for Joe and Hunter, and towards a potential, rumored wrongdoing by Donald Trump.

RADIO

The Fed, banks printing money to 'prevent' trouble: Recession WILL come soon

Glenn discusses the current state of the economy with author of "Zero Hour," Harry Dent. He says the federal reserve is still funding big banks daily to increase their excess reserve, signaling that something is wrong with the system. Dent says the banks printed money back in the 1930s to climb out of the Great Depression, which only created an aftershock that was even worse. Banks and the Fed are doing the same thing today, so will the coming economic crisis be worse than the 2008 recession? Economist Dent says it's possible, and that it could come as early as 2020.

RADIO

Lessons from Bill and Monica: Trump, Biden, Obama, Ukraine, and corruption

Glenn says there's an important lesson we can all learn looking back on the Bill Clinton and Monica Lewinsky scandal: corruption should matter, no matter which political party is to blame. So why are Democrats and the media trying desperately to shame President Trump for alleged corrupt action, but ignoring Joe Biden's clear intention to help his son, Hunter, profit through foreign policy in the Ukraine? How do we fix such a double standard?