'Singularity' Author Warns That Future AI Could Pose These Threats

William Hertling, author of the “Singularity Series,” joined Glenn on today’s show to talk about the future of technology and artificial intelligence. They tackled these questions and more:

  • What will it look like when humans and smart machines are “coexisting”?
  • Will we keep losing jobs to automation?
  • When will robots be able to diagnose our illnesses and replace doctors?
  • How will the human experience change as technology advances?
  • Will we be able to “opt out” of AI?

With every upside there looks to be a downside with the advancements in AI, tell us in the comment section below whether you are excited or ready to pull the plug.

This article provided courtesy of TheBlaze.

GLENN: I have been immersing myself in -- in future tech, to try to understand what is coming our way and what the -- the moral issues are of the near future.

What it means to each of us in our lives. What it means to be asked the question, am I alive?

Is this life? We have so many questions that we have to answer. And we're having trouble with just some of the basic things. And no one is really thinking about the future.

When you think about the future, and you think about robots or you think about AI, Americans generally think of the terminator. Well, that's not necessarily what's going to happen.

How do we educate our kids?

So I've been reading a lot of high-tech stuff. And in my spare time, I've been trying to read some novels. And I'm looking for the storytellers, the people who can actually tell a great story that is really based in what is coming. The -- the futurist or the -- the near future sci-fi authors, that can show us what's on the horizon.

And I found a series of books. It's called the -- the Singularity series. And I found them over the Christmas vacation. And I just last night finished the fourth one.

And they are really, really well-done. They are -- they get a little dark. But it also shows the positive side of what could be. And it was a balanced look, and a way to really understand the future that is coming and is on the horizon.

William Hertling is the author, and he joins us now. William, how are you, sir?

WILLIAM: I'm doing great. Thanks so much for having me on.

GLENN: Congratulations on a really good series.

This is self-published?

WILLIAM: Yep. It is self-published. I could not find a publisher who saw the vision of the series. But I self-published it, and people love it. So it gets the word out there.

GLENN: Yeah. You've won several awards for it, and I hope -- you know, I don't know what your sales have been like, but I hope your sales are really good. Because I -- I think it -- well, let me ask you this: What was the intent of the series for you?

WILLIAM: You know, what happened was, about ten years ago, I read two books back-to-back. One was Ray Kurzweil, The Singularity Is Near, which I know you've read as well.

GLENN: Yep.

WILLIAM: And the other one was Charles Straufman's (phonetic) Accelerometer, which is a fictional book about the singularity.

And what I really realized at that point in time was that we had the biggest set of changes that were ever going to face humanity. And they were coming. And they were in the very near future. Right? They're certainly coming in my lifetime. They're probably coming within the next ten years. And there's very little out there about that.

And as you said, most of the stories that are in media today are about these terminator-style stories. AI rises up. They take control of the machines. And we fight them in the battle. Which, of course, makes for a great movie. I would love to see the Terminator many times over, but what happens when it's not like that? What happens when it's sort of the quiet kind of AI story. And that's really what I wanted to explore. What happens when there's this new emergence of the first AI that's out there, and people realize they're being manipulated by some entity? And what do they do about it? How do they react?

GLENN: So I find this -- first of all, you lay it out so well. And the first book starts with the emergence of AI. And then moves -- I think the next book is, what? Ten years later, five years later --

WILLIAM: They're all ten years apart. Yeah. Basically explore different points of technology in the future.

GLENN: Right. So the last one is in the 2040s or in the 2050s. And it's a very different thing then than it starts out as.

WILLIAM: Yeah.

GLENN: And the thing I wanted to talk to you about is, first of all, can you just define -- because most people don't know the difference between AI, AGI, and ASI, which is really important to understand.

WILLIAM: Sure. So AI is out there today. It's any time programmers write a piece of software. Yet, instead of having a set of rules, you know, if you see this, then do that. Instead, the AI software is trained to make the decisions on its own. So AI is out there today. It's how you have self-driving cars. It's what selects the stories that you read on Facebook. It's how Google search results come about.

And AGI is the solution that artificial intelligence will become more general, right? All of the things that I mentioned, are very specific problems to be solved. How to drive a car is a very specific problem.

GLENN: So a good -- a good explanation of AI would be big blue, the chess-playing IBM robot.

It has no general intelligence. It does that.

WILLIAM: Exactly, right. And we have IBM's Watson, which is really good at making diagnoses about cancer. But you can't have a conversation about how you're feeling.

GLENN: Right.

WILLIAM: But AGI would. AGI would appear to be like a human being, conceivably. In that, it could talk and reason about a wide variety of topics, make decisions. Generally, use its intelligence to solve problems that it hasn't even seen before.

GLENN: Now, AGI can pass the Turing test?

WILLIAM: Yeah, so the Turing test is this idea that you've got a person in one room, chatting with someone in another room, and they have to decide, is that a human being, or is it a computer? And if they can't figure it out, then that is the Turing test.

And you pass the Turing test, if you can't distinguish between a computer and a person.

GLENN: How close are we to that?

WILLIAM: Well, I think we probably all have been fooled at least a couple of times when we've either gotten a phone call or made a phone call and we think that we're talking to a human being on the other end. Right? But it actually turns out that we're talking to a machine that routes our phonecall somewhere.

So, you know, we're there for a couple of sentences. But we're still pretty far away if you're going to have any meaningful conversation.

GLENN: And AGI is when the computer has the computing power of a human brain?

WILLIAM: Yeah.

GLENN: Okay. Now, that's not necessarily a scary thing. But it's what happens when you go from AGI to ASI, artificial super intelligence. And that can happen within a matter of hours. Correct?

WILLIAM: It can. There's a couple of different philosophies on that. But if you can imagine that -- think about the computer that you have today, versus the computer that you had ten years ago. Right?

It's vastly more powerful. Vastly more powerful than the one you had 20 years ago. So even if there's not these super rapid accelerations in -- in intelligence. Even if you just today had a computer that was the intelligence of a human being, you would imagine that ten years from now, it's going to be able to think about vastly more stuff. Much faster. Right?

So we could see even just taking advantage of increasing in computing power, we would get a much smarter machine. But the really dangerous, or not necessarily dangerous, but the part -- the really rapid change comes from when the AI can start making changes to itself.

So if you have today, programmers create AI. But in the future, AI can create AI. And the smarter AI gets, then in theory, the smarter the AI it can build. And that's where you can get this thing that kind of spirals out of control.

GLENN: So you get a handle on how fast this can all change, if you have an Apple i Pad 2, that was one of the top five supercomputers in 1998. Okay?

That was a top five supercomputer.

WILLIAM: Yeah.

GLENN: That's how fast technology is growing on itself.

All right. So, William, I want you to kind of outline what -- we're going to take a break, and I want you to come back and kind of outline why all of this stuff matters. What -- what is in the near future, that we're going to be wrestling with? And why people should care. When we come back.

GLENN: As you know, if you're a long-time listener of the program, I'm very fascinated with the future and what is coming. The future of tech and artificial intelligence.

William Hertling is an author and a futurist. He is the author of what's called The Singularity Series. It's a series of four novels, that kind of break it down and tell you what's coming. And break it down in an entertaining fashion. I highly recommend The Singularity Series. If you are interested in any of this, you need to start reading that, you will really enjoy that.

STU: William, I know Glenn is a big fan of your work and has been reading a lot about technology. I think a lot of people who are living their daily lives aren't as involved in this. I think a third or a half of the audience when you hear AI, don't even connect that to artificial intelligence, until you say it.

I know as a long-term NBA fan, I think Allen Iverson, honestly when I hear AI. Can you make the case, with everything going on in the world, why should people put this at the top of their priority list?

WILLIAM: Well, it's the scale of the change that's coming.

And probably the nearest thing that we're really going to see is over the next five years, we're going to see a lot more self-driving cars and a lot more automation in the workplace. So I think transportation jobs account for something like 5 percent of all jobs in the United States.

And whether you're talking about driving a car, a taxi, driving a delivery truck, all of those things are potentially going to be automated. Right? This is one of the first really big problems that AI is tackling. And AI is good at it. So AI can drive a car. And it can do a better job. It doesn't get tired. It doesn't just go out and drink before it drives, and it doesn't make mistakes.

Well, that's not quite true. They're going to make less mistakes, but they're going to make less mistakes than your typical human operator. So you know business likes to save money. And it likes to do things efficiently. And self-driving cars are going to be more cost-effective. They're going to be more efficient. So what happens to those 5 percent of the people today who have transportation jobs? Right?

This is probably going to be the biggest thing that affects us.

GLENN: I think, William, you know, that Silicon Valley had better start telling the story in a better fashion. Because as these things hit, we all know politicians on both sides, they'll just -- they'll blame somebody. They're telling everybody that I'm going to bring the jobs back.

The jobs aren't coming back. In fact, many, many more are going to be lost. Not to China, but by robotics and AI. And when that happens, you know, I can see, you know, politicians turning and saying, "It's these robot makers. It's these AI people."

WILLIAM: Yeah. Naturally. And yet, unfortunately, the AI genie is out of the bottle, right? Because we're investing in it. China is investing in it. Tech companies around the world are investing in it.

If we stop investing in it, even if we said, hey, we don't want AI, we don't like it, all that's going to do is put us at a disadvantage compared to the rest of the world. So it's not like we can simply opt out. It's not really -- we don't have that option. It's moving forward. So we need to participate in it. And we need to shape where it's going. And I think this is the reason why it's so important to me that more people understand what is AI and why it matters. Because we need to be involved in a public conversation about what we want society to look like in the future.

As we go out, if even more jobs are eliminated by AI, what does that mean? What if we don't have meaningful work for people?

GLENN: I think the thing I like about your book series is it starts out really hopeful. And it shows that, you know, this technology is not going to be something that we really are likely to refuse. Because it's going to make our life incredibly stable and easy in some ways.

And I kind of would like you to talk about a little about, you know, the stock market and the economy and war and everything else. Something that you talk about in your first novel. And show you when we come back, the good side, and then what it could turn into.

STU: So Allen Iverson is taking our transportation jobs?

GLENN: Yes, yes.

STU: Okay. That's what I got from that.

GLENN: We're talking to William Hertling. He is the author and futurist. The author of many books. His latest is The Kill Process. I'm talking to him about The Singularity Series. And the first one in there is the Avagadro Corp. And it starts out around this time. And it starts out with a tech center in Portland. And a guy is working on a program that will help everybody with their email. And all of a sudden he makes a couple of changes. And unbeknownst to him, it grows into something that is thinking and acting and changing on its own.

And, William, I would like you to take us through this. Because the first book starts out really kind of positive. Where you're looking at this -- and there's some spooky consequences -- but you're looking at it going, you know, I could see us -- I'd kind of like that. And by the end, in the fourth book, we've all been digitized. And we're in a missile, leaving the solar system because earth is lost.

A, do you think this is -- is this your prediction, or you just think this is a really kind of good story?

WILLIAM: Well, you know, I think a lot of it has the potential to be real. And I think one of the things you probably know from my reading is that I'm fairly balanced. What I see are the risks and the benefits. I think there's both.

GLENN: Yeah.

WILLIAM: I get very upset. There are so many people that are very dogmatic about artificial intelligence and the future. And they either say, hey, it's all benefits and there are no risks. Or they only talk about the risks without the benefits.

And, you know, there's a mix of both. And it's like any other technology. Right?

GLENN: We don't know.

WILLIAM: All of our smartphones -- we all find our smartphones to be indispensable. And at the same point in time, they affect us. Right? And they have negative affects. And society is different today than it was years ago, at the cost of our smartphones.

GLENN: But this is different though than anything else that we've seen like a smartphone. Because this is -- this is like, you know, an alien intelligence.

We don't have any way to predict what it's going to be like, or what it's going to do. Because it will be thinking. And it most likely will not be thinking like a human.

But can we start at the beginning, where, just give me some of the benefits that will be coming in the next, let's say, ten years that people will have a hard time saying no to.

WILLIAM: Sure. I mean, first of all, we already talked about self-driving cars, right? I think we all like to get into our car and be able to do whatever we want to do and not have to think about driving. That's going to free us up from a mundane task.

We're going to see a lot more automation in the workplace. Which means that the cost of goods and services will go down. So we'll be able to get more from less. So that will seem like an economic boom, to those of us that will afford it. Right? We will be able to enjoy more things. We'll have better experiences when we interact with AI. So today, if you have to go to the doctor, you'll wait to get a doctor's appointment. You'll go in. You'll have this rushed experience, more than likely, if you're here in the US. You'll get five minutes of their time, and you're hoping they will make the right diagnosis in the five minutes they're with you. That's going to be I think one of the really big changes over the five, ten years from now is we'll see a lot more AI-driven diagnosis.

So when you're having medical issues, you can go in, and you can talk to an AI that will be more or less indistinguishable than talking to the nurse when you walk into the doctor's office.

And by the time the doctor's sees you, there will already be a diagnosis made by the AI. And it likely will be more accurate than what the doctor would have done. And all they'll do is sign off on it.

GLENN: Yeah, I had a hard time -- until I started reading about Watson, I had a hard time believing that people would accept something from a machine. But they are so far ahead of doctors, if they're fed enough information.

They're so far ahead on, you know, predicting cancer and diagnosing cancer than people are. I think it's going to be a quick change. You're going to want to have the AI diagnose you.

WILLIAM: Right. Because that's going to be the best. Right? When we go to the doctor, we want the best. We don't want the second best.

GLENN: Right.

WILLIAM: So we're going to see a lot of that. And then, you know, ten to 15 years out -- you know, it's funny, I had a conversation with my daughter one day, and she asked, hey, Dad, when am I going to get to drive a car?

And I thought about her age, and I thought about that. And I was like, well, I'm not sure you're ever going to get to drive a car. Because where you are and when self-driving cars are coming, you may never drive a car.

And so you'll just get one, and it will take you where you want to go.

So there's going to be very -- they're both subtle and yet dramatic changes in society when you think about, hey, we're going to have a generation of people, and they will never have learned how to drive a car. Right? So their time will be free to do other things. They'll be different than we are.

GLENN: Do you see the -- you know, in your first book, you talk about, you know, AI changing, you know, the emails that are being sent and doing things on its own. And really manipulating people.

We are already at the point to where we accept the manipulation of what we see in our Facebook feed. But that's not -- there's -- there's -- that's not a machine trying to do anything, but give us what we want.

WILLIAM: Right.

GLENN: Do you see us very far away from, you know, hedge fund computers that can -- that can really manipulate the markets in a positive way or computers that can begin to manipulate for peace, as you put in your book, your first one?

WILLIAM: It's a good question. We're definitely going to see that. At a minimum, right? We can imagine that if you have an authoritarian government, they're going to distribute information to pacify people.

And that's not a good thing often. In some ways, it is. You know, if you have armed unrest, people will die. So there's a balance there. I think what we'll see is we'll just see lots of different people use technology in lots of different ways.

So maybe we don't have, you know, a hedge fund manipulating the markets in a positive way. Maybe it starts with a bunch of hackers in another country, manipulating the markets to make money. Right?

So I think we are going to see that distribution, that manipulation of information. And it's hard.

It out there now, right? There is content -- a lot of the content that you read on the web, whether it's a review of a restaurant or a business, a lot of it is generated by AI. And it's hard to tell what's AI versus a person writing a genuine review.

GLENN: Talking to William Hertling. He's an author and futurist. Author of a great series of novels called The Singularity Series. William, the -- the idea that intelligent -- not AI. Not narrow AI. But, you know, super intelligence or artificial general intelligence just kind of comes out of nowhere, as it does in your first novel, where it wasn't the intent of the programmer, is interesting to me.

I sat with a -- one of -- a bigger name from Silicon Valley, just last week. And we were talking about this. And he said, whoever controls AI, whoever gets this first is going to control the world.

He was talking to me privately about a need for almost a Manhattan Project for this. Do you see this as something that is just going to be sprung on us, or will it be taken, you know, in a lab? Intentionally?

WILLIAM: I think the odds are probably strongly biased towards in a lab. Both because they have the kind of deeper knowledge and expertise. You know, because they have the kind of raw computing power, right? So the folks at Google will have millions of times of computing power, than somebody who is outside a company like Google. So that alone -- it's like they have the computers that will have it in 15 to 20 years, right? That kind of computing power. And that makes AI a lot easier of a problem to solve.

So I think it's most likely to come out of a lab.

GLENN: If you're looking at, for instance, the lawsuit that was just filed against Google about the way they treat people with different opinions, et cetera, et cetera. My first thought is, good God, what are those people putting into the programming?

I mean, that -- that doesn't -- that doesn't work out well for people. Is there enough -- are there enough people that are concerned about what this can do and what this can be, that we have the safeguards with people?

WILLIAM: You know, I -- I really think we don't. I mean, think about the transportation system we have today and the robust set of safety mechanisms we have around it. Right?

So we want to drive from one place to another. We have a system of streets. We have laws that govern how you drive on those streets. We have traffic lights. Cars have antilock brakes. They have traction control. All these things are designed to prevent an accident.

If you get into an accident, we have all these harm reduction things. Right? We have seatbelts and airbags. After the fact, we have all this -- we have a whole system of litigation, right? We have ambulances and paramedics in the hospitals to take care of those damage results. In the future, we'll need that same sort of very robust system for AI. And we don't have anything like that today.

GLENN: And nobody is really thinking about it. Which is --

WILLIAM: Yeah, nobody is thinking about it comprehensively. And one thing you can imagine is, well, we'll wait until we have a problem, and then we'll put those safety mechanisms in place.

Well, the problem, of course, is that AI works at the speed of computers, not at the speed of people. And there's this scene in one of my books -- I'm sure you remember reading it -- where there's a character who witnesses a battle between two different AI factions.

GLENN: Yes.

WILLIAM: And the whole battle takes place, a lot of things happen between the two different AI factions, all in the time it takes the human character's adrenaline to get pumping.

And by the time he is primed and ready to fight, the battle is over. And they're into negotiations and how to resolve it, right?

GLENN: It's remarkable in reading that. That's a great understanding of -- of how fast this will -- things will move.

It's like one of the best action novels of war scenes I've ever seen. Really, really good. You know, page after page after page of stuff happening. And you get to the end, and you realize, "Oh, my gosh, this -- the human hasn't even hardly moved. He hasn't even had a chance to think about the first step that happened." And it's already over.

WILLIAM: Exactly. So this is why we need to be thinking about, how are we going to control AI? How are we going to safeguard ahead of time? We have to have these things in place, long before we actually have AI.

STU: Isn't it true though, William, that eventually some bad actor is going to be able to develop this and not put those safeguards in? And we're not going to have a choice. Eventually, the downside of this is going to affect everybody.

WILLIAM: You know, it's very true. And part of the reason why, I say, right? We can't opt out of AI. We can't not develop it. Because then we're just at a disadvantage of someone who does. And it gets even scarier as you move out. So one of the things I talk about in my third book, which is set around 2035. And I talk about neural implants. I think neural implants -- so basically a computer implanted in your brain, the purpose of which is mostly to get information in and out. Right? Both having a smartphone in our hands where we're trying to read information on the screen. We can get it directly in our head. It makes interaction much smoother, easier. And -- but it can also help tailor your brain chemistry. Right? So if you can imagine if you're someone who has depression or anxiety or a severe mental disability, that a neural implant could correct those things. You basically would be able to flip a switch and turn off depression or turn off anxiety.

STU: Wow.

GLENN: So, William, I'm unfortunately out of time. Could I ask you to come back tomorrow and talk and start there? Because that's really the third book. Start with the neuroimplants and where it kind of ends up with technology. Because it is remarkable. And in reading the real science behind it, it's real. It's real.

WILLIAM: It sure is. It's coming.

GLENN: Yeah. Could you come back maybe tomorrow?

WILLIAM: Sure. I would be happy too.

GLENN: Okay. Thank you so much, William. William, author and futurist. He is the author of The Singularity Series.

STU: You should get one of those things, Glenn. That thing logical alter your brain. William Hertling is the author of all these books. There's four of them in this series, The Singularity Series. Plus, Kill Process just came out. That's WilliamHertling.com.

Let me ask you this, Glenn, is this the write way to think about it? This comes in from Twitter, @worldofStu. To understand the difference between AI, artificial intelligence, and AGI, Artificial General Intelligence.

So if there's a self-driving car, and it's AI, you say, take me to the bar, and it says calculating route. Beginning travel.

Okay? If you say it to AGI, take me to the bar, it responds, your wife says you drink too much and my sensors say you put on a few pounds, routing to the gym.

GLENN: I have a feeling, you're exactly right.

STU: That's terrible.

Today is the 75th anniversary of D-Day, the largest amphibious invasion in history.

The Allied invasion force included 5,000 ships and landing craft, 11,000 planes, and almost three million allied soldiers, airmen and sailors. Despite such numbers, the location and timing of the invasion was still an enormous gamble. The Nazis fully expected such an invasion, they just didn't know precisely when or where it would be.

Despite the enormous logistics involved, the gamble worked and by the end of June 6, 1944, 156,000 Allied troops were ashore in Normandy. The human cost was also enormous – over 4,900 American troops died on D-Day. That number doubled over the next month as they fought to establish a foothold in northern France.

There were five beach landing zones on the coast of northwestern France, divided among the Allies. They gave each landing zone a name. Canada was responsible for "Juno." Britain was responsible for "Gold" and "Sword." And the U.S. had "Utah" and "Omaha."

The Nazis were dug in with bunkers, machine guns, artillery, mines, barbed wire, and other obstacles to tangle any attempt to come ashore. Of the five beaches, Omaha was by far the most heavily defended. Over 2,500 U.S. soldiers were killed at Omaha – the beach so famously depicted in the opening battle sequence of the 1998 movie, Saving Private Ryan. The real-life assault on Omaha Beach included 34 men in that first wave of attack who came from the same small town of Bedford, Virginia. The first Americans to die on Omaha Beach were the men from Bedford.

amp only placement

America has a national D-Day Memorial, but many people don't know about it.

America has a national D-Day Memorial, but many people don't know about it. Maybe that's because it wasn't a government project and it's not in Washington DC. It was initiated and financed by veterans and private citizens. It's tucked away in the foothills of the Blue Ridge Mountains, in the small town of Bedford, Virginia. Why is the memorial for one of the most famous days in modern world history in such a tiny town? Because, as a proportion of its population of just 3,200 at the time, no community in the U.S. sacrificed more men on D-Day than Bedford.

There were 34 men in Company A from Bedford. Of those thirty-four, 23 died in the first wave of attacks. Six weeks after D-Day, the town's young telegraph operator was overwhelmed when news of many of the first deaths clattered across the Western Union line on the same day. Name after name of men and families that she knew well. There were so many at once that she had to enlist the help of customers in the pharmacy's soda shop to help deliver them all.

Among those killed in action were brothers Bedford and Raymond Hoback. Bedford was the rambunctious older brother with a fiancée back home that he couldn't wait to return to. Raymond was the quieter, more disciplined younger brother who could often be found reading his Bible. He fell in love with a British woman during his two years in England training for D-Day. Like in that opening sequence of Saving Private Ryan, Bedford and Raymond barely made it down the ramp of their Higgins Boat in the swarm of bullets and hot steel before they were cut down in the wet sand.

Bedford and Raymond Hoback's mother, Macie, learned of both their deaths from two separate telegrams, the first on a Sunday morning, the second the following day. Their younger sister, Lucille, remembered her mother's devastation, and her father walking out to the barn to cry.

The day after D-Day, the killing field of Omaha Beach was already transforming into the massive supply port that would help fuel the American drive all the way to Berlin over the next year. A soldier from West Virginia was walking along the beach when he saw something jutting out of the sand. He reached down and pulled it out. He was surprised to find it was a Bible. The inside cover was inscribed with: "Raymond S. Hoback, from mother, Christmas, 1938." The soldier wrote a letter and mailed it with the Bible to Raymond's mother. That Bible, which likely tumbled from Raymond's pack when he fell on D-Day, became Macie Hoback's most cherished possession – the only personal belonging of her son that was ever returned.

Of the 23 Bedford men who died on Omaha Beach, eleven were laid to rest in the American cemetery in Normandy.

These men, many of them barely out of their teens, didn't sign up to march to the slaughter of course. They had hopes and dreams just like you and I. Many of them signed up for adventure, or because of peer pressure, and yes, a sense of honor and duty. Many of the Bedford Boys first signed up for the National Guard just to make a few extra bucks per month, get to hang out with their buddies, and enjoy target practice. But someone had to be first at Omaha Beach and that responsibility fell to the men from Bedford.

Over the last several years, the D-Day anniversary gets increasingly sad. Because each year, there are fewer and fewer men alive who were actually in Normandy on June 6, 1944. The last of the surviving Bedford Boys died in 2009. Most of the remaining D-Day veterans who are still with us are too frail to make the pilgrimage to France for the anniversary ceremonies like they used to.

It's difficult to think about losing these World War II veterans, because once they're all gone, we'll lose that tether to a time when the nation figured out how to be a better version of itself.

Not that they were saints and did everything right. They were as human as we are, with all the fallibility that entails. But in some respects, they were better. Because they went, and they toughed it out, and they accomplished an incredibly daunting mission, with sickening hardship, heartbreak, and terror along the way.

So, what does the anniversary of D-Day mean in 2019?

In one sense, this anniversary is a reprimand that we've failed to tell our own story well enough.

In one sense, this anniversary is a reprimand that we've failed to tell our own story well enough. You can't learn about the logistics of the operation and above all, the human cost, and not be humbled. But as a society, we have not emphasized well enough the story of D-Day and all that it represents. How can I say that? Because of an example just last weekend, when common sense got booed by Democratic Socialists at the California Democrats' State Convention. When Democratic presidential candidate John Hickenlooper said during his speech that "socialism is not the answer," the crowd booed loudly. When did telling the truth about socialism become controversial?

Sure, socialists, and communists and other anti-American factions have always been around. America certainly had socialists in 1944. But the current socialists trying to take over the Democratic Party like a virus don't believe in the D-Day sacrifices to preserve America, because they don't believe America is worth preserving. They are agitating to reform America using the authoritarian playbook that has only ended in death and destruction everywhere it is followed.

Ask a Venezuelan citizen, or an Iraqi Christian, or a North Korean peasant why D-Day still matters in 2019.

The further we move away from caring about pivotal events like June 6, 1944, the less chance of survival we have as a nation.

At the same time, the D-Day anniversary is a reminder that we're not done yet. It's an opportunity for us to remember and let that inform how we live.

Near the end of Saving Private Ryan, the fictional Captain Miller lays dying, and he gives one last instruction to Private Ryan, the young man that he and his unit have sacrificed their lives to rescue in Normandy. He says, "Earn it."

In other words, don't waste the sacrifices that were made so that your life could be saved. Live it well. The message to "earn it" extends to the viewer and the nation as well – can we say we're earning the sacrifices that were made by Americans on D-Day? I cringe to think how our few remaining World War II veterans might answer that.

Honor. Duty. Sacrifice. Gratitude. Personal responsibility. These used to mean a lot more.

Honor. Duty. Sacrifice. Gratitude. Personal responsibility. These used to mean a lot more. I don't want to believe it's too late for us to rediscover those traits as a nation. I want to believe we can still earn it.

The challenge to "earn it" is a lot of pressure. Frankly, it's impossible. We can't fully earn the liberty that we inherited. But we can certainly try to earn it. Not trying is arrogant and immoral. And to tout socialism as the catch-all solution is naïve, and insulting to the men like those from Bedford who volunteered to go defend freedom. In truly striving to earn it, we help keep the flame of liberty aglow for future generations. It is necessary, honorable work if freedom is to survive.

The end of Lincoln's Gettysburg Address is remarkably relevant for every anniversary of June 6, 1944. This is what D-Day still means in 2019:

"It is rather for us to be here dedicated to the great task remaining before us – that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion – that we here highly resolve that these dead shall not have died in vain – that this nation, under God, shall have a new birth of freedom – and that government of the people, by the people, for the people, shall not perish from the earth."

Letter from Corporal H.W. Crayton to Mr. and Mrs. Hoback – parents of Bedford and Raymond Hoback who were both killed in action on June 6, 1944

Álvaro Serrano/Unsplash

July 9, 1944 Somewhere in France

Dear Mr. & Mrs. Hoback:

I really don't know how to start this letter to you folks, but will attempt to do something in words of writing. I will try to explain in the letter what this is all about.

While walking along the Beach D-day Plus One, I came upon this Bible and as most any person would do I picked it up from the sand to keep it from being destroyed. I knew that most all Bibles have names & addresses within the cover so I made it my business to thumb through the pages until I came upon the name above. Knowing that you no doubt would want the Book returned I am sending it knowing that most Bibles are a book to be cherished. I would have sent it sooner but have been quite busy and thought it best if a short period of time elapsed before returning it.

You have by now received a letter from your son saying he is well. I sincerely hope so.

I imagine what has happened is that your son dropped the Book without any notice. Most everybody who landed on the Beach D-Day lost something. I for one as others did lost most of my personal belongings, so you see how easy it was to have dropped the book and not know about it.

Everything was in such a turmoil that we didn't have a chance until a day or so later to try and locate our belongings.

Since I have arrived here in France I have had occasion to see a little of the country and find it quite like parts of the U.S.A. It is a very beautiful country, more so in peace time. War does change everything as it has this country. One would hardly think there was a war going on today. Everything is peaceful & quiet. The birds have begun their daily practice, all the flowers and trees are in bloom, especially the poppies & tulips which are very beautiful at this time of the year.

Time goes by so quickly as it has today. I must close hoping to hear that you receive the Bible in good shape.

Yours very truly,

Cpl. H.W. Crayton

It's not as easy as it used to be for billion-dollar entertainment empires like The Walt Disney Company. It would be more streamlined for Disney to produce its major motion pictures in its own backyard. After all, abortion in California is readily available, as well as a protected, cherished right. And since abortion access is critical for movie production, right up there with lighting equipment and craft services, you would think California would be the common-sense choice for location shooting. Alas, even billion-dollar studios must pinch pennies these days. So, in recent years, Disney, among other major Hollywood studios, has been farming out production to backwater Southern lands like Georgia, and even Louisiana. Those states offer more generous tax breaks than Disney's native California. As a result, Georgia for example, played host to much of the shooting for the recent worldwide box office smash Avengers: Endgame.

But now it looks like it's Georgia's endgame. The state recently passed what is known as a "heartbeat" bill – a vicious, anti-woman law that would try to make pregnant women allow their babies to be born and actually live. It's a bridge too far for a major studio like Disney, which was largely built on creating family entertainment. How can Disney possibly go about making quality movies, often aimed at children, without access to unfettered abortion? It's unconscionable. Lack of abortion access makes it nearly impossible to shoot movies. So, what's a major studio to do? Disney might have considered migrating its business to Louisiana, but that state too has now signed a heartbeat bill into law. It's utter madness.

These monstrous anti-abortion bills, coupled with having to live under President Trump, has led Disney to seek a new home for its legendary movie magic. Last week, Disney's CEO, Bob Iger, announced that all future Disney movies will now be filmed on location in the Sub-Saharan African nation of Wakanda.

"Disney and Wakanda are a match made in heaven," Iger told reporters. "Wakanda was, until recently, a secret kingdom, much like our own Magic Kingdom. With this new partnership, we'll not only get to continue our legacy of making movies that parents and children everywhere enjoy together, but we'll get to do so in a safe space that reveres abortion as much as we do."

Wakanda is one of only four African countries (out of 55) that allow unrestricted abortion.

As home to the most advanced technology in the world – and with the planet's highest per-capita concentration of wokeness – Wakanda offers women painless, hassle-free abortion on demand. As the Wakandan health ministry website explains, the complete absence of any white-patriarchal-Judeo-Christian influence allows women in Wakanda to have complete control of their own bodies (with the exception of females who are still fetuses). As winner of the U.N.'s 2018 Golden Forceps award (the U.N.'s highest abortion honor) Wakanda continues its glowing record on abortion. That makes it an ideal location for Disney's next round of live-action remakes of its own animated movies in which the company plans to remove all male characters.

Iger says he hopes to convince Wakandan leadership to share their top-secret vibranium-based abortion procedure technology so that American women can enjoy the same convenient, spa-like abortion treatment that Wakandan women have enjoyed for years.

Wakanda is one of only four African countries (out of 55) that allow unrestricted abortion. Disney plans to boycott and/or retaliate against the other 51 African nations, as well as any U.S. states, that restrict abortion. Specific plans are being kept under wraps, but sources say Disney's potential retaliation may include beaming Beverly Hills Chihuahua into the offending territories on a continuous, indefinite loop.

When asked how Wakanda's futuristic capital city and distinctly African landscape would be able to double for American movie locations, Iger said, "I guess America will just have to look more like Wakanda from now on."

One potential wrinkle for the Left-leaning studio is the fact that Wakanda has an impenetrable border wall-shield-thing designed to keep out foreign invaders as well as illegal immigrants. Iger said he understands Wakanda's policy of exclusivity, adding, "After all, not everyone gets into Disneyland. You have to have a ticket to get in. Anyone is welcome, but you have to go through the process of getting a ticket." When one reporter pointed out that Iger's answer sounded like the conservative argument for legal immigration under the rule of law, Iger insisted that the reporter was "a moronic fascist."

What if the unthinkable happens and Florida also enacts its own "heartbeat" law? That would be problematic since Walt Disney World is located in Florida. Iger responded that Disney would "cross that bridge if we get to it" but that the most likely scenario would entail "dismantling Disney World piece-by-piece and relocating it to the actual happiest place on earth – Wakanda." As for whether Disney would ever open character-themed abortion clinics inside its theme parks, Iger remained coy, but said, "Well, it is the place where dreams come true."

With the Wakanda solution, Disney may have found a place where Minnie Mouse can finally follow her heart and have true freedom of choice.

When pressed about the cost of ramping up production in a secretive African kingdom that has no existing moviemaking infrastructure (which could easily end up being much more expensive than simply shooting in California) Iger said, "You can't put a price tag on abortion freedom. Wakanda Forever and Abortion Forever!"

With the Wakanda solution, Disney may have found a place where Minnie Mouse can finally follow her heart and have true freedom of choice. And that will be welcome relief to traditional families all over the world who keep the Walt Disney Company in business.

*Disclaimer: The preceding story is a parody. Bob Iger did not actually say any of the quotes in the story. Neither is Wakanda an actual nation on planet Earth.

"Journeys of Faith with Paula Faris," is a podcast featuring conversations about how faith has guided newsmakers and celebrities through their best and worst times. The Church of Jesus Christ of Latter-Day Saints is a much maligned religion so Glenn joined the podcast and took the time to explain what it means to him and how it changed his life.

From his suicidal days and his battle with drugs and alcohol, it was his wife Tania and his faith that saved him. All his ups and downs have given him the gift of empathy and he says he now understands the "cry for mercy" — something he wishes he'd given out more of over the years.

You can catch the whole podcast on any of the platforms listed below.

- Apple Podcasts
- Google Podcasts
- TuneIn
- Spotify
- Stitcher
- ABC News app

One of these times I'm going to go on vacation, and I'm just not going to come back. I learn so much on a farm.

You want to know how things work, go spend a summer on a farm. You're having problems with your son or daughter, go spend a summer on a farm.

My son changed. Over two weeks.

Getting him out of bed, getting him to do anything, is like insane. He's a 15-year-old kid. Going all through the normal 15-year-old boy stuff. Getting him on the farm, where he was getting up and actually accomplishing stuff, having to build or mend fences, was amazing. And it changed him.

RELATED: 'Human Wave Theory': Connecting the dots on the strategic attack on our border

Our society does not allow our kids to grow up, ever. I am convinced that our 15-year-olds could be fixing all kinds of stuff. Could be actually really making an impact in a positive way in our society. And what's wrong with our society is, we have gotten away from how things actually work. We're living in this theoretical world. When you're out on a farm, there's no theory here. If it rains, the crops will grow. If it rains too much, the crops won't grow.

If there's no sun, they won't grow. If there's too much sun, they'll shrivel up and die. There's no theory. We were out mending fences. Now, when I say the phrase to you, mending fences, what does that mean? When you think of mending fences, you think of, what?

Coming together. Bringing people together. Repairing arguments.

I've never mended a fence before until I started stringing a fence and I was like, "I ain't doing this anymore! Where is it broken? Can't we just tie a piece of barbed wire together?"

Let's stop talking about building a wall. Because that has all kinds of negative imagery. Mending fences is what we need to do.

That's called mending fences.

And why do you mend fences? So your animals don't get out and start to graze on somebody else's land. When your fence goes down, your cow is now on somebody else's land. And your cow is now eating their food.

We look at the phrase, mending fences as saying, hey. You know, we were both wrong. Mending fences has nothing to do with that.

Mending fences means build a wall. My neighbors and I, we're going to get along fine, as long as my cows don't go and steal their food, or their cows don't come over and steal my cow's food.

We're perfectly neighborly with each other, until one of us needs to mend a fence, because, dude, you got to mend that, because your cows keep coming over and eating my food.

You know what we need to do with Mexico? Mend fences.

Now, that's a phrase. You hear build a wall. That's horrible.

No, no, no. We need to mend fences.

In a farming community, that means putting up an electric fence. That means putting up barbed wire.

So the cows — because the cows will — they'll stick their head through barbed wire. And they'll eat the grass close to the road. Or eat the grass close to the other side of the fence. And they'll get their heads in between those fences. And they can't get out sometimes. Because the grass is always greener on the other side. You look at these damn cows and say turn around, cow — there's plenty of stuff over here.

No. They want the grass on the other side of the fence.

So you mend it.

And if it's really bad, you do what we do. We had to put an electric fence up. Now, imagine putting an electric fence up. That seems pretty radical and expensive.

Does it really work? Does it shock them? What does that feel like to a cow?

The cows hit it once, and then they don't hit it again. They can actually hear the buzz of the electric fence. There's a warning. Don't do it. Don't do it. They hear the current and they hit it once and they're like, "I'm not going to do that again."

So you mend fences, which means, keep your stuff on your side. I like you. We're good neighbors. You keep your stuff on your side and I'll keep my stuff on my side and we'll get together at the town hall and we'll see each other at the grocery store. Because we're good neighbors. But what stops us from fighting is knowing that there is a fence there.

This is my stuff. That's your stuff. But we can still trade and we'll help each other. But let's stop talking about building a wall. Because that has all kinds of negative imagery. Mending fences is what we need to do.

You can have a tough fence. It could be a giant wall. It could be an electric fence. But you need one. And that's how you come together.

The side that's having the problem, mends the fence.