The reports of American decline are greatly exaggerated

America is wealthy, influential, powerful, and increasingly equal. Don't let anyone tell you otherwise.

The Iwo Jima flag raising.
(Image credit: Illustrated | Getty Images, iStock)

The narrative of American decline gets louder every day. Our parents had it better. Our influence overseas has waned. Our country is poorer and more divided than ever.

We say these things, and we mean them in the moment, and our knowledge of history — or lack thereof — allows us to truly believe them. "American decline is the idea that the United States of America is diminishing in power geopolitically, militarily, financially, economically, socially, culturally, in matters of healthcare, and/or on environmental issues," observes a Wikipedia entry on the idea. "There has been debate over the extent of the decline, and whether it is relative or absolute," the article adds, but that it is happening is presented nearly as established fact.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up
Jason Fields

Jason Fields is a writer, editor, podcaster, and photographer who has worked at Reuters, The New York Times, The Associated Press, and The Washington Post. He hosts the Angry Planet podcast and is the author of the historical mystery "Death in Twilight."