Lost Wars

16Bacevich-master1050

The terrorist attacks on September 11, 2001 profoundly change America’s national security equation – perhaps forever.

Those attacks spawned the wars in Afghanistan and Iraq, and these have all-but-consumed the U.S. military for more than a decade-and-a-half.

There have been several books – some good and some less so – that have tried to help us come to grips with not only why we embarked upon these wars, as well as why we can’t “win.”

Andrew Bacevich’s review of Daniel Bolger’s book, “Why We Lost,” offers some key insights. Here is how he begins:

The author of this book has a lot to answer for. “I am a United States Army general,” Daniel Bolger writes, “and I lost the Global War on Terrorism.” The fault is not his alone, of course. Bolger’s peers offered plenty of help. As he sees it, in both Afghanistan and Iraq, abysmal generalship pretty much doomed American efforts.

The judgment that those wars qualify as lost — loss defined as failing to achieve stated objectives — is surely correct. On that score, Bolger’s honesty is refreshing, even if his explanation for that failure falls short. In measured doses, self-flagellation cleanses and clarifies. But heaping all the blame on America’s generals lets too many others off the hook.

Why exactly did American military leaders get so much so wrong? Bolger floats several answers to that question but settles on this one: With American forces designed for short, decisive campaigns, the challenges posed by protracted irregular warfare caught senior officers completely by surprise.

Since there aren’t enough soldiers — having “outsourced defense to the willing,” the American people stay on the sidelines — the generals asked for more time and more money. This meant sending the same troops back again and again, perhaps a bit better equipped than the last time. With stubbornness supplanting purpose, the military persisted, “in the vain hope that something might somehow improve.

Want more? You can read the full article here

AI and You!

merlin_129555845_9af63388-6b04-422c-a297-f879e0d7287d-master768

Few subjects have captured the public’s imagination today more than artificial intelligence (AI) and machine learning. A niche, tech subject just a few years ago, AI has now gone mainstream.

Part of this is because we are surrounded by digital aps like Siri and Cortana inform and entertain us daily (just ask Siri “What is zero divided by zero).

But AI will play a much more profound role in our lives in the future. But we may have to wait for it. Here is part of what Steve Lohr shared recently in a New York Times piece:

There are basically three big questions about artificial intelligence and its impact on the economy: What can it do? Where is it headed? And how fast will it spread?

Three new reports combine to suggest these answers: It can probably do less right now than you think. But it will eventually do more than you probably think, in more places than you probably think, and will probably evolve faster than powerful technologies have in the past.

This bundle of research is itself a sign of the A.I. boom. Researchers across disciplines are scrambling to understand the likely trajectory, reach and influence of the technology — already finding its way into things like self-driving cars and image recognition online — in all its dimensions. Doing so raises a host of challenges of definition and measurement, because the field is moving quickly — and because companies are branding things A.I. for marketing purposes.

An “AI Index,” created by researchers at Stanford University, the Massachusetts Institute of Technology and other organizations, released on Thursday, tracks developments in artificial intelligence by measuring aspects like technical progress, investment, research citations and university enrollments. The goal of the project is to collect, curate and continually update data to better inform scientists, businesspeople, policymakers and the public.

Want more? You can read the full article here

Thinking Well?

jdc_b457

As human beings, we pride ourselves on being rationale…after all…we’re not lemmings running off the end of a cliff…right?

I thought we were, that is, until I read a short op-ed by David Brooks. Here is part of what he said about how rationale we are:

Richard Thaler has just won an extremely well deserved Nobel Prize in economics. Thaler took an obvious point, that people don’t always behave rationally, and showed the ways we are systematically irrational.

Thanks to his work and others’, we know a lot more about the biases and anomalies that distort our perception and thinking, like the endowment effect (once you own something you value it more than before you owned it), mental accounting (you think about a dollar in your pocket differently than you think about a dollar in the bank) and all the rest.

It’s when we get to the social world that things really get gnarly. A lot of our thinking is for bonding, not truth-seeking, so most of us are quite willing to think or say anything that will help us be liked by our group. We’re quite willing to disparage anyone when, as Marilynne Robinson once put it, “the reward is the pleasure of sharing an attitude one knows is socially approved.” And when we don’t really know a subject well enough, in T. S. Eliot’s words, “we tend always to substitute emotions for thoughts,” and go with whatever idea makes us feel popular.

Want more? You can read the full article here

The Great War?

06Kazin-master768

For most Americans today, World War I is something that is consigned to history books. We learned that the United States entered the war reluctantly, but that we fought the good fight. We also get the notion that one of the results of the war was that America became a great power – and became greater during the 20th Century.

That’s why I found Michael Kazin’s New York Times piece, “The Great Mistake in the Great War,” so interesting. Here is how he began:

One hundred years ago, Congress voted to enter what was then the largest and bloodiest war in history. Four days earlier, President Woodrow Wilson had sought to unite a sharply divided populace with a stirring claim that the nation “is privileged to spend her blood and her might for the principles that gave her birth and happiness and the peace which she has treasured.” The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.

Still, most Americans know little about why the United States fought in World War I, or why it mattered. The “Great War” that tore apart Europe and the Middle East and took the lives of over 17 million people worldwide lacks the high drama and moral gravity of the Civil War and World War II, in which the very survival of the nation seemed at stake.

World War I is less easy to explain. America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.

But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.

Intrigued? You can read the entire article here

The Writing Process

1112-BKS-Kummer-blog427-v2

There are a few writers who help define what writing is for all of us. John McPhee is one of them. That’s why I was intrigued by a review of his newest book: “Draft NO. 4.” Here is part of what the reviewer had to offer:

Followers of John McPhee, perhaps the most revered nonfiction narrative journalist of our time, will luxuriate in the shipshape prose of “Draft No. 4: On the Writing Process,” a collection of eight essays that first appeared in The New Yorker, his home for more than 50 years. Writers looking for the secrets of his stripped-bark style and painstaking structure will have to be patient with what is a discursive, though often delightful, short book. McPhee’s publisher is presenting it as a “master class,” but it’s really a memoir of writing during a time of editorial cosseting that now seems as remote as the court of the Romanovs. Readerly patience will be rewarded by plentiful examples of the author’s sinewy prose and, toward the end, by advice and tips that will help writers looking to become better practitioners of the craft and to stay afloat in what has become a self-service economy.

Virtually no part of McPhee’s long career, full of months-long or years-long research trips and hours or days staring at a blank computer screen, resembles the churn-it-out grind of today’s professional web writer. Except the earliest part, which he returns to often: the English class at Princeton High School whose teacher, Mrs. McKee, made him write three pieces a week (“Not every single week. Some weeks had Thanksgiving in them”) for three solid years and encouraged her students to critique one another, to the point of hissing and spitballs. Her constant deadlines led him to devise a crucial tactic: Force yourself to break from “wallowing in all those notes” and determine an ending, then go back to worrying about the beginning. Which leads to the first formal rule he provides, and then only a quarter of the way through the book: When you’re getting nowhere and “you don’t know what to do. Stop everything. Stop looking at the notes. Hunt through your mind for a good beginning. Then write it. Write a lead.”

Want more? You can read the full article here

Challenges

earth

Victor Davis Hanson is a force of nature. Recently, he commented on the state of our nation and the challenges we’ve built for ourselves. Here’s how he began:

Our Baby Boomer elites, mired in excess and safe in their enclaves, have overseen the decay of our core cultural institutions.

Since the Trojan War, generations have always trashed their own age in comparison to ages past. The idea of fated decadence and decline was a specialty of 19th-century German philosophy.

So we have to be careful in calibrating generations, especially when our own has reached a level of technology and science never before dreamed of (and it is not a given that material or ethical progress is always linear).

Nonetheless, the so-called Baby Boomers have a lot to account for — given the sorry state of entertainment, sports, the media, and universities.

The Harvey Weinstein episode revealed two generational truths about Hollywood culture.

One, the generation that gave us the free-love and the anything-goes morals of Woodstock discovered that hook-up sex was “contrary to nature.” Sexual congress anywhere, any time, anyhow, with anyone — near strangers included — is not really liberating and can often be deeply imbedded within harassment and ultimately the male degradation of women.

Somehow a demented Harvey Weinstein got into his head that the fantasy women in his movies who were customarily portrayed as edgy temptresses and promiscuous sirens were reflections of the way women really were in Los Angeles and New York — or the way that he thought they should be. It was almost as if Weinstein sought to become as physically repulsive and uncouth as possible — all the better to humiliate (through beauty-and-the-beast asymmetry) the vulnerable and attractive women he coerced.

Want more? You can read the full piece here

Intellectual Property

30fruitninja1-superJumbo

There was a time when movies were based on either a book (typically a very good book) or an original screenplay (typically by a great screenwriter). That was then, this is now.

I’d always had the notion that something was changing, but Alex French’s article in the New York Times magazine, “How to Make a Movie Out of Anything — Even a Mindless Phone Game,” so revealing – and so frightening. Here how he began:

In 2013 a movie producer named Tripp Vinson was thumbing through Variety when he stumbled upon a confounding item: Phil Lord and Christopher Miller, a pair of writers and directors, were working on something called ‘‘The Lego Movie.’’ Vinson was baffled. ‘‘I had no idea where they were going to go with Legos,’’ he says. ‘‘There’s no character; no narrative; no theme. Nothing.’’

Since Vinson got into the business, something has changed in Hollywood. More and more movies are developed from intellectual property: already existing stories or universes or characters that have a built-in fan base. Vinson thinks it started in 2007, when the Writers Guild went on strike. ‘‘Before the strike, the studios were each making 20-­something movies a year,’’ he says. ‘‘Back then, you could get a thriller made. After the strike, they cut back dramatically on the number of films they made. It became all about I.P.’’ — intellectual property. With fewer bets to place, the studios became more cautious. ‘‘The way to cut through the noise is hitching yourself onto something customers have some exposure to already,’’ he says. ‘‘Something familiar. You’re not starting from scratch. If you’re going to work in the studio system, you better have a really big I.P. behind you.’’

This trend toward I.P.-­based movies has been profound. In 1996, of the top 20 grossing films, nine were live-­action movies based on wholly original screenplays. In 2016, just one of the top 20 grossing movies, ‘‘La La Land,’’ fit that bill. Just about everything else was part of the Marvel universe or the DC Comics universe or the ‘‘Harry Potter’’ universe or the ‘‘Star Wars’’ universe or the ‘‘Star Trek’’ universe or the fifth Jason Bourne film or the third ‘‘Kung Fu Panda’’ or a super-­high-­tech remake of ‘‘Jungle Book.’’ Just outside the top 20, there was a remake of ‘‘Ghostbusters’’ and yet another version of ‘‘Tarzan.’’

Want more? You can read the full article here

Super Soldiers

06MACINTYRE-master768

America has depended on the men and women of the Special Operations Command to deal with the threats of this century. Army Rangers, Navy SEALS, Air Force and Marine Corps special operators and others have been on the front lines, “on the wall” protecting us from enemies who would do us harm.

But few know the history of special operations, and fewer still know it began beyond our borders. That’s why I found Max Boot’s review of “Rouge Heroes,” Ben McIntyre’s history of Britain’s SAS so fascinating. Here is part of what he said:

Once upon a time, when the president wanted to use military force without becoming embroiled in a major conflict, the cry would go out: “Send in the Marines!” Today the role once played by the Marine Corps — as the troops of choice for low-profile missions without a formal declaration of war — has been largely supplanted by the United States Special Operations Command. With tens of thousands of “operators” and a multibillion-dollar budget, Socom has become virtually an independent military service.

Given the ubiquity and importance of Special Operations today, it is a little startling to realize just how novel they are. While there have long been specialized units, like Rogers’ Rangers of the French and Indian War, professional Special Operations forces date back only to World War II. All of the combatants employed them, but it was the British who were most assiduous in creating small units of swashbucklers.

The regular army establishment, of course, sniffed at the idea of a self-proclaimed military elite, and not without cause. Field Marshal William Slim, the liberator of Burma, wrote, “Armies do not win wars by means of a few bodies of supersoldiers but by the average quality of their standard units.” But Winston Churchill was enchanted by the supersoldiers and countenanced the creation of myriad units like the Commandos, the Long Range Desert Group, Popski’s Private Army, the Special Operations Executive, the Special Boat Service and the Chindits.

None were more storied than the Special Air Service (S.A.S.), which survives to this day and inspired the creation of foreign counterparts like the United States Delta Force and the Israeli Sayeret Matkal. The origins of the S.A.S. are recounted with verve by the veteran British historian and journalist Ben Macintyre, who has made a specialty of writing about clandestine operations in World War II and beyond. (His most recent book was about the British double agent Kim Philby.) This is hardly the first time the S.A.S. story has been told — a number of its veterans wrote entertaining memoirs, among them Fitzroy Maclean’s “Eastern Approaches” — but “Rogue Heroes” is the best and most complete version of the tale, because Macintyre was granted access to a hitherto-secret scrapbook known as the SAS War Diary.

 

Want more? You can read the full article here

Up or Down

17brooksWeb-master768

There is a lot of bad news out there: Church shootings, North Korea nukes, catastrophic storms, and on and on. It’s easy to wonder if the world is going to hell in a hand basket.

That’s why I found a recent piece by David Brooks so revealing as well as uplifting. Here is part of what he said:

The popular gloom notwithstanding, we’re actually living in an era of astounding progress. We’ve seen the greatest reduction in global poverty in history. As Steven Pinker has documented, we’ve seen a steady decline in wars and armed conflict. The U.S. economy is the best performing major economy in the developed world.

In 1980 the U.S. had a slight edge in G.D.P. per capita over Germany, Japan, France and the U.K. But the U.S. has grown much faster than the other major economies over the past 37 years, so that now it produces about $54,000 of output per capita compared with about $39,000 for Japan and France.

During the mid-20th century the West developed a group-oriented culture to deal with the Great Depression and the World Wars. Its motto could have been “We’re in this together.” That became too conformist and stultifying. A new individualistic culture emerged (pivot) whose motto could have been “I’m free to be myself.” That was great for a time, but excessive individualism has left society too fragmented, isolated and divided (hatchet). Something new is needed.

Want more? You can read the full article here

Silicon Valley: Your Friend?

shutterstock_185722835

Almost from its inception, the World Wide Web produced public anxiety — your computer was joined to a network that was beyond your ken and could send worms, viruses and trackers your way — but we nonetheless were inclined to give these earnest innovators the benefit of the doubt. They were on our side in making the web safe and useful, and thus it became easy to interpret each misstep as an unfortunate accident on the path to digital utopia rather than as subterfuge meant to ensure world domination.

Now that Google, Facebook, Amazon have become world dominators, the questions of the hour are, can the public be convinced to see Silicon Valley as the wrecking ball that it is? And do we still have the regulatory tools and social cohesion to restrain the monopolists before they smash the foundations of our society?

By all accounts, these programmers turned entrepreneurs believed their lofty words and were at first indifferent to getting rich from their ideas. A 1998 paper by Sergey Brin and Larry Page, then computer-science graduate students at Stanford, stressed the social benefits of their new search engine, Google, which would be open to the scrutiny of other researchers and wouldn’t be advertising-driven. The public needed to be assured that searches were uncorrupted, that no one had put his finger on the scale for business reasons.

Intrigued? You can read the entire article here