2017-11-24

Feeling We're In This Together

Income Inequality, part 2

There are a lot of different ways to measure inequality: the top X percent versus the bottom Y percent. But any X or Y we might choose reveals about the same trends, and about the same differences between nations. One measure is called "the 20:20 ratio" -- it's the ratio of the income of the top 20 percent to the income of the bottom 20 percent. It's a very common metric, and the UN uses it, so let’s look at that one.

Update: Not much change. Data from the OECD (HERE) indicate the 20:20 ratio for the US staying about the same for the most recent years for which data is available.
2013: 8.6
2014: 8.7
2015: 8.3

When the ratio of the top quintile to the bottom quintile is less than 5, then we find a society generally maintaining some shared assumptions about wealth and about each other.

Roughly, when that ratio is about 5 or less, the attitude of the populace will look resemble something like this:
“If there are somewhat wealthier folks among us, that’s OK. I can accept that some people are luckier, or more skillful at work that society prizes, or they’re more driven to work hard, and they end up wealthier. That’s fine – and as it should be. The relatively wealthy serve as a reminder to me of what good schooling and hard work and a little luck might make available to my children. If the town doctor has a big house on a hill, that’s OK – he’s smart and had a lot of training, and he’s using that to help us when we get sick, so more power to him. Maybe my kid can get a scholarship and be a doctor.”
That kind of thinking was still pretty much the largely-unspoken norm on the day 37 years ago when I first held my newborn daughter in my arms.

But that attitude loses purchase, begins to slip away, if the rich-poor gap grows too large. That outlook that prevailed through my life and my parents life up until 1980, has now come to seem quaint -- an echo of a bygone time. Few, it seems, think like that anymore.

The two key features of that outlook were: (1) that the higher levels of wealth were attainable by those who weren't already rich; and (2) those who had wealth deserved it. These two features are connected, for when upper-class wealth seems attainable – when the perception of most people is that anyone with the right combination of talent, drive, and luck can become upper-class – then those who do make it to society’s top wealth echelons are presumed to deserve it. But when the gap becomes as enormous as it has in the US, the folks at the bottom and middle can no longer see the wealth of the ones at the top as either attainable or deserved.

By the time my little girl was graduating from college in 2000, the world she was commencing into had become profoundly different from the one she was born into. The country had become a place where we could no longer feel we were all in this together.

Now, I know that the idea that there once was, up until about 37 years ago, a halcyon time of general social solidarity overlooks the deep racism that has divided our country throughout its history, and that given the reality of the deep and hostile racial divide, gauzy nostalgic impressions of togetherness are delusional. Very true. Even so, whites could see rich whites as attainable, and black could see wealthier blacks as attainable. But for the last 15 years or so, even that has fallen apart.

A relatively equal society – where the ratio of top quintile to bottom quintile is less than 5 (as it is in places like Japan, Scandinavia) -- can sustain a shared understanding among its members. But if, as in the U.S., that ratio is 8 or 9, there’s a disconnect. We lose the shared understanding of the legitimacy of things. The wealthy are beyond attainability, beyond any credible story of deservingness. We lose the sense that we’re in this together. The wealthy become “them.” And "they" don’t care about "us" -- so we don’t care about them. Anomie and division set in; anger and alienation become the social mood.

Sensing the resentment of most of society, the wealthy, in turn, retreat behind gated communities, which further increases the disconnect. We begin to believe the game is rigged; we don’t have a chance. When we believe that, we become more likely to behave in ways that make that a self-fulfilling prophecy.

Rich and poor alike feel the division, the disconnect. The result is that phenomenon I mentioned: everything that’s tough about modern life is exacerbated. Higher levels of depression, higher levels of consuming things that aren’t good for us: from drugs to alcohol to junk food to mindless TV shows to mindless consumer products.

When you compare nation to nation, there’s no correlation between wealth and life expectancy or mortality. No correlation. Rich countries have about the same life expectancies and mortality rates as relatively poor countries, until you get into the really poor end of the spectrum. As long as a nation has per-person income above about $9,000 a year, further increases do nothing to increase life expectancy. That’s the nation-to-nation comparison.

But when we do a zip-code-to-zip-code comparison, we get a different picture. The poorer zip codes have higher mortality than the richer zip codes. If you took several of the poorest zip codes, created a new island in the Pacific, put them all there, maintained their per-person incomes as they were, made a new island nation of them, they’d have decreased mortality. They’d be fine. But because they live near the wealthier areas, they perceive that difference. They see all around them the inescapable fact that they live in a society that is set up to work for others, but not for them. They are reminded daily that they are not in a society of mutual care. And that wears them down much more than relative material deprivation.

* * *
This is part 2 of 3 of "Income Inequality"
See also
Part 1: Modern Life Made Tougher

2017-11-23

Modern Life Made Tougher

Income Inequality, part 1
"The central task of the religious community is to unveil the bonds that bind each to all. There is a connectedness, a relationship discovered amid the particulars of our own lives and the lives of others. Once felt, it inspires us to act for justice. It is the curch that assures us that we are not struggling for justice on our own, but as members of a larger community. The religious community is essential, for alone our vision is too narrow to see all that must be seen, and our strength too limited to do all that must be done. Together, our vision widens and our strength is renewed." (Mark Morrison-Reed, Singing the Living Tradition #580)
"To unveil the bonds that bind each to all." That's the “central task of the religious community,” says Rev. Morrison-Reed. Unveil the bonds. The bonds are already there, but are veiled, hidden. We don’t see them. But in religious community – that is, community explicitly oriented toward ultimate concern – together we remove the veil for one another. We learn from and with each other to perceive the bonds. We learn to pay attention – to take in the moment just as it is.

Our choir sang words that translate as “listen to the wind blowing through the night, breathing peace to all.” Listen. Attention itself cultivates peace. Becoming mindful of those ever-present yet often undetected bonds “inspires us to act for justice.” Because we’re together, and see how thoroughly we are conjoined, we know we are not alone in struggling for justice. “Alone,” as Morrison-Reed says, “our vision is too narrow to see all that must be seen, and our strength too limited to do all that must be done.” But once we see those bonds unveiled, and live out of the awareness of them, the vision widens and the strength multiplies.

So, what shall we do with that wider vision and multiplied strength? I’m here particularly to talk today about income inequality, and what we as a people of faith, energized by a deep awareness of our bondedness, can do about that.

On November 2, 1980, my daughter was born. She was born into a world that certainly had poverty, but did not see the sort of wealth disparities we have now. Two days after she was born, Ronald Reagan won the election for president. And over the course of her life so far – she turned 37 this month – there’s been a massive transfer of wealth to the wealthy

In 1979, the poorer half earned 20% of the nation’s pre-tax income. By 2014, just 13%. If the US had the same income distribution it had in 1979, each family in the bottom 80% of the income distribution would have $11,000 more per year in income.

From 1947 to 1979, we all grew. In those 32 years:
  • For the bottom 20%, income rose 116%.
  • For the second quintile, income rose 100%.
  • For the middle quintile, income rose 111%
  • For the fourth quintile, income rose 114%.
  • For the top 20%, income rose 99%.
The gain of the top 20% was about the same as – though actually slightly less than – the other quintiles.

But from 1979 to 2007, it was a completely different story. In those 28 years:
  • For the bottom 20%, income rose 15%.
  • For the second quintile, income rose 22%.
  • For the middle quintile, income rose 23%.
  • For the fourth quintile, income rose 33%.
  • For the top 20%, income rose 95%.
In 1980, the richest one percent of people got eight percent of the income. Eight times the average income would seem to be plenty. Who could want more than that? Surely that’s more than enough. But in 2011, the richest one percent brought home 20 percent of all income.

"During the 1950s and 60s, CEOs of major American companies took home about 25 to 30 times the wages of the typical worker. In 1980, the big-company CEO took home roughly 40 times. By 1990 it was 100 times. By 2007, CEO pay packages had ballooned to about 350 times what the typical worker earned.” The ratio is down a little since then – but in 2016 CEOs were still making 271 times what the typical worker made. Don’t let this lull you with a sense of improvement: it’s fluctuating a bit within the range of the egregiously horrible.

Modern life is tough. Living the way we do is hard on people: anxiety, depression, unsure friendship, consumerism, lack of community. Not all of that would go away if suddenly tomorrow all income and wealth distribution were at 1979 proportions again. Yet everything that’s tough about modern life is made worse by such huge disparities.

* * *
This is part 1 of 3 of "Income Inequality"
See also
Part 2: Feeling We're In This Together

2017-11-20

Truths about Thanksgiving

That Festival in 1621
  • In 1620, the Mayflower landed at Plymouth rock bringing a 102 Puritans. The new arrivals did not call themselves "Pilgrims" -- they called themselves “saints” because they thought of themselves as being God’s elect. Only in the 20th-century did “Pilgrim” come to refer to the Mayflower Puritans.
  • These Puritans settled in an area that was once Patuxet, a Wampanoag village. It had been abandoned four years prior because of a plague that had earlier been brought by European traders. Before 1616, the Wampanoag numbered 50,000 to 100,000, occupying 69 villages scattered throughout the region that is now southeastern Massachusetts and eastern Rhode Island. The plague killed up to two-thirds of them. Many also had been captured and sold as slaves.
  • The English did not see the Wampanoag that first winter at all. They only caught a rare glimpse of a fleeting shadow of the land's inhabitants until March 1621 when Samoset, a Monhegan from Maine, came to the village. The next day, he returned with Tisquantum ("Squanto"). Tisquantum had been abducted as a boy in 1614 from the very village the Mayflower Puritans found abandoned. Tisquantum was sold as a slave in Spain, then escaped to England. After several years was able to get back to Turtle Island (what we call North America). When he returned to his village, he discovered there were no other surviving Patuxet -- the rest were either killed in battle or died of disease brought from Europe. He’d learned English so he could talk to the settlers and serve as a translator. He showed them how to plant corn, fish and gather berries and nuts. The crop seeds the colonists had brought with them failed, so without Tisquantum’s help, there probably wouldn’t have been a harvest to celebrate that fall.
  • The Puritan colonists did not wear black, large hats with buckles on them, nor buckled shoes. The 19th-century artists who painted them that way did so because they associated black clothing and buckles with being old-fashioned. Actually, their attire was bright and cheerful.
  • The harvest celebration on 1621 was not a solemn religious observance. It was a three-day festival that included drinking, gambling, athletic games, and even target shooting with English muskets -- a not-so-subtle way to warn the indigenous peoples that these colonists could shoot them.
  • The Wampanoag chief, Massasoit, and 90 warriors made their way to the settlement in response to the sounds of the gunfire. They thought the colonists were under attack, so they came prepared for battle to help defend the colonists.
  • The Wampanoag were probably not invited, and the settlers were probably rather nervous having them around. An 11-foot high wall had been erected around the entire Plymouth settlement for the very purpose of keeping the indigenous peoples out. Moreover, mere days before the feast, a company of settlers led by Miles Standish had actively sought the head of a local Indian leader.
  • The Wampanoag were not wearing woven blankets on their shoulders and large, feathered headdresses. They wore breechcloth with leggings -- and perhaps one or two feathers in their hair in the back.
  • The main course was venison, rather than turkey. The Wampanoag stayed for three days, during the course of which they contributed five deer. There are references to "fowl" -- which would have included ducks, geese, and various birds -- so it's possible that some turkey could have consumed, albeit without cranberry sauce. Other foods that may have been on the menu: cod, bass, clams, oysters, Indian corn, native berries and plums, all washed down with water, beer made from corn, and another drink the settlers called “strong water.” Pumpkin pie? Nope. In those days, the settlers boiled their pumpkin and ate it plain. They didn’t have flour mills to make flour for a crust, nor cane sugar, nor the "pumpkin pie spices" (cinnamon, nutmeg, clove).
  • The 1621 harvest celebration was not in November, which would have been much too late. It was some time between late September and the middle of October.
  • Everything we know about that 1621 feast came from a description in one letter by colonist Edward Winslow. That letter was lost for 200 years. After it was rediscovered, a Boston publisher, Alexander Young, in 1841 printed up the brief account of the feast. Young dubbed the episode “The First Thanksgiving.” White Americans, craving a romanticized story of their past, latched on to it.
  • The colonists celebrating in 1621 did not call their event "Thanksgiving." For them, “thanksgiving” was a day of fasting – and this was a feast -- the opposite of their thanksgiving observance.
  • Calling any event involving white settlers in North America "the first Thanksgiving" overlooks the fact that, for thousands of years before Europeans arrived, Indigenous people throughout Turtle Island (North America) celebrated seasons of Thanksgiving. 'Thanksgiving' is a very ancient concept to the first nations of this continent.
  • The 1621 celebration was a one-off that was not repeated -- and, in any case, wasn't thought of as a "Thanksgiving." The first European-recognized Thanksgiving came in 1637, when Governor Winthrop of the Massachusetts Bay Colony proclaimed a Day of Thanksgiving.
That Proclamation in 1637
  • It wasn’t until 1863 that the US National Holiday of Thanksgiving was declared by Abraham Lincoln, who set Thanksgiving Day as the last Thursday of November. In 1941, Franklin Roosevelt changed Thanksgiving Day from the last to the fourth Thursday: November 1941 had five Thursdays, and by moving the holiday up a week he gave merchants a longer Christmas shopping season that year -- and all subsequent years with five Thursdays in November, which occur, on average, twice every seven years.
  • There is no historical link between today's holiday and the 1621 celebration. The linkage is purely mythical, created in 1841 by publisher Alexander Young. The historical roots of our current holiday begin, instead, in 1637.
  • In 1637, Governor Winthrop proclaimed a Day of Thanksgiving. The proclamation focused on giving thanks for the return of the colony's men who had traveled to what is now Mystic, Connecticut where they had gone to participate in the massacre of over 700 Pequot men, women and children. The thanks that was foremost in Winthrop’s proclamation was thanks for their “great victory”.
  • The Pequot had gathered for their annual green corn dance when Dutch and English mercenaries surrounded the camp and proceeded to shoot, stab, butcher and burn alive all 700 people. William Bradford wrote:
    “Those that scraped the fire were slaine with the sword; some hewed to peeces, others rune throw with their rapiers, so as they were quickly dispatchte, and very few escapted. It was conceived they thus destroyed about 400 at this time. It was a fearful sight to see them thus frying in the fyer, and the streams of blood quenching the same, and horrible was the stincke and sente there of, but the victory seemed a sweete sacrifice, and they gave the prayers thereof to God, who had wrought so wonderfully for them, thus to inclose their enemise in their hands, and give them so speedy a victory over so proud and insulting an enimie.”
  • The roots of the American Thanksgiving holiday are a celebration of the massacre of hundreds of Native people. It grew into a general celebration of genocide. For example, a Proclamation of Thanksgiving in 1676 thanks god that the "heathen natives" had been almost entirely wiped out in Massachusetts and nearby.
  • A century later, the Thanksgiving Proclamations weren't about genocide of the indigenous peoples -- that was no longer a concern. The proclamations did, however, continue to be connected with violence. The Continental Congress, in the midst of the Revolutionary War, issued Thanksgiving Proclamations each year from 1777 to 1784. The 1777 proclamation, for example, declared it an "indispensable duty" of all "to acknowledge with gratitude their obligation to" God and to "implore such farther blessings as...to...smile upon us in the prosecution of a just and necessary war." Thus was the way paved for Lincoln, in the midst of the Civil War, to make Thanksgiving a US National Holiday. Lincoln's 1863 Thanksgiving Proclamation concludes by recommending that his fellow citizens "implore the interposition of the Almighty Hand to heal the wounds of the nation and to restore it as soon as may be consistent with the Divine purposes to the full enjoyment of peace, harmony, tranquility and Union" -- which seems an implicit request for God's favor on the Union Army in battle.
The Day of Mourning

In 1970, the Commonwealth of Massachusetts was arranging celebrations of the 350th anniversary of the Plymouth Rock landing. Wampanoag Wamsutta (Frank) James was invited to speak -- then disinvited after the event organizers discovered his speech was one of outrage for atrocities and broken promises. So instead of participating in the official proceedings, Native people gathered at Cole’s Hill overlooking Plymouth Rock. Every year since, Native Americans have been gathering there from around the country at noon on the fourth Thursday of November to observe a National Day of Mourning.

Do Away with Thanksgiving?

There’s nothing wrong with gathering with loved ones to give thanks for our blessings and sharing a meal. After all, Native people have done so for thousands of years. But when we do, let us acknowledge the true origin of this holiday.

Along with remembering all the good in your lives, all the blessings you enjoy, remember also the pain, loss, and agony of the Indigenous people who suffered at the hands of those Puritans now called “Pilgrims'. And when you list your gratitudes, include thanks that you have the capacity to face the truths of the past, to learn from them to love others better, and love the rich diversity of humanity.

2017-11-17

The Truth Behind the Fad

The Mindfulness Fad, part 3

So how do you get genuine mindfulness? Some things to keep in mind – to be mindful of, in the old sense:

(1) There is no true multi-tasking. Your brain cannot actually do more than one task at a time. What we call multi-tasking is really just switching back and forth among multiple tasks. You’re still only doing one thing at a time, but you’re only doing it for a few seconds before switching to something else, and then switching back.

(2) This so-called multi-tasking lowers productivity. “Students and workers who constantly and rapidly switch between tasks have less ability to filter out irrelevant information, and they make more mistakes.” (Time)

(3) The more we multi-task – that is, switch rapidly among tasks – the worse at it we become. Unlike just about everything else in life, multi-tasking is one thing that we get worse at the more we do it. In other words, the people who spend more of their time unitasking – focusing on one thing at a time and really getting into the zone in that activity – are better able to juggle multiple balls in the air when an occasion arises that they have to. The ability to see the multiple things as features of one thing – the present moment – helps filter out irrelevant information and keep the focus on what’s most needful about each thing.

I mentioned the two aspects of mindfulness: (a) bringing attention to immediate experience – particularly, noting mental events as they happen; and (b) being open, curious, and accepting of whatever it is that you’re noticing. How do you do that? You can just decide to do that. Pay attention to bodily sensations, what thoughts are arising in your mind, investigate your immediate thoughts and feelings with nonjudgmental curiosity. Simply deciding to do that will typically last maybe 10 seconds. Maybe even a couple minutes. Then your usual habitual way of being kicks in.

If you want to change your habits – become habitually more attentive to and openly curious about immediate experience, that’s going to take some work. Sorry about that. No easy walk to freedom.

You could start with a class. Google “mindfulness classes near me.” Prolonged reinforcement of the core concepts in a classroom setting will help shift your brain’s neural habits. Of course, you'll need to keep practicing after the class ends. One of the things the class will emphasize is a daily meditation practice – so if you just aren’t able to take 30 minutes out of your hectic day, then you probably aren’t going to get much more mindfulness than you already have.

Another option is to skip the class and get a book. I’ve read a lot of books about mindfulness and meditation and Zen. The very first one I read some 16 years ago is still the best first one – and if you only ever read one, this would be the one: Mindfulness in Plain English, by Henepola Gunaratana. (Available in its entirety as a free PDF HERE). If you’re ready to change your life, it’ll tell you how to do the practice that, if you stick with it, day after day, will be transformative.

These are the truths behind the fad: (1) There's no easy path to transformation and liberation, and (2) There is a path that'll get you there if you stick with it.

Strengthening the mindfulness muscle is kinda like strengthening any muscle – you make it stronger by exercising it. Kinda like, but also kinda unlike. With muscles, there’s a fairly predictable timeline by which exercise increases strength. If you have a normal and healthy physiology, and you adopt a regimen of exercise, and stick to it, then you will get stronger. There’s a rough curve by which, with some wobble in the graph, you will progress toward the limit to which that regimen can take you. Mindfulness strengthening doesn’t go like that. It’s not a reliable product of putting in the time doing the exercise. The spirit has its own schedule. Committed serious spiritual practitioners can go for years when their practice just seems void and useless. Then they can hit a patch where they actually seem to be regressing. They’re acting as cranky, unkind, disconnected -- as withdrawn, on the one hand, or as controlling, on the other – as they ever had before they started any spiritual practice. There is no smooth curve of progress.

But there do, overall in the long run, tend to be certain fruits of the practice. A related difference between the physical and spiritual is this: With physical exercise, you become different by becoming different. With mindfulness exercise, you become different by becoming exactly who you are. Slowly, one finds that the overlay of judgments about who you think you should be drop away, and your true self shines forth a bit more.

It helps to have a group to practice with: a weekly group alongside the daily practice at home. Such groups aren’t hard to find. There’s one available here that I lead every Saturday at 10:00. Behind the fad, the third-eye chakra tea, the essential oils diffusers, the voice-activated guided meditation device, the acupressure meditation mats, the apps, the studios, the bells and whistles, there’s . . . you. And the reality you’re in and not separate from. And an authentic practice for being who you are and loving what is, every moment.

Whether you go for the version with the monks and robes, temples and statues, and sangha community or the version with teachers in professorial casual at retreat centers or studios and no ongoing community to speak of, there’s something real there – something worth our . . . attention.

* * *
This is part 3 of 3 of "The Mindfulness Fad"
See also
Part 1: Origins of the Mindfulness Fad
Part 2: Mindfulness Goes Secular

2017-11-15

Mindfulness Goes Secular

The Mindfulness Fad, part 2

Jon Kabat-Zinn
Westerners are weird about anything we identify as a religion. If something is a religious practice, we’re skeptical, suspicious. Some of us already have a religion – and we typically think we can have only one, so we aren’t interested in another. And those of us who don’t have a religion often don’t want one – so, again, we’ll stay away from a practice if we think it’s religious. But if it doesn’t feel like a religion, we will be game to try anything: Marxism, Freudian psychoanalysis, pilates.

If you have a cool therapy, technique, or analysis that you want to offer to people, you may be able to get access to schools, prisons, the military, meetings civic groups, and workplace programs to talk about your helpful practice. But if you're the messiah of a new religion, or a preacher, teacher, or guru of an established one, those institutions are much more likely to be closed to you. We're suspicious of religion. We want our public institutions generally to be neutral about religion, as we also want them to be neutral about partisan politics. We understand that religion and politics divides us. But a program that promises to be helpful to people of any religion, or none, and people of any politics, or none, has a chance to be welcomed in our public institutions.

So Jon Kabat-Zinn, a molecular biologist and a long-time student and practitioner of Zen, knew that if Americans were going to accept the techniques he’d been learning in Zen, he had to find a way to present them so they didn’t feel religious. The program Kabat-Zinn devised and launched in 1979 is essentially a series of Zen trainings, but instead of calling them Zen trainings, he calls it Mindfulness-Based Stress Reduction – MBSR. As MBSR, it doesn’t have any of those trappings associated with religious traditions: no priestly types in robes, a minimum of chanting and bowing, funding itself more from fee-for-class and less from donations, not much sense of continuing relationship once the term of the class is over.

And it really took off. You pay four or five hundred dollars for a class that meets once a week for eight weeks, and you get the teaching and learn the practices without being subjected to authorities in funny clothes or membership in an ongoing community.

Google teaches mindfulness to its employees – and so do more stuffy companies like McKinsey and BlackRock. There are mindfulness apps for your smartphone – more than two dozen, some offering $400 lifetime subscriptions. There are mats and cushions and clothing lines and incense and bells and “mindful lotus tea” (6 dollars for 20 bags). Meditation-related businesses in 2015 generated $984 million in revenue. As journalist David Gelles put it: “For an enterprising contemplative, it’s never been easier to make a buck.”

“There are Mindful Meats, Mindful Mints and the Mindful Supply Company, which makes T-shirts.” You can paint your bedroom in “Mindful Gray.” A dairy-free mayonnaise-substitute called “Mindful Mayo” is $4.50 a jar.

There’s a book – actually two books – titled One-Minute Mindfulness: one by Donald Altman, 2011, and one by Simon Parke, 2015. But mindfulness needs to be more than a brief reprieve between checking Facebook and the next episode of Stranger Things. A newer 2017 book by S.J. Scott seems to acknowledge that mindfulness will take a little more time. Its title is: Ten-Minute Mindfulness.

Much of this mindfulness rage is faddish, and, I will say, not helpful. In fairness, I think there's also a lot of it that is helpful. And, full disclosure: I own and use daily: a zabuton (mat), a zafu (round cushion), inkin bell (bell on a stick), bell gong ("singing bowl"), wooden clappers, a Buddha statue, a meditation timer app on my tablet, and an ample supply of incense. The MNDFL studios and some of the apps strike me as rather pricey and often unnecessary, but for folks who can afford it, who don't have an established practice, who are looking around for what might work for them while fitting within their schedule, I'm glad they've got the options.

Yes, a lot of what peddles itself as a mindful product has nothing to do with actually doing anything. Moreover, it’s important to be aware that many of the supposed advantages of mindfulness can be gotten in other ways. For instance, it's true that 30-minutes of meditating does reduce stress and lower blood pressure and promote a more positive feeling about your life. You can also get those results from 30 minutes of stretching or exercising or, for that matter, watching an I Love Lucy re-run.

Many people find that the practice improves work performance, but perhaps you remember the Hawthorne effect? They kept making the lighting brighter, and productivity kept going up. Then they started dimming the lights more and more, and productivity still went up. It turns out that any change that you think will make you more focused and productive, probably will. It’s a version of the placebo effect. Maybe mindfulness training will get you a little further than the placebo effect – and so would getting more sleep.

There’s also worry that companies pushing mindfulness on their employees are just trying to get more out of them without otherwise improving their pay and working conditions. Some writers have worried that “McMindfulness” placates people into acceptance of political and social injustices (e.g., Virginia Heffernan, Kristen Ghodsee, Ruth Whippman). I don’t have this worry. It’s my own experience, confirmed in numerous accounts from other people, that being rooted in the here and now, and feeling the joy in each moment, also awakens compassion, and makes us more, not less, energized to take action for justice.

Your capacity for joy – your ability to feel and be present to and sustain joyousness – is equal to your capacity for sadness and pain – your ability to feel and be present to grief – for they are the same capacity. Mindfulness increases that carrying capacity for both joy and sorrow at the same time, for it is in the numbed-up mindlessness of pursuit of continual distraction that we push both of them away. Genuine mindfulness will then contribute to, rather than detract from, social activism for a more just and peaceful world, and workplace activism for fair wages and working conditions.

Next: How to get genuine mindfulness

* * *
This is part 2 of 3 of "The Mindfulness Fad"
See also
Part 1: Origins of the Mindfulness Fad
Part 3: The Truth Behind the Fad

2017-11-13

Origins of the Mindfulness Fad

The Mindfulness Fad, part 1

Knees down, y'all.
If you're not able to let your knees rest on the ground, just use a chair.
Mindfulness is such a fad. It has really been all the rage. A Congressman wrote a book about it. A Congressman! Rep. Tim Ryan (D-OH) happened upon a Jon Kabat-Zinn book that had a section on Mindfulness in Politics, and he was so inspired he went to a 5-day mindfulness meditation retreat with Kabat-Zinn. Then he wrote a book called A Mindful Nation: How a Simple Practice Can Help Us Reduce Stress, Improve Performance, and Recapture the American Spirit. It has chapters devoted to:
  • Mindfulness in our schools: how it can increase our children’s attention and kindness.
  • Mindfulness in our hospitals and doctors’ offices: how it can improve our health and our healthcare system.
  • Mindfulness in our military, police, and firefighters: how it can improve performance and build resiliency for the military and first responders – and how, later on, mindfulness is the path for coming to terms with PTSD.
  • Mindfulness in the workplace: how it can help us rediscover our values and reshape our economy.
Much better.
It’s been five years since that book came out, and Tim Ryan is still in congress, now in his 8th term.

Then a couple years ago, Time magazine had a cover story on "The Mindfulness Revolution."

Things called meditation studios have started opening up. There’s a company called MNDFL -- which is “mindful” without the vowels because, I don’t know, vowels are so not in the present moment. With studios in Brooklyn’s Williamsburg and Manhattan’s Greenwich Village, and Upper East Side, they bill themselves as New York’s premier meditation studio. They got a nice article in Vogue, titled, “Introducing Manhattan’s Must-Visit Meditation Studio.” Classes are 30, 45 and 60 minutes, and start at $10. When classes are not in session, the studio is open for self-guided practice.

What’s all the hype about? Mindfulness has two aspects:
  • bringing attention to immediate experience – particularly, noting mental events as they happen.
  • being open, curious, and accepting of whatever it is that you’re noticing.
Do you need to take classes to do these two things? Maybe. You’d be entering into a social phenomenon, whatever else you’d be doing. So it's worth asking, how did “mindfulness” get where it is today?

Mindful used to mean, “bearing in mind,” – as in remembering. Someone who was mindful of their duties was remembering their duty – not forgetting it. If you were mindful of a slippery surface, you were remembering and keeping in mind that the surface was slippery.

Then in 1881, Thomas William Rhys David produced some translations of Buddhist scriptures, and he translated the Pali word sati as “mindfulness.” Perhaps a better translation of sati would be recognizing reality. Alternative translations include remembering the present, paying attention, being present. According to the Buddha, sati is one of the seven factors of enlightenment. (The other six, by the way, are investigation, determination, joy, tranquility, concentration, and equanimity.)

But it was Jon Kabat-Zinn, a century later, who took Rhys David’s word “mindfulness” and made it into a mass phenomenon. He did it by stripping away anything that smelled religious. By “smell religious,” I don’t mean a certain kind of belief. After all, believing, as Marxists do, that conflicts between employee and employer are the central driving force of iron laws of history that will necessarily eventually lead to government control of the means of production is roughly the same type of belief as that the second coming of Christ is imminent. Believing, as Freudians do, in Oedipus complexes is the same category of belief as believing in original sin. Believing, as Jeremy Bentham did, that one should always act to produce the greatest happiness for the greatest number is the same kind of belief as believing one should adhere to the Ten Commandments. It’s not the kind of belief propounded that accounts for St. Paul, Mohammad, and Gotama being founders of religions while Marx, Freud, and Bentham are not. Rather it’s the other features typically found in what we recognize as religion:
  • priests or monks with distinctive robes;
  • special meeting places, usually architecturally distinctive, with a distinctive name like temple, church, mosque, or vihara, inside which are material symbols – altars and crucifixes or Torah scrolls or Buddha statues – or chalices;
  • unison practices like hymn singing or unison reading or chanting from sutras;
  • congregational community – which other forms of spiritual development like yoga classes or meeting with a spiritual director do not;
  • a distinctive economics: there’s no admission price for worship, or fee for classes, but rather a donation-based economics;
  • generally presumed exclusivity. While it is conceptually possible for one person to be both Christian and Buddhist, say – or both Unitarian Universalist and any other major world religion -- the general presumption is that, when it comes to religion, you choose just one, and it becomes a part of your identity. By contrast, the practices and teachings of cognitive behavioral therapy, or yoga, or marathon running, or wine connoisseurship, or BeyoncĂ© fandom don’t imply not being anything else.
Sati is a practice and teaching from Buddhism -- a religion with all the above trappings of religion. Kabat-Zinn's project was to promote the practice and teachings, but dissociate them from the accompaniments of a religion.

Next: Why was this necessary?

* * *
This is part 1 of 3 of "The Mindfulness Fad"
See also
Part 2: Mindfulness Goes Secular
Part 3: The Truth Behind the Fad

2017-11-09

Persecutions Sometimes End

Witches, part 2

By the fall of 1692, people in Salem were beginning to come to a semblance of their senses. Many were questioning the sheer number of accusations – finding it improbable they could have that many witches. They began to question the trustworthiness of those who claimed to have been afflicted by the witches. Maybe the accusers were the ones who were lying? Suddenly – as suddenly as it had started – the witch craze was over. The numerous people still in custody were released.

We can easily surmise that behind the persecutions were resentments and grudges. There was also a fertile context of theological rigidity. As Stacy Schiff writes in The Witches: Salem 1692,
“Salem is in part the story of what happens when a set of unanswerable questions meets a set of unquestioned answers.”
As my colleague Rev. Erica Baron, herself a pagan trained in and active with the Temple of Witchcraft, put it:
“This is a story of a community willing to believe the worst about each other on some of the flimsiest evidence imaginable.”
1692 Salem was extreme, but every community harbors resentments, quarrels, grudges, jealousies.
Those tensions sometimes rend the fabric of community, and healing is in order.

Fourteen years after the witch persecution, in 1706, Ann Putnam, who had claimed to be afflicted by witchcraft and had accused over 60 people, apologized. Many of the jurors who had handed down guilty verdicts also apologized, signing a letter to the community and to descendants of those convicted and executed. They wrote:
“We confess that we ourselves were not capable to understand, nor able to withstand the mysterious delusions of the powers of darkness and prince of the air, but were for want of knowledge in ourselves and better information from others, prevailed with to take up with such evidence against the accused as on further consideration and better information, we justly fear was insufficient for the touching the lives of any...whereby we fear we have been instrumental with others, though ignorantly and unwittingly, to bring upon ourselves and this people of the Lord, the guilt of innocent blood....We do, therefore, hereby signify to all in general (and to the surviving sufferers in especial) our deep sense of and sorrow for our errors in acting on such evidence to the condemning of any person.”
1692 Salem was extreme, but women have long been the go-to group to blame for whatever is frustrating for the powerful, or, for that matter, the relatively powerless. No one has been literally burned to death in this country for being a witch for 300 years, but women continue to feel the burn of judgments that they are "witches" if they speak out against abuses they endure.

In the Salem of 1692, the sheer number of the accusations triggered a sudden shift, an opening of eyes -- a realization that this many women can't all be witches. Today we may see a dimly echoing parallel shift. The sheer number of accusations may – with any luck – trigger a similar eye-opening shift. This time the accusations are not against, but by, mostly women -- speaking up about sexual harassment and assault. But the growing realization is, again: this many women can't all be "witches."

In 1692, incredible accusations against mostly women were deemed credible. These days, highly credible accusations by mostly women have been disregarded and dismissed. Maybe we are prepared as a society now to see that women willing to speak up about unwanted advances are not some version of witches. As humans, we all want to be attractive and friendly. Women face additional burdens to not be attractive or friendly in what someone might perceive as "the wrong way" -- whatever that is -- yet still face harassment and assault, no matter how careful they've been, because it turns out it doesn't really have to do with attractiveness, or insufficiently prim dress or behavior. Mostly women and a few men face a double persecution: subjected to harassment or assault, and then subjected to a grueling and demeaning process if they speak up. Even on rare occasions when they win a significant monetary settlement, it comes with enforced silencing.

Ending the second persecution will go a long way to also ending the first. When victims can report harassment and assault and be taken seriously and believed, the impunity which allows that mistreatment to go on and on will be over.

In fall of 1692, in Salem, a persecution of mostly women very suddenly stopped. In fall of 2017, across the US, will another persecution of mostly women similarly suddenly stop? May it be so. May it be so.

* * *
This is part 2 of 2 of "Witches"
See also
Part 1: Witches!
I am indebted to my colleague Rev. Erica Baron, upon a sermon of whose I have relied.

2017-11-04

Witches!

Witches, part 1

One of the connections that the dominant US culture makes with Halloween is witches. So this Halloween I want to reflect with you about witches.

Witch is from the Old English wicce, meaning "female magician, sorceress." As Christianity spread through England, it came to mean "a woman supposed to have dealings with the devil or evil spirits and to be able by their cooperation to perform supernatural acts."

There were men, supposedly, who practiced witchcraft, too -- wizards and sorcerers. In fact, wicce is the Old English feminine and wicca the masculine for such practitioners. But this "dealings with the devil" idea has a very long-standing much stronger association with women. The Laws of Ælfred, established in about 890, for example, identified witchcraft as specifically a woman's craft, whose practitioners were not to be suffered to live among the West Saxons. Behind this, we see women's wisdom, power, or authority was resented and suspect.

The Biblical verse, Exodus 22:18, declares, in the King James, "thou shalt not suffer a witch to live." The word rendered as "witch" meant "female sorcerer." That the feminine was specified apparently indicated that casting spells was much more common among women among the ancient Hebrews -- or that the patriarchal interests of the time were more threatened by women than men engaging in sorcery.

With that as background, let us turn to Salem, Massachusetts in 1692. It all began in January of that year with accusations against three people:
  • Sarah Good, a beggar who was disliked for constantly, well, begging – and showing little gratitude for what she received while cursing those who declined;
  • Sarah Osborne, also peripheral to the community, who was a widow engaged in a protracted court battle over the settlement of her husband’s will; and
  • Tituba, the Indian slave of the town minister’s family.
The two Sarahs maintained they were innocent of any devil consorting, but Tituba gave a long and lurid confession with all manner of strange details about her pact with the devil and blasphemous rituals. That really got the whole community worked up, and the search was on for others who might have participated. Eventually, 19 people would be executed: 14 women and 5 men.

Stacy Schiff, in her book, The Witches: Salem 1692, writes:
The youngest of the witches was five, the eldest nearly eighty. A daughter accused her mother, who in turn accused her mother, who accused a neighbor and a minister. A wife and daughter denounced their husband and father. Husbands implicated wives; nephews their aunts; sons-in-law their mothers-in-law; siblings each other. Only fathers and sons weathered the crisis unscathed. A woman who traveled to Salem to clear her name wound up shackled before the afternoon was out. In Andover, the community most severely affected – one of every 15 people was accused. The town’s senior minister discovered he was related to no fewer than 20 witches. Ghosts escaped their graves to flit in and out of the courtroom, unnerving more than did the witches themselves. Through the episode surge several questions that touch the third rail of our fears: Who was conspiring against you? Might you be a witch and not know it? Can an innocent person be guilty? Could anyone, wondered a group of men late in the summer, think himself safe? How did the idealistic Bay Colony arrive – three generations after its founding – in such a dark place? Nearly as many theories have been advanced to explain the Salem witch trials as the Kennedy assassination. Our first true-crime story has been attributed to generational, sexual, economic, ecclesiastical, and class tensions; regional hostilities imported from England; food poisoning; a hothouse religion in a cold climate; teenage hysteria; fraud, taxes, conspiracy; political instability; trauma induced by Indian attacks; and to witchcraft itself, among the more reasonable theories. You can blame atmospheric conditions or simply the weather: Historically, witchcraft accusations tended to spike in late winter. Over the years, various parties have played the villain, some more convincingly than others. The Salem villagers searched too to explain what sent a constable with an arrest warrant to which door. The pattern was only slightly more obvious to them than it is to us, involving as it did subterranean fairy circles of credits and debits, whispered resentments, long-incubated grudges, and half-forgotten aversions. Even at the time, it was clear to some that Salem was the story of one thing behind which was a story about something else altogether. In 300 years we have not adequately penetrated nine months of Massachusetts history. Things disturb us in the night. Sometimes they are our consciences. Sometimes they are our secrets. Sometimes they are our fears.
* * *
This is part 1 of 2 of "Witches"
See also:
Part 2: Persecutions Sometimes End
I am indebted to my colleague Rev. Erica Baron, upon a sermon of whose I have relied.


2017-11-03

No More Disposability

Environmental Racism, part 2

There have been some victories in fighting back against environmental racism.
  • In 1989, the Louisiana Energy Services (LES) sought to build a privately-owned uranium enrichment plant just outside Homer, LA, straddling a road connecting two African American communities, Forest Grove and Center Springs. Residents sued, and the Nuclear Regulatory Commission's Atomic Safety and Licensing Board found that racial bias did play a role in the site selection process.
  • In Diamond, LA, a small, African-American neighborhood, was sandwiched between two large Shell Oil plants. For years, residents lived with an inescapable acrid, metallic odor and a chemical fog that seeped into their houses. They experienced headaches, stinging eyes, allergies, asthma, and other respiratory problems, skin disorders, and cancers. Periodic industrial explosions damaged their houses and killed some of their neighbors. Protests eventually led Shell to agree to relocate the residents.
Let me lay some historical context for this. The economics of slavery wasn’t just that the landowners got cheap labor. They got cheap dangerous labor. Slaves could be and were subjected to dangerous levels of heat, and mosquito-borne illnesses such as malaria. After the civil war and a brief period of reconstruction, we re-created our habits of enslavement under other names: the share-cropping system and the prison system, for example. We also continued our national long-standing habit of subjecting people of color to greater environmental risks. The segregation created under the Jim Crow era, and continued through red-lining practices, did not merely concentrate minorities together – it concentrated them together in the more dangerous areas – or in areas that we then were more likely to make dangerous by putting pollution there.

It’s been 30 years since the UCC report, Toxic Waste and Race in the United States. Where are we today?
  • More than half of all people who live close to hazardous waste are people of color.
  • Black children are twice as likely to suffer from lead poisoning as white children.
  • Childhood asthma, linked to exposure to pollution, has actually been declining since 2011 – after rates doubled in the 1980s and 90s. So that’s good news. Still, more than 14 percent of black children have asthma, compared with about 8 percent of white children. And black children are also much more likely than white children to suffer severe complications.
  • The response to Hurricane Maria’s devastation of Puerto Rico has been lackluster compared to the response to Harvey in Houston and Irma in Florida.
The pattern of treating people of color as mattering less continues.

I’ve noticed that sometimes among Unitarian Universalists there’s what feels like a split. Some of us are more oriented to Climate Change. Nothing else really matters if we don’t ensure a future for the planet. Some of us don’t get so worked up about CO2 because unarmed young black men are being shot by police. But at root, it’s the same issue.

If we create a world where we don’t trash people, we can’t trash the planet. Environmental justice is not about environmental equity -- not about redistributing environmental harms. It's about abolishing them for everyone.

It's about, "No more disposability." A world without disposable people, without disposable communities, without disposable species, will require a world without disposable plastics. "No throwaway planet" and "no throwaway people" are one concept, one idea, one value.

* * *
This is part 2 of 2 of "Environmental Racism"
See also
Part 1: Racism and the Environment

2017-11-02

Racism and the Environment

Environmental Racism, part 1

It was 30 years ago, in 1987, that the United Church of Christ conducted a study. (The UCC is generally regarded as the next most liberal historically-Protestant denomination after the Unitarian Universalists. I went to a UCC seminary, and the joke that they themselves tell on themselves is that UCC stands for “Unitarians Considering Christ.” Some years ago, UUA and UCC teamed up to create the "Our Whole Lives" sexuality education curriculum. That was a joint project of our two denominations.)

The UCC’s work in the area of environmental racism has been way ahead of ours. Thirty years ago the UCC Commission for Racial Justice undertook an extensive study of the subject. Their report, Toxic Waste and Race in the United States, found that:
  • Communities with a commercial hazardous waste facility averaged 24% minority.
Even more striking,
  • communities with two or more [commercial hazardous waste] facilities -- or one of the nation's five largest landfills – averaged 38% minority.
Meanwhile
  • communities with no such facility averaged just 12% minority.
Socio-economic status – class -- also played a significant role, but race was still more significant.
Minority groups continue to be burdened with a disproportionate number of environmental hazards.

The report is available HERE.

Let's now jump to a much more recent situation. Flint, Michigan. The city has just under 100,000 people, 41% poor and 57% African-American.

In 2014, Michigan state authorities, to save money, switched the water supply of Flint, MI, from Lake Huron to the Flint River, known for its pollution. Almost immediately, boil advisories had to be issued because fecal coliform bacteria was flowing into the homes of Flint. Because the Flint River is polluted to begin with, water from that river is corrosive. Flint River water was found to be 19 times more corrosive than water from Lake Huron. Treatment with anti-corrosive agents would go a long way to address that, and federal law requires such treatment. But the state Department of Environmental Quality violated that federal law and simply didn’t treat Flint’s water with anti-corrosive agents.

So this corrosive water, unmitigated in its corrosion, began flowing to Flint. It was coming in through aging pipes, and because it was so corrosive, it leached lead out of the pipes. Lead content in the drinking and bathing water in Flint shot so high it met the EPA’s definition of "toxic waste."

In fairness to the state of Michigan, as fair as we can be, the switch to the Flint River was always meant as a temporary measure for two years while a new pipeline from Lake Huron was being constructed. OK, good to note. But, still! It is not OK for the water in people’s homes to be toxic waste for two years – or even for one day. Is there any doubt that what happened to Flint would never have happened to a predominantly middle-class and white city?

Black lives matter. Black lives matter because all lives matter. Yet black lives are treated as mattering less than white lives. One of the ways we see black lives counting for less is environmental racism – that is, burdening minority groups with a disproportionate number of hazards, such as toxic waste facilities, garbage dumps, and sources of pollution.

Consider the case of Altgeld Gardens, a housing community Chicago, was built on an abandoned landfill and is surrounded by 53 toxic facilities and 90% of Chicago’s landfills. Mercury, ammonia gas, lead, DDT, PCBs, PAHs, heavy metals, and xylene, are among the known toxins and pollutants affecting the area, and residents suffer excessive rates of prostate, bladder, and lung cancer; children born with brain tumors; fetal brains developing outside the skull; asthma, ringworm, and other ailments. The population of Altgeld Gardens is 90% African-American and 65% below the poverty level.

Chester, PA, has five large waste facilities (including a trash incinerator, a medical waste incinerator, and a sewage treatment plant) with a total permitted capacity of 2 million tons of waste per year – compared to merely 1,400 tons allowed in all the rest of Delaware County, PA.
Chester residents suffer a cancer rate 2.5 times higher than anywhere else in Pennsylvania and a mortality rate 40% higher than the rest of Delaware county. Chester is 65% African American.

Our nations floodplains have high populations of blacks and Hispanics, placing them at higher risk if a flood comes. And, in fact, when Hurricane Katrina hit New Orleans in 2005, institutionalized practices had segregated minorities into the most vulnerable low-lying areas. Not only that, but New Orleans’ evacuation plans relied heavily on the use of cars. In New Orleans, 100,000 city residents, disproportionately minority, had no car. Hundreds who could not evacuate died. And then, after the hurricane, the federal response, according to many black leaders, was slow and incomplete. At the time of Hurricane Katrina, New Orleans was 60.5% African American.

These are the results of racism manifesting in the way environmental issues are handled.

* * *
This is part 1 of 2 of "Environomental Racism"
See also
Part 2: No More Disposability

2017-10-17

Biases and Anxiety

What Other People Think? part 3

We Need Our Group. So Find One that Values Changing Your Mind.

I find it helpful to keep in mind: We are able to not care what some people think – THOSE people – by caring instead about what other people think – OUR people, as we have generalized them. I also find it helpful to remember the limitations and biases of my brain, and how seeing the world the way I do is largely a fluke of my social situation, mixed with some genetic predispositions.

Is there something we can do about this? A little bit, yes. Rather than Polonius, “to thine own self be true,” go back to Socrates, “know thyself.” Identify your own confirmation biases. You were made to be oriented toward tribal bonding, so try to be aware of how that’s at work in your opinion formation.

Since we are such a group-oriented species, see if you can hook up with a group that values evidence over any particular story for interpreting that evidence. This is tricky, because every group likes to think of itself as valuing the evidence, but almost all of them fail to notice how highly selective they are about the evidence they value.

But here’s an example. I understand that members of the Yale Political Union
“are admired if they can point to a time when a debate totally changed their mind on something. That means they take evidence seriously; that means they can enter into another’s mind-set. It means they treat debate as a learning exercise and not just as a means to victory.” (David Brooks, New York Times)
Can you identify times when evidence changed your mind? Can this congregation become the sort of tribe that admires members who talk about when evidence broke through their confirmation bias and changed their mind?

Nonanxious Presence

Edwin Friedman (1932-1996)
As I look further at this question – “Should we care what other people think? How much, in what ways, under what circumstances?” – there may be an underlying issue here of anxiety. The issue of how other people are judging us tends to come up in our lives when we’re feeling some anxiety about where we stand with people around us. The question, “do they like me?” naturally raises some anxiety, and to deal with that anxiety, one strategy is to tell ourselves we don’t care. This strategy tends to be disconnecting. It's a tried-and-true strategy for coping with anxiety without ever acknowledging to ourselves that the anxiety arose. And the drawback is that we disconnect.

Here’s an alternative that derives from the work of Jewish rabbi and family therapist Edwin Friedman: nonanxious presence. Nonanxious presence is one of my slogans I try to live by – not always successfully, but I try.

If I tell myself, “I’m not going to care what other people think,” I conceal from myself the anxiety that prompted me to say that. If I tell myself, “be nonanxious,” I’m bringing awareness to the fact that, yes, a little bit of anxiety is there, and I’m now intentionally going to move past that.

If I say, “I don’t care what they think,” I’m disconnecting. If I say, “nonanxious presence,” I’m telling myself to stay connected, stay present.

Another popular strategy for dealing with do-they-like-me? anxiety is to go the opposite way – instead of “I don’t care what they think,” I start doing and saying things I think they will like.

Those are the two main strategies: blow ‘em off, or bend over backwards to appease. Nonanxious presence is neither of these. It’s an approach that, first, recognizes the anxiety. I notice anxiety first in myself, and then notice how anxiety is functioning in the system around me, the anxiety of the people who aren’t liking me. Whatever reason they may say they disapprove of me, underneath that, there’s anxiety. Something about me is challenging their assumptions, their status quo, their world picture, and that’s anxiety-producing for them. Bringing awareness to my anxiety, and the anxiety in the system, I make a decision not to be ruled by that anxiety. This is easier said than done, and it’s a skill that takes a while to develop.

Supposing I’m able to move into being nonanxious, the next part is presence. I’m going to bring my presence to the situation -- MY presence – who I am. This is not appeasing, or saying what you think they want to hear so they’ll like you. Friedman’s term is self-differentiation. Self-differentiation is: the capacity to be present to, but not caught up in surrounding emotional processes – not taking on the anxiety in the system. It also involves reaching clarity about your principles and vision, and a willingness to be exposed and be vulnerable.

Nonanxious presence neither disregards what other people think, nor is it controlled by the natural human impulse for approval.

The Good, the Bad, and the Ugly

In sum, other people, some of them, form judgments of us -- and though they probably think about us less than we imagine, making judgments of others and relating to how others judge us is inherent in being the social species that we are. This involves some good, some bad, and some ugly.

The good is that we care, we want to connect, and bond, and have a shared story, and not be psychopaths.

The bad is that we’re oblivious to evidence that doesn’t support our story, we suffer confirmation bias, and despise people with different opinions.

The ugly is anxiety – in ourselves and in the systems of which we’re a part. This anxiety can make us disconnect on the one hand, or lose ourselves and our integrity in seeking after approval on the other hand.

May we find ways to embrace and celebrate the good, compensate for the bad, and effectively manage the ugly.

* * *
This is part 3 of 3 of "What Other People Think"
See also
Part 1: We All Care What Other People Think of Us
Part 2: The Self and Its Worldview

2017-10-16

The Self and Its Worldview

What Other People Think, part 2

George Herbert Mead (1863-1931)
What is this “self” thing to which Polonius tells us to be true? The great philosopher and social psychologist George Herbert Mead, whose career spanned the first three decades of the 20th century, understood the self as a generalized other. I always thought that was very helpful. The self IS others – certain important others – generalized into a single personality. A person’s “generalized other” is her conception of the important other people in her life in general – an amalgamation of the people with whom she identifies. (She literally identifies with them in the sense that she gets her identity from them.) The ones with whom the child identifies during the formative years, she generalizes into a shared set of attitudes and assumptions which are her attitudes and assumptions, defining who she is.

As the context of your life shifts, and the people you’re around, and the people you identify with, shift, who you are shifts. It does. Maybe just a little. Maybe a little bit more.

The question is: how much, how fast? When it happens too much, too fast, that’s a problem. You need a core sense of self that’s pretty stable over time.

You might hear advice such as: “Don’t let other people tell you who you are. Don’t let their voice be more powerful than your own.” What this means is: “Don’t let what people are telling you now replace too fast too much of what you have previously learned from other people.”

On the other hand, not shifting at all is also a problem. You need a core sense of self that’s pretty stable – but not totally static. Life is for growing and learning, and growing and learning means taking in influences from some other people.

Indeed, our vaunted rationality is more about social bonding than for discerning truth. A few years ago, Hugo Mercier and Dan Sperber published an article, “Why Do Humans Reason?” If reason evolved to discern truth or make better decisions then natural selection would have weeded out confirmation bias (which Wikipedia defines as: "the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities").
“If a fact comes in that doesn’t fit into your frame, you’ll either not notice it, or ignore it, or ridicule it, or be puzzled by it—or attack it if it’s threatening.” (George Lakoff, qtd in Yudhijit Bhattacharjee, "Why We Lie," National Geographic)
"I trust this site to tell the truth."
Confirmation bias is a huge distortion – an enormous obstacle to adopting the belief that best fits all the available evidence. Confirmation bias exists because forming beliefs that fit the evidence is not the purpose of human reasoning. Forming social bonds is the purpose of reasoning.
“Most of us are quite willing to think or say anything that will help us be liked by our group. We’re quite willing to disparage anyone when, as Marilynne Robinson once put it, ‘the reward is the pleasure of sharing an attitude one knows is socially approved.’” (David Brooks, New York Times)
Human thinking is fundamentally relational because for our ancestors going back millions of years survival had more to do with strong relationships and social bonds of support than it did with reaching conclusions that fit the evidence. Competition between groups placed a premium on group solidarity, and group solidarity was reinforced by sharing an ideology – a characteristic pattern of reasoning.

The genus homo has been around for between 2.5 and 3 million years, and the scientific method for less than 400 years. Clearly, coming up with a story that really fits best with all the evidence that has been or could be gathered is a low priority for brains like ours. But having a story that we share with our tribe-mates is a high priority.

So powerful is our own worldview, so convinced of the power of its arguments do we become, that we can’t imagine how someone on the other side would answer those arguments. When I have talked to someone on the other side of some opinion that I have, and they’ve told me their answer, even when I understand it at the time – which is itself a rare occurrence – I don’t retain it. A few days later, I’m back to being unable to conceive how the arguments on my side could possibly be answered. Since it's important to me to understand other people, I find this forgetfulness (about details of how they defend viewpoints different from mine) perplexing and vexing.

In a study a couple years ago, participants were told “Donald Trump said vaccines cause autism.” (And Trump has repeatedly suggested there’s a link.) Participants who were Trump supporters showed a stronger belief that vaccines do cause autism. That’s not surprising: For them, Trump is a credible source, so they believe what he said. Then
“the participants were given a short explanation—citing a large-scale study—for why the vaccine-autism link was false, and they were asked to reevaluate their belief in it.”
The explanation was cogent enough so that participants “now accepted that the statements claiming the link were untrue.” They got it. They understood that, in fact, there is no link between vaccines and autism. But they didn’t retain it.
“Testing them again a week later showed that their belief in the misinformation had bounced back to nearly the same level.” (Bhattacharjee, National Geographic)
If it doesn’t fit our worldview, it doesn’t stick. Even in cases where information is accepted and agreed with in the moment, if it doesn’t fit our worldview, we forget it.

That’s how powerful our worldview is. And where did that worldview come from? It came from identifying with certain other people, and forming a generalized sense of how they thought.

This is bad news for truth, but it’s good news for integrity. Integrity with our worldview, with the sense of self that has that worldview, usually trumps new information that doesn't fit our worldview.

* * *
This part 2 of 3 of "What Other People Think"
See also
Part 1: We All Care What Other People Think of Us
Part 3: Biases and Anxiety

2017-10-15

We All Care What Other People Think of Us

What Other People Think, part 1

At last year’s congregational auction here at CUUC, the high bidder for the privilege of giving me a sermon topic wanted me to address the issue of other people’s opinions of us, and should we care what other people think? So today is the day I take up this topic.

It turns out there’s an answer to this question: Yes, we do and should care what other people think. Blessed be, amen, our closing hymn . . .

Oh, you know I’m not going to short-change a topic as rich as this one. (What would you think of me?)

The question is one of integrity, isn’t it? “To thine own self be true,” as Polonius said. And integrity – the wholeness of a life in which actions and principles are consistent – is important.

We’ve all met people who say, no, they are not affected by what other people think of them. Maybe you used to be one of those people – maybe you still are. I spent some years as youth imagining that I was immune to the opinions of others – but this was mostly so I could say snide and contemptuous things about lemming-people who followed the crowd and wouldn’t think for themselves.

Social psychologist Mark Leary did a very interesting study. He began by surveying a large group of students on their self-esteem and how much it depended on what other people think. Some of the students, question after question, reported that they were completely unaffected by the opinions of others. Other students said they were strongly affected by what other people think of them.

Leary then selected two groups: students who most strongly said they were not affected by others, and students who most strongly said they were affected by others’ opinions of them. Here’s what each student did. They had to sit alone in a room and talk about themselves for five minutes, speaking into a microphone. They were told that there was someone out of sight in another room listening to them, and that listener was making a determination, based on what they heard, whether they wanted to interact the speaker in the next part of the study. So the speaker is talking for five minutes into a microphone. And they can’t see the listener, but at the end of every minute as they’re talking, a number appears on a display in front of them – a number between 1 and 7. The students have been told that a 1 means "the listener really doesn’t want to interact with you. From what they’re hearing, they don’t like you." A 7 means that they very strongly do want to interact with you. Numbers in between indicate varying degrees of in-between interest.

Imagine how this would feel. You start talking about yourself, and at the end of one minute you see a 4. OK, they’re on the fence. You keep talking. At the end of the second minute, you see a 3.

Oh.

At the end of the third minute, it’s a 2.

I'm losing them!

At the end of the fourth minute, it’s up to a 3.

Ah, that last minute must have a little more interest in it.

Then at the end of the last minute, it’s back down to a 2 again.

Ugh.

Now, Leary has rigged it. Half the students, by random draw, got 4-3-2-3-2 – and it didn’t matter how charmingly or how boringly they talked about themselves. The other half of the students got rising numbers: 4-5-6-5-6.

Not surprisingly the students who had said that they cared about other people’s opinions had big reactions to the numbers. When one of these students got the sequence of falling numbers, their self-esteem sank.
“But the self-proclaimed mavericks suffered shocks almost as big. They might indeed have steered by their own compass, but they didn’t realize that their compass tracked public opinion, not true north” (Jonathan Haidt, The Righteous Mind 91)
Leary says that we have an internal sociometer that continuously measures how the people around you value having you around. When your sociometer needle drops, it triggers an alarm and changes our behavior. Leary writes:
“the sociometer operates at a nonconscious and preattentive level to scan the social environment for any and all indications that one’s relational value is low or declining.” (qtd in Haidt 92)
We ALL care what people think of us. It’s just that some people have such low self-awareness that they think they don’t. And why do they have that delusion? Because they think people will think better of them if they profess to be the kind of person who doesn’t care what other people think.

Actually, there is one group of people that truly do not have the internal unconscious sociometer and can be said to not care what other people think of them: psychopaths. Psychopaths would care what others think only as part of a plan to manipulate or exploit them. They don’t have shame and guilt: the social emotions that correspond to being attentive to what others think of us.

At this point you may be thinking, OK, we are all, if we aren’t psychopaths, attentive to others opinions, but even so, there are some people who are clearly always seeking approval and others who, while attentive to overt signs of approval and disapproval, aren’t always trying to get approval. Some people seem to need to constant reassurance and others seem to be more “self-defined” or self-differentiated. They take for granted that they are "acceptable enough," and, unless explicitly shown or told they aren't, they don’t seem to think about it.

To get at what’s going on here, we have to unpack the notion of self a little bit. What is this “self” thing to which Polonius tells us to be true?

Next: The Self

* * *
This is part 1 of 3 of "What Other People Think"
See also:
Part 2: The Self and Its Worldview
Part 3: Biases and Anxiety

2017-10-03

Quintessence of Glorious Dust

Yay! Death! part 3

At one point in his book, To Be a Machine: Adventures Among Cyborgs, Utopians, Hackers, and the Futurists Solving the Modest Problem of Death, after describing his encounters with certain transhumanists keen to inhabit robot bodies that would colonize the galaxy, Mark O’Connell reflects:
“Some essential element within me reacted with visceral distaste, even horror, to the prospect of becoming a machine. It seemed to me that to speak of colonizing the universe – of putting the universe to work on our projects – was to impose upon the meaningless void the deeper meaninglessness of our human insistence on meaning. I could imagine no greater absurdity, that is, than the insistence that everything be made to mean something.” (20)
Yes. Being who you are means making meaning in that particular way that you do. And that’s beautiful within the boundedness of its finitude. But to burst those bounds, to raise the prospect of subjecting everything to one particular way of meaning-making, is to lose meaning.

I am, then, so grateful for that boundedness, that finitude which frees me play my part without overrunning all the other parts. What has meaning in context, loses meaning when stripped of its context – and the context that allows our lives to have meaning is that they are so brief, so bound to particular place – and a single lifespan of time.

The transhumanists speak of “the sense of ourselves as trapped in the wrong sort of stuff, constrained by the material of our presence in the world.” (55) They therefore seek a “more suitable computational substrate.” But it is just this material constraint that gives us the context for making some meaning of ourselves.

Would it be “you” if you were uploaded into a robot’s computer brain? Yes, it would be “you” in the same sense that the mountains and rivers and stars are already you. It would be "you" in the same sense that it would also be "me." The other sense of you is rooted in the sinews and guts of your particular body, and to put your brain processes into a machine would be to create something as different from you as your child. It would, in some sense, be your child. But it wouldn’t be you. (Nor is the future self into which you are slowly, willy-nilly, turning, and of which you are also the parent, as Wordsworth recognized, writing, "The Child is father of the Man.")

For one thing, a human brain, being organic, is, for better or worse, unpredictable. Each one has its own style, its own predilections, but also has a fair amount of randomness built into it. We often find ourselves doing something, and we make up a story that makes the action seem like part of a coherent purpose. That’s often a story made up after the fact to rationalize some bit of randomness. Our brains are like a school of fish, or a murmuration of starlings “where elements interact and coalesce to form a single entity whose movements are inherently unpredictable.” One could, I suppose, build randomness into a computer emulation of your brain, but what seems to appeal to the transhumanists is getting rid of the flaw of our unpredictable randomness. Get rid of that, and an essential aspect of our humanness has been stripped away.

Our materiality, the randomness and surprise our “wetware” produces, the urgency and preciousness of life that comes from its brevity, the dying animal that we are: this is our glory and our part to play in the vast cosmos.

At the end of one of his chapters, Mark O’Connell describes being back home writing up his experience with one of the sects of transhumanism. He writes:
"What a piece of work is man, I thought. What a quintessence of dust. Some minutes later, my wife entered the bedroom on her hands and knees, our son on her back, gripping the collar of her shirt tight in his little fists. She was making clip-clop noises as she crawled forward, and he was laughing giddily, and shouting “Don’t buck! Don’t buck!” With a loud neighing sound, she arched her back and sent him tumbling gently into a row of shoes by the wall, and he screamed in delighted outrage, before climbing up again. None of this, I felt, could be rendered in code. None of this, I felt, could be run on any other substrate. Their beauty was bodily, in the most profound sense. I never loved my wife and our little boy more, I realized, than when I thought of them as mammals. I dragged myself, my animal body, out of bed to join them.” (68-69)
Blessed be.



* * *
This is part 3 of 3 of "Yay! Death!"
Part 1: Yearning for Immortality
Part 2: Two Epiphanies