Showing posts with label meta. Show all posts
Showing posts with label meta. Show all posts

Half-life

We've sold a home and not yet bought another, living in a holding pattern until we can move to where we belong.  We certainly don't belong in this apartment.  This purgatory does strange things to me.

I can sleep and eat (and even cook in a limited way) and drive to work, but the things that made my life mine are not here.  I can't put a record on the stereo, I can't shake up a mixed drink, I can't pull a book off the shelf and read.  All of that is on a truck parked in a nameless warehouse.  This is temporary lodging.  And that prevents me, in some subtle way, from starting to really live here in Durham.  I can't let myself move forward until I clear this final hurdle and get into the house that will be mine.

Alice and I were looking at live shows last night and bought tickets to see a band this weekend.  Why hadn't I done that already?  Do I not want to attach the memory of a live show to this apartment?  Do I want to avoid attaching any memories to this apartment - to avoid getting attached to it?  Maybe that's it.  It hurts to make a life in a place and then leave it.  Better to be temporary, to have a half-life, for a month than to really grab it and then have to tear myself away.

Ubiquitous cameras will make it hard to hide...your feelings

Nicholas Carr has written a nice tight piece of speculative futurism called "automating the feels".  The setup:  recently a company that produces corporate training materials came out with software that uses the computer's camera to make sure you're actually looking at their training videos.  Nick picks up that ball and runs with it - read his post.  But I got to thinking about what this kind of technology could tell us about ourselves.

Let's say, as Nick predicts, future communications will have a sort of side channel of emoticons automatically generated by the camera in your phone or computer.  Sort of like a soundtrack, but feelings.  You'd get emails about some project with running commentary about the sender:  "Bored.  Bored.  Bored.  Impatient." It would be impossible to lie about certain things; social niceties such as "looking forward to seeing you again" would come across very differently if the reader knew that while typing that, your expression was "disgusted".

Some people might learn things about themselves.  "The camera keeps telling me I'm angry whenever I text these people.  Come to think of it, it's right.  So why am I still hanging out with them?"  Or, "The computer kept telling me I'm happiest when I'm spellchecking.  Maybe I should have become an editor instead of an engineer."

Presumably it wouldn't be long before AI or fuzzy logic was used to fine tune these algorithms to individual users.  Then your phone would just know that you're annoyed all the time, and it wouldn't bother to say so unless it's, you know, worse.  But that might be doing a disservice to people you correspond with who have never met you.  And the algorithms will get subtler as time goes on.  At first they'll only be able to detect rage and glee, but eventually every emotional distinction there's a word for will be detectable.  And maybe more.  Maybe eventually we'll have to start coming up with new words for emotional states that our computers tell us about but we have no names for.

There are so many possibilities.  In many areas (such as lying), our society functions on an imbalance of available information.  People choose how much of themselves to reveal.  Throwing the covers off that imbalance would be pretty disruptive, the same way the internet disrupted retail markets by making it possible to instantly get the price of a product at every store that sells it worldwide.  Eventually we might have to wear a balaclava just to send a neutrally toned message.  But I'm sure the machines will have an emoticon for that!

Two cities, two concerts

In the last week I've seen concerts in both Detroit and Cleveland, and I couldn't help but notice how differently people behaved.  TL;DR:  Detroiters clap.

I didn't start going to small venues to see rock bands in earnest until I moved to Cleveland.  The way people act here does vary from venue to venue but I kind of figured this is just normal:  stand where you can see the stage, keep your mouth shut, and do not under any circumstances dance.  I might prefer a little more freedom, but I can deal with it.

In Detroit I saw Swans (again) with the opening act Low at the Magic Stick.  I would have paid good money just to see Low - see their incredible album The Great Destroyer - so I wasn't going to miss this show.  Their set was beautiful, and left me thinking I need to pick up more of their albums.  They have a loud-soft dynamic whose loud bits are a perfect pairing with Swans, but their soft bits are very soft indeed.  The guys behind me must not have seen each other in a while, because their conversation went on too long and got too loud.  In a rare moment of lucidity, I turned around to face them, jerked my thumb over my shoulder at the band, and asked the guys "are they boring you?"  They kept it down after that.  Unfortunately, I'd gotten up early and spent much of the day in airports, so I was too tired to enjoy Swans.  When I left at 1AM, they were still playing.

All the fans in Detroit clapped and cheered loudly after every song, and yelled in appreciation when the band started a song they recognized.  At one point Low said to the audience, "thanks for clapping."  That was what got me thinking about this.

In Cleveland three days later I saw ...And You Will Know Us By The Trail Of Dead at the Grog Shop.  This was a super high energy rock show in an intimate space, and the band worked hard to fire up the 70-odd fans by climbing down into the audience to play their instruments.  A few people were actually jumping around, but with all that effort and energy, three quarters of the audience was still just standing there nodding rhythmically.  At other equally energetic shows, there have been times where I've gotten into the music and started dancing in place a bit, only to draw glares from people standing nearby.  There is no clapping and little cheering at a typical Cleveland concert.  Maybe one out of ten attendees will yell in appreciation after a great song; the rest stand there like they've seen it all before.  Bands playing in Cleveland must feel like their shows are a flop.

I don't know what to make of all this.  Maybe there's an element of hipster insecurity to Cleveland audiences, a fear that if they admit the show is good, then people will think they haven't seen enough shows to know what's really good.  Or maybe Clevelanders really have seen it all before.  These two cities are too much alike for there to be big differences.  They're both under the gun in more ways than one, and moments of escape are important.  Seeing a rock band should free you, get you out of your own head, wipe the slate clean and reenergize you for when you have to get back to the grind.  Because the grind is waiting for you.  That's why we have rock music.

The "to think about" list

I looked at my to do list today and realized that I only really plan to do, soon, the top three or four things.  But the list is twenty or thirty items long.  I need to be more honest about this.

The last time I had a "to do list" that really worked was when I had a Palm Pilot and I used it to keep track of remodeling a house.  That was tidy.  But in the real world of personal projects that seem important and then fade, or work projects that split and change focus, the linear to do list is messy.

It turns out that many of the tasks that have percolated downwards in my list are things I either can't do yet because of resources, or don't know how to do yet.  If doing something is just a matter of my time and its priority coming together, then that's a real "to do".  If something else is missing, then it doesn't belong on a "to do list".

So I think I'll calve off a "to think about" list.  Then the things I can actually do won't get lost, and there's still a place for problems I haven't solved yet.

Fear of an Earworm

I'm halfway through the article "Fear of a Black President" by Ta-Nehisi Coates in the latest issue of The Atlantic.  It's an insightful and probing discussion of how Barack Obama's race has affected his presidency.  But the most memorable thing about the article so far, for me, is the author's name.

When I became aware of Ta-Nehisi Coates as a cultural critic a few years ago, I assumed the author was a woman.  There's something vaguely feminine about the prefix "Ta" and the ending "isi".  In the discussion of this latest article - I read about the article before I actually read the article - I learned that Coates is a dude.  OK, fine.  Filing away that fragment of cognitive dissonance, I dug into this important cultural artifact. 

Mr. Coates points out Obama's embrace of his race:  his unabashed blackness in preferring one rap star over another, in being photographed while a little black boy feels his hair.  These are cultural signals, and they're emanating from the highest office in the nation.  All previous emanations from that office signaled white culture.  Shades of white, from the Stepford-meets-Mad-Men Kennedy White House to the we-mean-business-but-we-horse-around-too presidency of Bush the younger, but all white.  Barack Obama is black, and the fact that he enjoys it and doesn't feel obligated to hide it is revolutionary for this nation.

How, then, does one pronounce Ta-Nehisi?  I'm pretty sure I've got Coates down, but ... tah neh HEE see? tah NEH heh see? 

I read, too, about Trayvon Martin's death and how the tenor of the national conversation about it changed after Obama addressed the issue.  All he said was 'if I had a son, he would look like Trayvon' and 'let's understand exactly what happened here, understand everything and investigate everybody'.  And suddenly what had been an uncontroversial tragedy became racialized political fodder. 

TAH neh hissy?

The name floated through my head while I chopped vegetables, while I drove to work.  It nagged me like an earworm, like a child's choir singing Go Tell It On The Mountain in my head endlessly.  Race relations in America faded into the background.

Some people have excellent visual memories.  My memory is audible.  To dial a phone number, I read it out loud to myself and speak it back as I dial (usually sotto voce).  If I just look at the number, no matter how long, I forget it by the time I've dialed the third digit.  I'm good at remembering how to pronounce words and names.  I need to know I'm saying it right.  It bothered me when my Chinese coworkers wouldn't even tell me if I was pronouncing the name of their damn city correctly (for the record:  Quanzhou = "chen-zo").  So it drove me nuts not knowing how to pronounce this name that was so foreign to me that I couldn't even deduce its gender.

Finally I had an idea.  Tennessee.  Whenever I think of this author, I am mentally going to pronounce his name Tennessee.  Like Tennessee Williams.  Or not so much.  But anyway:  problem solved.  LA LA LA LA LA LA LA.  Now I can go finish the article, and find out how Obama can be black in some ways but not others, unburdened by questions of pronunciation.

Gooooo tell it on the mooooountaaaaain....

Ancient Chinese science fiction

When we think of science fiction, what comes to mind is space operas and barely-credible technologies.  It's very much a genre of the modern age.  But could science fiction have existed in other times and places?  When the Chinese invented gunpowder, what kind of future technology might they have imagined?

Fantasy (fiction) has always existed.  Think of the Epic of Gilgamesh - it's one of the oldest examples of literature of any kind, and although people had a different relationship with their myths than we do today, surely nobody took it as literal truth.  Fantasy is born out of an incomplete understanding of the physical world; the stories are narratives to fill in the gaps and possibilities opened up by not-knowing.  Why does the Nile flood every year?  Let's make up some supernatural beings--with personalities like ours--to explain it.

The genre of science fiction, by contrast, dates back only to the late 1800s and the rapid evolution of mechanical design that accompanied the Industrial Revolution.  Jules Verne and H.G. Wells caught the imagination of a lot of people.  It was born when a generation saw so much technological change that they began to expect change, and they began to wonder what changes the future would bring.  They speculated. 

That notion got me to wondering:  could science fiction have evolved alongside other surges of technology?  In the medieval Arab world, when they mastered the distillation of alcohol ("al-kohl"), did they cure some disease and invent some new kind of boat at around the same time, leading people to wonder what was next?  Was there speculative fiction in ancient China when they pioneered papermaking and invented the compass?

The thing is, "fiction" as we understand it today - as leisure reading for entertainment - is a product of the age of the printing press and of disposable income and time.  Gilgamesh, on the other hand, was probably pregnant with symbolism and lessons for members of Mesopotamian society; I imagine it was required reading.  Literacy rates were low and everybody was busy just surviving, so stories had to be important to be passed on.  So with some disappointment I must conclude that it's unlikely that science fiction flourished in the silicon valley of the Orient.  There would have been no audience, and it wouldn't have been thought important enough to write down.  Too bad.

Hardwired for abstractions

Recently I was part of a discussion about whether a particular musician's work was art or schlock.  Someone commented that those who call it schlock are members of the same class of self-appointed arbiters of taste that pollute every artistic field. 

And I thought, yes, the leaders in every field of human endeavor, not just art, maintain their lead by constantly creating new abstractions, forcing others to learn them.  Trading coconuts for bananas became gold currency which became paper currency which became credit which became junk bonds which became packaged mortgages.

And that's what I suspect differentiates genetically modern humans from other species:  we are hardwired for abstractions.  A successful quest for food or sex leads us to continue pursuing the winning strategy.  In apes, it ends there.  But in humans, we're saddled with self-awareness, which forces us to create a worldview.  Every success or failure has to be integrated into that worldview with an abstraction, or else it makes us feel anxious that we don't know how the world works.  So:  I killed the deer because I left a heap of apples for it to eat.  I got laid because I used that cologne.  These explanations give us a handle on what to do next.  They might be utterly wrong, but they dispel the angst of believing we're ignorant.

This post was an abstraction.  But was it art or schlock?

People learn best what they figure out for themselves

The title of this post has been a piece of my "working wisdom" for so long that I've forgotten where I picked it up.  Many thanks to whatever thoughtful soul told me that in the mists of my past.  I know it's true - but why?

An easily understood example comes from the sensory experience of wine tasting.  You can be told that "corked" wine (a bad bottle) smells like wet cardboard.  Or you can be told that burgundy smells "barnyardy".  But you won't really get it until you smell it with your own nose in a wine that really shows it.  Many of the common descriptors in wine tasting were a mystery to me until I experienced really clear examples of them.

In teaching, there's a subtle difference between handing out conclusions and letting students draw their own.  (I haven't done much teaching, but I spent 12 years in college, so I think I'm qualified to talk about it at least a little.)  It's almost sleight of hand, a piece of verbal trickery designed to get people to draw a particular conclusion without saying it outright.  Being told, or taught, something directly is like being handed a map with two stars on it.  You may see where point A and point B are and how to get from one to the other, but you'll have a tough time making the trip without the map in hand.  Experience the trip, though, and the memories of the physical path will be burned into your neural pathways.  Your brain will remember the solved problem of how to get from A to B.

In his counseling circles, Parker Palmer makes a distinction between "open, honest" questions and leading questions.  In that context, there is no cut-and-dried factual answer, so to lead the learner is to impose your own interpretations and conclusions on their experiences.  That would be pernicious enough by itself.  But more insidiously, it narrows the field of inquiry, steering the learner away from interpretations that might be helpful.

What do I think about it?  I think that when you mentally process a new idea, you have to decide how worthy it is for space in your brain.  In the case of received wisdom, when someone else tells you what they learned, you can always make excuses and say you might have come to some other conclusion if you'd been there.  As a result, you don't feel a terribly strong need to remember it.  With direct experiences, it's harder to rationalize away your own observations.  And you remember - because you did the work yourself.

Science, justice, and living

I'm a scientist, and as such, I'm a big believer in using evidence and reason to understand the world.  But I admit it's not the only way.  Our legal system doesn't rely solely on rigid rules, and neither should we in our daily lives.

A recent Ars Technica opinion piece talked about why physicists so often try to speak authoritatively about subjects far from their actual expertise, and come across as jerks in the process.  The basic problem is that physicists (and many other technical types) believe that they are experts in the most fundamental, most important kind of knowledge, and in addition they're experts at using logic to defend almost any position.  This dovetails nicely with my last post and the comments on it.

One of the points made over and over again by Parker Palmer in A Hidden Wholeness was that it's a mistake to ignore other kinds of knowledge, for example what you might call intuition.  Or spiritual knowledge.  Or social insight.  You get the idea.  What makes physicists look like jerks is the implicit value judgement that these sources of information are inherently inferior to the firsthand observational knowledge of the natural world that physics is based on.  For many years I was that guy:  I couldn't tell you how many times I was called arrogant because I refused to take anything but what I called "facts" seriously.  I know now that "facts" excludes a whole lot of truth.

Science and justice:
Our legal system was born out of the historical period when empirical rationalism was the leading theory on how to guide people's actions.  That is in contrast to, say, taking theological recommendations as was done in earlier ages.  So the justice system and our Constitution were designed to influence our society like an engineer would design road widths to influence the flow of traffic through different areas.  For example, we decided that drugs are bad for society, so we outlawed them and put punishments in place for those who use them.  It's all theoretically based on cause and effect, though of course some laws are better supported by evidence than others.

But the legal system isn't a machine, impartially reshaping everyone who comes through it.  There is a non-rationalistic, non-empirical element built in:  the jury.  The jury is there to enforce what you might call poetic justice.

Science and living:
What does this have to do with our lives, the choices we make every day?  I've always tried to live my life according to principles, making an informed choice about the best way to live and then sticking to those choices.  When I learn something new, I revise my choices.  It's all very scientific.  But I've been thinking lately that it needs an element of poetic justice.  A truth other than the factual kind, a truth from my inner life in addition to those from the way I've come to see the rest of the world.  My wife has described me as the "king of self-denial", and it's true, I have immense restraint when it comes to doing what I *think* is best as opposed to what I want in the moment.  I've never trusted my impulses.

The thing is, I'm now coming to distrust my very scientific informed choices.  When I look back, I can see that a lot of them were just retroactive rationalizations for following a subconscious impulse.  That impulse might have been to avoid something feared, to approach something desired, or to strike at something hated.  But all those impulses were hidden.  To make a close analogy, as the Ars piece noted, during training in rhetoric one argues towards defending a predetermined position; this position may not be the one you would choose, or even one you think is right, but its assignment to you was hidden from the audience.  The point of the exercise in rhetoric is simply to lay down logical arguments to support it.  It's a deeply unscientific, even antiscientific, practice, and I think we do it all the time to defend our actions retroactively.  We do what we want, and then afterwards we come up with a story about why we did it. 

If we're acting on our desires anyway, why not bring them out of the subconscious and into the light?  Why not see our motivations for what they are, and give ourselves a chance to decide which of them to give in to?  Our rules for having a good life may say one thing, but our sense of poetic justice--or maybe just poetry--may have something else to say.

The benefit of the doubt

"Never attribute to malice that which can adequately be explained by stupidity."

You've probably heard that one.  Here's a corollary:

"Never attribute to stupidity that which can adequately be explained by ignorance." 

When you find yourself in a serious disagreement with someone, first give them the facts as you see them.  Stick to what you've observed firsthand, not your interpretations or hearsay.  You'd be amazed at the people you can find common ground with just by sharing your experiences.  Second, if they agree that you saw what you said you saw but they still don't agree with you, silently decide for yourself whether or not they're just dumb.  If you think they are a smart person, you may reluctantly conclude that they are malicious.

Now here's the tricky part:  do them the favor of turning this procedure around on yourself.  Listen to the facts as they see them - carefully restrict them to their own firsthand observations.  Give their observations the same weight as your own in your interpretation of events.  If after careful consideration of ALL the facts you still disagree with them, then silently decide for yourself whether or not you are just not as smart as they are.  If you think you're not dumb, but you can't come to some kind of agreement with them, you might be malicious.

I like to trot this one out during election season, but it never works.

To name something is to begin to kill it

I have a group of good friends who've gelled over the last several months.  Having noticed this, I'm tempted to say to them, hey, we're the somethings - something we have in common - but I'm resisting.  To name something is to create expectations for it, set patterns for its evolution, and to limit it.

There's a saying that the vitality of a form of expression is in inverse proportion to the number of books that have been written about it.  Think of the difference between hip-hop and rock in the early 1980s.  Hip-hop had barely entered the public consciousness--it had only recently been named so people who weren't familiar with it could talk about it--whereas rock had been the subject of hundreds of books.  Hip-hop was thriving with its practitioners trying new things practically every week; rock was moribund, more a commercial enterprise than an art form.

What does my group of friends have to do with a couple of musical genres?  These things are all communities of people trying new things, all of them creating something in some way, and watching what the others do, to form a larger composition - a genre, a group of friends, a scene, an academic discipline, anything.

To name such a thing is to put the first knife into it.  Other blows will follow.  There will be an "elevator speech" that those in the know use to describe it to the uninitiated.  And there's a little prestige that comes with recognizing a thing that can be named - seeing patterns and showing them to people makes you look smart.  After conversations about the new thing, there will be articles, documentaries, books.  Those introduced to the thing at each stage will then seek out what they were told this thing is, but what they look for is only what the thing was yesterday.  They want the thing to freeze so they can experience it the way they were told it was.  With each of these blows, the thing becomes less vital, dies a little.  They start at virtually the moment the thing comes into existence.

I want my group of friends to be flexible and adaptible.  I want us to be able to let people in and let people go, to try doing things together that we've never done, to find new ways of expressing ourselves and new channels of communication.  I'm frankly scared that if I even so much as stand up and announce that we are a group, then even that will limit us.  I can live without the pride of claiming that tiny burst of prestige.  Maybe I'll just let us be.  Just being seems to be working out for us so far.

Ingredients

My wife has been accused of viewing a restaurant's menu more as a list of ingredients than as a list of finished dishes.  She often asks them to make something special for her.  Here's the thing:  I'm kind of the same way about technical results.  When a coworker gives me a report with some experiments and the conclusions they've drawn, I'm likely to go straight to the data and come up with my own conclusions.  Data, of course, are the ingredients of a technical argument.
(I can't help it.  All my training and inclinations drive me to take observations and generate abstractions from them.  I will not be satisfied with someone else's abstractions if I get a chance to see the data.  Trust plays a role here, as does the fact that people learn best what they figure out for themselves.  As for Alice, she might order "off the menu" simply out of personal preference, but trust might have something to do with it too.  She might look at a menu, see a parade of bad combinations, and simply lose faith in the ability of the kitchen to produce something good without her guidance.)
The work of the chef and the work of the scientist are not so different.  I've always said that mine is a creative job.  They call us "knowledge workers".  We start with the elemental and construct something novel.  Ingredients become new forms of sustenance, or new areas of human knowledge.

***************

I'm writing this post in Blogger In Draft, with the "Google Scribe" feature on.  As you type, it offers suggestions on what word you might be in the middle of typing,  or might want to type next.  It is the strangest damn thing I've ever experienced.  It feels like someone is constantly interrupting me, or trying to finish my sentences.  I'm tempted to use it to write a blog post about nothing, just by letting it dictate the next word and see what the infinite monkeys produce.  They're developing self-driving cars--in fact, Google is part of the effort--next, will we get automatically generated art?

Why music?

When you were a teen, you probably had songs that you listened to over and over, laying in your bed or huddled in the corner wearing headphones.  You weren't the only one.  Did you ever wonder why music is so popular with young people? 

Well, what is our music about?  Mostly it isn't about our schoolwork or our jobs, our laughter or our arguments, the safe things that occupy our everyday lives.  It's about more ... I think the right word is "intense" ... feelings.  Love.  Anger.  Ecstacy, delirium.  Scary stuff, it makes you act in ways you normally wouldn't.  I've begun to think that music is our way of getting used to those raw states of mind so we're prepared for them when they happen to us.

Why would we need that?  Here's my guess:  we are, as a culture, notoriously blind to our own emotional and even physical states.  Our minds are swimming with distractions created by consumerism; we're manipulated by advertising and fantasies in our media that are all constructed to maintain this capitalist world.  All that outside influence has a toll:  we may not have the time and the mental clarity to understand our own bodies and feelings - or we may simply give in to the distractions in order to escape from the hard work of figuring it out.  So we're unprepared to deal with lust, territorialism, fear, shame, and all the rest.  Music takes all those raw states and presents them safely, as entertainment.  Repeated exposure makes the messy stuff less frightening when it hapens to us.  
(Want a laugh?  When we have these intense feelings, we think:  I know what that is - and we start communicating in song lyrics.)
Here's a related thought, about adulthood.  I just got over appendicitis, and during the recovery I was repulsed by the idea of having a beer or a glass of wine.  Alcohol is about experiencing a different state of mind.  When I feel sick, I don't want alcohol, because I'm already away from normal - I want to be normal again.  So maybe music functions to familiarize us with the scary aspects of our ordinary state of mind, and alcohol serves to take us away when it gets boring.

Plain English

I consider it a virtue to speak in plain English, as opposed to whatever specialized diction is relevant to the topic.  It's hard work--buzzwords are easier--but I think I'm doing a favor to the people I'm talking to.

Every trade has its own terminology.  Acronyms like "HTML" are obvious, but it becomes quite maddening when terms like "quality" and "merit" take on unexpected meanings in manufacturing and human resources.  This trade talk is a convenient shorthand, but it also acts, for better or worse, as a group identifier and that makes it exclusive.

You don't have to be in a trade for its lingo to rub off on you, either.  For example, people who have received a lot of counseling sometimes engage in "therapy-speak".  In one highly personal conversation, my mind raced to fill in whole paragraphs of meaning behind terms a friend was tossing off by reflex - terms that have other, simpler meanings in plain English, but it was clear my friend meant more.  And I may have been wrong.  I may not have known all the associations and implications of their shorthand.

I don't want to be misunderstood.  Not when I'm excited about the topic, and certainly not when it's highly personal.  I discipline myself to speak in plain English, using the meanings of words that everybody knows.  If you read this blog (for example, the "recommended posts" at the bottom of the right column), I hope you'll agree that I've talked about some pretty abstract material in understandable ways.  It might surprise you that I do the same thing at work, replacing the shorthand of project names with their concrete goals.  Why?  My boss has four other employees like me, all working on radically different technologies; the last thing I need is for him to recall an outdated meaning of a term I've just used.
(Emily Dickinson famously grounded a difficult subject by saying that she knew she was reading poetry when "I feel physically as if the top of my head were taken off".)
Buzzwords are tempting.  They make you look smart, like you're in touch with how the experts define every aspect of your field.  They're also faster.  But when you go to all the effort of learning them, are you really better at solving problems in your field? 

Sometimes it's difficult or it takes longer for me to speak in plain English, because I have to think about what my audience knows and explain in detail in terms they'll understand.  In personal matters it's about being aware of, and honest about, my physiological and emotional responses (like Dickinson was), instead of packaging sets of them into code words that may not reflect my reality.  It's about knowing my audience, respecting their time, and giving them a chance to benefit from what I'm saying.  Otherwise, why bother?

Goodbye, TableTalk.salon.com

I just stumbled across this, which is happening as we speak.  It's an online community 16 years old being shut down - without an archive (via Scott Rosenberg).  It hurts a little, because I posted there from 1998 through the early 2000s before moving to Salon's other forum, The WELL.  I built friendships, met people face-to-face, and read a lot of great stories.  For example, this hilarious tale of two dogs finding an elk carcass and refusing to leave it.  What does the shutdown say about the permanence of content on the Internet?

The official announcement gives reasons for the shutdown and for the lack of an archive, but that's not why I'm posting this.  In this Metafilter discussion there's an interesting comment by user meehawl:
"I see my commenting history as more akin to conversation than anything else. It always suprises me that people might want to save what they've said online. This is not to snark at those who want to keep what they've said. It's just not my viewpoint. I'd happily see my ephemeral conversational metafilter history deleted."
Sounds like Facebook, doesn't it?  One of the first things I noticed about Facebook was that they make it a lot harder to find old content than pretty much any other site on the Web.  Facebook is not archival.  They don't make money off people looking up advice and interactions from years ago, so they don't let you.  It's not like it would be difficult for them to put in a search form, they purposely left it out so you'd concentrate on the here and now. So they can build an advertising profile on you.

Discussion forums like Table Talk are starting to look old-fashioned in comparison to Facebook, but people who got on the Internet before Web 2.0 developed an expectation that everything they ever said was in the cloud and would stay there forever, searchable by a simple Google query.  Not that you necessarily wanted it to be, it just was, because of the simplicity of the HTML that was used to build forums.

That age has ended.  Closed databases like Facebook--and The WELL--are not indexed, either for reasons of revenue or privacy.  Every byte on the Internet has a cost and a value, and business decisions are being made about your public history.  Who knows, maybe eventually it will be unnecessary to be careful what you put out there - the pictures of you drunk, the off-color jokes - and link rot will take away everything you don't pay to curate and archive.

Meditation as housecleaning

I sat down to read on Sunday and found it quite difficult to focus.  I'd removed all distractions - I was by myself, comfortable, with some background music on - but a thousand thoughts interrupted me.  Gotta add that chore to my to-do list ... what did that guy mean when he said that ... it became comical when I remembered I was out of business cards and found myself getting out of my chair to put a couple of them where I'd find them later.

This was my mind tying up loose ends.

I kept returning to the book (my book club was meeting to discuss it Monday) but I felt bad that I wasn't fully engaged.  I liked the material, but it took a couple hours before I could read more than a paragraph without thinking of something else.  I had a bit of an epiphany and jotted myself a note, saying that it happened when I sat down to read because the environment I set up for reading is exactly the environment my mind needs to do the housecleaning.

Let's pursue this housecleaning analogy:
Let's say I have the house to myself on a Saturday.  I want to use the day to work on a big project and then relax later.  I have some breakfast and start walking around the house.  In the kitchen, I set down my coffee cup and I see that there are dirty dishes.  I wash them and clean the countertops.  I can't just start on my project with the house looking like a disaster.  Leaving the kitchen, I go upstairs to put on some old jeans.  On the way, I see dog hair on the floor, so I decide to vacuum.  But when I get to the bedroom I see clothes lying around.  I tidy them up and make the bed.  I vacuum.  When I go to the basement to get the handheld vacuum for the stairs, perhaps I notice my half-finished project laid out on a table, and start working on it.  Or perhaps I observe that the fireplaces are full of ash, and it's cold out so I might prepare firewood for later.  This goes on until I'm either too tired to continue or there is no more that I can do.  And then finally I settle down.  Perhaps in the living room, with a fire and a beer and some music.  Or perhaps in my office, with a book.  Or maybe, if I have the energy, work on that project.
When I got to the book club and mentioned my epiphany, Jack Ricchiuto told me that my experience was a textbook example of meditation.  In meditation, he said, there is an object of focus; at first the mind wanders but you return your attention to the object.  After a half an hour to a few hours, the mind clears.  I was quite surprised; meditation has been recommended to me before, but I always imagined it to be ... I don't know, something like magic.  Not something that could happen accidentally.  But this experience was quite familiar to me.  I guess I've been meditating from time to time all my life.

The guilt of ideas by association

Over at The Renaissance Mathematicus, thonyc is asking some very pointed questions about science - but the questions are really relevant to everyone.  First, how does a person's personal ethics affect their professional authority as seen by others?  Second, if your work might build upon the work of someone of questionable ethics, do you eschew acknowledging their professional contributions (thereby causing your own work to suffer) or do you refer to their work as you would anyone else's (at risk of legitimizing their less wholesome aspects)?  The case in point is a historian who is a leading expert on Galileo and also a Holocaust denier.  I'll be following the conversation.

Experiences become narratives that become bricks in the path that got us here.

Have you ever noticed that when you and your partner are telling stories about the things that have happened to you, the stories get tighter and tidier as time goes on?  Details get omitted if they don't contribute to the way you're interpreting it for the friends you're talking to at the moment.  You and your partner may even engage in a subtle public battle over the meaning of your shared past, like a watered-down real-life Who's Afraid of Virginia Woolf.

Our life has a layer of incidental detail that influences the way we experience it.  In the retelling, we strip off these details to make neat narratives that are easy to relate and easy to hear and make us look clever and entertaining.  These narratives are malleable for a time, like clay, but eventually they ossify, hardening into bricks.  When we look into our past, these stones fit together without gaps to form a path ... a path sturdily constructed to answer the question "how did I get here?".

We simplify our stories to make them easy, but the truth is that our lives are not steps from one stone to the next.  They're buzzing with possibility, with different things to notice and different ways to think and feel.  A life lived consciously cannot be distilled into a novel.

This phenomenon came up in a conversation I had with some friends; the Buddhist tradition explicitly fights it, making an effort to include all the details when events are recalled.  It honors the listener by assuming they're smart enough to come to their own conclusions.  Todd Kashdan recommends that when you have a good experience, you should try not to pigeonhole it with an explanation - the moment you do that, the event is dead.  If you allow the experience to be surrounded by all the tiny details that came along with it, you allow new meanings to continue to be created, and your happiness to go on.

Consumerism vs the liberal arts education

The other day I was reminded of the quaint concept of the "liberal arts education" when the shooting of Rep. Giffords in Arizona was blamed on the suspect's ownership of Mein Kampf.  Whatever the merits of that argument, or the question of whether the shooter had even had such an education, I wonder whether today's college students so much as know what a liberal education means.

When I was an undergrad in the late 80s, I got it, barely.  The professors explicitly told us.  They pushed back against students who regarded taking English courses as the educational equivalent of eating broccoli, and wanted to dive right into their major instead.  I get the impression that the last 20 years haven't been kind to the professors' arguments; as the rise in cost of a bachelor's degree has vastly outstripped inflation, students have focused on ROI.  That's a consumerist viewpoint, and as a result, universities have come to resemble trade schools more and more.  This has even trickled down to secondary education, in the form of the "no child left behind" act.  The objective is to produce qualified wage-earners rather than engaged and well-rounded citizens.

Well, consumerism is a self-perpetuating system that trains people to earn more money by making things so that they can spend more money buying the things other people made.  The whole cycle is predicated on the brief rush of pleasure of an acquisition.  But happiness isn't just increasing the number of pleasant moments in your life - unless you consider addiction a happy condition.  The pleasure of acquisition is myopic; happiness requires perspective.  The liberal arts education used to give students the tools to get that perspective, but consumerism wants the resources that went into that education.  (Start your major early!)  I'd even go so far as to say that people with real perspective weaken consumerism, because they're capable of choosing to earn less in order to have a more satisfying life. 

"Kids these days?"  No, that's not my point.  The saving grace in this situation is the current generation's charity work and community involvement.  The grades and extracurricular activities expected of college-bound high school students today astonish me.  If I had graduated from high school in 2010 instead of 1987, barely in the top 20% of my class and with only one theater production to show for all those evenings and weekends, I'd never have been accepted to the University of Michigan.  The exposure to the broader community and the chance to be altruistic goes a long way towards providing perspective.  The competitiveness of the college admissions process might just be a backhanded way of giving these young folks the tools they need to have a better life.

Self-help for corporations

On the face of it, a corporation is no different from a shelf full of beanie babies - it's a collection of real things that "exists", as a collection, only in an abstract sense.  The thing is, corporations exhibit some collective behavior that resembles the way people behave.  For example, can you blame BP's Gulf oil leak on individuals?  It's probably more accurate to say it was a result of a corporate culture:  standards and priorities that were set by and shared by most of BP's employees. 

Much has been said about the inherent amorality of corporations:  that they report only to the bottom line.  There are individual people that behave amorally too - but unless they're sociopaths, their conscience intrudes.  Then they start thinking about how to lead an existence that's more in tune with their surroundings.

Self-help books give such people mental, emotional, and social advice.  Which makes me wonder:  what would a self-help book for a corporation look like?  How can you guide the latent sentience of a group?

Corporate culture isn't top-down; examples are set from above, but they have to be reinforced, or reinvented, at every level.  There are many cases where a corporation lacks a cohesive culture or where parts of it have a culture different from the rest.  Take my employer - it grew rapidly by acquisition, so each site had its own "feel" for quite some time after acquisition.  Turnover, and repetition of a consistent message from headquarters, has built a shared outlook.  But it wouldn't have happened without buy-in.  Every individual contributed.

Where does individual morality come from?  Strictly behavioral lessons, like "don't hit people or they'll hit you back," don't explain the full spectrum of human ethics.  Altruism (part of which may be genetic) comes into play.  Spirituality too.  And, perhaps most of all, empathy.

What kind of "book", then, can influence the individual employees of a corporation to set priorities in accordance with empathy, spirituality, and altruism?  Certainly their own moral codes are guided by these things.  Maybe they just need to sense that they're allowed--even required--to make those things a priority within their company.  We all want to act with integrity.