Nicholas Carr has written a nice tight piece of speculative futurism called "automating the feels". The setup: recently a company that produces corporate training materials came out with software that uses the computer's camera to make sure you're actually looking at their training videos. Nick picks up that ball and runs with it - read his post. But I got to thinking about what this kind of technology could tell us about ourselves.
Let's say, as Nick predicts, future communications will have a sort of side channel of emoticons automatically generated by the camera in your phone or computer. Sort of like a soundtrack, but feelings. You'd get emails about some project with running commentary about the sender: "Bored. Bored. Bored. Impatient." It would be impossible to lie about certain things; social niceties such as "looking forward to seeing you again" would come across very differently if the reader knew that while typing that, your expression was "disgusted".
Some people might learn things about themselves. "The camera keeps telling me I'm angry whenever I text these people. Come to think of it, it's right. So why am I still hanging out with them?" Or, "The computer kept telling me I'm happiest when I'm spellchecking. Maybe I should have become an editor instead of an engineer."
Presumably it wouldn't be long before AI or fuzzy logic was used to fine tune these algorithms to individual users. Then your phone would just know that you're annoyed all the time, and it wouldn't bother to say so unless it's, you know, worse. But that might be doing a disservice to people you correspond with who have never met you. And the algorithms will get subtler as time goes on. At first they'll only be able to detect rage and glee, but eventually every emotional distinction there's a word for will be detectable. And maybe more. Maybe eventually we'll have to start coming up with new words for emotional states that our computers tell us about but we have no names for.
There are so many possibilities. In many areas (such as lying), our society functions on an imbalance of available information. People choose how much of themselves to reveal. Throwing the covers off that imbalance would be pretty disruptive, the same way the internet disrupted retail markets by making it possible to instantly get the price of a product at every store that sells it worldwide. Eventually we might have to wear a balaclava just to send a neutrally toned message. But I'm sure the machines will have an emoticon for that!
Let's say, as Nick predicts, future communications will have a sort of side channel of emoticons automatically generated by the camera in your phone or computer. Sort of like a soundtrack, but feelings. You'd get emails about some project with running commentary about the sender: "Bored. Bored. Bored. Impatient." It would be impossible to lie about certain things; social niceties such as "looking forward to seeing you again" would come across very differently if the reader knew that while typing that, your expression was "disgusted".
Some people might learn things about themselves. "The camera keeps telling me I'm angry whenever I text these people. Come to think of it, it's right. So why am I still hanging out with them?" Or, "The computer kept telling me I'm happiest when I'm spellchecking. Maybe I should have become an editor instead of an engineer."
Presumably it wouldn't be long before AI or fuzzy logic was used to fine tune these algorithms to individual users. Then your phone would just know that you're annoyed all the time, and it wouldn't bother to say so unless it's, you know, worse. But that might be doing a disservice to people you correspond with who have never met you. And the algorithms will get subtler as time goes on. At first they'll only be able to detect rage and glee, but eventually every emotional distinction there's a word for will be detectable. And maybe more. Maybe eventually we'll have to start coming up with new words for emotional states that our computers tell us about but we have no names for.
There are so many possibilities. In many areas (such as lying), our society functions on an imbalance of available information. People choose how much of themselves to reveal. Throwing the covers off that imbalance would be pretty disruptive, the same way the internet disrupted retail markets by making it possible to instantly get the price of a product at every store that sells it worldwide. Eventually we might have to wear a balaclava just to send a neutrally toned message. But I'm sure the machines will have an emoticon for that!