Someday in the foreseeable future, someone is going to write a postmodern short story called “Friends and Followers.” The question is whether we should hope it’s a postmodern postmortem. That’ll be one of those “hindsight is 20-20” things.Is social media good for us? Does it makes us happier? Does it makes us more employable? Hell, does it make us more employed? You know what I think on that subject. Or you haven’t been paying attention.
But today I want to ask a question we can answer. As usual, it has some set up, probably an empty promise about more entries on the same subject later. But bare with me, because I always come around again eventually, and unless I die unexpectedly, or you die unexpectedly, we will, together come back to old topics. Hey, maybe we’ll die unexpectedly together when the world ends. Because of social media.
The set up: I have 841 Facebook friends. I follow 519 people on Twitter. I am fascinated by the idea that I could care either about the author/curator or the post itself on every single item that hits my feeds (and of course both, whenever possible, but the honest to God truth is sometimes the people you love post boring ass shit to the internet).
So I got into it. On the face of it, it’s a simple information overload dilemma. Delete:
Then it’s more a question of overall cajones and some tedium, right? No more information problems.
But here’s the thing, though— two way street, innit? Here, ladies and gentlemen, is the rub:
That is to say, some of the people you’re connected to that you don’t know well and/or never knew well are people who you feel safe rejecting, but some of them are also people who you originally became connected with in hopes of something more. So the question is —
Can social media be used by the individual to create or strengthen relationships with other human beings? Revised 3/25 — Can social media be used by the individual to create or strengthen relationships with other individuals outside of the internet/virtual networks?
In this context, the question is not abstract, it decides whether or not someone is worth keeping in your network. Because if there is no chance, it’s just masochism, and regressive, to keep ‘em around. Might as well move on.
So that’s the question, and that wraps this post.
No hang on.
I do have one final thought. A postscript question. I’d like to know with the advent of social, mobile technology, what percentage less eye contact does the first world experience?
I went to college with some fiercely intelligent, brave and good people. They introduced me to progressive politics in ways I hadn’t thought about before. These new ideas included the concepts of institutional oppression, critical race theory, and white guilt. I didn’t really understand them until I left college and came home to my parents suburban home in IL. Then I saw firsthand what they spoke of, and witnessed my parents’ resistance to the idea that their prejudices were something larger than the accusations of leftist nutjobs. However, as time progressed, I saw that even this street went both ways.
One thing librarians understand is that meaning is created through contextualizing knowledge. Facts and figures mean nothing, we determine ethical behavior based on putting a story together about those facts and figures. Underneath the political and economic divides, there is a human nature in the same way that there is a feline nature for cats. I’m not getting romantic here, it’s bio-evolutionary. One thing that is naturally true about human beings is that we like a good story and that on the whole, we’d prefer a good story that makes us comfortable assigning blame than a complicated mess where things are very unclear.
The basic truth is that the moment there came into existence a second human being on planet earth, nothing was ever all one person, one entity, or one side’s fault again. From this truth, we gain the knowledge that agency is the best solution to any given problem, and that denial of agency is not a methodology that solves problems. Because if every conflict you are involved in in your life is at least partially your responsibility, then you have at least partial agency in resolving the conflict. The phrase “disempowered,” which means “to be without agency,” is a truly apt political term to describe the harm majorities do to minorities every day in the United States.
However at the line between a minority member and a majority member, there are several complicating factors: he who represents the majority only does so with some factors of his identity, and he who represents the minority belongs to majorities as well. Which majority meets which minority depends upon which lens you are looking through at that line where they meet.
This is the hardest thing to communicate: even deeply complex, rich and compelling ideas have nothing on the complexity of a real life human being. We do not fit into binaries because we are all both empowered and disempowered, we are privileged and victimized, we are all hopeful and afraid. We are, individually, both sides of every political issue. These things are understood best by mediators, information science nerds, writers and those who are fascinated by the narratives at war with each other inside every person. We live in a postmodern world where not only is every person oppressive and oppressed, but every person oppresses himself. And you wonder why our famous postmodern authors speak of loneliness.
“The books we publish are all in the public domain so there is no real need for readers to continue to pay for them.” —planetebook.com
“When authors put their books on TUEBL they are joining the 21st century in realizing that releasing your ebook for free only boosts your sales [Source 1 Source 2 Source 3.] TUEBL helps market books and authors, and also integrates flattr to allow users to donate to their favourite authors directly.” -Tuebl.com
A fuckload of classic literature:
- 1984 by George Orwell
- A Christmas Carol by Charles Dickens
- A Portrait of the Artist as a Young Man by James Joyce
- A Tale of Two Cities by Charles Dickens
- Aesop’s Fables by Aesop
- Agnes Grey by Anne Brontë
- Alice’s Adventures in Wonderland by Lewis Caroll
- Andersen’s Fairy Tales by Hans Christian Andersen
- Anne of Green Gables by Lucy Maud Montgomery
- Anna Karenina by Leo Tolstoy
- Around the World in 80 Days by Jules Verne
- Beyond Good and Evil by Friedrich Nietzsche
- Bleak House by Charles Dickens
- Crime and Punishment by Fyodor Dostoevsky
- David Copperfield by Charles Dickens
- Down and Out in Paris and London by George Orwell
- Dracula by Bram Stoker
- Dubliners by James Joyce
- Emma by Jane Austen
- Erewhon by Samuel Butler
- For the Term of His Natural Life by Marcus Clarke
- Frankenstein by Mary Shelley
- Great Expectations by Charles Dickens
- Grimms Fairy Tales by the brothers Grimm
- Gulliver’s Travels by Jonathan Swift
- Heart of Darkness by Joseph Conrad
- Jane Eyre by Charlotte Bronte
- Kidnapped by Robert Louis Stevenson
- Lady Chatterly’s Lover by D. H. Lawrence
- Les Miserables by Victor Hugo
- Little Women by Louisa May Alcott
- Madame Bovary by Gustave Flaubert
- Middlemarch by George Eliot
- Moby Dick by Herman Melville
- Northanger Abbey by Jane Austen
- Nostromo: A Tale of the Seaboard by Joseph Conrad
- Notes from the Underground by Fyodor Dostoevsky
- Of Human Bondage by W. Somerset Maugham
- Oliver Twist by Charles Dickens
- Paradise Lost by John Milton
- Persuasion by Jane Austen
- Pollyanna by Eleanor H. Porter
- Pride and Prejudice by Jane Austen
- Robinson Crusoe by Daniel Defoe
- Sense and Sensibility, by Jane Austen
- Sons and Lovers by D. H. Lawrence
- Swanns Way by Marcel Proust
- Tarzan of the Apes by Edgar Rice Burroughs
- Tender is the Night by F. Scott Fitzgerald
- Tess of the d’Urbervilles by Thomas Hardy
- The Adventures of Huckleberry Finn by Mark Twain
- The Adventures of Tom Sawyer by Mark Twain
- The Brothers Karamazov, by Fyodor Dostoevsky
- The Great Gatsby
- The Hound of the Baskervilles by Arthur Conan Doyle
- The Idiot by Fyodor Dostoevsky
- The Iliad by Homer
- The Island of Doctor Moreau by H. G. Wells
- The Jungle Book by Rudyard Kipling
- The Last of the Mohicans by James Fenimore Cooper
- The Legend of Sleepy Hollow by Washington Irving
- The Odyssey by Homer
- The Merry Adventures of Robin Hood by Howard Pyle
- The Metamorphosis by Franz Kafka
- The Picture of Dorian Gray by Oscar Wilde
- The Portrait of a Lady by Henry James
- The Prince by Nicolo Machiavelli
- The Scarlet Pimpernel by Baroness Orczy
- The Strange Case of Dr Jekyll and Mr Hyde by Robert Louis Stevenson
- The Tales of Mother Goose by Charles Perrault
- The Thirty Nine Steps by John Buchan
- The Three Musketeers by Alexandre Duma
- The Time Machine by H. G. Wells
- The Trial by Franz Kafka
- The War of the Worlds by H. G. Wells
- Treasure Island by Robert Louis Stevenson
- Ulysses by James Joyce
- Utopia by Sir Thomas More
- Vanity Fair by William Makepeace Thackeray
- Within A Budding Grove by Marcel Proust
- Women In Love by D. H. Lawrence
- Wuthering Heights by Emily Brontë
Click on the motherfucking Hypelinks bitches.
Here! Have a fuckload of modern literature, too!
- A Clockwork Orange - Anthony Burgess
- A Study In Scarlet - Sir Arthur Conan Doyle
- Abraham Lincoln, Vampire Hunter - Seth Grahame-Smith
- An Abundance of Katherines - John Green
- Artemis Fowl - Eoin Colfer
- Bossypants - Tina Fey
- Breakfast At Tiffany’s - Truman Capote
- Bridget Jones’s Diary - Helen Fielding
- Catcher In The Rye - J.D. Salinger
- Charlie And The Chocolate Factory - Roald Dahl
- City of Bones - Cassandra Clare
- Clockwork Angel - Cassandra Clare
- Damned - Chuck Palahniuk
- Darkly Dreaming Dexter - Jeff Lindsay
- Dead Until Dark - Charlaine Harris
- Ender’s Game - Orson Scott Card
- Everything Is Illuminated - Jonathan Safran Foer
- Extremely Loud and Incredibly Close - Jonathan Safran Foer
- Fahrenheit 451 - Ray Bradbury
- Fight Club - Chuck Palahniuk
- Go The Fuck To Sleep - Adam Mansbach
- I Am America (And So Can You!) - Stephen Colbert
- I Am Number Four - Pittacus Lore
- Inkheart - Cornelia Funke
- It - Stephen King
- Life of Pi - Yann Martel
- Lolita - Vladmir Nabokov
- Marked - Kristin Cast
- Memoirs Of A Geisha - Arthur Golden
- My Sister’s Keeper - Jodi Picoult
- Never Let Me Go - Kazuo Ishiguro
- One Day - David Nicholls
- Paper Towns - John Green
- Percy Jackson and the Olympians: The Lightening Thief - Rick Riordan
- Pretty Little Liars - Sara Shepard
- Slaughterhouse Five - Kurt Vonnegut
- Snow White And The Huntsman - Lily Blake
- The Book Thief - Markus Zusak
- The Bourne Identity - Robert Ludlum
- The Giver - Lois Lowry
- The Hunger Games - Suzanne Collins
- The Kite Runner - Khaled Hosseini
- The Lovely Bones - Alice Sebold
- The Notebook - Nicholas Sparks
- The Outsiders - S.E. Hinton
- The Perks of Being A Wallflower - Stephen Chbosky
- The Princess Diaries - Meg Cabot
- The Things They Carried - Tim O’Brien
- The Time Traveler’s Wife - Audrey Niffenegger
- The Ultimate Hitchhiker’s Guide To The Galaxy - Douglas Adams
- Tuesdays With Morrie - Mitch Albom
- Uglies - Scott Westerfeld
- Vampire Diaries: The Awakening - L.J. Smith
- Water For Elephants - Sara Gruen
- Wicked - Gregory Maguire
We can safely assume that the postmodern sadness and/or loneliness that is prevalent among middle class Americans today is, among other things, a byproduct of material comfort, and thus by definition, a first world condition. Looking at American history, we see cyclical action-reaction states of identity that move between “I want to be part of something bigger than myself” and “I want to be recognized and appreciated as an individual.”
The Great Depression, and WWII, for example, are times when Americans wanted desperately to be part of something larger. Thus the WWII propaganda advocated citizens doing their duty to the country, even at home, even among women and children. The 50’s took that mentality to an extreme (house un-American, etc.) and by the time Vietnam rolled around, America was back to feeling a lack of individual authenticity. Out of that rose the civil rights movement, feminism, and personal computing, among other major characteristics of American culture today.
The sixties was arguably an incredibly patriotic time for the American people, because though much of it was dedicated to reforming the government, the ideal that the government could be a reliable and just entity was still alive and kicking. Compare that to today, where the cynicism of what government is and does is pervasive among middle class Americans, especially Gen Y. The trouble is that cyclically, we are back to a time when we desire to be part of something larger than ourselves—this being one of the foundations of the postmodern sadness we experience—but since the government has been dismissed as a viable manifestation of “something larger,” we have turned, unfortunately, to corporations. And to some degree, religion.
So it is that owning Apple products or Android products (or—gasp! and aren’t you rebellious—open source products) has become a personality identifier, that whole areas of profitable journalism are now basically marketing, that we see phrases emerge like “personal branding,” and see large sums of money being exchanged for data that is collected from websites which primarily exist for users to express their identities. Facebook, Twitter, and Pinterest have effectively replaced political communities.
In high school, in AP Gov, we learned that once every thirty years or so in America, there’s a political realignment and what the political parties hall marks and party lines are change. So republican and democrat come to mean something else than they did before, in common parlance, and the issues change. The rough draft of the declaration of independence has a distinct difference in the seminal quote we all know. The rough reads “life, liberty and the pursuit of property.” Why? Because at the time, only landowners could vote. Which is to say, our forefathers felt that not only was representation important, it was a key factor to happiness. Representation, upon reflection, is a combination of acknowledgment of a citizen’s individuality and of her role as part of a larger group. It never occurred to them, probably, that Americans would stop caring, or stop educating themselves, or stop thinking for themselves. But the end result of all three is that to get out of having to think or learn or understand, we’ve resorted to faith based politics. You “believe” in climate change or you don’t. “That’s just your opinion,” I see people say on Facebook, despite the fact that my opinion has a lot of empirical evidence. When that happens, we don’t argue the issues, we argue what criteria define an issue. So rather than arguing about ways to deal with global warming, we argue about whether there IS global warming, with one of the main points in the argument somehow ending up to be whether or not one believes in it. We separated church and state a long time ago, but nobody thought that one day the existence and acknowledgment of sociopolitical issues would be a matter of faith, we just assumed that the viewpoint on the acknowledged issue might be faith-based.
I think it’s probable that these party realignments, today’s based on either religion or corporate ties, correlate to the cyclical identity binary, and that progress will probably stalemate until such time as we can determine a compromise between group identity and individual identity that is balanced enough to break the cycle.
This is where my profession comes into play. The necessary change, to my mind, that will enable Americans to reclaim authenticity while maintaining their group identities, is in how we approach what has been termed “information overload.” This is my feeling: a combination of the amount of information now available and the emerging view that the individual is a credible source has resulted in many Americans being to intimidated or overwhelmed to successfully curate the information they pay attention to in order to stay informed and active in their communities. It is nice to think that citizen journalism is a thing now, and people can report from the ground facts that might never make it into mainstream media. But the other side of that coin is that mainstream media has been curating what news is for years, and we are just beginning to do so. As a result, people who are into technology tend to spend more time reading information about technology than world news, and people who are into politics are less likely to know about technological projects to help address sociopolitical issues, and so on and so forth.
But if the answer to too much information is censoring anything that is not a topic of interest, then we lose—both the opportunity to stay abreast of what’s the what, and the opportunity to become interested in new things. Two big, defining factors of middle class American happiness might be the social advantage that comes from the intelligent discussion of relevant issues in one’s community, and the ability to progress into new territories of learning and fun. Of course, these are two things that seem to be lacking to my eye (cynicism, yes) at the moment, but another thing that is lacking seems to be happiness, so my theory holds for now, anyway.
Awhile ago I read about Buckminster Fuller’s “Comprehensive Designer,” in Fred Turner’s From Counterculture to Cyberculture which no human could ever really be-
“According to Fuller, the Comprehensive Designer would not be another specialist, but would instead stand outside the halls of industry and science, processing the information they produced, observing the technologies they developed, and translating both into tools for human happiness. Unlike specialists, the Comprehensive Designer would be aware of the system’s need for balance and the current deployment of its resources. He would then act as a ‘harvester of the potentials of the realm,’ gathering up the products and techniques of industry and redistributing them in accord with the systemic patterns that only he and other comprehensivists could perceive.” (Turner, Fred. From Counterculture to Cyberculture. University Of Chicago Press. May 15, 2008.)
But my feeling is that though unreachable, it’s a laudable goal, and can only make the lives better of the people who aim for it. A more doable, less broad version of the same job might be called “librarian.” That’s because it’s primarily the librarian’s job to find as many frames of reference (patterns that information can fit into) as possible for any given material or electronic item in order to maximize the usability of that item.
However, there is another word for people who find more uses for information that originally intended. That word is “hacker.” Traditionally, we have avoided that word, which most people believe to be a result of an old definition of the word used to describe malicious and non-consensual access of computers or data on computers by people who were pretty good at computers and computer security. I think another, less well known reason we have avoided using the word “hacker” is that traditionally no matter how they do it, hackers are “rebels,” fighting “against the system” and frankly don’t care what you don’t know, can’t be bothered to share information, and are not exactly socially adept. Hackers may not be trying to break into your computer, but nor does being friends with one sound like a good time. And yet, a more creative definition of the verb “to hack” basically means “to repurpose.” To take information in one context and put it in another. People who excel at this, whether it be on computers, in art, in practical every day processes, or wherever else, are people who are going to be capable of seeing many perspectives, of not being held back by dogma or fear, of being motivated and productive. There is, as our friends at creative commons would say, a “share and share alike” feature to this definition too, whereby the hacker shares her information and her methodologies enabling her communities to grow with her.
I have termed this in my mind as “the compassionate hacker,” but I have seen the sort of characteristics I’m talking about embodied in other words in other fields, too—“flipped academic” in academia (http://theinnographer.com/flipped-academic/), “found art” in the art world, and so on. The great re-contextualization. The job of the librarian has become less a matter of information navigation as instruction about information navigation. To address the postmodern unhappiness we feel, and our revolving needs as individuals and members of groups, we must each become our own librarian—there is simply too much information and to much emphasis on the individual as a credible source today for librarians to competently mediate. Instead, librarians can teach the skills necessary to become a “compassionate hacker” in whichever fields people are in, and those people in turn will become teachers in their own right. This speaks to Fullers’ idea of the “Comprehensive Designer” which focused on people being individual microcosms of larger macrocosms that they were also a part of. In this way, we balance our needs for authenticity and perhaps can finally step outside a cycle which is slowly causing our demise.
Today, Maurice Sendak, prolific children’s author known for unfailing honesty, died from a complications after a stroke, at age 83 in Danbury, Connecticut. For those of us who were once quite familiar with his picture books, there is a morbid transition here, one of many small farewells to childhood that when added together are jointly called “growing up.”
For numerous young adults today, Mr. Sendak’s death is not only another cut tie, but also an echo of an old sentiment that American culture has rejected in conjunction with the increasing popularity of the scientific method for discovering truth. Sendak and his work are both part of the mythology of our lives, and his death represents something bigger than his life in that sense: it speaks to something about the meaning of our lives— and even if we cannot articulate what specifically this thing is, we feel it deeply, and it overshadows, if briefly, the rational knowledge that Sendak was just a man. For us, as in the case with all mythological characters—real or representational— he was more. We can learn something from this haunting echo, and in so doing, we can honor the life of Maurice Sendak.
It is perhaps the following premise that is the root for all art: experience is derived from neither fact nor truth, but rather, from meaning. This, and also the rarely contested notion that happiness is, primarily, an experience, suggests that perhaps the common debate we know as science vs. religion is simply not useful. For the purposes of this post, I will address this quandary as primarily one of information navigation— that is, framing a debate is essentially an informational endeavor, and doing it right ought to lead to either personal or societal progress or both, and doing it wrong will lead to the opposite. Now, looking at both the individual and the societal scene in America today, any cynic (which is the thing to be these days) can make a strong case for societal regression. For my part, I believe it to be true but also am not concerned with the issue in this post. Or rather, I think convincing the reader that finding meaning in his own mythologies will make him personally more happier and more fulfilled which carries over to his interactions with his own society, which will have more long-term benefit. Thus, the thrust is this: instead of science vs. religion, what we ought to endeavor to manifest is science and mythology.
It would probably behoove me now to define “mythology.” There is the common use of the word “myth” to either mean “lie” or “fable” or “fictional/metaphorical story which explains how something came to be,” including examples such as creationism, or the Hades/Persephone explanation for the seasons. To the extent that the fictions are taken seriously, they are frequently deemed spiritual or religious. But there are other ways to use the word mythology. American mythology, for example, might include the Boston tea party— “no taxation without representation,” an idea, a myth, a reality (we hope). Think of all the mythologies there are and could be: the mythology of postmodernism, the mythology of capitalism, the mythology of the Bush family, the mythology of the Narnia books, the mythology of rituals involving candles, the mythology of yours truly, Joanna Tova Price. Oh, and lest we forget: the mythology of science. Mythology is the other side of the coin, it is the meaning that accompanies truth.
We can, for example, talk about the truth of gravity objectively- that which is incontestably true but has little affect on us. Of course the notion that we are unaffected by gravity is ridiculous, but I here submit that it’s not the truth of gravity but the meaning of gravity that affects us. The meaning of gravity never contradicts the science of gravity because they are not trying to share the same realm of authority. If a man were to say, “bungee jumping is my favorite activity,” or “the fact that humans will never fly with out apparatuses makes me jealous of birds,” we would not consider these statements to be attacking science— but we would understand that they affected his experience of the world, and thus his relative happiness. He discusses the mythology of gravity, and we know it to be a different thing than the science of gravity. We do not get upset at the fact of falling down or broken dishes, rather we are upset because of what such events mean. In the past we have given mythology as much authority over the domain of meaning as science has over the domain of truth. This is a thing we need to return to, a thing which has been recently titled by scholars as “The New Sincerity.”
Sendak’s characters are most frequently concerned with what things mean, as opposed to what things are true, and when people discuss “the illusion of innocence” which Sendak’s books make short work of, what they mean is that Sendak gives an honest account of a child’s relationship with meaning. For while children have much to learn about how to find out the truth, they certainly excel at finding out the meaning in ways that adults must inevitably forget as they grow older (oh, the echo!).
But we are not wholly helpless in this domain, even as adults. We do need to reconnect with our mythologies, personal, and otherwise. Religion is certainly an example of a mythology one might turn to, but it is hardly the only one. My favorite recent example of a personal mythology is Elizabeth Gilbert’s, who recently had her great grandmother’s cookbook republished. You can read about it here. Another example that pops into my head is from a panel that I attend at the Writers and Writing Programs Conference in Chicago in 2009, where poets discussed poetry and “citizen’s journalism.” Poetry does not represent the facts of a situation, but rather, the meaning. In this way, poetry is never concerned with science, only with mythology.
There is also no reason to give up science, or even to lessen the authority of science. Rather, we need to place less value in the mythology of science, and turn towards a place where Sendak lived for years: a place where meaning has as much value as truth. Sendak was the gay son of two poor, Jewish immigrants and he has described his childhood as difficult because much of his extended family was killed overseas during those years, in the holocaust. It may be that these facts profoundly influenced his amazing ability to understand how children relate to meaning. Maybe the reverse is true instead. Probably in some complicated way, both are true. But the facts are not so relevant. Rather it is his role in the mythology of our lives that needs attention here. Let us recognize the ghostly echo that follows the death of Maurice Sendak for what it is: A call to take back your disowned mythologies, to give them authority, and to love them once more.
Note: In the coming days, I will be exploring different ways to do this, individually and in groups, and I hope to post some thoughts on my webpage when it goes live. I’ll update you all when that happens.
Big, public events offer us the opportunity to look at the way our interactions with information on a grand level affect us on every level, individually and as a whole. Today, President Obama speaks in Ann Arbor. I suppose this is a bit of a mini vacation for him, because it’s not like he has to win our vote, this is a pretty liberal town. I remarked to a couple friends that the concept of standing in line all night for a ticket to see the man himself was abhorrent, to waste so much time supporting what is essentially a circus. However, I want to clarify a couple points. The first is that it is not the political process itself that I find so off putting—the process for electing a president, and the processes for making laws in the USA are two things worth taking seriously.
It is rather the context that we place politics in that is no longer worth my time. We live in a time where political news and Hollywood gossip are treated the same way in the press. We live in a time where good politics are sensational politics. We live in a time where the bottom line is selling a story, emphasis on sell.
So. You could write a feature story on me that discusses my flaws—of which there are a few—and you could write a feature story on me that discusses my better qualities. The first article would make me seem unqualified for whatever it is I am attempting, and the second article would make me seem practically overqualified. Both stories would, in a sense, be true. Neither would say anything that was factually inaccurate. However, in another sense, neither would actually tell the truth because neither would represent me as I am—a combination of both my strengths and my weaknesses. A good feature discusses both.
Furthermore, it will generally be true that those with great strengths have also great weaknesses. You will not find a presidential candidate whose weaknesses are limited to biting his nails or not cleaning his room enough. We hopefully expect our president to be better than us at the skills it takes to run the country, but we should not expect them to be better than us at life generally, because we are all only human, and our character traits generally balance each other out. Specifically, what I am trying to say here is that I don’t care how many times Monica Lewinski did what to President Clinton in the oval office. I don’t care if Mr. Red is for family values while carrying on an extramarital affair with a man. His extramarital affair should not distract the media—and the public—from having a real and serious conversation about our core values, and the kind of America we want our kids to have.
The sensationalism with which we approach politics only hurts us. It gives people who are—more than democratic or republican—just plain crazy too much power. Palin is an embarrassment to republicans, democrats, and humanity generally, and yet she makes headlines because she’s just so sensational. In a country where politics were taken as seriously as they should be, the press would let Palin face her own incredibly embarrassing behavior privately as we moved forward in a serious conversation about the future of this country.
The way we approach politics today is an example of information behavior. Politics currently reside inside a particular context, or frame of reference, that is not inaccurate, but rather is the wrong truth. It’s the truth that only serves to make us dumber, and hurt us more. When I decided not to go see Obama today, it was not because I don’t respect the Office of the President, or the stuff that a President or even Mr. Obama specifically does. Rather, it’s because I have no interest in showing up to cheer for a battle already won when there are real issues to address, real issues that deserve serious attention and critical thought.
There is a different, better context to place politics in. Another, better truth to what is happening and what can be done in this country. As citizens, we should demand that politics be approached with the gravity they deserve. After all, at the end of the day, this is not about celebrity gossip, in Hollywood or on the hill. If you only do one thing as a result of reading this op ed, let it be this: decide for yourself what politics are really about, what they’re really for and what they really do. When you’ve decided, don’t let the circus separate you from that conviction.
-Post Secret (http://www.postsecret.com/)
One of the themes of this blog is the way human beings interact with information, and this essay, titled “Caught in a Bad Romance,” argues that on an intimate level, social media becomes toxic as it becomes ubiquitous. I will say at the outset that in some ways, I am not the best person to be making this argument. I have a smart phone with an unlimited data plan. I not only use Facebook regularly but I am one of those obnoxious Facebook gamers that causes all of those shoot-me-now posts about how I need one more little piece of pixelated crap to complete this one quest which itself is contextless and meaningless and so on. I use AIM because it pulls together all the other instant messenger services I use into one place and because it’s a client which means when I browse away from Facebook or GMail, AIM will happily keep my Facebook and GTalk messengers running. I have four different blogs and I’m an avid Twitter user. In other words, I really, really want your attention.
This piece, which I find myself sitting down to write after months of avoidance and frustration, will make me not only all of the above but also a raging hypocrite, because it will quickly become apparent that I am largely (though not wholly, thank God) arguing against exactly what I am, or have become. I am never off the grid, and it is not in spite of this, but because of this, that at last I find myself essentially more disconnected, and alienated, from real human relationships.
On October somethingth, Steve Jobs died. The person that Jobs was—his genius and his flaws—is perhaps the most powerful bridge for bringing you across the distance into my world. To me, Jobs will always be two things— the father of the personal computer, and a lonely guy. Why do I think he was lonely? Steve Jobs was a consummate liar and liars are lonely. Two well known examples are when he lied to Wozniak in their early careers in order to steal money from him and when he signed a document in court stating that he was sterile and denying paternity of his daughter (he took it back and acknowledged her two years later).
There’s something discomforting about not telling the truth. I submit that it’s more than the obvious discomfort from doing something we’re taught on a fundamental level is “wrong,” but that each time we lie, we become a little lonelier, because we become a little more disconnected from the person we’re lying to and from the larger thematic connection that a conversation shares with human socialization- the liar alienates him or herself. Although he or she may not consciously realize this, the discomfort attached to lying has more to do with that alienation than with any fear of being found out or explicit rule about being wrong. The point isn’t that Steve Jobs was a liar—because it doesn’t take much to figure that out—this is about how because Steve Jobs was a liar, he inevitably must have been a lonely human being. It’s about how a lonely, socially screwy genius might have invented the platform for ubiquitous social media to fill a space inside himself.
For the literal minded out there, this is not a true story in a factual sense, although the facts about Wozniak and Jobs’ daughter are true. This is a true story as in a parable, it tells a truth about the nature of some human phenomenon through the use of story. This is a story that your brain may wonder about but your gut knows is true.
Let’s first talk about the ubiquity of computing. The iPad, Apple’s alternative solution to the netbook, and its’ success says something about our expectations in terms of portable connection. Specifically, we’re more interested in the mobility of social media than we are in the mobility of actual work. That is, one does not do serious word processing, or create presentations, or do design work on an iPad. The iPad is primarily designed for social media and entertainment. For my part, I believe that media is an inherently social thing, that our brains are in a social mode when we watch movies or read novels and articles or play games, even if we’re doing so alone.
Apple’s insight here is that the American desire to be social is mediated by the American fear of coming forward in a completely, almost unbearably honest way, with all the awkward and unfortunate bits that make up the human character. Americans are afraid of coming clean and not being good enough. This, often called a “fear of rejection,” is more powerful than it first appears—it’s not about what other people think, or the labels. Rather, what I want you to consider is that it comes down to a question of loneliness. If you are found “not good enough,” it does not necessarily say anything about your character. There are plenty of examples in history where someone who was absolutely right was thoroughly rejected. Galileo wasn’t wrong, but you can bet your ass that when his entire world turned on him, he was lonely. It turns out that being right does not mean much when it comes to human relationships. I’m asserting here that Americans are not afraid of being wrong, they’re afraid of loneliness.
Steve Jobs’ fortuitous return to Apple turned out to be about taking that fear and wrapping it in consumerism, and delivering it underneath the Christmas tree in the American home. I imagine Jobs as a lonely guy, and as a resourceful guy. I imagine he might have recognized his own loneliness as a resource. You have to admire the ability of a man to take his own discomfort and sell it at great profit. Surely this is some kind of genius.
Thus far, I have tried to show that the liar’s greatest fear is loneliness, and ironically, that lying causes loneliness. Now I mean to show that social media is a form of lying. At least, off the bat, you can probably admit that social media is not entirely honest. Most of us—quite rightly—only put our best faces forward on the internet. Our Facebook profiles, Twitter accounts and 4Sqaure check-ins don’t exactly reflect real life. If they did, those photos that aren’t the good kind of bad (just the bad kind of bad), and those moments when we find we really could say in 140 characters or less something totally unpleasant about ourselves, would be documented for the whole world to see. Not to mention we’d have 4Square king of the bathroom, king of your parent’s basement, and so forth. There are numerous pointed moments we’d rather not let the whole world see, but more importantly, there are many more moments when what is going on is not something that can be articulated succinctly, that cannot be expressed meaningfully through social media. Most of real life is like that, not concise or clear.
At the same time, there are social strictures that cause us to text someone instead of calling them, consistently read the Facebook or Twitter feeds of people we don’t talk to, not post those bad-bad photos, check in to the club every Saturday, and generally try to represent ourselves in a way that we only feel a little bit of the time. They exist because the deception is comforting. Abstracting the ambiguity of real life to the consumer’s version of happiness seems much easier than looking someone in the eye with all of the confusing, or terrifying, or infuriating, or insanely tedious, or exciting, or wondrous, or happy, or freeing, or empowering moments that make up real life. It’s hard to be honest about how awkward real life really is.
If you’re Steve Jobs, you have to look someone in the eye and say, “I’m the kind of guy who pretends my sperm doesn’t work to get out of acknowledging my daughter.” If you’re Steve Jobs, you have to look someone in the eye and say, “I’m the kind of guy who steals money from my friends.” If you’re Steve Jobs, maybe you don’t even want to look in the mirror and think that. This is maybe not a bad thing. So he doesn’t want to think he’s a “bad person.” So he doesn’t want to be a “bad person.” His desire to not be an asshole is itself a good thing. His resulting cowardice is probably a bad thing, but more relevant to this parable, it’s a common, human thing. In the face of our own wrongdoings, what humans want more than anything else is to not be wrong, and it’s not, at the end of the day, about what other people think about us. It comes down to what we think about ourselves, and what happens to the way we relate to the world after we’ve perceived ourselves to be bad or wrong. And what happens is we feel lonely.
It’s worth repeating that though Jobs is a great example because he is both Mr. Dishonest Person in an extreme way and Mr. Social Mobile in an extreme way, this being “wrong” or “bad” thing happens to everyone all the damn time. It’s part of the ambiguous nature of real life. The potentially new thing I am trying to get you to see is that when we lie, be it blatantly to someone’s face, or through social media by not presenting—not even having the ability to present—the whole truth, we make ourselves a little lonelier.
To the extent that we allow social media to mediate social interaction, we lie a little bit. The great irony is that though social media is frequently labeled as “connecting,” it is more often than not alienating. We become lonelier in the concrete experience of real life. Social media is pretty, convenient, and most importantly, it’s always there. It is also a poor representation of the roughly hewn, ambiguous and refreshing beauty that makes up the awkward truths of real life. The awkward truths in Steve Jobs’ real life were seriously awkward, but not quite as awkward when they were mediated by technology, and even better, not quite as true.
If it were merely a poor representation of humanity, and human culture, social media could take its’ place on the long list of media which has tried and failed to do something real. But social media is devious in that it not only falsely claims to bring people together, it actually creates distance. It is social amputation. Those people everyone knows who spend more time interacting with people on their phones than interacting with the people they’re out with are the obvious amputees. You may very well be a less obvious one. I was very nearly one, and if you recognize that you could be too, you are not obligated to say it—knowing it is enough. The awareness will empower you, and that’s what I hope this essay will do—empower American social media users to be a little less lonely.
The ubiquity of social mobile local, of social media, of social technologies, etc., etc., does many things, and not all of them are bad. But one of them is unspoken, and I mean to say it as clearly as possible: On an intimate level, the ubiquity of social media means the ubiquity of loneliness.
For my internship at the game archive, I did an interview with a local community TV station. A sneak preview of the show, which is on the future of libraries, is right here. In case you’re wondering which person is me in this video: I’m the girl. ;)
[This is part 2, the first post gives my reasoning for saving the publishing industry. This post is discusses how to save it.]
Previously, I spent some time on the worth of the publishing industry. In review, publishing as an institution is able to discriminate in an educated way. The publishing industry understands the worth of the great American novel, and it understands the worth of Twilight and it also knows the differences between their values, while still valuing both. It is the industry’s ability to generate and maintain standards, while still appreciating the different value of different kinds of literature, that makes it valuable. The logistical ability to create access to the content—e.g. with printing presses, through book stores and libraries, or with e-content—is a bonus, but I believe it is secondary.
However, I also see merit in the small, independent publishers, and self publishers, in the arts and crafts of the chap book or zine. It’s a beautiful thing when people create work with pride, and do not align themselves or hope to win the favor of big institutions.
As a reader, I have a great love for both industry publishing and independent publishing. When I think about my own appreciation, I see industry publishing as a way of entering into entirely new universes— like Harry Potter for example, independent publishers would not have the resources to create and maintain such universes with such wide audiences. But it is the independent publishers that provide community to readers, and lovers of words. There are no middle men. When I thought about it, I concluded that fan communities are a little bit like independent publishing, because they well generate mass amounts of content that are really only meant for the community, to enrich the small, intimate system that is a fan community. The value of the independent publishers and the self publishers, then, is that while their communities might not be in the same physical location, they feel local, and personal, like the mom and pop store, or the bar or coffee shop where all the neighborhood regulars hang out.
Finally, this led me to an approach that combines those two qualities into one, bigger market. The creative commons copyright allows authors to maintain as much right to their content as they want, and give away as much as they want, with pretty much any conditions they want. This is very general statement, and of course “some restrictions apply,” but it is freeing nonetheless because it allows authors to share their universes in ways they were not able to do before.
This, then, is the big idea: We have seen with the iPhone and Android phones as platforms on which software developers can write apps and make their own money, a small percentage of it going back to Apple and Google. This is a wildly popular market both for software developers and consumers. There is absolutely no reason why the same thing could not be done with books.
That is to say, what if buying a book allowed you to enter a code on a website that let you into a community of readers. Those readers could upload films, short stories, long stories, artwork, animation, comics, or any form of creative expression, in the universe the book(s) the readers enjoy was written in, for money, a small percentage of which would go back to the publisher. The author wins because publicity would skyrocket, and because theoretically the reader would have to buy the book(s) to join the community, the publisher wins because it makes money off of readership in a new way, and it allows for the feeling of a local community, which would support the publishing industry in ways it cannot find support among readers now. With Pottermore, J.K. Rowling is on the right path, but I think we could go farther. The creative works would be the apps, and the universe that the books were created would be the platform. Artists could offer some of their work for free, to build reputation, and then sell their more advanced stuff. There could be forums, RPGs, people might develop computer games or board games. The advantage for the reader is that the universe is infinitely expandable, and there’s the social aspect, which is not a small benefit either. Finally, the author could participate in these communities, heightening the relationship between author and reader, and achieving what is seen as very marketable today— the ability for someone who is liked by many people to communicate with his or her fans on a personal level (see Twitter as an example of celebs talking to their fans).
Of course, this could take its form outside of the internet, as well. Publishers could create compilations of the best rated short stories by fans. Meetups for fan bases in big cities would probably be easily doable. If I were a publisher, I’d throw parties, in fact.
Monetizing readership in this way, I believe, is a win for everyone involved. As a reader, and as a writer, the concept is thrilling to me. I’d like to see the publishing industry revive itself, I’d like to see readership communities grow, and I’d like to encourage creative expression. This is one of those few times where I think a good business decision could also make the world a better place.