"New Technology" is an oxymoron. As soon as a technology is created it ceases to be new. It is a container, a bucket into which we load content. The relationships among the bucket, the content therein and the people who use it, are dynamic. This blog explores those relationships.

About Me

We reach "calm" by different paths, but we recognize it when we see it. That is not surprising. We have known it all our lives. Our first calm came when we were babies. Babies come with an on-off switch. They are either squirming, laughing, crying; or they are asleep. Ever wonder why there are so many images of sleeping babies on social media? Perhaps because they capture that memory of our first calm.
Then we grow away from calm. Just growing our body requires effort. My mother would come into my room at night and rub my knees until the pain went away. We now know that kids can grow inches a week, and it hurts! Growing into an adult life can also hurt, but it is also joyful. Despite some bruising of body and spirit, we spend our life meeting challenges, chasing dreams, reaching goals; missing others.
When we realize that there is more life in the rearview mirror than out the windshield, our perspective changes. That meeting with the "boss" you used to coach in youth soccer seems less vital. You notice you are living different priorities than many of your colleagues. You realize you want to find calm. Writing helps me find it. Maybe sharing what I write will help you find it.

Monday, February 20, 2012

.
Harry Potter's parents were killed because their personal data, data that they thought was secure and would not be used without their consent, was compromised by Peter Pettigrew, the very person to whom they had entrusted the data. Pettigrew leaked the data to Voldemort who, having thus learned the Potters' location data, came and killed them. It's a lot like shopping at Target. No, no, wait. I'm serious. A student just sent me a link to a story about a pregnant teen being "outed" to her parents by Target's "targeted" advertising. Aside from the murders, the story contains a lot of parallels to the tragic tale of the Potters' demise.

Here's what happened. It seems that Target does a lot of data-mining when you shop there. According to the Forbe's article, "Target assigns every customer a Guest ID number, tied to their credit card, name, or email address that becomes a bucket that stores a history of everything they’ve bought and any demographic information Target has collected from them or bought from other sources." Then Target mashes the data around and looks for patterns that might reveal clues to purchasing preferences and they mail the owner of the "data bucket" a personalized flyer full of coupons that will "help" them save money on those items in which Target's "bucket algorithm" asserts they are interested. Well, some time in the not too distant past, Target's algorithm elves zipped out a flyer to someone the algorithm assured them was pregnant.

Problem: The recipient of flyer was an unmarried high school student whose irate father showed up at the local Target store demanding to know why Target was encouraging his daughter to get pregnant. Further problem: the algorithm was right, the daughter was not only pregnant, but due to give birth almost exactly when the Target algorithm predicted. Dad apologized to the manager. A creepy tale for our time, but neither as unique nor as simple as it appears at first blush.

Perhaps our naive assumption that the data we trail behind us in cyberspace will be used to our benefit can be traced to Google's famous founding motto, "Don't Be Evil." Yet, in the last week Google has revealed that it has been messing with the code in Apple's Safari web browser to enable Google to do much the same type of data tracking globally, that Target has been doing within its organization. The revelation of "Safari-gate" has prompting calls for an FTC investigation of all things Google. Calls which, by the way, have fallen on deaf ears at both Google and the FTC. One wonders how long such stonewalling will be successful? Still, it has been going on for quite awhile, and by now I would guess that Google and Facebook are the two companies that know more about the lives of millions of people in the world than any other entities. No doubt governments would love to know more, but they don't have Google's or Facebook's budgetary and technology resources. Besides, the CIA will probably soon be able to buy the app for their iPads - after giving Apple it's 30% piece of the pie. The companies targeted by these negative headlines staunchly assert that any excessive gathering of personal data has merely been the result of unintentional missteps in their efforts to provide the services we demand of them. I wish that were a bald-faced lie.

You see, the fly in ointment for those crying to stem the current tsunami of data mining, crunching and selling is this: we freely provide most of the data being mined. No one holds a gun to our head and demands that we use our Preferred Customer Card at the local grocery, clothing, or hardware store. We pay for the Groupon that feeds data into that bucket. We fail to install "Do Not Track" software. We blithely click "Like" and "+1" all over the web. We Tweet and Retweet our little fingers off, pouring more and more data into the busy maw of the data miners. Do we really think that all those "services" are provided to put money into our pockets? Let me tell an old story about a free lunch . . . . Those "services" generate huge profits for a kaleidoscope of companies whose entire raison d'être is to lighten our wallets; to slide cash out of our accounts and into theirs. And that's OK.

No, really, it is OK. That is the core of capitalism, of a marketplace economy. It is what the nation has been about since our earliest days, and no one seems to have come up with a better system. The more nuanced issue is fairness and intent. My simplistic perspective is "tell me what you are asking from me, tell me what you know about me, and tell me what I am getting in return." If that information is open and up front, and if I can easily choose "not to play," then fine. I will not gripe. But that is not, it seems, how data mining works. Data miners work on the assumption of "what they don't know won't hurt them." They take our data, often in surreptitious ways, and use it to significantly increase their profits, or they simply repackage and sell the data to others. But, opine the data-miners, we gave it to them, the data are in their hands as a result of our own actions or inactions. No harm, no foul.

Increasingly, I have grown less convinced of the case for "no harm."

The case of revealing the teen pregnancy is one obvious example of harm being done. It is probably no big deal in the life of a large corporation like Target, but it is certainly a big deal in the life of that youngster and her family. The discordant dialogues within families are difficult enough without being brought to light by the blunders of a clueless crew of anonymous digital hucksters.

But I believe there is a deeper and more primary harm, and that is the re-conceptualization of the private. Our species began in private. Privacy was imperative or the faster, stronger creatures would kill us. We were relatively harmless little packages of protein, if the carnivores could find us. Then, across the millennia, we evolved into clans and tribes, towns and cities, nations and empires. We put on public faces to perform the public tasks necessary to maintain the complex institutions integral to civilization. Privacy became not so much a case of the survival of the species as it was a comfort, a soothing retreat from the rough elbows of public life. A private place became a space apart, became something to be valued and pursued. In America, one became fully vetted in the dream when you owned a "home of you own." Nothing was more painful in the recent recession than losing that cherished private place, your home.

Yet now various hip "cyberati" inform us that "privacy is so 20th century." In the 21st we share it all, posts and reposts on your timeline from womb to tomb. Every private thought and action is made public, often at the very instant of its occurrence. Yet, if that were really the undisputed state of the current culture, why would the various intrusions into our data stream cause such indignation? Perhaps it is because we are upset by the realization that in a purely public world we lose the unique opportunity to construct truth from our private existence, because that existence is no longer private. Our insight into our personal past now flickers on Ancestry.com, open to anyone with the price of admission. Our personal present scrolls by on a variety of social media. The comfort of conversation is peppered with quick consults of the electronic oracle to ascertain any questions or assertions of fact, history or locale. The distinction between public and private has blurred beyond definitional agreement. We seem to recognize those spheres only by the most egregious trespasses: "Not only do I not need to know that, I am offended by having been made aware of it," and, "How dare you seek to intrude upon that part of my life?"

Our inability to consistently or accurately discern the various shades of gray between those blacks and whites, between obvious good and unfettered evil, may well arise from the fact that good and evil often seem to wear the same masks and live in the same digital spaces, and those spaces are increasingly public spaces. Our lives, taken as a whole, have become more public than private. Which leads to this question: Is there an evolutionary advantage to lives lived primarily in public? If I am being asked to jettison the comforting quiet of the private in favor of the roar of lives lived in full public view, what do I gain? As an individual? As a species? To date the dominant response seems to be "better shopping." That is not yet enough for me. I'm still willing to settle for humble wine before a fire that is neither HD nor crackling in surround sound, but is quietly comfortable in a private, friendly circle built for two..

Wednesday, February 8, 2012

It is bad form to pose a question that is almost as long as the job candidate's presentation, so for once I followed my self-imposed mandate: shut-up and listen. The presentation on Internet Art had been very interesting - thought-provoking even, as evidenced by the fact that more words than doodles adorned the open page of my trusty notebook. Problem was that few if any of the notations were of the "ascertain the suitability of the candidate," variety. They were all more skewed to the "nature of art in the world and the place of the Internet in that dialogue" end of the spectrum. My colleagues would not be pleased were I to trot them out at the end of a meeting that had already passed “quittin’ time.” I had already tossed out one brief question, which was swiftly ignored in favor of more formal communication scholarship methodological, theoretical and bibliographic issues. I started a new doodle.

And here's what I was thinking as I doodled:

You talked about an Internet art show where one of the formal curatorial criteria was to have no criteria, in that no submissions would be rejected. Takes me back to days when my daughters swam for the Faculty Club in the Triangle Swim Association. Everybody swam, everybody got a ribbon. No swimmer's efforts would be rejected. As a parent I understood that the dominant orthodoxy of the time was "everybody wins." I could appreciate that, but it muddied the previous clarity of a more hierarchical, though admittedly less friendly, system of the blue ribbon is first, the red is second and the white is third. If you didn't get one, well, swim faster next time. Tough swimming love, but at least clear. “Ribbons for all!” was more confusing. I remember that when my older daughter was swimming in the "six-and-under" group, she decided it was really important to get a ribbon of every color. She had every color but the cool black one with the gold letters. How do you tell a five-year old, "Well, honey, to get that ribbon you have to be last. So go out there and do your worst!"

An art show in which there are no rejections injects similar confusion into the scenario. I can sympathize with the idea of breaking the bounds of the old "beaux arts salon" mentality in order free up space for new methods of expression. But that same lack of criteria almost demands abuse. I mean, if you publicly declare that anything will be accepted, you are just asking folks to see if you really mean anything. You are begging people to foreground the "con" in contemporary art. “Let see if they’ll accept this!” And we have all been to brick and mortar shows where yes, they did. Still, one thing about the Internet is that there are no walls on the gallery; that space is in many ways quite unlimited, so perhaps it is fine to give everyone a ribbon - as long as one is willing to accept the fact that the ribbon has no meaning.

There are probably as many different definitions of art as there are artists and art critics. The Internet seems to be most amenable to what I have thought of as "therapeutic art." Therapeutic art is art done primarily for the benefit of the artist. Therapeutic art is to the artist what the gym is to most people. Most folks go to the gym and work out because it makes them feel good - either physically when the endorphins kick in, or the more emotional and intellectual pleasure of knowing you are doing something "healthy." Most folks are not professional athletes who go to the gym to condition themselves to better perform the physical demands of their job, they go to the gym to feel good. The same is true of therapeutic art - it is a balm, it calms us amid life's ragged race.

The Internet art discussed in the presentation - often incredibly complex, intense effort and energy by thousands of people resulting in "products" that were often the fleeting or ephemeral manifestation of "process" - seems to have a strong therapeutic component. The repetitive attention to detail resulting in complex patterns seen in this type of Internet art is more akin to quilting than say, painting or sculpting. Those art forms are often defined by a jerky process, start and flow, stop and stare, trial and error enacted in the isolation of a studio. The analogy of Internet art to quilting fits even more snugly if we consider a quilting bee, when a group of artists in direct communication with one another work to create a communal work.

I tend to see quilting bee, therapeutic, art as different from what I think of as transcendent and/or definitional art. Transcendent definitional art has more to do with the definition of the discrete self that transcends the characteristics of the group. In this type of work the artist seeks an expression of an evolving or established singular self. The value in the creation of this type of art lies in the simultaneous expression of, and the physical crystallization of, the self in the artifact. That assertion obviously runs counter to the notion in general semantics that the word is not the thing. To a certain degree I am asserting that the creation is the creator, or at least has a holographic relationship to its creator in that he or she can be re-visualized through the artifact. I went to see the recent excellent exhibit of Rembrandt's works at NCMA, here in Raleigh. Since his death in October of 1669, Rembrandt has been his paintings. In the absence of the artist, the artifact becomes the primary manifestation of the self. And, of course, therein lies a threat, a danger in the fragility of the artifact. To the extent that the creation can be physically destroyed, so a portion of the expressed self is placed at risk.

Internet art, or at least art contained by the Internet, can be both advantaged and disadvantaged by its electronic home.

In therapeutic art the Internet offers the advantage of a seemingly infinite "quilting table." Millions of people can pull up a chair, stick in a needle and add their swatch to the pattern. However, there are risks attached to mistaking this therapeutic art for transcendent or definitional art. One can certainly find balm in the affirming shared activity of thousands - but these actions can also bury the uniqueness of the self in the complexity of massively networked activities. In its darker moments, networked art can feel more like the group-mind throbbing of a beehive, rather than the cozy comfort of a quilting bee.

Obviously, the throbbing of the hive also engenders the communal power that is often germane to political art. And hence the Internet proves fertile ground for activists who wish to create group artifacts that espouse particular political perspectives. But, in the interest of full disclosure, I need to admit that the one idea that alienates me from many of the current dominant trends in thinking about the relationship between the Internet and art is that I think the phrase “political art” is an oxymoron. Naturally one can discern political themes in artifacts, but when one starts a work with the objective of asserting a particular political perspective then one is doing public relations, or marketing, or good old fashioned “politikin’” You can produce great music, powerful images, and memorable moments doing politics, but since the message is predetermined by policy, I have trouble seeing it as art, which remember, I define as an exercise is self-exploration and definition.

For that kind of transcendent definitional art, the Internet can offer safe harbor for the artifact. A lot of sticky issues lie behind that simple sentence. They all stem from my "anti-semantic" assertion that the artifact is the artist. The question is this: To what extent is a facsimile of the artifact really the artifact? Much of my own art is, at least in part, digital. Drawings and/or photographs are scanned into Photoshop, are digitally manipulated in that environment, and then printed out - sometimes repeatedly - on different media. I would assert that subsequent iterations of those files could produce an artifact that would be identical, and hence the same thing as, any other output of the file. One could make commercial differentiations based on signature, chronological order, etc., but in terms of the artifact being the artist, the Internet could offer an artifact better "security through ‘replicability’" than any museum vault.

For artifacts created non-digitally the case initially seems less clear. Still, when we consider sculpture, music, paintings, etc., we enter an arena where, if we cannot already create functionally identical replicas, we will soon have that capability. Differentiating between different castings from the same mold was an arcane debate in the 1900s. It will become functionally meaningless as our ability to do 3D rendering from either digitally created files or digital files created from scans of three-dimensional objects comes of age. I recently read of a woman whose entire lower jaw had been rendered in 3D with a computer “printing” technique called “laser melting where layers of a metallic powder are built up and fused together with a laser.” When the jaw was implanted, the patient could talk, chew and breathe with an ease long absent from her life. Thus could Michelangelo's David be converted to a series of 1s and 0s or perhaps qubits that will allow for the creation of perfect replicas of any size, texture and color, including ones exactly matching the “original.” Seems to blur that entire notion of “the original,” not?

True, the idea of "Hey! Run me off another David!" makes me a little queasy. And I can envision an entertainment conglomerate buying up the rights to the David and selling "personalized copies" with "a true-to-life rendering of your very own face." Yeech. I am still a tad too attached to the idea of "the hand of the master." If the artifact is the artist, then - ethically and artistically - the replicated artifact must remain true to that which the artist created. When we step off that path we run the risk of bumping into the Stepford Wives. But in the final analysis, the artifact is a stimulus that triggers the firing of neurons in the mind of the individual experiencing that stimulus. To the extent that the replication duplicates that pattern of stimuli, then I think we can assume that the recreated artifact recreates the artist.

No, no. You do not want to go there. You do not want to start down the "Well, if what we are really looking for is ‘neurons firing a specific pattern’ why don't we just . . ." road. Why? Because identical stimuli fire neurons resulting in radically different “patterns of perception” in different people. You look at someone on the street and wonder, "Who let them walk out of the house like that?" They glanced in the mirror and thought, "Lookiiing gooood!" Same stimulus, worlds of difference in "the eye of the beholder."

I strongly believe that we need to somehow interact with an actual artifact, that we cannot try to make a leap to some kind of direct neural stimulation. For me the art that strikes the deepest chord is art that, through the creation and sharing of an artifact, creates a relationship between the artist and another individual, often an unknown individual. When I was a younger man, with dreams of life on the stage still large before my eyes, it seemed important that the artifacts in which I played a part should be seen by many. Similarly, ensemble work seemed the most fulfilling - the troupe performing for the many. Not so much anymore. These days "from my head and heart to yours," seems more fitting. The feeling of, if not the reality of, an interpersonal, dyadic relationship is increasingly important to me - in my teaching, in my life, in my art. So I am interested in media containers, be they physical or digital, that are amenable to those kinds of expressions.

The Internet, as a container, is not designed specifically for art. As a matter of fact the Internet is perhaps the most flexible, least content-specific container yet devised. If it has a "content preference" it is only the one that we bring to it. Currently people who self-identify as artists seem fascinated by the nooks and crannies of the Internet container that are new to us: the ability to display an artifact to, and receive feedback from, large numbers of individuals. We are also intrigued with the ability to involve large numbers of individuals in an Internet process that "feels" expressive, the ephemeral nature of the "artifact" notwithstanding. Those are certainly legitimate expressive uses of the container. But, they are not the ones that attract me.

Art situated in electronic social networks - either unique networks created for an expression or commercial entities like Facebook employed for an expressive project – are, for me, too reminiscent of middle school cafeterias. They are spaces that, under the guise of sociability and inclusion, are actually more prone to the public competition and posturing for which at least the largest social network was originally intended. No doubt interesting work will grow in those spaces. However, another positive aspect of this flexible container that we currently call the Internet, is that we are free to walk away from the areas in it that have no appeal. So, I choose to stroll away from the cafeteria that boldly hangs out the sign: "Home of Internet Art" and seek greener pastures elsewhere.

And it just so happened that this morning I chanced upon a space in the container that I found far more appealing: vipartfair.com. It appears to be the Internet equivalent of a huge art fair in a major city: lots of artists displaying their artifacts in virtual booths. Nobody implying I need to "like" them. Just the "stuff,” and I browse at my leisure. The interface is rich, but needs some getting used to. Still, it seems quite well done - "grown-up," if that makes sense. Nobody is rushing around, hollering and posturing. Nothing "pops up" or "rolls over." Despite a wildly eclectic collection of artists, the “fair” has that "touch of calm and insight" that I associate with transcendent definitional art. No doubt others would find it tiresome. There are obviously some “Con”temporary works mixed in with the interesting pieces, but all in all, an intriguing show. However I note, with some dismay, that the exhibit "closes" tomorrow. I suppose that artists in the new container will often mold the container to retain not only the form, but also the assumptions of older containers. I might like it. I'm not sure. It seems that "closing" the fair on a specific day reduces the "cheapening effect" that accompanies the convenience of 24/7 accessibility to everything. Besides, when I signed up to enter the fair the curators assured me that I would receive notice of, and access to, future fairs. So I assume more of the art I like awaits me.

So here we are, back in the eye of the beholder, in the Internet of the user, in the art of the moment. There was a time when I would have felt the need to close with "the proper perspective." There may be one. Still, when I think about the myriad ways in which art and the Internet may intersect, it strikes me as foolhardy to assert that I know what that "proper perspective" is. You want truth, certainty? Not here. Maybe in that cafeteria back down the hall a bit. . . ..

Wednesday, February 1, 2012

With an impending IPO we are once again knee-deep in all things Facebook. Part of that frenzy is an interesting article in PC World about how to block Facebook's new Timeline feature. That feature, on the off chance you have managed to avoid hearing about it, is an unfolding womb-to-tomb dossier that records in excruciating detail and precise chronological order every post, picture, like, dislike, poke, twiddle and bleep that you have ever entered on Facebook. The article's author, Sarah Jacobsson Purewal, was apparently concerned about the ease with which others could jump back years, into the midst of her posts as a less circumspect youngster. The steps necessary to bar that door are absolutely byzantine.

For me it was a "be careful what you ask for" moment. You see, almost exactly 7 years ago I wrote an article in Flow asking for such a device. I was asking for a digital "trunk in the attic." Let me quote myself:
The gradual, expressive, maturing of the digital environment makes me hopeful that an old communicative fantasy of mine may be edging toward reality. I have always been delighted by the creativity of others. Nothing gives me more pleasure than to be in the presence of another’s insight or expression and find myself reduced to a state of delighted confusion: How did they do that? And, how did they even think of that? The desire for real answers to those questions often drives me to Google to find an author’s or artist’s or musician’s or scientist’s email address and ask them. You would be amazed at how often they respond. The problem, of course, is that on occasion they have been so rude as to die before answering my questions – sometimes decades ago.

The frustration of their ultimate inaccessibility always reignites my desire for a “virtual biography.” I want to know what Einstein ate, what the streets he walked along looked like. I want to share the music to which Georgia O’Keeffe listened; I want to hear the sounds of London that Shakespeare heard. I want to be able to participate in some way in the experiential reality that must have shaped the creative flame within those souls. And I want to feel the firelight and hear the wind that whipped around the farmhouse winter nights in South Dakota when my father was a boy. I want to see the pages of the books that entranced my mother as a young girl in rural Pennsylvania. I want interactive, real time biographies that move beyond words on a page or flickering images sprung from the imagining of filmmakers and TV producers.

Such living histories would be incredibly difficult and expensive to create. To reassemble the past from fragments of mostly discarded data, to attempt to reconstruct from them a facsimile of the creative, reflective, experiential reality of one long dead is a daunting, if not impossible, task. However, assembling such works to chronicle lives in the present, using digital technology, has become surprisingly feasible.

Think about it. All you really need is a “capture device” – something that can record the visual, auditory and textual experiences of a life, a “structuring device” – something that allows one to edit, order and organize those collected experiences, a “storage device” – someplace to store both the collected data and the constructed representations, and a “publishing-distribution device” – something that allows for the sharing of the constructed representations with others.

In short, all you needed was Facebook. Of course, at the time Facebook was barely a year old and was primarily a tool that let socially challenged Ivy League college students hook up. So I look at Timeline with a kind of horrified fascination: "No! No! That is not what I meant, that is not what I meant at all!" What I envisioned for the "Digital Trunk in the Attic," or my later, more intricate, notion of a "Legacy Library" was employing the power of digital technology to capture the insights of those among us who have enriched the human condition with beauty and wisdom. Funny cats, dancing hamsters, or dancing drunks - these were not the artifacts I had in mind.

As the Republican primary unfolds around us with all its hedge-funded hoopla, I cannot help but recall the 1988 Vice-Presidential debates when Lloyd Benson chided Dan Quayle, "You, Senator, are no Jack Kennedy." So I say to Facebook's Timeline: "I proposed the Legacy Library. The Legacy Library was a friend of mine. You, sirrah, are no Legacy Library, you are no digital trunk in the attic!"