Featured Post

WELCOME TO RUSSELL ARBEN FOX'S HOME PAGE

If you're a student looking for syllabi, click the "Academic Home Page" link on your right, and start there.

Wednesday, April 30, 2008

Brilliant, Scary, Visionary, and Strange

Anyone who has read this blog at all knows that if there's one thing I'm not it's a so-called techno-libertarian, or whatever you want to call them. Leaving completely aside my communitarianism and everything else, there's the simple fact that at heart I'm an old-fashioned, read-the-dead-tree-newspaper-with-breakfast-every-morning, make-my-students-hand-in-actual-papers-to-my-actual-hands fuddy-duddy. The most advanced technology I use in the classroom is chalk. I don’t know what the deal is with “Web 2.0,” and I tend to be suspicious of those who claim to know. I don't like Blackberries, and I worry about the people who do.

I don't imagine that'll change. But still, just now, via Making Light (the wonderful blog of the Nielsen Haydens), I stumbled upon and watched with fascination a video of a 15-minute talk given by Clay Shirky (some sort of high-tech guru, I presume) that put the whole vision of an endlessly interconnected world of media and information into a context where, for a few moments at least, I really got it. I'm dubious about much of the history he invokes, and his math to calculate just "where do people find the time?" sounds a little crackpot to me...and yet the whole thing, his imagined evolution of us from passive tv-watchers to interactive Wikipedia-page-writers, was brilliantly persuasive. In 15 minutes, he travels from the wrenching changes of the industrial revolution (and its essential technology, gin), to the unexpected wealth of the post-WWII world (and its essential technology, the television sitcom), to the "cognitive surplus," to Pluto, and beyond. Watch the whole thing to the end, to make sure you get the somewhat scary (but oh so truthful) story of the 4-year-old and the dvd player.

I’m not yet convinced that the world he’s describing is real, or that I’ll be happy raising my children in it...and blogger though I may be, I still refuse to buy a Blackberry, and our rather simple dvd collection is more than adequate for keeping a lid on our entertainment needs, thank you very much. But still, I think I can understand the strange promise of that Blackberry, webby, downloadable, borderless, interactive world a little better now. Give it a look, and decide for yourself.

37 comments:

Rob Perkins said...

Clay Shirkey missed the point, and overstated what that four year old was looking for and why.

Early seasons of "Dora the Explorer" used a computer game metaphor to encourage interaction. The opening title was even a computer screen. That show had a mouse pointer on the screen *every single time Dora asked the child viewer for feedback*.

This contrasts with a show like "Blue's Clues", where the host asked for spoken feedback. (And like lumps, my kids never talked back to the TV. It's a TV and they knew it.)

Naturally, the kid was looking for the mouse she knew should control that pointer. She did *not* assume that the screen was broken. She assumed that the mouse was missing.

(And in that sense, the producers of Dora also missed the point on an appalling level, by assuming that a child would *respond* to that kind of interaction by talking to the screen or pointing at it instead of looking for the mouse. I am certain that they did.)

Beyond that, though, his insights are actually kind of pedestrian, and as old as the desktop publishing revolution. Remember "Student Review"? I think it's still being published just off BYU's campus! The same publication you wrote for pseudyononymously back in the late 80's!

So he's right in that media sharing and different time sinks are with us to stay. TV producers who think wiki and YouTube are fads and that people will all go back to television are not thinking with their expensive educations. If they retained any of it in the first place.

And he might even be right that we'll all come to assume that screens without interactivity will cease to be appealing.

But his little girl wasn't thinking that, when she went looking for the mouse.

Russell Arben Fox said...

Well, geez Rob, way to take the wind out of my sails. Next you'll be telling me that Clay Shirky is some old hack who still fiddles with Japanese transistors in his basement that nobody in the industry takes seriously.

Your commonsense point about "Dora the Explorer" makes sense, of course; we've had kids who've watched it for long enough that I remember those old episodes. So Clay got that one wrong. As for the rest of his stuff being "pedestrian"...well, maybe. But to me, who only learns enough about technology to avoid thinking about it for as long as possible, his talk of the "cognitive surplus" and all the rest was weirdly compelling.

Rob Perkins said...

Sure it was: You were a part of the desktop publishing revolution, the mac and the laser printer, which freed small journalists from the need to approach the heirs of Pulitzer and Hearst in order to publish news. Instead of needing a press costing hundreds of thousands of dollars, writers could federate around a $10,000 rig and still make a newspaper.

And instead of watching TV or playing coin-op video games, (which is a large part of what I did) you became a writer for both the newspapers affiliated with the campus as early as your freshman year, at least until the attitudes of the Pulitzer heirs forced a choice on you as to which one. I remember that.

So, in actuality, Russell, you were an early participant in Clay Shirkey's trends. That's my reasoning for calling it pedestrian; it's been going on very steadily for 25 years or longer.

The rhetoric is even the same! Hence, the reference to the Student Review Before the mac, no newspaper off BYU's campus lasted longer than a year or three.

The actual technologies are ancillary to the point, I think. The Internet lets you blog without considering the awesome level of human cooperation required to let you blog.

Shirkey's point about TV as a surplus time sink is well-taken, but honestly; wasn't David O. McKay warning against it as a waste of valuable time 50 years ago?

So, yeah, from one point of view, quite pedestrian. Joss Whedon once put the term "somnambulent public" into the mouth of one of his supermen, ironically a boy who spent all his time watching all the television he could.

Perhaps I'm not impressed with Shirkey's insights because I've now seen at least three waves of such "this'll change society" rhetoric without it changing much about society over the last 20 years.

That would be desktop publishing as a revolution (macs and the Student Review), the Internet as a revolution in gadfly and at-times anarchic publication (Drudge Report), and now as a revolution in interactivity (Wikis and blogging). But it's all one to me, since the rhetoric about these sorts of sea changes has always been the same. I'll bet that if I pushed a bit, I could even find that rhetoric present in the 1964 Civil Rights Act's fanfare, women's sufferage, emancipation, and the Declaration of Independence and the Constitution.

(So, yeah, pedestrian, but also kind of Jeffersonian, in that there is a "revolution" of one kind or another setting people a little freer, if they choose it? Hah! A pedestrian revolution!)

I'm just as impatient with most tech visionaries, for the record here. Steve Jobs, for example, is no longer impressive to me, though he was at the head of the DP revolution back in the 80's. Ray Kurzweil strikes me as something of a complete joke, since he draws linear projections about computing power, and sticks to the notion that our computers will soon awaken as artificial intelligence.

Put simply, I don't buy it, because I've learned a thing or two about physics and don't agree that the seat of human intelligence is completely in the brain. Yeah, that's a religious notion, but I think you'll recognize why a mainstream Mormon might think so.

Meantime, I can tell you exactly why to get a Blackberry or an iPhone. The primary reason is to be insufferable in casual conversation, since that wikipedia thing will always be at your hip.

And if I were to give advice to you as a student to a professor it would be to embrace the technologies which enhance your teaching and my learning. PDF-formatted papers in email are not scary. Nor is the idea of letting me use a scrubbed laptop in the testing center to write essay response tests, rather than those damnable blue books.

Hope you will.

Russell Arben Fox said...

Rob, I see your point now about bringing the Student Review and all that into the conversation. You're right: I do remember how people--and I!--were talking about desktop publishing and the way information was being "liberated" by all the revolutions which the Macintosh and the programs we ran on it introduced; we did look upon the "new world" of publishing in terms of empowerment, interactivity (though I suppose we didn't use that word exactly), and so forth. I guess I'd forgotten those days; thanks for bringing them back to my mind.

I've never, to my recollection anyway, really ever been struck before by the sort of high-tech visions such as Shirkey is pushing; this is kind of a first for me. Usually I've been much more a dissenter. But somehow his rhetoric caught me unawares. You should blog on this, Rob; turn your comments into a longer post, about your loss of faith in the visionaries, if only because you see them--perhaps rightly--as just individual points along an extended continuum, stretching way back in time, points that for whatever reason want to see themselves (and want others to see them) as on the cusp of something dramatically "brilliant, scary, visionary, and strange," when the truth may be that they're simply plugging along the same road that everyone else is already on.

Anonymous said...

But assuming the mouse was missing *is* the point -- a screen without a mouse seemed broken to her, not in the "Manufacturer's Warranty" way, but in the engineers' way, where something that doesn't work as the user expects is broken.

In other words, the DVD violated her expectation that screens should be available as a site of interaction, whatever she was watching (it probably wasn't Dora, in fact -- I used that as a ref for an American audience, but she is English, and so was likely watching something else.)

Rob Perkins said...

@Clay, of course it was a consumer's response to an engineering detail; a four year old will know nothing of the warranty.

I have five kids myself. Three are still interested in that kind of children's programming, and I've never seen them expect our television to have a mouse. They simply go over to the iBook I have set aside for them in the corner, where their skill at using the software installed there surpasses most of the adults I see.

If there is a criticism, it is not that your original point is invalid. It's simply that the metaphor failed with me because of my own kids, who never interacted with the faux-interactive children's TV so popular with educators these days.

Instead, they all had the presence of mind from the age of three to know what the TV could do and what it couldn't.

Aldo said...

Rob, preliminarily, congratulations on the 5 kids. Brave woman, your wife. Somehow, I picture a Rob circa 1988 and can't imagine you espoused (but that goes for all of us, I suppose).

Anyway, I think Clay's point is still valid in that the way the next generation thinks about human interaction is radically different from what it was before. Whether this shift is revolutionary or evolutionary is ultimately unimportant. What is noteworthy, IMHO, is that my 16 y.o. has a very different notion of what it means to "keep in touch" than anyone in prior generations.

Thanks to facebook, he knows more of the minutiae of life in our small Connecticut hometown than anyone residing in Tokyo ought to know. Were we to move back, he could pick up life without missing a beat, and then stay "in the loop" of his Tokyo social circle to the extent he wishes. (Even this betrays my "old guy" thinking; I don't know that my son would acknowledge that "different circles" even exist.)

For us, and I think Clay was leading to this, life was compartmentalized; there was active/up-time and inactive/down-time. Now the boundaries are gone. While the kids get it intuitively, we old farts (maybe not you, Rob) need someone like Clay to say "Hey, the boundaries are down". And I believe that the boundaries being down does change society substantially, even if incrementally.

Unknown said...

Hi Rob!

I have the same reaction to tech visionaries that Rob does. Here's a good rule of thumb: if anybody tries to predict the future farther than six months ahead, he's probably wrong.

Here is where I think Shirkey got it wrong. He's conflating two things: interactivity and creativity. It is absolutely true that media - virtually all media - is and will continue to become more interactive. But Shirkey is making the claim that people will use this increased interactivity to make things. Like Wikipedia, or virtual worlds, or whatever; and that as media becomes interactive more people will start doing this.

If this is even more than marginally true I'll eat my hat. It may be true that such efforts will become more visible, because that's what the internet is good at - giving power to individuals. But here's the rub: creative people, the kind of people who make wikipedia or open-source software or blogs - were creative before the internet came along. They wrote for Student Review. They bought chemistry sets and blew stuff up in their basements. They bought Apples IIs or Commodores and wrote code in their bedrooms. They joined acting troupes. The internet just lets them do all of that stuff better and faster.

And noncreative people? This is going to sound unbelievably elitist. Maybe it is, I don't know. But noncreative people were noncreative long before TV gave them an excuse. Shirkey's claim appears to be that the internet will make noncreative people into creative people. That is an enormously difficult thing to do. Noncreative people aren't creative because thinking is hard work, and lots of people just don't enjoy doing it. So when they recreate, they do things that don't require thinking. Like TV. Or video games. The best interactive media in the world isn't going to change that except in isolated cases.

Russell Arben Fox said...

But here's the rub: creative people, the kind of people who make wikipedia or open-source software or blogs - were creative before the internet came along....But noncreative people were noncreative long before TV gave them an excuse. Shirkey's claim appears to be that the internet will make noncreative people into creative people. That is an enormously difficult thing to do.

Heh heh heh. You're playing to my fuddy-duddy prejudices here: my basic doubts that much of the high-tech stuff which absorbs so many peoples' time isn't all that impressive. You guys are making me feel like I allowed Shirky's rhetoric to sell me a bill of goods.

Unknown said...

I build this stuff, and you're right. Some of it's pretty good. The internet was a major invention in the history of the world, no doubt about it. Some of it's just hype - look how fast the palm pilot went away. But none of it is going to change basic human nature.

D. Ghirlandaio said...

Art engages you as something made by someone else, that you experience not passively but actively and to which you then respond. Henry James didn't make room for readers? Can you be passive and read The Ambassadors?
Only bad television is passive, and even it takes effort. To do a remix of Hamlet and do it well you need to know the original inside and out. To understand or "get" the remix it would help to know the original.

It takes two people to communicate, To communicate is to challenge. To challenge articulately takes time and patience. To listen well takes time and patience and means to you have to shut up. What this idiot describes is a world of narcissists rewriting each others half-finished sentences in a haze of semi-social self-absorption.
Futurists know nothing about communication and culture. If I needed more proof that sci-fi and cerebral fiction is based on a misunderstanding, this is it. The whole thing makes me cringe.
It's the dumbing down of communication and art into interactive "gaming."

Why do games like chess have rules? Because games without rules are boring. Because in playing against an opponent you're playing against an opposing philosophy: Kasparov or Karpov? Baseline game or serve and volley? The rules are simple. The players are complex.
Dreaming is not art. Art is the articulate description of dreams. All Shirkey is talking about is a waking dream: If nobody took the time to do anything well, then we could do everything together. We could all write like Robert Heinlein and Ayn Rand and make movies like Ed Wood. Wouldn't it be fun? The soundtrack for his intro, the corporate muzak power-pop, just about seals it.

Anonymous said...

We'll have many more people creating and so there will be a greater store of worthwhile ways to be distracted, some even of immense value. But, we will also have even less filtering, and so even a greater percentage of what is consumable (read: what is "shared" with us) will be crap (as difficult as this is to believe). And yet we won't have any more time than we have now to try to sort through it. And so, most people will fill their time with a handful of favorite toys - most of which will be of questionalbe value. Very much like the TV era, only more so.

I love the internet, and it has certainly changed everything. The amount of information we've got mere seconds away at any given time is ... there is barely a word. Awesome. But with what is gained, there is always something lost. I haven't received a personal hand written letter in many years. That's a loss.

I always remember this article by the novelist Mark Helprin. If it doesn't bring out the luddite in you, nothing will.

http://www-rcf.usc.edu/~clingerm/helprin.html


~

Unknown said...

Let me say something about gaming. I think gaming gets something of a bad rap. To a large extent that is deserved, as most games to date have been fairly mindless. But I think that this is a genre with promise. Look at the evolution of movies from their invention until now. At first movies were dismissed as being mindless entertainment for the masses, and certainly not art. That perception lasted, what 30 years or so, until someone figured out how to use the new medium to tell amazing stories in ways other media couldn't, and then movies became accepted as being as much an art form as the novel.

I think the same thing will happen to video games. There will always be mindless video games. But once someone figures out how to use the medium to tell really great stories that couldn't be told any other way, I think some of them will rise to the level of art.

Rob Perkins said...

Here's another angle; I'm into my third day trying to form an essay about just how futurism bores me. I intend to eviscerate Vinge, Kurzweil, Brin, and Steve Jobs in 900 words or less, and... it's hard!

Hi Aldo. Man, that's a blast from the past... Howzit? You're in Tokyo?

I should post our most recent family pic someplace.

@Glen, I build it too. But the Palm never really went away, we just call it "iPhone" these days. That's the altitude I see these gizmos at. Palm->Win Mobile->iPhone/Blackberry, all seeking to solve the same problems and be hyper-pagers to the masses.

I'd have to be a dolt not to recognize the fundamental shift the Internet offers, but the older I get the more I equate it with any automation, including steam factories and putting cannon on a ship with a compass from China.

But to answer Aldo, the different circles of his son's friends certainly exist, unless it is possible for your son to combine his two groups of friends online somehow, in a more fundamental way than Facebook.

Circles of acquaintance and friendship can grow and flourish online. I've actually ended up offering marriage and child rearing advice to women who play World of Warcraft, for example, and I am in business with a man in Ohio whose face I've never seen in person, and employed by another man in Ohio with whom I have personal contact one week out of every three years, except through e-mails and AOL IM's.

Unless this just proves that I am an example of the point you're trying to make? Unless I'm one of those old guys who instinctively gets it, because I have been doing it since long before pagers could talk back?

That I've seen the boundaries as "down" for years, and therefore might see the bumps in the road which Clay Shirkey did not point out in his short speech?

I've got to go off and think about this for a bit.

Rob Perkins said...

@Thomas --

I love the Internet, too. Without it, my current circumstance is impossible.

But it is certainly built on the least robust technologies, using the absolute minimum level of voluntary human cooperation, and it is owned in large part by people who would love nothing more than to own it more completely, and exact much larger tolls for its use than the usurious amounts they already charge.

It's going to collapse under its own weight unless some very hard decisions are made by the owners of the networks which comprise it. That collapse could (not *will*, but *could*) come as soon as four years from now.

@Glen re gaming and art -- It should not surprise you that I play World of Warcraft, but its use to me is as a friend-building circle, that of the poker table, since its natural interaction group size is five players. The possibility for community building is realized there.

And its art production is at least as expensive as a movie, even if the stories told are somewhat banal. WoW is kind of like a "Star Wars" milieu, but with the opportunity to play along. It's the inheritor of D&D, in a real sense.

Even so, I do *not* suggest that people take it up. What a time-sink. Almost as bad as joining a neighborhood association board!

Unknown said...

I'd suggest that The Steve exists in a completely different plane than the other futurists you're talking about. Jobs doesn't try to predict the future; if you listen to his public pronouncements he tends to be very conservative, and in fact has gotten much more so over time. He's not really talking about fundamental changes in society. He's just building better gizmos, which I think most of us can agree he does very well. What Shirkey, Kurzweil, et al, do is very different. They aren't businessmen. They aren't building anything. They're just talking heads.

D. Ghirlandaio said...

The problem with movies is that they make it easy to be passive. The problem with TV is that makes it even easier and gaming even more so. I don't love the internet any more than I love or would have loved the printing press when it was invented. The printing press put a lot of great craftsmen out of business. It facilitated communication over distances, a good thing, but it simplified it as well. Still I'd call it a net plus.

When tools make it easier to become thoughtless it's a problem. Futurist don't think of such questions. New things are cool Newness is the telos. But newness is not interesting thought.
Life is old wine in new bottles. Getting all hot for the shape of the bottle is what Madison avenue encourages. Advertising is for futurists.

Technics is about bottles, art is about wine. The Guttenberg bible is not about the printing press. The Sopranos, Deadwood, et al are not about TV. Titian's paintings is not about oil paint and Nadar and Warhol's works are not about Philosophy. At some level they're all about the impact of technology on life. Over the past few hundred years art has become preoccupied with understanding what it means to live with newness, because we are preoccupied with the question ourselves, and we make art.
Other people are enthralled with newness itself, they think newness is the answer. Most of them are technicians, and they illustrate their answers and call it art. Soviet Socialist Realism, The Fountainhead, and advertising. Norman Rockwell too: modern nostalgia. All illustration.

"We'll have many more people creating and so there will be a greater store of worthwhile ways to be distracted,"
Distracted from what!!?

Art isn't creation it's observation and description. And doubt.

Anonymous said...

"Distracted from what!!?"

I think we intuit that there is a deeper engagement with reality, and that, depending on who we are, much of what we do is a way of avoiding that kind of engagement. However much I don't want to be this way, the majority of what I do on the interenet I do to distract myself from the unplesant realities of my life. That's sad, even pathetic, but there it is.

~

Unknown said...

"Art isn't creation"

I take issue with this. Let's think about the Wright Brothers for a second. One way to interpret what the Wrights did is purely technical - they wanted to build a new device, they went off and worked for a while, pulled off a brilliant technical coup, and made a machine that could fly.

But another, equally instructive way to interpret what they did is as one of the most influential pieces of performance art ever. What does it tell you about the human condition that two guys from a small town in Ohio would seclude themselves in their garage for several years and emerge with a contraption made of piano wire and fabric that could take flight, giving them views of earth and of humanity that nobody had ever seen before? And that they would take their device all over the world demonstrating it, and that the people who saw their performances would subsequently never see themselves or their world the same way again?

Here's what it tells me: it tells me that there's always reason for hope. It tells me that the world is a very surprising place. It tells me that Orville and Wilbur had it all over Christo and Jean-Claude.

Your homework: interpret the Apollo program as art; both performance art and as strictly aesthetic static art. What does it communicate to you?

Rob Perkins said...

Hah! Plenty of people have called the Apollo program a "stunt", since it failed to live up to the ideal they had of a colony on the Moon, rather than just a demonstration that the U.S. is capable of lifting missiles out of the gravity well. For delivery back into it.

And since a stunt is a kind of performance art... and a hoax (remember those people?) is a very refined kind of performance art!

Of course art is creation. Even description, doubt and observation are creative acts, because our senses perceive thing imperfectly.

Rob Perkins said...

My intent is to place Steve at a certain point in the continuum, and mock his misuse of the term "revolutionary" a bit.

OK, I'll just post the draft, over at http://www.parasiticmeme.com/?p=22. And you'll see that I left Steve out of the mix...

D. Ghirlandaio said...

"But another, equally instructive way to interpret what they [the wright brothers] did is as one of the most influential pieces of performance art ever."

You're not referring to what they did but to the story that became of it. Time and distance mythologize events. You need to separate events from their use. The Wrights were a couple of dry sticks out of the midwest, full of can-do industriousness. If you want to make them into Duchampian figures it might make for a good novel, but that's your recontextualization of them and what they did. You might end up saying something about early 20th century America, but not the Wrights themselves. You could do the same for Hiroshima or 9-11. Stockhausen got himself in hot water for making that argument.

"giving them views of earth and of humanity that nobody had ever seen before?"

Balloons had been around for awhile, and so had airborne photography. Here's Paris and Boston

Unknown said...

"You're not referring to what they did but to the story that became of it."

No, I'm not. I'm "recontextualizing" the definition of art. Art isn't just something that sits on museum walls. Art is something - anything - that communicates something of value to humans about what it's like to be human. By that measure, the effort to build an airplane - something that had never been done before, and that most people thought to be flatly impossible - is one of the most audacious acts of hope and defiance ever performed. It certainly speaks as profoundly about what being human is about as putting paint on canvas or carving shapes out of stone.

D. Ghirlandaio said...

I'm "recontextualizing" the definition of art. Art isn't just something that sits on museum walls. Art is something - anything - that communicates something of value to humans about what it's like to be human.

Performance art is the rediscovery and reinvention of theater by those who were schooled to be opposed to it. Read "Art and Objecthood" the silly but historically important essay by Greenberg's most important protege. It was written during the period that produced performance art. It's a relic of prescriptive grammar for the arts. You'll understand what people were rebelling against.
You're free to describe anything anyway you want. The question is whether it will stick. In the long run, your definition -defined itself by wishful thinking and ignorance of history- won't

D. Ghirlandaio said...

"Art is something - anything - that communicates something of value to humans about what it's like to be human."

Then anything -any object or event- is art. Which would be fine if it weren't so general as to be useless. Best to say that any object or event is fodder for art, especially historically significant events. Saying the Wright brothers' flight or 9-11 is art is to think too broadly to be interesting. It doesn't re-define art it vague-ifies it.

Unknown said...

"In the long run, your definition -defined itself by wishful thinking and ignorance of history- won't."

Let me ask you something. You're clearly a highly educated person. But have you ever invented anything? Ever come up with an idea for a machine, a technique, a new way of making something or doing something, and then worked to make it happen?

I have. I do it for a living. I don't claim to be in the same class as Orville and Wilbur Wright, Robert Goddard, Werner von Braun, Max Faget, or Charles Babbage. But I'm involved in the same endeavor, and I think I know something about what it's like. And I guarantee you that every one of those men did what they did because they wanted to show, in no uncertain terms, what humans could accomplish. I think that's the same thing that motivated Bach and Rodin and Picasso. And I think their accomplishments deserve to be placed in the same category.

You seem to believe that the effort to develop new things that actually do something useful is somehow less profound than making things that just sit there being pretty or provocative. I simply disagree. But you know what? I'll choose to disagree with you without insulting your intelligence or your education, which is a step up from the way you've treated me. I'll even give you the last word. Go ahead and take it.

D. Ghirlandaio said...

My grandfather invented this. He held 20 patents and was a midwestern dry stick.
"I think that's the same thing that motivated Bach and Rodin and Picasso. And I think their accomplishments deserve to be placed in the same category."
Many technology enthusiasts agree with you. As I said, before with different examples: if you want to put the works of Werner von Braun and Rodin in the same category you're free to do so. You're also free to compare apples and oranges. The question is: what do we learn by doing so? Wanting to see your work as art makes it so, for you but the argument for historical precedent is thin.

I'm not a Luddite. I just got off the phone (no joke and it's a Blackberry) with a friend of mine who's one of the 4 or 5 people on the planet responsible for the details behind the most advanced motion capture effects in Hollywood. He's telling me I just gotta pick up the new 12" wacom pad. It's a monitor! And I'm gonna do it. It's a neat tool. And I'll try to find a way to do something interesting with it. But it's not art it's a tool.
Invention without reflection has not in in the past been thought of as art. I doubt it will be in the future.
I'm sorry to be rude but the defense of that deluded sales geek or what he represents is too much.

Rob Perkins said...
This comment has been removed by the author.
Rob Perkins said...

(The deleted comment is this one, but with a sentence completed.)

Something is not clear to me. Have you assumed that Glen defends Clay Shirkey?

I took out that Glen is impatient with Shirkey and tech "visionaries". But I agree with him that you have not been polite in making your point, which is what, again? That not every creative act is worthy of being called art?

Art?

That old short latin word "ars"?

The one whose use in engineering's phrase "state of the art", which in turn describes the most recent standardized techniques used to produce a complex product? (That is to say, the 45 nanometer IC fabrication process is "state of the art"?)

The word used to describe how a Dickens character in Oliver Twist got away from authorities?

Work the etymology on "artifact", which is certainly what the Flyer is, and you find that its use acknowledges that it is a work of human creativity, of art!

We enshrine both the Flyer and the B-52 in museums, after all, many times across the street from other museums which hold the things you have narrowly called art.

Your definition insufficiently captures the scope of the word, even in modern common use.

I don't usually call what I do "art", in that I don't usually think of designing user interfaces and workflows for industrial software as artful or even all that impressive.

But maybe I should. It's a marvelously specialized job I have, combining old Fortran and C++ programs with whatever new stuff Microsoft throws at me, just to keep up with the platform wars. D., Glen, you decide. Is UI design in the service of industry "art"? If a UI goes away like many of Apple's UI's do, is that art?

There is irony, by the way, in that you and Glen and I agree that Shirkey is selling snake oil. Or a blue pill. We should be kinder to one another, since we are implicitly federated our rescue of Russell from Shirkey's Matrix. If I could find another metaphor to mix here, I'd drink it right down...

o there I did it, a tiny work of silly art which amounts to almost nothing, except to illustrate an ironic point. Does it qualify?

:-)

Unknown said...

Rob said...

"D., Glen, you decide. Is UI design in the service of industry "art"?"

I think it could; it depends on how good you are at it, Rob :)

I would not call the paintings of Thomas Kinkade art. I would call the paintings of Michelangelo art. Whether it's art or not depends not one whit on the media; it depends on the skill of the artist. So I'd say that the UI of, say, Microsoft Word isn't anywhere close to art. But the UI of Myst? Or of a really really well-designed website? Yep, very possibly.

D. Ghirlandaio said...

My second comment was in response to discussion of gaming, which I think still deserves a bad rap. Early film is some of the most beautiful. Check out Louis Feuillade, you'll have a great time. Think silent Flash Gordon serials directed by Jean Renoir.
My third was to the Wrights as performance artists. and by that point I'd lost patience but that was a mistake, and I apologize. I jumped too fast.

There's art in the way you choose your tie, and even in the way you walk, with an amble or a lope or a strut. Art is rhetoric, in both the good and bad senses. What bothers me is the argument that mechanism itself is art, that art is function. You could call this an esthetic, but it's the esthetic of autism. For most people -and I think this is true- art isn't the invention of the airplane it's the bicycle parts and the starched white shirts and the way the flyer was both old fashioned and modern. As one of you said, it's all the associations that come with it.

The Wikipedia page says that when Wilbur died their father wrote: "A short life, full of consequences. An unfailing intellect, imperturbable temper, great self-reliance and as great modesty, seeing the right clearly, pursuing it steadfastly, he lived and died." Dry language but still very rich. There's poetry to it but I wouldn't describe the language as "creative."

Of course there's an art to interface design. The interface is where human flexibility meets the inflexible machine. And a beautiful web page is a beautiful interface as well.
This discussion is pretty much burned out, and I'll apologize again, but my only point- and I should have been clearer- is that art/poetry/ whatever is inventive communication, but invention itself is not communication. The Flyer is as beautiful as a Brancusi, probably more so because its more complex: it carries more kinds of meaning.
And I'm sure just as there are two kinds of beauty in serve and volley and baseline play, and in the aggressiveness of Kasparov, and the patience of Karpov, there are arguments among designers over the beauty of circuit boards. But the beauty is in the communication of the sensibility of the maker not the function of the thing itself. To put it another way, art isn't doing something well, it's doing something that communicates well all the reasons you have for doing it. [It's funny but I think it makes sense] Anything becomes art that can do that. You could defend the esthetics of autism but I won't.
On a general note I don't think you need to recontextualize anything to see the beauty of the Flyer. And I'd argue, but only as a theater critic, that the spectacle of the first flight, being spectacle, is not as beautiful as all the stories that followed it. Anyway, enough for now.

Unknown said...

D Wrote:

"To put it another way, art isn't doing something well, it's doing something that communicates well all the reasons you have for doing it. [It's funny but I think it makes sense] Anything becomes art that can do that."

D, that's exactly what I meant. My beacon in that regard is Da Vinci, who made no distinction between art and invention, and did both as well as anyone who has ever lived.

And, apology accepted; I hope we meet again.

D. Ghirlandaio said...

Now I know where you got von Braun.

D. Ghirlandaio said...

And since I go for doubles: da Vinci is the originator of your argument. I think he's problematic as an artist and an intellect but that's a point to be debated more than fought.

Unknown said...

"I think he's problematic as an artist and an intellect"

Really? Why?

jmil said...

Interesting stuff. You guys might want to check out the most recent Malcom Gladwell article in the New Yorker. http://www.newyorker.com/reporting/2008/05/12/080512fa_fact_gladwell

His end argument is basically that scientific discoveries are "in the air" and an eventuality. If Newton didn't discover Calculus, Liebnz would have. (Bad example, bit of a Newton worshiper here) Anyway, he contrasts this idea of eventuality in science to the individual un-recreatable in works of art.

Also I'm going to have to agree with ghirlandaio defintions of art. The others are a bit willy nilly. Why do you want to be an artist so bad? Isn't it pretty great to be an inventor/scientist?

Unknown said...

Personally, I don't claim to be an artist.

Also, we need to distinguish between science and technology. Somebody at some point would have developed the calculus if Newton hasn't done so; absolutely right. Would someone have eventually landed on the moon if NASA hadn't done it in 1968? Not nearly as clear.

I think my original point about art wasn't communicated clearly. I didn't mean to say that art and technology were the same thing. I didn't mean that an artifact, something designed by engineers to perform a specific function, is art - although such artifacts can indeed be quite beautiful. What I did mean was that the act of designing something new, especially something so unlikely that it appears at first glance to be impossible, such as the first airplane or a spacecraft capable of landing a human on the moon, is motivated by similar impulses as those which motivate the best art. And if one examines the greatest technological achievements, they communicate to the viewer in the same way art does - and that sometimes that's a rewarding way of thinking about those achievements - as human achievements with tremendous emotional impact.