Monday, June 30, 2008

GREED-E

Ok, I admit it. WALL-E is a great movie. It operates as a simple love story, as a hero vs. villain melodrama, and as post-apocalyptic science fiction, and it succeeds at all three.

However, if only to maintain my curmudgeonly reputation, I have to find a few things to pick on. For now, I'll limit myself to two.

Behind the closing credits, there's a wonderful sequence of graphics essentially mimicking the history of art in the course of a few minutes. There are prehistoric-looking drawings, graphics that resemble the work of ancient scribes and Medieval illuminations. There are also references to specific artists such as J.M.W. Turner, Georges Seurat and Vincent van Gogh. I'll have to see it again to put my finger on it, but there's something about these stylistic allusions that suggests the Pixar artists are not simply paying homage to these great artists. They are smugly boasting, as if to say "Ha! With our digital tools, we can do anything any other artist has ever done."

The more egregious fault, of course, is that although the entire movie is a heavy-handed screed against consumer culture, it's preceded by an ad for the WALL-E video game, due out next month. The discreet BnL ad hidden near the end is tongue-in-cheek, but the WALL-E video game ad is certainly not. Moreover, a quick Web search reveals that the Disney/Pixar folks are zealously pursuing every possible licensing opportunity for WALL-E toys, games, bed clothes, etc., just as with every other Disney property. It's as if the message is: "Humankind is doomed if we don't change our acquisitive ways, but meanwhile, buy some more junk from us!"

Thursday, June 26, 2008

ConFusion

Ever consider the rash of new job titles being created for positions involved with some aspect of developing and maintaining Web sites? It seems that every task or activity in traditional publishing has a corresponding position in Web site creation, but with a completely misleading title. In addition, some jobs correspond more with traditional broadcasting than publishing, and they now have the pre-pended "Web" designation.

Web Master, Web Designer, Web Developer, Web Producer, and dozens more with "Web" in the title. Salary.com lists 18 different job titles starting with "Web...", and that's not even counting "Web Press Operator," a traditional printing job. Then there are the Web jobs that don't even have "Web" in the title: Information Architect, Experience Designer, Content Coordinator.

And the salary ranges are all over the place. Though you'd never guess it from the titles, some of these jobs correspond to traditional graphic design jobs, some to writing and editing jobs, some to software engineering/computer scientist jobs, and others all over the place.

Can job titles be copyrighted or trademarked? Maybe there's a revenue opportunity here.

Tuesday, June 24, 2008

The Mac Is Not Perfect

It's been a little over a year since I switched my primary home computer from a Windows XP Pro machine to a MacBook Pro. I guess anything with "Pro" in the name is ok by me. During that time, I've learned a lot more about being productive on the Mac. However, there are still some things that, in my opinion, Windows does better. (Gasp! Can he be serious?)

Yup, I'm serious. I'm certainly no Windows evangelist, and I do admire the Apple emphasis on product design. But there are certainly things that are just plain harder to do on a Mac.

1) Using the mouse. Since Macs have always had mouses, the system was designed around usng a mouse for some functions. Windows, on the other hand, evolved on a mouse-optional platform, so you can do absolutely everything from the keyboard. When I'm doing text-intensive things, keeping my hands on the keyboard is much more convenient and comfortable.

2) Resizing from anywhere. On Windows, you can resize a window by grabbing any edge of that window and dragging. On the Mac, you have to use the lower right corner. Smaller target equals harder task ... Human Factors 101. In fact, I like how Windows gives you 3 clickable window size options: small, full-screen, or minimized. On the Mac, you can use the green "dot" in the upper left corner of the window, but the results are unpredictable!

3) Menubar location. On Windows, the menubar is always right at the top of the window you're working on, so it's nearby. On the Mac, it's at the top of the home screen. If you have dual monitors it may be on a different monitor altogether, a good foot and a half or so from your work area!

4) Closing windows vs. exiting the application: On windows, closing all the open windows for an application exits the application. If you have sneaky little windows hiding behind other windows, that can be confusing, but generally, it's a pretty simple model. On the Mac, closing a window just ... closes the window. The application is still sitting there, waiting for your next urge. Now maybe this is just the result of my years of Windows use, but the Windows model seems more intuitive to me.

5) The mouse plugs into the keyboard?! Boy, is that ever a pain in the neck. For that matter, how come there are no "natural" Mac keyboards? I've used these for years on Windows, and I find them really ... uh, natural. Could that be one of the rare cases where design aesthetics trumped usability?

6) ... Well, I can't think of any others at the moment, but I'm sure I will later.

Meanwhile, there are a few areas where I was hoping the Mac would offer a better experience than Windows, but found that it doesn't. Chief among these? The Mac still keeps nagging me to install software upgrades that will require restarting my system. Why is that? I worked on various *n*x systems for years, and never had to restart to install software. Even X Windows just had a command to re-load the database after you edit the configuration files.

More later.

Sunday, June 15, 2008

Details Matter

I'm not sure which is harder to believe: that Dr. Henry Jones, Jr., aka Indiana, professor of archeology, world traveler and speaker of umpteen languages, ancient and modern, mispronounces the word "nuclear," or that multi-academy-award-winner Steven Spielberg couldn't correct him!

Of course, neither of these was the hardest thing to swallow in this movie, but what the heck. It was fun.

Friday, June 13, 2008

The Turing Test

In 1950, computer scientist Alan Turing proposed a way to test a computer's intelligence. The gist is that you have a person talk to a computer, and see if the person can tell whether he or she is talking to a computer or another person. If the tester can't tell the difference, that computer is said to be intelligent.

Well, there are a million problems with this test, but one of the most important ones is the gaping question of who is doing the judging. I've known people who could talk to the recorded time on the telephone and not know they were talking to a machine. On the other hand, I've known people who can talk to you, and can detect the precise nanosecond at which your mind starts to drift and think about whether you put on matching socks that day.

Another way to think of Turing's test is: If you want to know if a computer is intelligent, talk to it and see if it seems intelligent. I think this test could be applied to humans, too, and I know quite a few who would flunk the test.

Begging the whole question of what intelligence is, we could probably put together a list of tasks which, in humans at least, require intelligence to complete. Recognizing a person in a photograph. Looking at a drawing of a room or a building, and then drawing that same room or building from a different angle. Reading some text and then answering questions on it. Comparing analogies. You know, standard SAT type stuff.

Now for any one of these tasks, it's possible today to build a machine that does really well. Does that mean that artificial intelligence has already been achieved? Or does it require the whole amalgam of skills to be intelligence?

I'll come back to these questions, if I remember.

Friday, June 6, 2008

The End of Art?

Paul Krugman in today's New York Times discusses the trend of making all books, music, movies, and other creative content available in digital form on the Internet. The abundance and availability of content will push prices down to near zero, so most creative media work will be free or nearly so. This means that creators have to look for other ways to make money.

Krugman suggests tours, merchandise, and other ancillary sales could help make up the difference. He also hints at the subscription model, in which such work is essentially syndicated to paying customers, and some portion of the revenue goes back to the creator.

It all sounds pretty bleak for creators. Think about radio and TV in the days before cable. Programming was expensive to produce, so sponsors paid the costs, and got the advertising in return. This meant, of course, that the content had to appeal to the advertisers, and, more importantly, to their target demographic. This is what lead to TV's becoming what Newton Minow referred to as "a vast wasteland."

There are a couple of possible bright spots:

It used to be that you had to be mainstream enough, to appeal to a large enough audience, to get published, get recorded, get broadcast, etc. Now, almost anyone can produce a blog, a Web comic, a mini-animation, a video clip, or even a digital movie. Getting an audience is, of course, a problem, but people have found ways of finding content that interests them.

Relevance-based advertising means you can get advertisers even if your content is narrowly focused, offensive or subversive. Of course, your audience may not click on the ads, but at least there's the possibility.

Premium content might still be able to command some price. There have been a number of cases of authors turning their on-line content into traditionally published books, and successfully selling them. Publishing on-line may actually help you find a ready audience, as well as helping you hone your craftsmanship.

As Alvin Toffler predicted, the pace of change keeps accelerating, so people have to adjust to more and more changes in the space of a lifetime. This certainly creates opportunities for creative artists in terms of media to work in, subject matter, potential audiences, etc.

At least, that's what I'm hoping.