The late Douglas Adams once remarked of 2001 that one hears Arthur C. Clarke complaining that Stanley Kubrick left out this idea and that idea and another one over there, and in the end one is left with a fierce admiration for Kubrick. Now it actually is 2001, and currently playing in theaters is Steven Spielberg's version of a project Kubrick had had in the planning stages for decades: AI. And having now compared Kubrick's treatment to Spielberg's film, read the pans of both professional critics and barely literate buffoons on Usenet who totally failed to parse what they were seeing, and seen the grumblings of many of the string of screenwriters Kubrick called in to work on the project, it seems to me more clear than ever: Kubrick, genius; everyone else involved, not.

My reaction to the film has thus far been mostly contrarian — wanting to slap the people who say "realy crap dont waste yor mony" but also wanting to slap the people who've hailed it as a flawless masterpiece. The consensus among those who've had intelligent things to say has been that the film is severely flawed but still worth seeing, and I'll go with that. Just so you know where I stand. I should also get out of the way my reaction to the whole Oedipal element of the film, which is a bit idiosyncratic. Here's the situation: a pair of parents have a kid stuck in cryonic suspension, and get a "Mecha," a robot with really good AI, as a sort of replacement. The mother activates a program that causes the AI boy to imprint upon her — his purpose is thenceforth to love her, just as the purpose of the sex bot we meet later on is to satisfy the ladies. But the biological kid suddenly gets better, and a couple of misinterpreted incidents prompt the parents to want to ditch the AI kid, so the mother abandons him in a forest. He spends the rest of the film trying to reunite with her.

When I was eight years old my mother flipped out (as was not uncommon) and, rather than pulling an Andrea Yates, announced that my brothers and I were being kicked out of the house and would have to live on the street. This wasn't a calm declaration: she was completely insane, literally foaming at the mouth as she screamed, "WHY AREN'T YOU PACKING?! YOU DON'T LIVE HERE ANYMORE! GET OUT!" I don't think it's a stretch to say that the majority of my psyche was forged right then and there: just as an example, my abhorrence of drugs and alcohol is primarily rooted in my extreme anxiety when people I care about suddenly start acting strangely, unpredictably. I'm terrified of being abandoned by those I love and wind up bracing myself, keeping my guard up all the time, so it won't hurt so much when it inevitably happens. And of course, it goes without saying that this and similar incidents crippled any relationship I might have had with my mother, with whom I rarely speak nowadays. As came up in the last film I discussed here, Sexy Beast: scare tactics may achieve results in the short term — though I hadn't been doing anything wrong before the outburst, I sure as hell didn't disobey for a good while afterwards — but watch out for the long term, because you'll wind up losing big. So anyway, yeah — when she kicks the kid out of the car and he's screaming for her not to leave him, it was a bit much for me to take and I consequently tuned out or at least turned down that aspect of the film so I could concentrate on the rest.

The middle section of the film follows the AI boy's quest, motivated by an overly literal reading of Pinocchio, to get turned into a real boy so his mother will love him and take him back. Roger Ebert insists this is a mistake: "'What responsibility does a human have to a robot that genuinely loves?' the film asks, and the answer is: none. Because the robot does not genuinely love. It genuinely only seems to love." Note that this is Ebert's answer, not the film's. But when someone else advanced the same notion on Usenet — "His 'love' is not real. It is programmed." — someone a bit more clever supplied the retort: "Well so is yours. The only difference is that his is on chip and yours is on gene." The thing is, we know why, say, pain exists — those creatures which don't have some sort of damage avoidance system (as the film calls the robots' equivalent) tend not to survive long enough to reproduce this trait — and, in the film, the Mechas can reproduce the outward experience of pain exactly. How about the inward experience? Why do we meat-based machines actually feel pain, and at what point does the programming of same become an actual feeling? What is consciousness? We don't know. I don't know, and Roger Ebert doesn't know, and you don't know — if you think you do, you're kidding yourself. Ebert also asks, "From a coldly logical point of view, should we think of David [...] as more than a very advanced gigapet?" Here Ebert assumes that if so, we're free to mistreat him or toss him away — which is the exact opposite of the whole point of gigapets, very, very simple programs that people still felt compelled to obsessively tend to. On ifMUD there's a Perl script called "Alex" which acts very vaguely like a parrot; he's probably the most popular "person" on the system. Sid Meier's Colonization allows you to react to gifts from the smiling natives by attacking and enslaving them; how anyone can actually do this is beyond me, despite the fact that they're just chunks of code with some cartoony art. So when you've got a robot that looks just like a kid and screams, "Don't burn me! Please!", what the hell difference does it make whether it's "really" scared? If you can calmly melt such a creature into slag, I don't want to know you.

Anyway. I didn't much care for the first two of the three acts, the first being a cavalcade of erratic behavior with little insight into the characters to back it up, the second being lots of sound and fury signifying not a hell of a lot. Then comes the third act, set 2000 years after the end of the second, after the oceans have frozen and Earth is inhabited entirely by the descendents of the machines. NOT ALIENS! That people, not just mouth-breathers but perfectly intelligent people as well, could have missed this leaves me boggled. Anyway, I totally dug on the Super-AI. For one, it was extremely nice to see special effects used for something other than horror and combat and explosions. (I especially liked the disassembly of the cube car. Very cool.) For another, to see the Earth inherited by AI not played as a dystopia was very satisfying and true to Kubrick's vision: intelligence is intelligence, and AI like this would likely make us prouder great-grandparents than our biological descendents would. Natural selection chooses for survival skill; the world will become a much better place when it enters a post-Darwinian age where we instead select for what is good. That Spielberg diverges from Kubrick's plans for how the third act goes from there is disappointing, but still, it's nice to see the left turn stay in the movie at all.

Finally, I can't help but notice that the date stamp for this ramble about AI is 10010. Heh.

Return to the Calendar page!