In the The Contrarian Corner sections of this web page I'll be expressing views on topics relating to the computers, electronic media, and the digital culture generally that don't necessarily match, or even are opposed to, "the prevailing wisdom."
[The term "contrarian" derives from the 17th century, when it meant "... a person who takes a contrary position or attitude; specifically: an investor who buys shares of stock when most others are selling and sells when others are buying...". -- From Merriam-Webster.]
While computer visionaries have touted notions of the smart house, the smart kitchen, the smart office, and the like-- the list is long-- considerably more attention has been paid to how such things might be made smart than to the consequences for the human from living in the presence of such smart things.
The assumption--usually unspoken, and equally usually untested--is that you will like living with the smart whatever-it-is.
It will automatically do this and that for you. It will "know" you and your inmost wishes, not unlike the proverbial English butler. Surely, the prospect of living with the electronic surrogate of an English butler will be...irresistible.
The essence of the English butler qua metaphor is that he is perfect . Behaves perfectly. Anticipates your needs perfectly. Knows exactly how you like your eggs done, your tea flavored, your clothes laid out.
And, whatever else the English butler is, he is unobtrusive. You do not have unceasingly to correct his actions, to respond to repeated queries like "Is that the right amount of starch, Sir?" "Is you tea sweet enough, Madame?"
Similarly, smart things--in order not to annoy and be more trouble than they are worth--must necessarily share the hallmark trait of the proverbial English butler: that they behave perfectly.
For, if things are not done perfectly, the charm of butler...or smart thing...evaporates.
That is the crux of the problem. Can the smart this-or-that be perfect? If not, maybe-like a bumbling butler or one who is perpetually "in-training"--it's better not to have it...
Consider the smart kitchen.
Some versions imagine events like the refrigerator telling you--by synthesized voice??-- that you are out of milk, or that the expiration date on your milk carton has gone by. What then? Are you to run out an buy some more milk, or fresher milk?
Why doesn't the smart refrigerator--if indeed it's so smart-- order some milk. Or, better, get the milk (i.e., have it delivered). Otherwise, it would not be unlike compiler program that knows enough to tell you "missing parenthesis," not enough simply to take the initiative and insert it.
OK. The fridge orders some milk. But maybe you are going on vacation, and don't want any more milk just now. Or, your fitness-fanatic cousin is about to visit, and, yes, you'll need more milk but it has to be the non-fat type, not the 1% milk you usually get. Or, you doctor has just confirmed that you are lactose-intolerant, and that you need lay off milk for now.
So, the simple smart fridge concept--e.g., it detects you're out of milk and tells you, or maybe sends out an order for more--is not sufficient. It needs to know and do more.
And that's the problem. It needs to know lots more.
In fact, like the English butler, it has to be perfect.
Otherwise, it will inevitably pepper you with questions: How much non-fat milk per day does you cousin usually consume, Sir? Should I order some 1% for you, Madame, or would you be equally happy with the non-fat?
It is well to recall the plethora of kitchen-oriented "labor-saving devices" we already have. Blenders, mixers, food-processors, on and on. Do they save time and labor? Not really. The time people spent chopping things by hand they now spend in assembling, cleaning, packing away these gadgets.
A vignette in Newsweek magazine tells of the problems faced by Infoseek founder Steve Kirsch who "...spent four years and $10 million building his high-tech dream house in Silicon Valley's Los Altos Hills.":
Consider the motion sensors in every bathroom and closet: flipping the light switches off produces enough motion to automatically turn them back on. Then there's the homewide stereo system. You can play a CD throughout the whole house, but it turns on all the televisions at the same time. The front door unlocks when it detects motion inside, which means it graciously opens even if a family member sees that it's an intruder ringing the doorbell. And the back door has the opposite problem: it senses someone's presence and promptly locks. Add to those headaches dozens of remotes to control everything from the garage to the window blinds; an autofeeding fish tank that doesn't, and such a complex mess of audio and video controls that Kirsch's wife, Michele, says with a sigh," Sometimes I wish I could just push a button and turn off the TV."
From: Stone, Brad. "My House is a very Fine House...But." Newsweek, July 5, 1999, p. 55.
Reading one morning's messages recently, I saw a notice regarding a guest to give a talk on "Smart Vehicles," and how such vehicles might be outfitted with "vision-based systems for driver assistance and/or vehicle navigation," in particular:
"...the ability to autonomously follow a leading vehicle while recognizing infrastructure (e.g. traffic lights, traffic signs, lane markings) and other traffic participants (e.g. vehicles and pedestrians)."
It all sounded intriguing, especially the implied advances in vision systems, and I went to the talk. But, throughout, I couldn't shake the feeling that this was yet another instance of something being seen as technically feasible--and hence ought be done--but without adequate consideration of its potential real-world impact and consequences.
Surely a factor is the sheer intellectual challenge of creating a vision-based vehicle guidance system. It would be a "neat thing" (or whatever is the equivalent current expression) to see if, and to what extent, it could be done.
The problem is--even allowing that such a system could be put together, commercialized and installed in cars--that it leaves unanswered some crucial questions regarding just where the responsibilities of the humans involved begin and end: those of the driver (now passenger..??) of the vehicle, and the designers and vendors of such a system.
Some technologies aimed at driver augmentation are indeed blessings. Surely, few if any of use would want--except at antique car rallies--to attempt to start up an automobile by playing with the adjustment of spark and throttle levers and cranking the engine over by hand. The automatic choke and the self-starter were very welcome ways for technology to take over elements of the driving chore.
Similarly with the automatic transmission. While the manual shifting of the gears undoubtedly gives the more experienced driver greater control of the car in snow or in skids, under ordinary driving conditions most people--myself included--would rather leave gear-shifting to the car.
Power brakes are a boon, as well as power steering. They are all ways by which to augment the drivers, pulse amplifiers which reduce the physical burden of guiding a ton or two of metal, plastic and glass about.
But the foregoing technologies don't impinge directly upon what we may call "driver responsibility." Power brakes, for instance, relieve muscles, but don't replace driver judgment.
The anti-lock brake system (ABS), however, does take significant control away from the driver, and has proved to be a mixed blessing and/or success.
The problem with ABS early on was that people didn't necessarily know how to use it.
From the auto engineer's viewpoint, the ABS was a way to help motorist in difficult braking situations, e. g., on ice or snow. However, it turned out that most people, whether skilled or indifferent drivers, tended to operate ABS systems as they did conventional braking systems. That is, they would "pump" the brake pedal when trying to slow down or stop on a snow covered street.
With ABS, though, you don't do that. Instead, you are supposed to stomp on the brake, and hold it down; the ABS systems does the "pumping" for you, calculatedly distributing the braking action across the four wheels of the vehicle.
People were also alarmed at, should they in fact stomp and hold the pedal down, the violent underfoot "chattering" of the brake pedal as the ABS mechanism comes into play. (In my own Jeep Cherokee, the underfoot sensation and accompanying sound is that of the brake mechanism falling apart..!!)
The antidote to all of this was a spate of educational campaigns on TV and in the popular press aimed at clueing in the assumedly none-too-bright motorist on how to properly apply the pedal when using the ABS. Notably, no consideration was given to the possibility of adding a switch in the auto to toggle in or out the ABS depending upon whether the driver wanted ABS to come into play or felt competent to do their own braking. Rather, the auto engineers and regulatory types took the view--not dissimilar to that of how certain computer professionals systems programmer view the "user"-- as people too hopelessly inept for their own good and in need of intervention by wiser souls...like themselves.
In-car "electronic maps" or on-board navigation systems are currently much publicized, coming variously in the form of visual displays and/or spoken directions. They can be helpful, except when they are not, that is, when they don't work and get you to the wrong place.
But beyond mere inconvenience from malfunction, there's legal liability, an issue that's barely mentioned in discussions of such systems.
Suppose you are taking someone to the hospital in a medical emergency. You are in unfamiliar territory, and are using your in-car navigation system. It misdirects you, and you arrive too late either to prevent death or severe medical complications. What or who is responsible?
It's my guess that the days of in-car navigation systems--at least those whose makers claim to guide you to specific locations (such as a hospital)--are numbered. If we haven't already had the big liability case enter a court, sooner or later, in our litigious land, we will.
That we have had paper maps for decades, and the publishers of such are not sued because some user of them could not get to some vital place in time, will not matter. Particularly if the in-car electronic guidance system is claimed to be "automatically updating" or some such. The "smarter" some system is claimed to be, or seems to be when it is functioning well, the more the user will--unreasonably or not--expect of it. The in-car map should know.
I recall at the Media Lab a visit some years back from representatives of a German auto company. They talked enthusiastically of the automatic auto guidance system they were planning.
The system would allow the human driver of the vehicle to turn the car's navigation and control over to a computer-based, in-car system. The instrumentation of the car would be coordinated with electronic implants embedded in the highway to guide the car in the midst of traffic.
Toward the end of our conversation, I raised the issue of responsibility and legal liability should a mishaps occur. I noted that in the US, with its battalions of lawyers, lawsuits would inevitably arise, the effect of which would defeat the installation of such a system.
Their reply was to the effect that the US was relatively unsophisticated when it came to such matters, not quite as socially/legally advanced as certain European countries, and advances such as they envisaged would likely appear in Europe first, and perhaps the US might someday catch up.
A Boston Globe editorial observed the potential impact of the automated backseat driver coming up from behind and "taking the wheel":
The car is getting smarter than the driver--an unsettling trend despite the good intentions of safety engineers. Nissan is experimenting with a vehicle that not only warns motorists who are about to do something stupid but overrides them if they don't heed the warning.
This back-seat driver in the dashboard can automatically hit the brakes if the car is about to run into something. Also, the sensors outside the car--and in the steering wheel, the seat, the gas pedal, and the brakes--allow a computer to track a driver's every twitch, anticipating a lane change too close to an oncoming bus or the need for more gas as the car enters a highway...
...Japan [is] marketing adaptive cruise control, a system that automatically adjusts a car's speed to maintain a safe distance between it and the car ahead.
Also, sophisticated navigational aids --global positioning systems--are standard in Japan and becoming so in Europe...
US government and auto industry representatives are discussing the life-saving potential and marketability of such innovations as infrared windshield sensors for improved night vision, drowsiness detectors that send a blast of peppermint-scented cold air into a driver's face, and devices that sound an alarm if there is someone in a driver's blindspot or if the vehicle is inching too close to the edge of the road.
All ingenious ideas, but do consumers want to pay for them?...
"Auto suggestion." Editorial, The Boston Globe, Monday, June 21, 1999, p. A22.
Shortly after the editorial appeared, a Globe reader wrote in an interesting rebuttal about how a good, but not perfect, automated auto might cause more harm than good:
A child dashes out in front of car. The driver instinctively swerves. The "smart" car, sensing it is about [to leave] the road, overrides the driver's decision. The car avoids a survivable collision with a utility pole and kills the child...Software algorithms are no match for human judgment.
The buying public rejected "smart" seat belts and talking microwave ovens. They will also reject I-know-better-than-you engineering in automobiles. If they don't, hopefully the auto maker's liability lawyers will wake up in time.
[Goldstein, Andrew C. "'Smart' cars could be a dumb idea." Letters to the editor, The Boston Globe, Thursday, June 24, 1999, p. A24.]
Apt comments all. Especially the one about the I-know-better-than-you mindset.
It's revelatory to reflect on the outlook and mindset of some of the people who are advocates of, and enthusiasts for, automated highways.
A 7.6 mile portion of a carpool lane in Interstate 15 near San Diego, California, has been used to experiment with 10 Buick LeSabres traveling in a convoy fashion about 12 feet apart at 65 miles per hour. Involved in the demonstration system are Buick Division of General Motors, the US Department of Transportation, the California Transportation Department, the University of California at Berkeley, and representatives of more than 100 members of something known as the National Highway System Consortium (NAHSC).
From: "'Smart' highway travel poses what-if's." The Boston Sunday Globe, July 13, 1997, p. A12.
Of special interest is the outlook of the project's program manager, a former NASA engineer and researcher. "The need is there," he is quoted as saying. The text of the article explains: That's the need to unclog metropolitan highways, reduce, traffic fatalities, and ease fuel demands.
Then, we have the classic argument from inevitably: "The technology is coming," the program manager is quoted as saying. The article enlarges:
That is, the technology of cruise control that adjusts a car's pace and position in relation to other hard objects; of navigation by the Global Positioning System, which talks to satellites; of the collision avoidance systems, which warns Air Force One of closing traffic; and of super-IQ computers that recently out-pondered a world chess master...
Yes, indeed, all that is coming. Period. Better accept it...
This section of the article closes with another comment from the irrepressible program manager: "That makes AHS [the Automated Highway System] just another step in improving the condition of mankind."
There may be other concerns, though. The article notes some rising opposition from AHS opponents, including some concern from civil libertarians that highway automation is one step from federal identification and location of individuals, and thus and invasion of privacy. There is also a fear of malicious hackers invading the system and causing vehicular mayhem. Lastly, there is some concern that the on-board electronics required to use the AHS--should one wish to do so--may be an option that many low-income persons could not afford.
I would not want to go up in a plane which was piloted by a pilot who is "pretty good," or undergo brain surgery by someone who was "pretty good." I want them to be perfect. If they are not, then I want to wait until the cockpit or surgical gloves are filled by someone who is.
So, am I really saying that any "smart" system having to do with non-trivial activities, such as driving a car or giving navigational directions has to be perfect? That anything short of perfection is unacceptable? That seems unreasonable--the perfect as the enemy of the good. Can't a "smart" system be pretty good, or almost perfect?
No, it can't. It has to be perfect.
Computer systems, for instance, which presume to take control of your car away from you, to navigate you about, have to be perfect. Otherwise, they are no good to you...
The morning I wrote this sentence I drove in to MIT from my home in Arlington, a distance of about 8.3 miles via the route I took. My driving had to be perfect, or I would have spent a good deal of the morning filling out accident reports, visiting emergency rooms, or worse.
By being "perfect" in this context, I don't mean that I took every turn with auto-driving textbook finesse. It has to do, rather, with the things I didn't do, the things that didn't happen.
That is, I did not crash into any cars, sideswipe any cyclists, squash any squirrels, collide with any dogs. I didn't run any red lights, flatten any pedestrians, run my wheels into any street repair excavation.
In that sense--like the doctor's injunction to "...first, do no harm"--I, as operator of my Jeep Cherokee, was perfect. (Knock wood...).
Until the automatic driver can do that, and do it every day, day, week, and year in and year out, it's not good enough.
Similarly, until the smart house can run itself like the English butler would, it won't be good enough. It will be an interesting idea, but one with not many takers...