Saturday, January 20, 2007

If Your Software Won't Let me Lie (Pt II): Lying to Your Parents

Wouldn’t it be great if you could know exactly where your kids are all the time? All day, every day? Wouldn’t it be great to be sure they’re not ever getting up to anything you wouldn’t do, and they’re never in any danger… ever?

Really?

Here is an article about how parents in Japan will soon be able to monitor their kids’ whereabouts by tracking them with the GPS (Global Positioning System) device on their cellphones.

The Customer is Not Always Right

There is often a chasm of difference between what the customer thinks he wants, and what he actually wants. It’s the designer’s job to trace the customer’s expression of his needs (“I want the software to do these 25 things and be colored blue”) back to the premises, or root needs, that underlie them (“I want the software to be profitable, keep a 75% returning customer base, and look professional”).

At face value, it probably sounds like an excellent idea to most parents to be able to track the exact location of their kids 24 hours a day; they might, for example, imagine thwarting a kidnapping or a burgeoning drug addiction. Imagine a world, though, where this technology really worked and was adopted widely and used constantly. When you were a kid, did you ever sneak out? Lie about where you were spending the night? Did you go on a road-trip adventure that your parents never knew about? A disreputable party? What if you hadn’t been able to do any of those things, ever? What if you had spent your entire teenage years never once able to lie to your parents about your whereabouts? What if you yourself had to live with an entire generation of people who had never been allowed to break the rules?

That’s an exaggeration, of course. I'm just illustrating that individuation (the process by which children break from the mold of their parents’ social conditioning and experiment their way towards developing a unique self) is contingent in no small measure upon screwing up (and madly brainstorming your way out of it), breaking rules, and lying. Any software that seriously impedes kids (or anyone else) from doing these things will damage their ability to become full people who make meaningful and interesting contributions to society.


Fortunately, We’re All Brilliant Liars

The good news for kids, with respect to control technologies, is that any assortment of kids will always be smarter, quicker, and more resourceful than their parents; and they will always have access to more cutting-edge technology.

Likewise, the general human masses (kids or adults) will always find clever ways around any roadblocks that official technology produces… within weeks, usually, of the general adoption of that official technology. Any new control technology (e.g. Digital Rights Management [DRM]) takes about 1-3 year’s turnaround to move from inception to market. It takes, on the other hand, 2-4 weeks for a distributed team of 500 of the world’s bored hackers to come up with a workaround, distribute it on the net, and break the control. Is sharing protection (DRM) on copyrighted iTunes tracks bugging you? Go online and download one of a dozen third-party, free pieces of software to strip the protection from them. There are so many iTunes-hacks out there, they’re in competition with one another for sleekest user interface. I'm not saying this is a good thing or a bad thing; it just is. Distributed groups of hackers are smarter, and exponentially faster, than companies or government organizations who move through formalized processes to market.

So if we can crack DRM within a month, kids will have no trouble whatsoever in getting around more intrusive control technologies. They're better with technology, and their motivation to override anything that seriously restricts their freedom is greater than our idle need to share our copyrighted iTunes tracks. This is easy to illustrate. I recently read a story about how some middle and high schools are trying to staunch the overwhelming tide of collective technology by instituting “no cellphones in school” rules; kids are already thinking of innovative ways around that. Also, I'll link again to the story about Spanish high school kids hacking each others’ cellphones to gather blackmail material on other students. Over Thanksgiving I spent a long plane ride chatting up a fairly average, hip young middle school kid who could out-talk me in processor configuration and out-code me in Visual Basic. Think that kid is going to put up with his parents (who are Luddites by comparison) tracking him with the GPS device on his cellphone?

The likely scenario is that kids will hack together a half dozen little apps you can upload to your cellphone to make your GPS broadcast coordinates of your own choosing… making it twice as easy to lie to your parents about where you are than it was before you got the GPS phone in the first place.

So, to recap: people have to be able to lie to each other, at least sometimes and under some circumstances. If you produce software that doesn’t let people lie, they’ll either not use it (e.g. presence, where we block anyone from IM who we don’t want to have full access to our daily rhythms), or they’ll hack it to pieces so that all of the validity of the data is ruined/corrupted. In other words, if you ask for too much control, and you get none; you get a wobbly system, overcompensating for the original overcompensation.


How Much Information is too Much?

Let’s break personal status information down into three categories that might be projected by social software:

  1. Time Grain: frequency with which status information is updated

  2. Detail: specificity of information (“at school” –vs- “In the janitor’s closet with young attractive French teacher”)

  3. Level of Aggregation: high aggregation would refer to status information that reflects the status of a large group of people (“Flight 714 is over Albuquerque (and my mother is on that flight”); low aggregation information refers to three or fewer individuals (“My mother is the gift shop in the C wing of the Huston airport.”)

To create status-broadcasting tools that won’t cause overcompensation, you can only emphasize two of those variables at once. The more specific you get with those, the less specific you must be about the third. Some examples:

  • The Weasley’s clock in the Harry Potter series, which had a hand for each family member, pointing to one of several wide, vague categories: home, school, work, in mortal peril, etc. (Note: a few years back, Microsoft research created an actual workable manifestation of this clock.) (+Aggregation [by individual], +Time Grain [updated instantly], - Detail [five broad categories])

  • A technology which uses GPS to track the location of school busses (+Detail [exact location], +Time Grain [updated instantly], -Aggregation [tracks a formal group])

  • Dodgeball (see previous blog entry). (+Aggregation [individual], +Detail [exact location], -Time Grain [user rarely broadcasts information, and at her own discretion])

The bigger or more formalized the group you’re paying attention to, the more constantly and specifically you can broadcast information about it without major sociological backlash. The more individual or personal the information, though, the more vital it becomes to allow people to obscure the truth about themselves—to whom they chose, and when they chose.

If parents demand GPS tracking for their kids, companies will produce the functionality and it will sell. It just won't last or work very well; it will send the social system into a couple of wild swings, and then die out. Digital solutions that are actually going to last and incorporate themselves into the fibre of social life just have to incorporate themselves into our prexisting social patterns. As before: the nature of our relationships has to define the interface, not the other way around.

No comments: