Saturday, January 20, 2007

If Your Software Won't Let me Lie (Pt II): Lying to Your Parents

Wouldn’t it be great if you could know exactly where your kids are all the time? All day, every day? Wouldn’t it be great to be sure they’re not ever getting up to anything you wouldn’t do, and they’re never in any danger… ever?

Really?

Here is an article about how parents in Japan will soon be able to monitor their kids’ whereabouts by tracking them with the GPS (Global Positioning System) device on their cellphones.

The Customer is Not Always Right

There is often a chasm of difference between what the customer thinks he wants, and what he actually wants. It’s the designer’s job to trace the customer’s expression of his needs (“I want the software to do these 25 things and be colored blue”) back to the premises, or root needs, that underlie them (“I want the software to be profitable, keep a 75% returning customer base, and look professional”).

At face value, it probably sounds like an excellent idea to most parents to be able to track the exact location of their kids 24 hours a day; they might, for example, imagine thwarting a kidnapping or a burgeoning drug addiction. Imagine a world, though, where this technology really worked and was adopted widely and used constantly. When you were a kid, did you ever sneak out? Lie about where you were spending the night? Did you go on a road-trip adventure that your parents never knew about? A disreputable party? What if you hadn’t been able to do any of those things, ever? What if you had spent your entire teenage years never once able to lie to your parents about your whereabouts? What if you yourself had to live with an entire generation of people who had never been allowed to break the rules?

That’s an exaggeration, of course. I'm just illustrating that individuation (the process by which children break from the mold of their parents’ social conditioning and experiment their way towards developing a unique self) is contingent in no small measure upon screwing up (and madly brainstorming your way out of it), breaking rules, and lying. Any software that seriously impedes kids (or anyone else) from doing these things will damage their ability to become full people who make meaningful and interesting contributions to society.


Fortunately, We’re All Brilliant Liars

The good news for kids, with respect to control technologies, is that any assortment of kids will always be smarter, quicker, and more resourceful than their parents; and they will always have access to more cutting-edge technology.

Likewise, the general human masses (kids or adults) will always find clever ways around any roadblocks that official technology produces… within weeks, usually, of the general adoption of that official technology. Any new control technology (e.g. Digital Rights Management [DRM]) takes about 1-3 year’s turnaround to move from inception to market. It takes, on the other hand, 2-4 weeks for a distributed team of 500 of the world’s bored hackers to come up with a workaround, distribute it on the net, and break the control. Is sharing protection (DRM) on copyrighted iTunes tracks bugging you? Go online and download one of a dozen third-party, free pieces of software to strip the protection from them. There are so many iTunes-hacks out there, they’re in competition with one another for sleekest user interface. I'm not saying this is a good thing or a bad thing; it just is. Distributed groups of hackers are smarter, and exponentially faster, than companies or government organizations who move through formalized processes to market.

So if we can crack DRM within a month, kids will have no trouble whatsoever in getting around more intrusive control technologies. They're better with technology, and their motivation to override anything that seriously restricts their freedom is greater than our idle need to share our copyrighted iTunes tracks. This is easy to illustrate. I recently read a story about how some middle and high schools are trying to staunch the overwhelming tide of collective technology by instituting “no cellphones in school” rules; kids are already thinking of innovative ways around that. Also, I'll link again to the story about Spanish high school kids hacking each others’ cellphones to gather blackmail material on other students. Over Thanksgiving I spent a long plane ride chatting up a fairly average, hip young middle school kid who could out-talk me in processor configuration and out-code me in Visual Basic. Think that kid is going to put up with his parents (who are Luddites by comparison) tracking him with the GPS device on his cellphone?

The likely scenario is that kids will hack together a half dozen little apps you can upload to your cellphone to make your GPS broadcast coordinates of your own choosing… making it twice as easy to lie to your parents about where you are than it was before you got the GPS phone in the first place.

So, to recap: people have to be able to lie to each other, at least sometimes and under some circumstances. If you produce software that doesn’t let people lie, they’ll either not use it (e.g. presence, where we block anyone from IM who we don’t want to have full access to our daily rhythms), or they’ll hack it to pieces so that all of the validity of the data is ruined/corrupted. In other words, if you ask for too much control, and you get none; you get a wobbly system, overcompensating for the original overcompensation.


How Much Information is too Much?

Let’s break personal status information down into three categories that might be projected by social software:

  1. Time Grain: frequency with which status information is updated

  2. Detail: specificity of information (“at school” –vs- “In the janitor’s closet with young attractive French teacher”)

  3. Level of Aggregation: high aggregation would refer to status information that reflects the status of a large group of people (“Flight 714 is over Albuquerque (and my mother is on that flight”); low aggregation information refers to three or fewer individuals (“My mother is the gift shop in the C wing of the Huston airport.”)

To create status-broadcasting tools that won’t cause overcompensation, you can only emphasize two of those variables at once. The more specific you get with those, the less specific you must be about the third. Some examples:

  • The Weasley’s clock in the Harry Potter series, which had a hand for each family member, pointing to one of several wide, vague categories: home, school, work, in mortal peril, etc. (Note: a few years back, Microsoft research created an actual workable manifestation of this clock.) (+Aggregation [by individual], +Time Grain [updated instantly], - Detail [five broad categories])

  • A technology which uses GPS to track the location of school busses (+Detail [exact location], +Time Grain [updated instantly], -Aggregation [tracks a formal group])

  • Dodgeball (see previous blog entry). (+Aggregation [individual], +Detail [exact location], -Time Grain [user rarely broadcasts information, and at her own discretion])

The bigger or more formalized the group you’re paying attention to, the more constantly and specifically you can broadcast information about it without major sociological backlash. The more individual or personal the information, though, the more vital it becomes to allow people to obscure the truth about themselves—to whom they chose, and when they chose.

If parents demand GPS tracking for their kids, companies will produce the functionality and it will sell. It just won't last or work very well; it will send the social system into a couple of wild swings, and then die out. Digital solutions that are actually going to last and incorporate themselves into the fibre of social life just have to incorporate themselves into our prexisting social patterns. As before: the nature of our relationships has to define the interface, not the other way around.

Tuesday, January 9, 2007

If Your Software Won't Let me Lie, I'm Not Going to Use It (Pt I)

  • Presence Information: Typically, presence data takes the form of a little icon next to your name in IM and some email programs/web services. It tells other users when you are online, offline, away, or busy.
  • Social Penetration Theory: A fairly simplistic model of intimacy, and how it evolves. A personal graduates their level of intimacy with another person (which is irritatingly, but accurately, summarized by the catch-phrase “Into-Me-See”) like peeling an onion: by adjusting the type, frequency, and privacy level of information that they share with him/her.
Argh, finally. This new piece of mobile software, “The Swarm” (still in development), proposes a presence model which starts to jive with how people negotiate social boundaries per-person, and over time (see: Social Penetration Theory).

The thing is, we define our relationships on a daily basis by the degree, consistency, and nature of information that we share with one another. We make distinctions:
  • by time grain: I tell my employer (in July) that my cat was hit by a car (in June). I tell my mother on Thursday that it happened on Monday. I call my sweetie the minute it happens.
  • by detail: I tell my employer I had a nice time in Cancun. I tell my mother about the lovely hotel and the awful prices. I tell my sweetie, at comedic length, about trying not to gawk at the giant, disfiguring mole on the nose of the maĆ®tre d'.
  • by sensitivity: I tell my employer my meeting with the new client was "awkward." I tell my mother that the client turned out to be an "old friend" from "those shady years." I tell my sweetie that the client was "that guy who was nice enough not to press charges."
Presence information is just that: it's information about what you're up to, right as it happens. As such, it is a very intimate handshake to be making with someone who you've just met at a party and casually invites you "to IM sometime." Combine this with the growing ubiquity of presence data in mobile/social digital tools, and you have a problem.

I just finished a gig doing usability research & surveys with a presence-based communications product.Some presence models are much more granular than here, gone, away, busy: some interfaces pull detail about what you're up to out of other programs (Outlook, for example); some let you write your own status messages.

They also allow for a tiny bit of individual regulation. When you add someone to your IM contact list, they see your presence information; if you don't want them to see it anymore, you block them or remove them from your list. This leaves you with binary options per individual: you give someone all of your information, or none of it. That's polar and awkward. Do I really want my old workmates seeing the status messages I write into Google Chat to make my friends laugh? (“Luke, I am your Puff Daddy.” – Puff Darth) Do I want the guy I met at the conference see what time I get home at night (status icon changes from away to online)? Would I rather, instead, block them from chatting with me entirely? How does sharing, or explicity blocking, this information change my relationships with these people?

The point is, we human beings like to, er, temper reality to degrees when we allow individuals into our personal information space. In other words, we like to lie. Lying, or at least withholding lots of things as we see fit, is an essential part of social functionality. In a sustainable presence model, the nature of the user's relationships dictate these intimacy-defining information exchanges-- not vice versa.

The presence model in The Swarm is (from what I can see in the full, brief article) one cognitive step closer to aligning presence with relationship onion-layers. It allows you to customize your status information per individual. Your boss sees you're home sick; you mom sees you're taking the afternoon off; your friends see you're playing "Brazilian Nurses' MudWrestling Deathmatch IV."
Stay tuned for: If Your Software Wont' Let me Lie (Pt II): Lying to Your Parents