Monday, May 9, 2011

Just Because You Can . . .

As reported in Ars Technica, a Wyoming couple just sued the Aarons rent-to-own chain for allegedly placing a hardware device in their computer that surreptitiously took screen shots and webcam shots and sent them back to Aarons.  According to the complaint, the couple found out after Aarons mistakenly sent an employee to repossess their computer (they had paid it off, but Aarons’ system missed that) and he showed them a picture of one of them using the computer.  When they asked him where he got the picture, he allegedly told them that he wasn’t supposed to show it to them.  Allegedly, Aarons decided to use the hardware device in question specifically because it would be very hard for people to detect or do anything about.

I’ll briefly address some lawyer points at the end, but let’s step back a second and think.  Aarons was trying to do something completely legitimate:  Use the computer to help them get it back if they weren’t paid.  I don’t think anyone would fault them if they had put a hardware kill switch or a GPS tracker in the machine and expressly told the renters they were doing that.  Most particularly, a renter would be hard pressed at the point of sale to object to those measures (at least so long as Aarons promised to disable its access or remove the devices on full payment).  Moreover, if Aarons took those measures, it would be in its interest to emphasize them to renters for deterrent effect (yes, I know that people could try and circumvent it if they knew, but few people would try and a good device could make it very hard for anyone who did, reducing circumvention to a minor annoyance).

Thoughts on Aarons’ Strategy

So what happened?  What follows is conjecture, but my best guess is that, at some point, a salesman convinced someone at Aarons that it would be really “cool” to be able to watch over its property in real time.  There’s a romance to doing surreptitious and sneaky things.  If your job consists of running property recovery for Aarons, there’s rarely any romance and you’re probably viscerally angry at deadbeats and their endless stream of lies and excuses.  So it probably felt great to be able to take “high tech” measures to finally stick it to those jerks and catch them in their lies.

The funny thing is that Aarons seems to have already known the whole thing was a bad idea before the incident became public.  First, it went to the trouble of using a mechanism that couldn’t be detected easily.  Second, the guy who went out to recover the computer was allegedly told not to mention the picture.  In other words, if Aarons had just thought the process through to the end, the impracticality probably would have been obvious.  It might be fun to imagine giving the lie to the deadbeats by showing them photographic evidence of themselves using the computers, but Aarons would rarely, if ever, really want to do that.

So the first lesson here is to think through measures like this (time bombs, kill switches, “phone home” devices and code, key loggers, watermarks etc.) all the way through past the fun revenge fantasy into day-to-day use.  Day-to-day use should be practical, boring and cost-effective.  If it isn’t, ignore the salesman and consider something less “cool.”

Thoughts on Disclosing

This leads to a more general topic, which was also raised by a recent episode of South Park:  When should you hide what you’re doing from customers/end users, when should you “bury” it in terms and conditions, and when should you emphasize it in a way that people will notice?  All laughing aside (and the episode was very funny) this is a subject that a lot of businesses and their lawyers don’t appear to think through strategically.  As the Aarons example (allegedly) indicates, people sometimes bury the lead for no good reason.

Here are my thoughts:

Buried Disclosure:

To start, let’s all recall that “unfair and deceptive practices” are illegal under U.S. federal law and most state laws.  This needs to be at the top of your mind as you consider concealing something from customers/end-users that they would care about.

That said, burying a disclosure is usually not a strategy of deception.  As a practical matter, you have to “bury” just about everything you disclose for the simple reason that people have very limited attention spans.  If you try to get them to pay attention to everything, you’re fooling yourself:  You’ll exceed attention spans and they’ll pay attention to nothing.  So buried disclosure is the appropriate strategy for everything that doesn’t need to be kept truly secret and isn’t worth singling out for emphasis.

In thinking about this kind of disclosure, bear in mind that just about no one reads it up front.  So there’s no point in worrying that anything you say will scare people off.  Under normal circumstances, the number of people who read your disclosure will be miniscule and you’re probably better off if those people go away.  Normally, the only reason you’re disclosing at all is to have something to point to later if anyone complains.  So forget about the decision process of the person who clicks on “accept.”  Write your disclosure with the endgame in mind.  That means that, where possible, it’s usually in your interest to be as clear and detailed as you can about what you’re really doing.

There is a caveat to the strategy of clear upfront disclosure.  Although almost no one will read your disclosure, you should consider the possibility that one of the people who does will publicize it.  So think about how it would play in the public eye.  If you’re just warning people about the obvious (e.g. this mapping app will access your location data), that’s not a concern.  But if you’re doing something that people might get upset about, honest but vague disclosure might be better (e.g., “we may share your information with companies with which we do business to enhance your experience and for other business purposes”).  Consumer advocates, reporters, bloggers, elected officials and regulators are always on the prowl for fresh outrages.  If you describe in detail something that other people are doing but burying in vague language, you’re volunteering yourself as the whipping boy.  In assessing the risk, picture the reaction of an enthusiastic, idealistic and well-meaning intern who hasn’t yet graduated college and has only a rudimentary understanding of your business (or any business, for that matter).

Emphasis:

As noted, most disclosure is automatically buried because it exceeds people’s willingness to pay attention.  That means that you should think about whether there’s anything you really want people to understand when they consent.  It also means that there’s a limit to how many things you can really emphasize.

The first priority is any information you expect to incentivize or deter people.  If they don’t know about it, it won’t affect their behavior the way you want.  Going back to Aarons, they probably should have embedded a kill switch in their rent-to-own machines and then warned the customers about it in a way that would make an impression.

Beyond deterrents and incentives, you should save emphasis for consents that face special legal standards under general law (such as, in some jurisdictions, arbitration) or any special regulations that apply to you.

Secrecy:

To repeat, if you’re thinking of hiding things, you need to keep in mind that unfair and deceptive practices are illegal.  We’re not talking about concealing your algorithms, source code and other trade secrets.  That’s necessary and customers/end-users almost never care about it.  We’re talking about gathering and using information in ways people wouldn’t expect or adding features (such as Aarons allegedly did) that work to your advantage but against the real or perceived interests of your customers/end-users.  If you’re concealing something because people would be upset if they found out, you’re usually taking the wrong approach.  They probably will find out eventually, at which point you’ll discover that your strategy amounted to carefully aiming a loaded gun at your own foot with the safety off and then hoping that nothing would ever set it off.

That’s not to say that secrecy is never a good strategy.  To be frank, however, I’m struggling to come up with meaningful examples.  I suppose you might not want to mention a purely internal “blue sky” analytics project that you haven’t decided to pursue, if it’s clearly understood that it will be disclosed (and all the old data discarded) if you ever decide to roll it out.  If you’re going to do that, by the way, don’t call the project something creepy.  There are probably other situations where concealment is advisable (comments welcome), but they are the exception.  Most of the time, there’s just no benefit to sneakiness.

Boring Lawyer Stuff

This is shop talk, so anyone who isn’t interested in whether the legal claims have much of a chance under the applicable statutes can stop now.

The complaint alleges interception of electronic communications in violation of 18 U.S.C. § 2511 (Electronic Communications Privacy Act) and unauthorized access to a protected computer in violation of 18 U.S.C. § 1030 (Computer Fraud and Abuse Act).

I don’t see the first claim at all.  The courts have given the ECPA such a crabbed interpretation, that it rarely covers outright e-mail interception.  The complaint doesn’t clarify exactly which communications it alleges were intercepted.  Whatever they have in mind, however, will almost certainly fail either to meet the definition of “electronic communication” or “intercept.”

By contrast, the CFAA claim seems very strong.  The only interesting question here (assuming the plaintiffs can prove their factual allegations) is whether the laptop in question is a “protected computer,” i.e. whether it was “used in or affecting interstate or foreign commerce or communication.”  Assuming the plaintiffs used it to send and receive e-mails and to conduct e-commerce, that’s probably not hard to prove.  But the reported cases I’ve seen all concern business computers that fit more easily into the definition.  I haven’t found any reported cases applying the definition to consumer computers.  If you know of any, feel free to share.