Monday, February 11, 2013

Jones Type Search Using Publicly Available Information.

With the recent decision that placing a GPS tracking devices without a warrant is unconstitutional, United States v. Jones 132 S.Ct. 945 (2012), it follows that the solution is to outsource it to a defense contractor, in this case the good people at Raytheon. It would seem somebody concerned about the project (known as RIOT) decided to leak a promotional video to the Guardian in the U.K. The video is an interesting example of scavenging publicly available information to produce a composite picture of a person's activities.

Two interesting problems features of  algorithm: 1) the use of embedded photograph information (in this case position via GPS and time) to gather additional information about the target. 2) the problem of assessing negative information in a pattern recognition context.

To elaborate on point #2. The subject checks in at the Gym most of all. The subject checks in at the Gym at 6am when he checks in (presumably via foursquare). People who are asleep at 6 am do not check in at the gym. Not checking in is negative information that computers do not include in their assessment of the probabilities. People are able to infer that the subject sleeps most nights but the computer has problems with this type of reasoning. Similarly most people don't check in at work via social network tools. These kind of background, not reported features distort the regression algorithms that work by brute force analytics. Don't worry though, I'm sure they'll get that fixed in Riot 2.0.

2 comments:

  1. Since we read Jones, I've been checking to make sure that my intuitive sense of what's offensive tracks with what the law defines as the invasion. This came to my attention in Jones, because I think that treating a trespass to place a GPS tracker as an unconstitutional search instead of engaging with the tracking that follows (the creepy part) is nuts.

    I run into trouble when I try to apply that idea here--all of that publicly available information is just floating around out there and that's mostly fine--Nick (from the video linked above) voluntarily checks in on Foursquare and uses LinkedIn--but it starts getting creepy when someone analyzes and compiles all that information to form a picture of Nick's life. But performing that kind of analysis isn't any more invasive in the sense that more information about Nick is revealed. The creepiness comes from rearranging public information to make it more meaningful. My point is, that if we want to locate the invasion of privacy (if any) with the onset of creepiness, in this case, invasion doesn't equal additional (or any) intrusion.

    This raises all kinds of questions about how the law ought to respond. Preliminarily, we have to decide whether privacy creepiness is something that law ought to respond to at all. Assuming so, we have to think really hard about the mechanism for doing so. Privacy regulation typically controls collection, disclosure and disposal--none of which are particularly useful controls when the offensiveness stems from the use of the information. This leaves us with the idea that we'd have to regulate use, but regulating the use of admittedly public information seems tricky at best...

    ReplyDelete
  2. This software seems to lack sufficient justification for the government when considering the corresponding intrusion. If the ultimate intention is crime prevention, I struggle to understand how past behavior would predict future conduct in the wake of criminal activity. I suppose a deviation from one's typical 'RIOT compiled routine' would be an indication of something to explore, however, I wonder how RIOT can offer more than these speculative leads. I also don't understand the necessity of compiling a data-bank of such information rather than having a tool like RIOT available to apply in a case-by-case scenario (ideally authorized by a warrant).

    Additionally, using my personal information to tailor marketing campaigns for my interests is something I’ve come to expect, and often appreciate, from the likes of Amazon and Target. But when the federal government takes my personal information that I may not knowingly supply (my longitude/latitude from pictures I post on Facebook) and uses it to predict my behavior, I feel a bit too intimate with Big Brother.

    ReplyDelete