Sunday, March 3, 2013

There’s Not an App for That...But There Is an FTC Report.



Cara wrote eloquently on February 5 about the FTC’s settlement with the social networking app, Path. As she noted, the case was of particular interest because it was not an instance where an entity violated its own privacy policy (as we saw with the Google Buzz settlement) but rather a case of an app having a poorly designed interface for informing consumers of information collection. That same week, the FTC took another step down the mobile app rabbit hole: it released a report on mobile app privacy.

The report reveals some interesting developments in the FTC’s approach to what is sensitive information in the sphere of mobile privacy. The FTC’s approach appears to be largely based on what consumers consider private. It makes clear that the FTC considers geolocation data to be sensitive. Searches for health and political affiliation information as well as the user’s communication with his or her contacts may be private, too. The report notes that a user might consider his or her photos, videos, address book, and calendar private in certain circumstances.

The report offers recommendations to platforms, app developers, advertising networks and other third parties, and app trade associations, but the majority of the content focuses on platforms. The FTC considers these platforms (Apple, Blackberry, Google, Amazon, Microsoft) to be the “gatekeepers” for the industry, which makes sense.

There are a variety of recommendations, but I was most intrigued by the FTC’s suggestion that apps include “just-in-time” disclosures. The idea is that apps would have some method of disclosing (perhaps a pop-up window or notice box akin to a text message) that would appear in the app at the relevant moment when information collection is about to occur beyond what the consumer would normally expect from the app. If what the FTC is seeking in this area is some sort of meaningful consent, this approach seems logical. At the moment right before the information is collected, the user has the opportunity to think, “Well, hey, here’s the information I would be providing to this app. Do I really want to hand that over?”

According to the FTC, some users will respond, “No, I don’t want to share that information.” In the report, the FTC cites a survey that found that 57 percent of mobile app users have “either uninstalled an app over concerns about having to share their personal information, or declined to install an app in the first place . . .” This, the FTC argues, suggests consumers have significant concerns over privacy in apps. Are people choosing not to share information or to delete an app because of a harm they anticipate? Do they care that they do not feel in control of personal information on their devices? Anecdotally, I can think of a number of times where I downloaded an app and deleted it when it asked for personal information, not because I necessarily saw any harm in sharing it, but because I was too lazy to type it in. But there have certainly been other times when the app wanted to access Facebook, and I did not feel the need for all of my friends to see everything I did with the app.

Does the report focus on the right types of information as being “sensitive”? Does the concept of a “just-in-time” disclosure give consumers enough control over their information? Does it seem feasible for mobile app platforms and developers to effectuate these reforms?

2 comments:

  1. I find it interesting that people might have a "just in time" notification about data collection. On one hand, such a notification could keep users informed of what is about to happen, and why. On the other hand, such notifications would be popping up in the midst of using the app. I know I'd probably see that as annoying as pop-up ads elsewhere. And if consent is required to use the app for its intended purpose, I have a feeling that very few people will decline.

    For me, apps come with a lot of questions about contracts of adhesion. The user has no bargaining power, no ability to change contract terms, and is entirely on a take-it-or-leave-it basis. I don't view app consent to be meaningful, but that doesn't mean it's any less legally binding.

    Times like these make me wonder how modern technology might allow for better agreements. For instance, based on what information you choose to allow or not allow, apps might just offer more or fewer services. If you opted out of GPS locations, you could still get directions, just not navigation. Obviously there would be a base level of required info for the app to function, but maybe customizable app agreements could help alleviate some other concerns about data collection and meaningful consent. Maybe the FTC should be pushing for something less intrusive on the app experience, and more easily used on the front end.

    ReplyDelete
  2. An article in the New York Times this week highlights that privacy is no longer just a regulatory interest, but that companies are also increasingly pushing one another to prove that consumer data is safe and within consumer control. The article notes that "to some degree, these developments [made by companies] signal that the industry is working hard to stave off government regulation, which is moving at a glacial pace anyway."

    It will be interesting to see if it is the industry leaders or a regulatory body like the FTC who end up defining what become accepted best practices. Perhaps the two of us could come up with some kind of term to be used in the media to define this "privacy" race. Can you tell I taught students about how the Internet freedom war is being describe by policy makers in Cold War terms last week?

    http://www.nytimes.com/2013/03/04/technology/amid-do-not-track-effort-web-companies-race-to-look-privacy-friendly.html?src=recg

    ReplyDelete