Jul 3 2018

Response to the Journal

(This post is running concurrently on the Return Path blog.)

It is now widely understood that the Internet runs on data. I first blogged about this in 2004—14 years ago!— here.  People have come to expect a robust—and free!—online experience. Whether it’s a shopping app or a social media platform like Instagram, these free experiences provide a valuable service. And like most businesses, the companies that provide these experiences need to make money somehow. Consumers are coming to understand and appreciate that the real cost of a “free” internet lies in advertising and data collection.

Today, the Wall Street Journal ran an article exploring the data privacy practices of Google and some of the third party developers who utilize their G Suite ecosystem. Return Path was among the companies mentioned in this article. We worked closely with the journalist on this piece and shared a great deal of information about the inner workings of Return Path, because we feel it’s important to be completely transparent when it comes to matters of privacy.  Unfortunately, the reporter was extremely and somewhat carelessly selective in terms of what information he chose to use from us — as well as listing a number of vague sources who claimed to be “in the know” about the inner workings of Return Path. We know that he reached out to dozens of former employees via LinkedIn, for example, many of whom haven’t worked here in years.

While the article does not uncover any wrongdoings on our part (in fact, it does mention that we have first-party relationships with and consent from our consumers), it does raise a larger privacy and security concern against Google for allowing developer access to Gmail’s API to create email apps. The article goes on to explain that computers scan this data, and in some rare cases, the data is reviewed by actual people. The article mentions a specific incident at Return Path where approximately 8,000 emails were manually reviewed for classification. As anyone who knows anything about software knows, humans program software – artificial intelligence comes directly from human intelligence.  Any time our engineers or data scientists personally review emails in our panel (which again, is completely consistent with our policies), we take great care to limit who has access to the data, supervise all access to the data, deploying a Virtual Safety Room, where data cannot leave this VSR and all data is destroyed after the work is completed.

I want to reaffirm that Return Path is absolutely committed to data security and consumer data privacy. Since our founding in 1999, we’ve kept consumer choice, permission, and transparency at the center of our business. To this end, we go above and beyond what’s legally required and take abundant care to make sure that:

  1. Our privacy policy is prominently displayed and written in plain English;
  2. The user must actively agree to its terms (no pre-checked boxes); and
  3. A summary of its main points is shown to every user at signup without the need to click a link

While a privacy expert quoted in the article (and someone we’ve known and respected for years) says that he believes consumers would want to know that humans, not only computers, might have access to data, we understand that unfortunately, most consumers don’t pay attention to privacy policies and statements, which is precisely why we developed succinct and plain-English “just-in-time” policies years before GDPR required them. When filling out a form people may not think about the impact that providing the information will have at a later date. Just-in-time notices work by appearing on the individual’s screen at the point where they input personal data, providing a brief message explaining how the information they are about to provide will be used, for example:

It’s disappointing to say the least that the reporter called this a “dirty secret.”  It looks pretty much the opposite of a secret to me.

In addition to our own policies and practices, Return Path is deeply involved in ongoing industry work related to privacy. We lead many of these efforts, and maintain long-term trusted relationships with numerous privacy associations. Our business runs on data, and keeping that data secure is our top priority.

Further, I want to address the scare tactics employed by this journalist, and many others, in addressing the topics of data collection, data security, and who has access to data. It’s common these days to see articles that highlight the dangers that can accompany everyday online activities like downloading an app or browsing a retail website. And while consumers certainly have a responsibility to protect themselves through education, it’s also important to understand the importance of data sharing, open ecosystems, and third party developers.  And more than that, it’s important to draw distinctions between companies who have direct relationships with and consent from consumers and ones who do not.

While they may not be top of mind, open ecosystems that allow for third-party innovation are an essential part of how the internet functions. Big players like Facebook and Google provide core platforms, but without APIs and independent developers, innovation and usability would be limited to big companies with significant market power and budgets—to the detriment of consumers. Think about it—would Facebook have become as wildly popular without the in-app phenomenon that was Farmville? Probably, but you get the point: third party applications add a new level of value and usefulness that a platform alone can’t provide.

Consumers often fall into the trap of believing that the solution to all of their online worries is to deny access to their data. But the reality is that, if they take steps like opting out of online tracking, the quality of their online experience will deteriorate dramatically. Rather than being served relevant ads and content that relates to their browsing behaviors and online preferences, they’ll see random ads from the highest bidder. Unfortunately some companies take personalization to an extreme, but an online experience devoid of personalization would feel oddly generic to the average consumer.

There’s been a lot of attention in the media lately—and rightfully so—about privacy policies and data privacy practices, specifically as they relate to data collection and access by third parties. The new GDPR regulations in the EU have driven much of this discussion, as has the potential misuse of private information about millions of Facebook users.

One of Return Path’s core values is transparency, including how we collect, access and use data.  Our situation and relationship with consumers is different from those of other companies. If anyone has additional questions, please reach out.