Mirrored profile of a women
Illustration by Sarah Leigh
January 22, 2019

Introducing Consent-Based Calibration

Collecting and maintaining user data just got a lot more risky, and potentially, a lot more expensive. This month’s implementation of California’s Consumer Privacy Act (CCPA) has a lot of organizations hustling to comply with the most rigorous data privacy regulation enacted in the U.S. to date. The CCPA demands a wide swath of participation, requiring companies and organizations (serving California residents) that meet one or all of the following criteria comply:

(1) Companies and organizations that have annual revenues of $25 million or (2) that have collected 50,000 or more personal data records or (3) that earn more than 50 percent of their income from the sale of personal data.

Compliance looks like the usual stakes, and isn’t where the implementation burden actually falls. What’s critical about the CCPA is that it has some teeth in terms of user protections. Even if an organization doesn’t experience a data breach, organizations are at risk of being sued. That’s because the new law states that once a user has requested their personal data report, an organization has 30 days to deliver the user’s record, and if it fails to do so, users can receive up to $750 in compensation (or actual damages, whichever is higher) from the organization that has failed to comply. What’s more, in addition, the organization can experience a fine of $7,500 per personal record.

But here’s the thing: As a digital agency responsible for presenting the best possible set of solutions, our first question in response to CCPA isn’t just “How can we help our clients comply?” It’s “What is the best way to balance consumer privacy with content or product delivery?”

That’s where Consent-Based Calibration comes in, a new approach that addresses concerns about data privacy, unwieldy personalization software, and content-dense websites—and delights users at the same time. Consent-based calibration shifts our main concern from collecting and protecting user data to actually respecting your privacy.

Consent-based calibration shifts our main concern from collecting and protecting user data to actually respecting your privacy.

Consent-based calibration is built on one simple idea and two core concepts:

The Idea:

We don’t need to know exactly who you are to serve your needs—or ours.

Organizations don’t need to know exactly who you are on a personal level in order to engage you in a meaningful way. And in fact, the basic tenets of personalization fail in ways that make the benefits unworthy of the risks. Consent-based calibration asserts that it’s more respectful, impactful, and impressive if an organization actually doesn’t know your every click and can still serve up a deeply meaningful experience. Out of this idea come consent based calibration’s two central concepts:

Concept 1:

Replacing personal experiences with optional personas isn’t just ethical, it’s effective.

Instead of entering a digital platform and consenting to GDPR data collection, a user would enter the same digital platform and be offered a choice of experiences: How would you like to experience our site? We can offer a choice of experiences to you, and you can shift between them at any time. Instead of collecting data on the user specifically to tailor every next step, consent based calibration suggests next steps based on the persona rather than the person, and at key checkpoints, you can opt into a different persona, or a different experience.

One of the most widely agreed upon issues with personalization—and with AI in general—is that it lacks the ability to understand that a user may want to radically change their course or interests: It fails to introduce us to the parts of ourselves that are unknown. Consent based calibration assumes that a person would like to actively participate in creating their own experience. It fosters respect by offering the choice to change that experience at any time. And ultimately, it shifts the now-ubiquitous single prompt of “Do you consent to us collecting your data to provide you with an experience that we define for you behind the scenes?” to an ongoing conversation of “Are you interested in interacting with our site in one of these typical user patterns? How about now?”

This shift from single ask—which gives the organization all of the decision-making power moving forward—to an ongoing consent-based calibration process—sows respect between organization and user. “We don’t need to know exactly who you are in order to share with you what you might be looking for” is the kind of consent-based mutuality that could only be developed by an organization that isn’t driven by commodification.

Applying consent-based calibration would address many of the ethical and practical issues organizations and individual users face regarding data, content, and user experience. It refuses to accept that personalization means a more personal experience, and instead, calls it out for what it is: A commodification engine.

Concept 2:

Massive content management systems should be replaced with lightweight, interactive experiences that rely on browser-side data storage.

We’ve worked with hundreds of clients, and across all of them, one rule tends to be true: Less than half (at most) of an organization’s content drives 99 percent of its website’s meaningful engagement. Having run hundreds of analytics audits and discovery workshops with leaders across industries, we can say that everyone shares a similar burden: too much content, not enough meaning, not enough engagement. Critically, the data collection required for personalization only doubles down on that burden: Massive amounts of content generate massive amounts of data—and massive amounts of data make both organizations increasingly vulnerable to data breaches and users vulnerable to repetitive personalization experiences.

Applying consent-based calibration would address many of the ethical and practical issues organizations and individual users face regarding data, content, and user experience. It refuses to accept that personalization means a more personal experience, and instead, calls it out for what it is: A commodification engine.

Organizations doing good in the world, the kinds of clients that ThinkShout serves, don’t need to leverage tools developed for extraction in order to deliver experiences meant for action. Consent-based calibration is one simple response to the need for organizations to create experiences for their users to feel understood and mutually respected. We know it’s possible, now we just need to do the work of making ideas like these probable.

Read more on Consent-Based Calibration
Collection of images connected by angled lines.
Ethics lens

Restoring Trust in Tech Starts With New Boundaries for User Consent

Consent-based calibration shifts control of the data that defines who we are online back to users.
Collection of images connected by angled lines.
Business lens

Questioning Personalization Is Good For Your Engagement Model

Imagine a world where you get online and are greeted, not by a now ubiquitous banner asking you to consent to allowing your cookies to be tracked, but by a clear and visually dynamic set of options, each one offering a different website browsing experience.
4 different glasses filled with dairy.
Design lens

The Case for Designing Digital Platforms That Respect Our Privacy

Consent-based calibration isn’t a new approach to relationships—person to person or person to organization—it’s a different way to think about who designs our day to day experiences, and ultimately, who controls our lives.
Person of color in background with textured circles
Technical lens

The Technical Benefits of Consent-Based Calibration

Accurate content personalization is a large technical feat, combining client side behavior tracking, data analytics, machine learning, and hand-built segmentation to deliver valuable experiences to end users. Consent-based calibration bypasses many of these technical challenges.