Restoring Trust in Tech Starts With New Boundaries for User Consent
I got my start in the relatively early days of the Internet: The late 90’s and early aughts were littered with exciting .com’s, with barely a one having a viable business model. One of my favorites at the time was Kozmo.com, which pioneered the technical delivery model which is ubiquitous today with the likes of Uber Eats, Dash, and Amazon. Back then, though, the actual manifestation of Kozmo was nuts! You could browse the website, which was essentially a visual convenience store, order your Ben & Jerry’s and a DVD (because streaming was still a distant dream), and have it all delivered within an hour by someone on a scooter. They even threw in a branded bottle opener for kicks! I don’t know how much Kozmo lost on every order, but it was significant, and, unsurprisingly, they went out of business in short order in the spring of 2001, along with so many others in the dotcom crash. Kozmo so defined the first dotcom boom and bust cycle that it’s story was told in a 2001 documentary titled e-dreams.
Before Kozmo crashed, though, investors and the popular press fully bought into the vision of the company. Why? Well, part of it was blind exuberance, but another big factor was the promise of the data collected and the personalized experience not far behind. Kozmo learned where you live, your movie and ice cream preferences, and when you like to relax. Just imagine the possibilities those insights presented in an age when telephone surveys and census data counted for cutting edge insights. I was sold on the idea. When my “old economy” friends would naively inquire how these .com’s made money, I would scoff and brag about the big data future. Oops.
But now that technology and culture have caught up to those early ideas, and our big data future is here, I think we can all agree that it is far from the utopia that early dreamers envisioned. Our data reality has brought us walled gardens where we are fed an ever-narrower stream of information catering to our worst instincts. Where giant corporations gather terabytes of private information, creating detailed profiles about each of us, that are sold to even bigger corporations so they can target us with sophisticated marketing campaigns. Where we are torn between the convenience of a “smart” speaker in our homes and the risk of being surveilled. Where the dream of equal access to knowledge has given way to the nightmare of politicized echo chambers and the erosion of truth. These early tech dreamers are now corporate titans, who use their billions to increase power through lobbying efforts and the squashing of competition.
But all is not lost! The Internet is still free and open, and we have the power to create experiences that respect our audiences individuality without eroding their identities.
Our digital experiences should reflect how we engage with the real world. They should allow us to choose our own personas during an experience, maybe even change them half way through. Consent based calibration builds the foundational trust that is needed to deepen engagement, and, ultimately increase the impact of whatever KPIs dictate the health of an organization in our digital environments.
Rather than gathering every possible piece of information without a user’s consent, just ask a user what type of journey they would like to take. This is not a radical concept: When you walk into a store, you are—in theory—greeted and presented with items to purchase based on your stated needs, as well as the assumptions that a salesperson might have about who you are. A person simply asks what you are looking for and, if they can help, begins to guide your shopping experience. Likewise, personalization makes assumptions about who you are based on what it knows, and rather than clarifying by asking directly, it deepens its assumptions based on monitoring your behaviors. Unlike real-life shopping experiences, personalization allows for no user-controlled interactive checks on its assumptions, and that’s where consent-based calibration comes in. When you meet people at a party, depending on your mood that night, you get to control how you define yourself, and it’s ok if you’re outgoing and owning the dance floor one night, and relaxing in the corner and talking with a friend another. Our digital experiences should reflect how we engage with the real world. They should allow us to choose our own personas during an experience, maybe even change them half way through.
We are not a composite of the data collected about us without our consent. We are people whose opinions, needs, feelings, and moods are constantly changing, and a digital experience should calibrate our experiences to match, with our consent. This paradigm is not only ethical and in the best interests of people, it also benefits the organizations they are engaging with. Consent based calibration builds the foundational trust that is needed to deepen engagement, and, ultimately increase the impact of whatever KPIs dictate the health of an organization in our digital environments.
We have the power and tools to build these products, let’s get started and create a better Internet.