I don’t use Snapchat mostly because nobody I know uses it. I’m not the target demographic, apparently. But that doesn’t keep me from talking about how much I dislike the user experience. I know that when I register such complaints I probably sound like a codger who wants his buggy whip back. But even so, I never really “got” why someone would build a platform where content is ephemeral. Isn’t the whole point of social platforms to catalog our lives? Maybe I’m missing something.
Kara’s Personal Mission
This past weekend I listened to Kara Swisher’s discussion with Evan Spiegel who is the CEO of Snap. I’m a big fan of the Recode Decode podcast where you’ll find Kara speaking with tech leaders (mostly) about contemporary topics. For a long time, and even more so since the last Presidential election’s fake news high-jinx, Kara has been pressing social media companies about their responsibilities re: content. The “media” companies like Facebook and Twitter say they’re a platform and not a media company. Therefore, they aren’t responsible for the content, they’re not responsible for whether it’s true or false, and they’re not responsible for the impact that lies have on their users and society. This is despite the fact that they want to be a platform where everyone gets their “news”.
Kara’s point, which is increasingly becoming the point of regulators, is that if you’re going to distribute information then you have to take responsibility for the truth of that information. I could be bought into the platforms argument if they weren’t so brazen about using my data to their own ends. If the price of my use of your platform is that you’re going to store my personal data forever and get unfettered use of my personal data to serve me content, then I do think you have some responsibility to make sure that what you serve is in fact truthful. If you’re using my data to feed me lies, that’s a huge problem. But all this presumes I’m at peace with having all this personal data floating around the ether.
I am missing something
I’ve been trained by Facebook to believe that content that never disappears is a great thing. Facebook reminds me of my old content all the time to encourage me to repost the memories that they’ve saved for me. Having content that eventually disappears isn’t part of their business model. To make money, my data has to be reanalyzed, reused, and shared to further profit goals. So I post, get likes, share and reshare.
Evan raised an interesting question in the interview with Kara. Why do I use these social platforms with their persistent data? He argues that my motivation is not to communicate but instead to broadcast for positive reinforcement. I post not to share but to be validated by clicks on the “Like” button. Persistent posts exist to allow for continued interest and more clicks.
Snap takes a very different approach. For one thing, that user experience that I don’t like does not include a “Like” button. For the user that eliminates the competition for attention. But more importantly, the content is transient. The snaps and stories disappear very quickly. So the goal is not to compete for sparse content attention, it’s to actually communicate with other humans. As stated earlier, I’m not enough of a user to know whether communication is truly the goal and what other Pavlovian mechanisms may exist with the application, but it does seem like a loftier aspiration than attention seeking.
What I’m not missing
Regardless of the “why” of Snap, its business model is still the same as other social platforms. Gather data. Analyze. Serve content.
I like Evan. I even like his business philosophy about helping users communicate. However, I wonder how long we’re going to be willing to trade our personal information so that they can be turned into something valuable by these companies. I wonder how long regulators are going to turn a blind eye to such practices regardless of whether we’ve been notified, opted-in, or are able to manage our data.
I think the time is short for personal data business models. Regulations are going to continue to turn against these models. However, I believe other models are going to emerge that help connect people with content. The pattern analysis available in ML platforms these days may allow us to go from a bottoms-up personalization model, one that gathers your personal data and then aggregates it, to something that is more top-down, anonymous patterns in behavior will lead to models that can predict your goals and help make the connection. It’s the same objective but one that can, perhaps, be done in a manner that keeps my information mine.
Privacy by Design
Our recent journey to become GDPR compliant raised my interest in all things related to personal information. SoloSegment is fortunate that our products don’t rely on personal information to work. In the past, we had captured some data that is now considered personal information, but it wasn’t fundamental so we made changes to the product to eliminate personal data (e.g. IP addresses) that we don’t need at the edge of our infrastructure. We only store anonymized data that helps our customers understand their customers.
If we knew more about our customer’s visitors, we could do all sorts of fancy things related to personalization. However, investing in a product that relies on personal information seems to be fighting against the headwinds of regulation and customer desire. We’re going to need a different model. Our customers are going to need a different model.
So, you’ll see some changes on the website in the coming weeks. We’ll begin talking about some machine learning products that we’ve recently introduced that help improve site search customer outcomes without relying on information that users provide. Instead, we’ll be using advanced analytical techniques to help companies automate the insight they can gather from behavioral models rather than from the personal data they gather from their users. We may not have arrived at the answer, but we have arrived on the right journey. Don’t bet on personal data models. Bet on privacy by design.