Service Over Surveillance: Building the Next Generation of Tech

By Travis Montaque

Corporate social responsibility has been an important business pillar across many industries, from retail to consumer packaged goods. But in tech we still haven’t fully defined and lived up to what it means to be a socially responsible tech company.

That needs to change.

Data privacy has been a hot topic in the mainstream lately, as the government looks to crack down on big tech. Apple recently voiced their stance on data privacy, stressing the importance of keeping personal things personal.

I’m here to say that Holler agrees with Apple. People should not be considered products for sale and we as a collective need to revert back to our most basic principles of providing customer service and experience as a core driver of business outcomes. Tech companies should not surveil consumers for the purpose of manipulating their behaviors, especially when the practice is detrimental to both human beings and to society. Think about one of the most basic concepts in machine learning, exploration exploitation, where machines explore and test various situations until it learns how to consistently deliver a desired outcome as defined by its creator. The outcome is usually profit with little regard as to how it arrived at it. Service over surveillance. This principle is one that I’ve been pushing internally at Holler as we continue to develop and distribute technology. This idea is the technology industry’s best chance at fixing what’s broken in how the Internet works today.


Old Tech Practices Rely on Selling YOU

Over the last decade, as social media channels have risen in consumer popularity, individual control over personal data has plummeted in tandem. Tech companies first took our data, (without many people even realizing it) which then gave them incredible control.

Surveillance Capitalism - defined as an economic system centered around the commodification of personal data with the core purpose of profit-making - has benefitted from taking personal information about you, me, and virtually everyone else that isn’t living completely off the grid.

Advertisers have benefitted from this wealth of data as well, being able to target people based on things like location, age, and interests with ads. Our social feeds have since been flooded with highly-targeted advertisements that over time have become practically indiscernible from our organic feed. If there was a creepy-o-meter for these practices - it would have broken by now. Yet, we’ve all gone along with it.

For a while, it seemed like we weren’t going to be able to escape from the hole we’ve all dug ourselves so deeply into. Advertisers addicted to targeting data that helps them feel like they are avoiding wastage of precious media budgets, the stock market rising uncontrollably from the billions of ad dollars flowing to ad tech companies, and real regulation nowhere to be seen. But finally, especially with Google’s move to remove cookies from Chrome over the next couple of years and Apple’s recent proposal to essentially render Advertiser ID useless, it seems like there is a light at the end of the tunnel when it comes to protection of data, and therefore, the protection of people.


Why Holler is Different

Holler is a tech company, but the way we capture and utilize data is much different than how the tech giants before us did it. We made the decision that we wanted to do things the right way, and now we’re aggressively taking the steps necessary to achieve that.

We do not capture personal information about YOU and your actions specifically and sell it for monetary gain. In fact, we’re already making the moves to do away with the collection of Advertising ID. Instead, our ability to drive profit is centered on being exceptional at context instead of identity. Being exceptional at context allows us to service users when they need us and puts the power in the hands of the user. Our ability to monetize comes from giving businesses the opportunity to enter the conversation only based on context with content that’s useful to the conversation, not with an ad (which I’ll explain more about below). Our AI also executes on your device, not the cloud, meaning that it never has to send your private messages to us.

To demonstrate further what we do differently, I’ll break down our core guiding principle:

The conversation is yours, not ours. We’re just here to make it better.

At Holler, on device, our system can understand someone’s intent when they type “I love pizza” into a group chat, because our technology understands the context of the words you type. Our AI understands that you love pizza, so it will serve you a pizza Sticker to help make your conversation better.

Why Context?

Why do we lean on context instead of personal stuff? Because we believe there is no need to know exactly who you are in order to provide more useful content that helps you have better conversations. That’s why instead of personal graphs, which have been used by leading social and search platforms, we use our context graph.

In the tech industry, there has been a huge conflict of interest between businesses making money and doing what’s best for people. And in the past, the latter has suffered so the former could prosper. And with that came a severe abuse in data privacy. Still, as a society, we continued to use these free tools, even after we eventually realized that data was being collected from us purely for profit gain.

But despite this mass complacency about data privacy, as I continued to build Holler, I knew I wanted to do something different. I knew I had to. Instead of following you around the Internet, we decided to service you only right in the moment you need us (or when you Holler). We’re confident we can service people successfully and help businesses succeed as well, in a way that’s privacy-safe for the people we serve and care about.


Our Commitment Moving Forward

1. No Surveillance

We don’t care to build up a personal picture of you to sell access to you to the highest bidder later. You’ll get what you want, when you choose, and it won’t be weird.


2. On-Device Suggestions


What you say on your device stays on your device. All of our suggestions are made on the device, so we aren’t taking that conversation data anywhere else (no storage on clouds, servers, or other secret hideaways).


3. Transparency

We’re currently working on making sure that users always know what’s going on within our experience in a way that is clear and super easy to understand. This article is our first step in being more transparent about what we’re doing at Holler.


It’s Time for the Next Generation of Tech Companies to Step Up

Holler is a Next-Gen Tech Company. We learned from the mistakes of so many before us, and we’re making the conscious decision to forge our own path based on the beliefs we’ve held since day one. We’re laser-focused on innovation and break-through products, but at the same time, are still anchored in a moral and social ethos that allows us to have a net-good impact on society.

And I’m not just referring to our stance on privacy. We’ve been firm in our opinions on other important topics as well, like reinventing corporate culture and being open about diversity, inclusion, and belonging in the workplace. We’ll continue to be outspoken about what we feel is right, and to do the work we need to in order to confidently stand behind those claims.

My team at Holler knows our purpose in the world is larger than the products we build. And as tech leaders, it’s crucial to realize that the actions we take today will shape the future of the people who use our services. We must choose wisely.

This is all still a work in progress at Holler, but we’ve already begun taking on the challenge. I know the responsibility is great, but building things this way (the right way) is never easy. Tons of creativity and grit are needed to go the distance. But the outcome will be well worth it.

Want to get in touch?
Drop us a line!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.