Like many Americans with too much time on our hands, I tried an experiment earlier this year: I significantly reduced the time and energy I devote to mainstream social media sites. What I learned changed both my ongoing behavior and my thoughts on where I’d consider working in the future.

My experiment started with a mild inconvenience: I had cleared all the cookies from my phone’s web browser, logging me out of Twitter and Facebook. (I haven’t had either company’s primary app installed for years; neither is necessary to use their service.) Faced with a login screen for each, I decided to leave myself disconnected. I also logged out of both services on my work PC. This effectively limited my time on these sites to when I was at home on my personal computer.

Much like some TV shows are classified as “appointment television,” meant to be watched at an exact time, my lazy gesture at work turned Twitter and Facebook into “appointment browsing.” On most days I check these sites once per day or so. I make exceptions when I’m on vacation or on the rare occasion that something newsworthy is happening in my area. I found that Facebook provides very little meaningful or personal content and that Twitter’s “conversations” are invariably dominated by the loudest voices in the room, with dissenters set upon by those voices’ sociopathic fans. Anti-curiosity reigns: many use social media not to open their mind, but to confirm what they already knew. This problem has survived several rounds of me muting and cutting ties with the most problematic people, pages, and publishers on each service.

My company provides plenty of junk food for free. I know that this food is bad for me. I still eat it. Similarly, I know that Twitter’s and Facebook’s information junk food is bad for me, and yet it still scratches some kind of subconscious itch. My deprioritization of those two services requires me to fill the information void with something better. Certainly there are many other social media I use for different reasons: LinkedIn, for professional improvement; Nextdoor, for neighborhood commerce and fear; Strava, for aspirational cyclists; and Discord, for my Pokémon Go habit.

Another side effect of reducing microcontent is that I’ve been reading more news articles and books. The articles are often from sources that I pay for — more on that later — and the books increasingly come from the Seattle Public Library, which rents out a huge selection of e-books and audiobooks. The fact that I borrow these books puts me under time pressure to finish, re-borrow, buy, or abandon any book in three weeks or less. In addition, I use Goodreads, a helpful social network for books which just so happens to be owned by Amazon.

Many modern media, notably Facebook and YouTube, emphasize “time on site” as a key metric: it’s desirable to have users spend as much time on the platform as possible. The reasoning behind this is directly tied to advertising revenue, not to the quality of information these media provide nor to its positive societal value. We all have a few friends and relatives who share garbage content: conspiracy theories, viral videos, and the like. These snackable nuggets don’t get enough pushback. In Sri Lanka, Mexico, and India, viral videos have resulted in vigilantes killing wrongly-accused people. In Germany, Facebook alone — not just Internet access generally — is tied to anti-refugee violence. Anti-vaccination misinformation is so mainstream in Italy that the government has significantly loosened expectations for children to be vaccinated. People are dying based on bogus provocative videos, but deleting them would reduce “time on site,” so it’s not in modern media companies’ interest to do so.

I find it troubling how a generation of Internet users has matured at a time when just about any interaction involves three parties: the consumer, the producer, and the advertiser. In addition to installing ad blockers on my home PC and phone, I’ve started to pay more producers for their content. Some, like the Puget Sound Business Journal, demand payment. Others, like local sites The Urbanist and Capitol Hill Seattle, make payment strictly optional. I also pay for e-mail service and file hosting from companies that don’t use such services as feeders into ad targeting systems. Some cynical friends and co-workers would attach a “yet” to that last sentence, but I really believe that the way to assert control over modern media is to insist that the users become customers, not targets.

Lastly, I’ve amended my LinkedIn profile to set two constraints on companies where I don’t want to work. I’d already said I didn’t want to work in the suburbs, not even in edge cities like Bellevue, Washington, and certainly not in small towns like Cupertino, California. I’ve since added that I want to work on products that people are willing to pay for — “no solely ad-supported metrics-driven services, please.” I’ve worked on marketing technology in the past. Being in the privileged position of a 15-year software engineer, I’m exercising my right to opt out of working on even more ad tech. I also urge my friends who work at companies like Google and Facebook to think about where their salaries, their valuable stock units, and their meals come from. How differently would their employers behave if they took in money from their users instead of from advertisers? Wouldn’t it be great to work at a company that hasn’t inspired “blockers” for its primary business function?

Updated: