Privacy Is a Luxury Good
And what that does to every system built on the assumption that users can opt out.
The simplest status symbol in 2026 is being unreachable.
Not a watch. Not a car. Unreachability.
Vogue called it in their June 2025 feature: complete unreachability is “the ultimate power move.” Digital detox retreats saw a 50% surge in search interest that year. Luxury resorts are advertising no mobile signal as a feature, not a failure. CEOs are switching to dumb phones. And the people who can genuinely close a laptop and not open it again until Monday: they’re the new aspirational class.
None of this is accidental. It tells you something precise about what privacy has become.
What Happened to Privacy
Privacy didn’t used to require money. It required distance and obscurity. You were private because nobody was watching you closely enough to care, not because you could afford to opt out.
The surveillance layer changed that. And it changed faster than most people tracked.
Employers now monitor workers for productivity, performance, keystrokes, meeting attendance, bathroom break duration. A 2024 Accenture survey found that 62% of executives said their companies were collecting data on people at work. Fewer than a third said they felt confident the data was being used responsibly. That gap, 62% collecting and under a third confident it’s ethical, is not a gap. That’s the system working as designed.
The US GAO published a report in December 2025 on digital surveillance of workers. The key finding: it’s more widespread than most employees know, and the regulatory landscape is sparse enough that employers have significant latitude.
Meanwhile, the executive who decided to implement that surveillance can afford a $1,500/night retreat in the mountains with no signal.
That’s the class structure of attention in 2026.
The Pattern
Privacy used to track wealth roughly. If you had resources, you had a bigger house, more walls, more physical distance from neighbors and employers. If you didn’t, you were more visible.
What’s new is the precision and the inversion of defaults.
The default is now surveillance. Opting out requires effort, money, or leverage that most people don’t have. You can buy a dumb phone, but you’ll sacrifice the banking app, the two-factor auth, the emergency contact you actually need. You can leave social platforms, but your professional network is there. You can request a reduced monitoring agreement from your employer, but only if you have enough standing to make that request without ending the conversation.
The person without leverage gets the default. The default is surveillance. And the default has gotten more aggressive every year since 2020.
This is not new. What’s new is how visible it’s become. And how explicitly the luxury market has started selling the exit.
What This Does to Systems
Here’s why this matters to builders.
When you design a system with surveillance as the default, you make an assumption: users will behave the same whether or not they know they’re being watched. That assumption is wrong.
People who know they’re monitored don’t behave naturally. They perform compliance. They route around the system when they think they won’t be caught and perform for it when they know they will. The data you collect isn’t a measure of behavior. It’s a measure of behavior under observation. Those are different signals.
I wrote about this in Friction Is a Vulnerability: friction that forces unsafe workarounds doesn’t disappear, it just goes underground. Surveillance doesn’t solve that. It compounds it. Now you have the friction and the workaround and a monitoring layer that finds the workaround and treats it as a compliance failure rather than a design signal.
The person who can’t opt out of the surveillance layer is also the person generating the most distorted data. And that data feeds the product decisions, the algorithm adjustments, the risk models.
You built your system on a corrupted signal, and the corruption came from the people with the least power in the room.
The Design Question
There’s a version of this problem that gets framed as an ethics question. Should we surveil workers? Is the monitoring proportionate? Is the data used responsibly?
Those are real questions. But there’s a harder, more practical one underneath: who are you actually designing for?
If your system requires users to be willing to be watched in order to function, you’ve made a bet. The bet is that willingness is uniform. It isn’t. It correlates with economic security, with leverage, with how replaceable the person thinks they are. The most surveilled users are also the most distorted users.
Design for the user who cannot buy silence. Not as a moral exercise but as a precision exercise. The product you build for the person with no exit is the honest product. It accounts for coercion. It accounts for what the monitoring actually does to the signal. It doesn’t rely on users choosing to participate when choosing otherwise costs them something.
The person who can close the laptop and go to a mountain retreat is not your design constraint. The person who can’t is.
The Actual Status Symbol
The CEO with the dumb phone and the no-signal retreat isn’t opting out of technology. They’re opting out of the surveillance layer that comes bundled with it. They’ve made the calculation that being reachable costs more than it returns.
Most of the people building software for you never get to make that calculation. They live inside the default.
That’s the design surface. That’s where the real system runs.
Design for the user who cannot buy silence.
Next: Once you accept that silence is priced, local-first starts looking less like a preference and more like reliability engineering. If the network is where the surveillance lives, keeping more of your stack local is how you keep more of the signal honest.
Resources
Vogue / Postdigitalist — “Being Offline the Latest Luxury” (June 2025) — 50% surge in digital detox retreat search interest; Vogue calling complete unreachability “the ultimate power move.”
Accenture — Workforce Data Collection Survey 2024 — 62% of executives confirm companies collecting data on workers; fewer than a third confident in responsible use.
US GAO — Digital Surveillance: Potential Effects on Workers (December 2025) — Federal report on the scope and gaps in worker surveillance regulation.
The Analog Executive (Morphic, January 2026) — Pattern established: executives opting for analog as a status and attention preservation move.
Friction Is a Vulnerability (Morphic, March 2026) — Friction that forces unsafe workarounds; surveillance amplifies the problem rather than solving it.



