NewHero
Disconnected

Disconnected

claustrophobic and philosophical

A psychologist hosts a dinner party where guests debate the ethics of brain-computer interfaces that enforce honesty and erase privacy, unaware they are participants in an experiment designed to expose their hypocrisy about the very technology they claim to understand.

Story

The table had been set for hours. Samantha watched from the kitchen doorway as Lisa arranged the last of the stemware, each glass catching light from the recessed fixtures overhead. The food was excessive, deliberately so. Whole roasted fish with skin that crackled like paper, vegetables still wearing their garden dirt in artful smears, bread torn rather than sliced. Everything designed to suggest authenticity, to anchor people in their bodies.

She had chosen this house for its windows. Floor to ceiling glass on three sides, the city spreading out below like a circuit board. Most of those lives were plugged in right now, consciousness diffused across the network, experiencing each other with a clarity that made this dinner party feel like a costume drama.

Victor arrived first, which had been planned. He set his bag down in the entryway and looked at her with something that might have been concern.

"You're sure about this," he said.

"I stopped being sure about things three years ago," Samantha said. "Now I just do them."

The others arrived in clusters. Emily and Eva together, which Samantha had not anticipated but should have. Eva had not been invited, but Emily's plus-one privilege gave her cover, and Samantha decided in the moment to allow it. Sometimes the experiment improved itself. Alexander came alone, checking his phone every few seconds in a way that suggested he was monitoring something more important. Marcus and Olivia arrived separately but within minutes of each other, and the way they avoided eye contact told Samantha everything about the current state of their relationship.

Max was last, apologizing about traffic in a way that suggested he had practiced the apology. Mark had been in the kitchen with Lisa since early afternoon, which meant he had heard Samantha's instructions, had understood something was happening beyond the official premise of the evening.

They settled around the table with the awkwardness of people who had agreed to perform intimacy without the infrastructure that usually supported it. Lisa poured wine. Mark brought out the first course, a soup that steamed dramatically in the cool air from the windows.

"I should say why we're here," Samantha said, and the conversations that had been starting died away. "I told most of you different things when I invited you. Some of you I told I had news about a breakthrough in therapy protocols for BCICS withdrawal. Some of you I told I wanted to discuss the new legislation. Emily, I told you I had documents about the animal communication project."

Emily set down her spoon. "And none of that was true."

"All of it was true," Samantha said. "But none of it was the reason."

Victor was watching her with an expression she recognized from college, from late nights arguing about whether consciousness could be understood or only experienced. He knew what was coming.

"I wanted to have dinner with people who had opinions about BCICS," Samantha said. "Strong opinions. The kind you build identities around. And I wanted to see if those opinions could survive an evening of actual conversation."

"That's a dinner party," Max said. "You're describing a dinner party."

"I'm describing an experiment," Samantha said. "Which you all agreed to participate in when you signed the consent forms disguised as dinner RSVPs."

The silence that followed had texture. Samantha could feel them processing, deciding whether to be angry or intrigued.

"You're a psychologist," Olivia said finally. "You can't just experiment on your friends."

"You're right," Samantha said. "Which is why after tonight, we won't be friends anymore."

Emily laughed, sharp and sudden. "Jesus Christ, Sam. You could have just ghosted us like a normal person."

"I could have," Samantha said. "But I wanted data."

Lisa brought out the fish. Its eye stared up at the ceiling, accusatory and dead. Marcus reached for the wine and filled his glass to the rim. Victor had told her about Marcus, about the addiction and the failed treatments and the way Olivia had tried to save him by forcing him offline, which had only made everything worse.

"So we're specimens," Alexander said. He did not sound particularly bothered. "What are you hoping to observe?"

"Hypocrisy," Samantha said. "The gap between what people say they believe and what they actually believe when pressed."

"You think we're hypocrites," Eva said, her voice level, curious rather than defensive.

"I think everyone is a hypocrite," Samantha said. "I think BCICS was supposed to fix that, and instead it just moved the hypocrisy somewhere else."

"Offline," Victor said.

The conversation had structure now. People needed structure the way they needed oxygen. Without it they panicked, revealed themselves in ways they would regret later. With it they could perform almost indefinitely.

"Let's start with connection," Emily said, pulling out a recorder and setting it on the table. "Are we more or less connected than we were before BCICS?"

"That's not a real question," Alexander said. "Connection is quantifiable now. We can measure the density of neural links, the frequency of consciousness overlap, the depth of shared experience. By every metric we're more connected."

"By every metric you've chosen to measure," Olivia said. "What about the feeling of being known without having to explain yourself? What about privacy not as a legal concept but as a space where you can be uncertain without being observed?"

"Privacy is just another word for isolation," Eva said. She had been quiet until now, and Samantha watched the others react to her voice, watched them remember she had not been invited. "People used privacy to hide, to lie, to maintain versions of themselves that didn't match reality. BCICS ended that. And yes, it's uncomfortable. Growth is uncomfortable."

"This isn't growth," Marcus said, words blurred at the edges. "This is replacement. We're not becoming better humans. We're becoming something else."

"Why is that bad?" Max asked. "Evolution doesn't care about single generations. Maybe we're transitional. Maybe we're the bridge between what we were and what we're supposed to become."

"Supposed to according to who?" Olivia asked. "You? The government? The companies that subsidize this technology because it makes populations easier to control?"

"Easier to protect," Max said. "Crime is down seventy percent. Education outcomes have improved across every demographic. People are living longer because preventative medicine actually works when you can monitor neural patterns in real time."

"People are living longer," Victor said, "which means legacy doesn't matter anymore. Which means nobody's planning for a future they won't be around to see."

Alexander shook his head. "Legacy was always narcissistic. The idea that your genetic material should persist after you're gone is just ego dressed up as philosophy. If we can extend life indefinitely, we don't need legacy. We can just keep living."

"And keep consuming," Emily said. "Keep producing. Keep feeding the system that depends on us staying plugged in."

"You say that like it's sinister," Eva said. "But maybe the system is just reality now. Maybe fighting it is like fighting gravity."

"Gravity doesn't have quarterly earnings," Victor said.

Lisa cleared the fish. The plates came back with white bones and olive pits, the residue of consumption. Mark brought out the main course, lamb that had been cooked so long it fell apart at the touch of a fork. They used food as punctuation in the argument, as a way to create pauses without admitting they needed to think.

"Tell me about intimacy," Samantha said. "Eva, you've been with your partner for four years. You've been connected through BCICS for three of those years. Has it made you closer?"

Eva's expression shifted, became private in a way that seemed impossible for someone who lived her life in transparent communion with millions of others. "Yes," she said. "And also no. We know each other completely now. Every doubt, every fleeting attraction, every moment of boredom or frustration or fear. There's no mystery left. And I thought that would be freeing, but mostly it's exhausting."

"Because you can't lie," Olivia said.

"Because I can't lie to myself," Eva said. "The things I used to be able to ignore, I can't ignore them when he can feel them happening in real time. So we just exist in this state of total knowledge, which is supposed to be the highest form of love, but sometimes I think it's just a different kind of loneliness."

Marcus was staring at his plate. "I talk to simulated profiles," he said. "Not people. Constructs designed to make me feel understood without the requirement that I understand them back. And it's better than being alone. It's better than unplugging and feeling like I've been skinned."

"The withdrawal," Samantha said.

"It's not just withdrawal," Marcus said. "It's the silence. When you've been connected, when you've felt yourself as part of something larger, coming back to just yourself feels like amputation. Like you're this tiny, limited thing locked in a body that can barely process what it means to exist."

"That's addiction talking," Olivia said, but her voice was gentle.

"That's reality talking," Marcus said. "The addiction is just what we call it when reality becomes unbearable without technological intervention."

"So we're all addicts," Alexander said. "Everyone who uses electricity, everyone who relies on medical technology, everyone who can't navigate without GPS. We've been technological beings for centuries. BCICS is just more efficient."

"Efficient at what?" Emily asked. "At making us more productive? At extracting value from consciousness itself?"

"At making us honest," Max said. "Which you'd think would be universally valued, but apparently honesty is only good when it's selective."

"Honesty without context is just cruelty," Olivia said. "Knowing every fleeting thought someone has about you, every moment of doubt or irritation or desire for something else, that's not intimacy. That's surveillance."

"It's reality," Eva said. "Those thoughts existed before BCICS. We just couldn't see them."

"And maybe there was a reason for that," Victor said. "Maybe privacy wasn't about hiding. Maybe it was about protection. About having space to be uncertain without being judged for it."

"But you are judged for it," Alexander said. "That's what minds do. They judge. BCICS just makes the judging symmetrical. You know what I think of you, I know what you think of me, and we can both decide whether the relationship is worth maintaining based on complete information rather than carefully curated performances."

"And if complete information makes all relationships impossible?" Samantha asked.

The question hung there. Lisa refilled the wine glasses. Mark cleared plates. The city outside continued its evening rhythm, lights shifting in patterns that suggested consciousness even though most of that consciousness was elsewhere, distributed across networks that turned individuals into nodes.

"I broke up with my husband through BCICS," Emily said. "I felt the exact moment he understood it was over. Not the words, not the conversation, just the knowledge appearing in his consciousness like a headline. And he felt my relief, which was worse than if I had lied about it. The honesty didn't make it easier. It just made it impossible to pretend we were both devastated."

"But you were pretending before," Eva said.

"I was allowing him dignity," Emily said. "Which apparently is the same as lying now."

"Dignity is a performance," Alexander said. "BCICS ended performance. That's literally the point."

"The point according to the people who profit from it," Victor said. "The companies that sell honesty as a product and connection as a service and charge the government to monitor the collective consciousness for signs of dissent."

Max set down his fork with unnecessary force. "You're describing crime prevention. You're describing a society where violence is predicted before it happens, where radicalization is stopped before it spreads, where children are protected because their abusers can't hide their intentions."

"I'm describing thought police," Victor said. "Dressed up in the language of safety and connection and collective good."

"Those aren't contradictory," Max said. "Safety requires monitoring. Connection requires transparency. The collective good requires individual sacrifice."

"Individual sacrifice," Samantha repeated. "That's what we're calling it now."

"What would you call it?" Max asked.

"Compliance," Samantha said. "Enforced through technology and justified through fear."

"Fear of what?" Eva asked. "Fear of a world where people can't hurt each other through deception? Fear of actually having to live with the consequences of our thoughts instead of hiding them behind politeness?"

"Fear of a world where interiority doesn't exist," Samantha said. "Where every thought is public and every feeling is data and every human becomes a transparent container for information that can be extracted and monetized and controlled."

"But we chose this," Alexander said. "Nobody forced BCICS on anyone. We chose transparency. We chose connection."

"Did we?" Victor asked. "Or did we choose a technology that promised to solve loneliness, and the transparency was just the price we paid without understanding what we were giving up?"

The lamb was gone. The vegetables had been consumed. Mark brought out dessert, something architectural and unnecessary, sugar shaped into forms that suggested meaning without delivering it.

"I need to tell you something," Samantha said. "About the animal communication project."

Alexander looked up. "You said you had documents."

"I lied," Samantha said. "But I have been following the research. And the early results are disturbing in ways I think you've been suppressing."

"I haven't suppressed anything," Alexander said.

"You've been describing pets as sociopaths in your internal communications," Victor said. "You've been finding that most animals, when you can finally understand what they're thinking, turn out to be operating on principles that would be diagnosed as pathological in humans."

"They're not human," Alexander said. "Of course their psychology doesn't map to ours."

"But BCICS was supposed to create connection," Samantha said. "It was supposed to reveal the deep commonalities between consciousness itself. And instead it's revealing that consciousness is more alien than we thought. That even the animals we've lived with for thousands of years are fundamentally other in ways we can't reconcile."

"So what?" Eva asked. "So animals aren't moral beings. We already knew that."

"But we didn't know it," Olivia said. "We projected onto them. We used them to practice love and loyalty and trust. And now we're learning that they never felt any of that. They were just responding to stimuli in ways we misinterpreted as affection."

"Which is probably what we do too," Marcus said. "Respond to stimuli and call it love."

The silence that followed had weight, had the feeling of something ending or beginning or both.

"Is that what you think?" Samantha asked. "That love is just misinterpreted stimulus response?"

"I think BCICS showed us that what we called love was usually just projection," Marcus said. "And now that we can see each other clearly, we're finding out that most of us don't actually like what we see."

"Speak for yourself," Eva said, but her voice lacked conviction.

"I am," Marcus said. "That's the problem. I'm always speaking for myself now. There's no distance between what I think and what I say. And it turns out that most of what I think is bitter and small and afraid. And everyone can see it. And they can see me seeing them see it. And it never ends."

"Then unplug," Olivia said.

"I can't," Marcus said. "You know I can't. The withdrawal would kill me. Not metaphorically. Actually kill me."

"That's not confirmed," Alexander said.

"It's confirmed enough," Victor said. "You've had sixteen deaths among long-term users who attempted to disconnect. You're calling it coincidence, but the pattern is clear."

"What pattern?" Max asked.

"BCICS changes brain structure," Victor said. "After a certain point, the organic tissue can't function independently anymore. It needs the network. And when the network goes away, the tissue fails."

"That's speculation," Alexander said.

"That's what your own data shows," Victor said. "Which is why you're fighting the disclosure requirements. Which is why you're lobbying to keep the long-term studies classified."

"I'm lobbying to prevent panic," Alexander said. "To give us time to develop solutions before people start making decisions based on incomplete information."

"Like the decision to disconnect before it becomes impossible?" Emily asked.

"Like the decision to reject a technology that has improved billions of lives based on edge cases and worst-case scenarios," Alexander said.

"Billions of lives," Samantha said. "Or billions of users? Because I'm not sure those are the same thing anymore."

The dessert plates were empty. Mark brought coffee, which seemed absurd given the time and the tension, but people took it anyway. Lisa stood against the wall, watching, and Samantha wondered what she thought of all this, whether she would go home and tell her children about the rich people arguing about whether consciousness was worth protecting.

"I want to tell you why I really organized this dinner," Samantha said.

"More actual than the experiment?" Emily asked.

"The experiment was real," Samantha said. "But it wasn't the only reason. I wanted to see if any of you could defend your positions without hypocrisy. If the people who claim BCICS is saving humanity could explain why they unplug for important conversations. If the people who claim it's destroying humanity could explain why they keep using it. If any of you could survive an evening of the kind of honesty you claim to value."

"And?" Victor asked.

"And you can't," Samantha said. "None of you can. You're all performing. Max supports BCICS because the government pays him to. Alexander defends it because his career depends on it. Eva loves it because it gives her a community she couldn't find in the physical world. Marcus hates it because it made him confront how unhappy he is. Olivia fights it because it took her brother away from her. Emily writes about it because controversy generates traffic. And Victor undermines it because he's always needed to be the person who sees through things."

"And you?" Victor asked. "What's your interest?"

"I thought I could use it to become whole," Samantha said. "I thought if I let people see everything, all the trauma and loneliness and fear, that somehow the exposure would heal me. But it didn't. It just turned my pain into data. And now I'm going offline permanently, and I wanted to do this first. To show all of you what you look like from the outside. To make you see yourselves the way you force everyone else to see you."

"This is cruel, Sam," Olivia said.

"This is honest," Samantha said. "Which is what you all claim to want."

Eva was crying, which surprised Samantha more than anything else that had happened. Eva, who lived her entire emotional life on public display, who had made a career out of transparency.

"You don't understand," Eva said. "None of you understand what it's like to be connected. To feel yourself as part of something infinite. And then to come back to just yourself and feel how small that is. How insufficient. I'm not defending BCICS because I'm weak. I'm defending it because I've experienced something beyond individual consciousness, and now individual consciousness feels like a cage."

"Then stay connected," Samantha said. "Nobody's stopping you."

"You are," Eva said. "All of you are. By making it shameful. By calling it addiction. By treating expansion of consciousness like it's a disease instead of an evolution."

"Because it is a disease," Marcus said. "Because it hollows you out and fills you with something that isn't you and calls it connection."

"How do you know it isn't you?" Alexander asked. "How do you know the isolated self is the real self and the connected self is the impostor? Maybe we've had it backwards. Maybe the lie is individualism, and BCICS is just showing us the truth we've been avoiding."

"The truth that we're not individuals?" Olivia asked.

"The truth that we never were," Alexander said. "That selfhood is a useful fiction we maintained because we didn't have the technology to transcend it. But now we do. And fighting it is like fighting the fact that the earth revolves around the sun."

"But the reality is that people are dying," Emily said. "The reality is that users can't disconnect without potentially fatal consequences. The reality is that BCICS companies are lobbying to make connection legally mandatory for certain professions. The reality is that we're creating a world where opting out might not be possible."

"Good," Max said, and the word landed like a stone. "Opting out shouldn't be possible. Not if the collective good requires participation. Not if the benefits are universal and the only people who suffer are those who refuse to contribute."

"Contribute what?" Victor asked. "Our consciousness? Our autonomy? Our right to be unknown even to ourselves?"

"Your willingness to be part of something larger," Max said. "Which is what society has always required. BCICS just makes it literal."

Samantha stood. The movement felt enormous in the suddenly quiet room. "I want to show you something." She took out her phone and projected a video onto the wall. It showed the dinner from above, from cameras she had installed without telling them. It showed Marcus drinking too much and Eva crying and Max defending positions he had criticized in private conversations with Samantha. It showed Alexander checking his phone every thirty seconds and Emily recording without permission and Olivia touching her brother's arm in a gesture that suggested forgiveness she had not yet voiced.

"This is what honesty looks like," Samantha said. "Not what you feel in the moment. What you do when you think nobody's watching. Which is exactly what BCICS was supposed to eliminate, but it didn't. It just moved the performance somewhere else."

"You recorded us," Emily said. "Without consent."

"You signed consent forms," Samantha said. "You just didn't read them because you trusted me."

"This is violating," Olivia said.

"This is transparent," Samantha said. "This is what you all claim to want. Total visibility. Complete honesty. No privacy. Well, here it is. How does it feel?"

Victor was smiling, which Samantha had expected. He had known this was coming, had helped her plan it, had understood from the beginning that the point was not to win the argument but to make everyone lose it.

"You're making a point about hypocrisy by being hypocritical," Max said. "You're criticizing surveillance while surveilling us. You're condemning violation while violating our trust."

"Yes," Samantha said. "Because that's what BCICS does. It violates you and calls it connection. It surveils you and calls it honesty. And you all defend it or attack it based on whether you're winning or losing the game, not based on any actual principle."

"So what now?" Alexander asked. "You've exposed us. You've shown us we're all flawed and inconsistent. What does that prove?"

"It proves that the question isn't whether BCICS is good or bad," Samantha said. "The question is whether we're willing to sacrifice everything that makes us human to become something else. And if we are, we should at least be honest about what we're sacrificing."

"Which is?" Eva asked.

"The right to be unknown," Samantha said. "Even to ourselves. Especially to ourselves. The right to contain contradictions. To change our minds. To be wrong privately before we're wrong publicly. To learn without being monitored. To fail without it becoming data. To be human in all the ways that humans have always been human, which includes being inconsistent and hypocritical and irrational and afraid."

She disconnected her phone from the projector. The image disappeared, leaving only the reflection of their faces in the dark windows.

"I'm going offline tomorrow," she said. "Permanently. And yes, I know the risks. I know I might die. But I'd rather die as myself than live forever as a fraction of something I don't understand and didn't consent to becoming."

"You did consent," Max said. "You signed the terms of service."

"I signed a document I didn't read that gave away rights I didn't know I had," Samantha said. "Which is what consent looks like in a world where everything is designed to make you give it up without understanding what you're losing."

The dinner was over. People stood, uncertain whether to leave or stay, whether to be angry or grateful or both. Lisa and Mark began clearing the table with the efficiency of people who had seen too much and learned to process it later. Victor approached Samantha, and for a moment they stood together looking out at the city, at all those separate containers of consciousness that were not separate anymore, that had been opened and connected and merged into something that might have been beautiful or terrible or both.

"Did you get what you wanted?" Victor asked.

"I got data," Samantha said. "Whether I wanted it is a different question."

"Are you really going offline?"

"I don't know," Samantha said. "That's the first honest thing I've said all night."

The guests left in the same clusters they had arrived in, except Eva left alone, and Marcus and Olivia left together, and Emily stayed behind to ask Samantha whether she could write about this, and Samantha said yes because surveillance was only wrong when it was hidden, apparently.

The table was empty now. The whole foods and careful presentations reduced to scraps and waste. Mark washed dishes while Lisa wrapped leftovers that nobody would eat. The city continued its evening rhythm outside the windows, and Samantha stood in the space between the performance and whatever came after, between the version of herself she had been and the version she might become, and wondered whether the gap between them was freedom or just another kind of loneliness, another way of being disconnected in a world that had decided connection was the only value worth protecting.

She turned off the lights. In the darkness, the city looked like a brain, all those firing neurons that might have been consciousness or might have been something else, something that had no name yet because the language to describe it had not been invented, and might never be invented, because the people who could have invented it were too busy being part of it to step back and see what it had become.

More Stories