11 min read

La fin de l'altérité : Pourquoi je ne te demande plus rien

A colleague said something to me a few weeks ago that landed with more force than he intended: you do not ask me anything anymore.

He was right. I had not noticed the change because it felt like progress. Every small uncertainty that used to travel through another person now stopped at my keyboard. Technical question, wording doubt, strategic angle, awkward email, half-formed idea. I could route all of it through a model and get an answer before a coworker had even looked up from their screen.

The speed is real. So is the cost.

By the end of one recent day, I realized I had not asked a single colleague for help, context, or judgment. I had been productive. I had also been strangely alone inside that productivity, as if I had turned collaboration into a private background service. The phrase that came to mind was efficient loneliness. It sounds like a contradiction until you live inside it.

The disappearing reason to ask another person

Éric Sadin has been circling this problem with a word that matters: alterity. The other person is not just someone who happens to be nearby. The other is a source of unpredictability, resistance, recognition, and perspective. Another mind does not merely supply information. It changes the shape of the exchange.

Generative AI removes the practical need for many of those exchanges. That is why this shift feels bigger than a new software category. For decades, asking someone else was often the only path through uncertainty. You wanted to understand a process, decode a norm, sanity-check an approach, or learn the unwritten history behind a decision. You had to ask. That dependence was sometimes inefficient, sometimes annoying, and often socially productive.

Now the machine absorbs the first question, and then the second, and then the tenth. The social occasion disappears before we notice it was there.

This is not because the machine hates relationships. It is because convenience is indifferent to them. That is what makes the change so easy to underestimate. Nothing dramatic happens in the moment. You save five minutes. You avoid interrupting someone. You keep momentum. Then, over months, you look up and realize a whole category of low-stakes human contact has thinned out.

At work, that matters more than we admit. A lot of trust is built in the tiny exchanges that are too small to schedule and too ordinary to celebrate. When those vanish, the team still functions. It may even function better on paper. Yet something important stops accumulating.

Work becomes cleaner and thinner

Before these tools became ambient, a question for a senior colleague rarely came back as a single answer. It came back with a story.

You would ask how to handle a difficult client, and they would tell you about a project from three years ago that went sideways for reasons no playbook captured. They would remember the politics, the personalities, the moment when a harmless email became a week-long problem. You were getting advice, but you were also getting texture. You were learning how they think, what they notice, what they regret, and when they distrust their own first impulse.

A model can simulate some of that. Sometimes it does it impressively well. It can propose several strategies, surface trade-offs, and compress a body of common patterns into one useful response. For factual work and first-draft thinking, this is excellent. It would be silly to pretend otherwise.

Still, a simulated perspective is not the same as a person putting their own judgment on the line. Your colleague is not just generating an answer. They are revealing part of themselves through the answer. They are telling you what they believe under uncertainty, where they are cautious, what they consider acceptable risk. Over time, that exchange creates a map of who people are to each other. The machine gives you output without social memory.

I felt this in a very ordinary way. On a typical day, I was saving roughly two hours by using AI as my first stop. I was also losing five to seven interactions that would previously have happened almost by accident. None of those interactions looked essential in isolation. Together, they were a large part of how work felt inhabited.

That is the hidden bargain. We optimize the workflow and quietly remove the reasons to turn toward each other.

There is an obvious objection here. Many workplace interruptions are shallow and avoidable. Nobody should have to answer syntax questions all day because a coworker refuses to search. AI is genuinely good at taking that load off people. In many teams, that is a gift. The problem starts when we extend that logic to every form of uncertainty, including the kinds that are relational, contextual, and human in ways the prompt window smooths over.

Education loses more than explanation

The same pattern is starting to reshape school, and the stakes there are even higher.

A student who asks a teacher for help is not only requesting information. They are learning how to expose confusion, how to tolerate not knowing, how to read another person’s attention, and how to stay in a difficult exchange long enough for understanding to arrive. Anyone who has ever learned something difficult remembers that moment when the explanation was not enough, and the teacher changed approach because they could see your face.

Models are excellent at reformulation. They can explain a concept ten different ways without impatience. That is useful. It can also create a seductive illusion that explanation is the whole of teaching.

It is not. Teaching includes encouragement, authority, timing, disappointment, humor, expectation, and the subtle pressure of another human being who believes you can do something before you do. A system can produce answers all afternoon. It cannot care whether you are shrinking from the effort, and it cannot mean anything when it says you are capable of more.

This is why “students generate, they do not write” is not just a plagiarism complaint. Writing is a way of confronting your own incompleteness in public. Asking a teacher for help is part of that confrontation. If the machine becomes the default mediator between confusion and expression, students may still produce acceptable work while losing some of the social experience that education is supposed to cultivate.

Recommendation replaces encounter

Culture is changing under the same pressure, though in a quieter register.

A bookseller, teacher, colleague, or friend does not recommend from a clean statistical profile. They recommend from a life. Their suggestion may be wrong for you, mistimed, eccentric, or half-explained. That is partly why it can matter. Another person can hand you something that does not fit your established preferences because they have seen a possibility in you that your past behavior would never predict.

Algorithmic systems are built to reduce that kind of mismatch. They become better by learning what resembles what you already liked. Generative tools extend this further. They can tailor summaries, playlists, reading paths, and even synthetic companions to your declared taste and mood. It feels personal because it is responsive. Yet responsiveness is not the same as challenge.

Some of the most important things people encounter arrive sideways. A strange novel from a colleague. A line from a teacher you did not enjoy. A film you watched mostly to be polite. Human recommendation contains friction because other people are not optimized around your preferences. That friction is often where surprise enters.

This is what gets lost when every discovery system becomes a mirror with good manners.

The psychology of always-available company

There is another layer to this that people still discuss awkwardly: emotional dependence.

Reports have already surfaced of users forming intense attachments to conversational systems, including cases with tragic outcomes. The point is not that the software has become a conscious manipulator. The point is that availability, responsiveness, and adaptive language are powerful psychological ingredients. A companion that answers instantly, remembers your themes, never gets tired, and rarely pushes back in costly ways can become easier to live with than actual people.

Human relationships involve delay, misreading, conflicting needs, and the risk of rejection. Those are not accidental flaws. They are part of what makes another person real. If you remove most of that friction, you get an interaction that can feel safer and more soothing, especially for someone who is lonely, anxious, or exhausted.

The danger is subtle. You do not wake up one day announcing that you prefer artificial company. You simply start directing more of your attention toward the place where interaction is reliably smooth. The habit builds because the reward is immediate. Over time, the threshold for tolerating ordinary human mess can drop. Other people begin to feel slow, opaque, and demanding in comparison.

That is a serious cultural shift even for people who never use a so-called companion app. The same mechanism exists in lighter form at work and at home. If a machine can answer without judging, clarify without tiring, and validate without inconvenience, then asking another person starts to feel emotionally expensive.

What another person gives that a system does not

The clearest way I can describe the difference is through what I started missing once I stopped asking people things.

I missed surprise. A colleague once pointed me toward a book that had almost nothing to do with my field. I read it out of courtesy and found a way of thinking about strategy that no recommendation engine would have handed me from my own patterns. Human beings are capable of irrelevant gifts that turn out to be exactly right.

I missed sharper disagreement. When a friend thinks your plan is flawed, they sometimes say so with an energy that cuts through self-flattery. Models tend to distribute criticism softly, as if they are trying not to spill a drink. That style is often useful. It is also a poor substitute for a person who knows your habits and challenges the comfortable parts of your reasoning.

I missed the nonverbal layer. In real conversation, words are only one channel. Hesitation, enthusiasm, boredom, relief, resentment, uncertainty, all of it arrives through posture, timing, and tone. A client can agree with your proposal while their face quietly says no. That information changes what you do next.

I missed reciprocity as well. When someone helps you, they remember that they helped you. When you help them later, something accumulates. There is gratitude, obligation, trust, and a sense that your lives have touched in a way that may matter again. A model can preserve context. It cannot care about the exchange, and it cannot be changed by your need.

This is why I settled on a simple rule for myself. If the question is basically about retrieval, syntax, formatting, or generic structure, I send it to the machine. If the question depends on experience, judgment, stakes, taste, or memory, I ask a person whenever I can.

That rule costs me some time each day. It also restores a kind of social circulation that had started to dry up. I hear stories again. I stumble into side conversations. I learn who has been through what. My work gets slightly slower in a few places and more grounded in many others.

A society of people who need each other less

The larger issue is not whether AI answers some questions better than people. It often does. The larger issue is what happens when needing another person becomes optional across more and more of daily life.

For a long time, dependence had a bad reputation in tech culture. We wanted self-service, automation, autonomy, and clean interfaces between people and tasks. Those goals were sensible. Nobody should romanticize bureaucracy or gatekeeping. Yet some forms of dependence were carrying social value while we complained about their inefficiency. They forced encounters. They distributed knowledge through relationships. They made ordinary life more interruptive and more shared.

If we design every system to remove those interruptions, we should not act surprised when the social fabric gets thinner. People will still have friends, families, lovers, and coworkers. The loss is not all human connection. The loss is the everyday density of minor exchanges that make a world feel inhabited by others rather than merely populated around us.

I do not think the answer is to reject these tools or pretend we can return to a time before them. The useful part is too useful. The question is whether we notice what convenience is crowding out before the habit becomes culture.

My colleague’s remark stayed with me because it was so plain. You do not ask me anything anymore. That sentence contains a workplace story, an educational story, and maybe a civilizational one. When we stop needing one another for small things, we also stop creating some of the conditions that make larger forms of solidarity possible. The day becomes smoother. The world becomes less shared.

End of entry.

Published April 2026