The model dropped on a Wednesday. Kate found out the way she found out about most things that would reshape her life — in a Slack thread, between a standup summary and a link to someone's farewell post.
She read the announcement on her phone in the parking lot of her daughter's orthodontist. Nora was inside getting her expander adjusted. The March sun was weak through the windshield and Kate had eleven minutes before the appointment ended, so she read the technical report the way she read most technical reports now — the abstract, the benchmarks, the eval tables. She looked at the SWE-bench numbers for a long time. Then she closed the tab and opened the weather.
That evening, at dinner, her husband Miles asked her what she thought about it. Miles was a principal engineer at a company that made compliance software for credit unions. He asked the question the way people asked questions when they already had the answer and wanted to see if you'd arrived at the same one.
"I think the evals look good," she said.
"The evals look good," he repeated, and then said nothing else, and Nora talked about a girl at school who had been caught using Claude to write a book report about a book that didn't exist, and everyone laughed, and Kate loaded the dishwasher and thought about the fact that her engineering team was forty-three people and her gross margin was sixty-one percent and those two numbers were, for the first time in her career, parsing as related.
She was the Chief Product Officer at Aldian, a B2B platform that helped mid-market industrial distributors manage pricing and quoting. Seventy-one million in ARR. The kind of company that didn't get written about in the press and didn't want to be. Their customers were people who sold fasteners and hydraulic fittings and they needed the software to work and they needed the phone answered when it didn't. Kate had been there four years. She liked it in the way she imagined long-married people liked each other — less with passion than with a deep, proprietary understanding of the thing's specific shape.
The week after the model dropped, her CEO, Raj, asked her to "take a look" at what it could do. He said it casually, at the end of a one-on-one about something else entirely, the way you'd ask someone to check if a window was locked.
She understood what he was asking.
She set up a sandbox on a Friday afternoon. She gave the model the spec for a quoting rules engine they had estimated at six weeks of work for two senior engineers. She gave it their style guide, their API docs, their test framework. She was careful and thorough because she wanted to be fair and because she was aware, in a way that felt almost physical, that she wanted it to fail.
It didn't fail. It produced the implementation in nine minutes. Not nine hours. Nine minutes. The code was clean. Not inspired — she wouldn't have called it elegant — but clean, idiomatic, well-tested. It looked like the work of a competent senior engineer who had no ego and no opinions and an unlimited capacity for following instructions precisely.
She ran the test suite. Everything passed. She read through the code line by line, looking for the kind of subtle architectural mistake that would only surface at scale, the kind of thing a staff engineer would catch in review. She found two minor issues. She told the model about them. It fixed both in under a minute and explained why its original approach had been suboptimal with a clarity that reminded her, for a disorienting moment, of the best post-mortems she'd ever read.
She closed her laptop and went outside and stood in her backyard for a while. The neighbor's kid was jumping on a trampoline. The dog needed water. She filled the dog's bowl and then she texted Raj: Can we talk Monday.
She did not text Miles.
The Monday conversation with Raj lasted twelve minutes. He had already talked to the CFO. He had already talked to the board chair. Kate understood that her sandbox experiment had been one of several, conducted in parallel, by people who had been asked casually, at the ends of meetings about other things.
Raj said the phrase "workforce planning" and Kate heard it land in the room like a piece of furniture being moved into a space that wasn't quite ready for it. They agreed to a pilot. Two teams. One running the traditional sprint cycle, one running what Raj kept calling "the new workflow," which meant one engineer and the model. They would compare velocity, defect rate, customer satisfaction. Eight weeks.
"We're not trying to prove anything," Raj said. "We're just trying to understand."
Kate nodded. She knew that when executives said they weren't trying to prove anything, the conclusion had already been reached and the process existed to make the conclusion feel arrived-at rather than imposed.
The pilot was a formality. She knew it and Raj knew it and she suspected that at least some of the engineers knew it too, because the good ones were already using the model on their own, quietly, the way people in the early days of GPS still kept an atlas in the trunk.
The results came back in six weeks, not eight, because the AI-assisted team had run out of backlog. They had completed everything assigned to them and then, without being asked, had started picking up items from the next quarter's roadmap. The solo engineer on that team, a woman named Priya who had been at Aldian for two years, told Kate in their debrief that she felt "weird." Kate asked her to say more. Priya said she felt like she was a contractor managing a subcontractor who was better than her at everything but didn't know what mattered. She said the hardest part of her day was deciding what to ask for. She said she went home less tired but more drained.
She felt like a contractor managing a subcontractor who was better than her at everything but didn’t know what mattered. The hardest part of her day was deciding what to ask for.
Kate wrote that down. Then she put it in a drawer, mentally, because the velocity numbers were very clear and the defect rate was lower and the customer satisfaction scores were identical, which meant that no one on the outside could tell the difference, which was the only finding that actually mattered.
In May, Raj made the decision. They would restructure engineering from forty-three people to nineteen, then to twelve by end of year. The reductions would be called a "reorganization around AI-native development." Kate was asked to lead the communication plan. She said yes because it was her job and because saying no would not have prevented it from happening and would have merely meant that someone else did it with less care.
She told Miles that evening. He was grilling chicken thighs and listening to a podcast about the Ottoman Empire. She stood in the doorway and said, "We're cutting thirty-one people by December."
Miles turned off the podcast. He didn't turn off the grill.
"How do you feel about it?" he asked, which was the question therapists asked, and she understood that he was asking it carefully because some version of this conversation was happening in his own building, in his own Slack threads, and whatever she said next would also be a data point in his own private calculation about how afraid to be.
"I think the work will still get done," she said.
"That's not what I asked."
"I know."
The first round of layoffs happened on a Tuesday in June. Kate sat in on every call. Raj had offered to handle them, but she said no. These were her people. She had hired nine of them personally. She had written their performance reviews and approved their promotions and she felt, with a conviction that she recognized as possibly irrational, that it mattered who said the words.
The conversations were fifteen minutes each. She had a script. HR was on the line. The severance was generous — sixteen weeks, extended benefits, outplacement support that included, and she noted this with the part of her brain that couldn't stop pattern-matching, an "AI upskilling" module. Several people cried. One person, a backend engineer named Daniel who had been at the company for seven years, asked her a question she thought about for months afterward.
"Do you think I did something wrong?" he asked.
"No," she said. "You were excellent."
"Then I don't understand," he said, and she could hear that he genuinely didn't, that the framework he had built his career inside of — work hard, get better, be reliable, be excellent — had quietly been made irrelevant while he was inside it, following its rules.
After the last call she drove to the grocery store because they were out of milk. In the parking lot she sat in her car for five minutes with the engine running. Then she went inside and bought milk and a bag of clementines and a bottle of wine that was more expensive than what she usually bought, and she understood that the wine was a small, private severance she was paying herself.
By September the company felt different in ways that were hard to describe and easy to measure. Revenue was up eleven percent. Engineering spend was down forty-four percent. Gross margin had crossed seventy percent for the first time in the company's history. The board was pleased. Raj used the word "inflection" on the Q3 call and no one pushed back on it because the numbers made it true.
Kate's product org had changed shape. She still had product managers, though two of the four had been quietly moved into what the company was calling "solution architecture" roles, which meant they spent most of their day specifying requirements with a precision that had previously been unnecessary. The specs had to be exact now. Not because the model couldn't handle ambiguity — it could, often better than a junior engineer — but because ambiguity introduced variance, and variance meant reviewing more output, and reviewing output turned out to be the new bottleneck. Kate found herself telling her PMs things she had never said before, things like "be more literal" and "don't leave room for interpretation." She was aware that she was training her team to write for a machine the way technical writers had once trained themselves to write for an audience, and she wasn't sure whether this was a skill or a loss.
The twelve remaining engineers were different people than the forty-three had been. Not literally — eight of them had been part of the original team — but their jobs had changed so completely that their titles felt like artifacts from a previous era, like calling someone a "computer" because they used to do arithmetic by hand. They reviewed code. They evaluated architectural decisions. They maintained a sense of the system as a whole, which the model could not do, or could do only in the way that a very detailed map can represent a city — accurately, comprehensively, without understanding what it's like to live there.
Priya, who had been on the pilot team, was now the most senior engineer. She had a new title — Staff Engineer, AI Systems — and a salary that was fifteen percent higher than before, and she told Kate in their October one-on-one that she was thinking about leaving.
"Why?" Kate asked.
"I don't build anything anymore," Priya said. "I just decide things and check things. I feel like a building inspector. I used to be an architect."
Kate didn't try to talk her out of it. She understood the feeling, or a version of it. Her own job had become less about shaping a product and more about shaping a process. The product was fine. The product was, by most measures, better than it had been. Features that would have taken a quarter now took three weeks. The backlog, which had been a constant source of tension between her and sales for four years, had effectively disappeared. Customer requests went from intake to production in days. She should have felt liberated. Instead she felt the way she imagined a conductor might feel if the orchestra could play perfectly without being conducted — still standing at the podium, still holding the baton, but aware that the music would sound the same if she put it down.
She should have felt liberated. Instead she felt the way she imagined a conductor might feel if the orchestra could play perfectly without being conducted — still standing at the podium, still holding the baton, but aware that the music would sound the same if she put it down.
Miles was laid off in October. His company didn't use the word "layoff." They called it a "strategic realignment of engineering investment." He came home at two in the afternoon on a Thursday and Kate was on a call and she saw him through the glass door of her home office, standing in the kitchen, still wearing his badge, and she knew before he said anything.
That night they sat on the patio after the kids were asleep. Miles was calm. He had sixteen weeks of severance and they had savings and the mortgage was manageable and he said all of this in a measured way, like he was reading from the same script Kate had read from in June, and she realized that everyone was reading from the same script now, that there was a shared language for this experience that had developed with the speed and efficiency of the technology that had caused it.
"I'll find something," he said.
"Of course you will," she said.
But they both knew the market had changed. Kate saw the resumes. She was hiring for one role — one — a senior engineer to manage their AI pipeline, and she had received two hundred and forty applications in the first week. A third of them were from people with ten or more years of experience. Staff engineers. Directors. People who had, eighteen months earlier, been turning down recruiters. She interviewed five and hired one and the other four sent gracious follow-up emails that she could tell had been written by the same model that had taken their jobs, and she thought about that recursion for a while and then stopped thinking about it because there was nowhere productive for the thought to go.
Nora, who was fourteen, asked at dinner one night if she should still learn to code. She was taking AP Computer Science in the fall. She asked it casually, the way she asked most important questions — while looking at her phone, as if the answer didn't matter.
Kate said yes. Miles said yes. They said it in unison, automatically, and then caught each other's eyes, and in that glance was the shared recognition that they had answered from instinct rather than conviction, that they had given the parental answer rather than the honest one, and that they would not revisit it in front of her.
Later, after Nora was in her room, Miles said, "What would you have said if she'd asked whether she should become a software engineer?"
Kate rinsed a glass. "She didn't ask that."
"She did, though."
Kate set the glass on the rack. She could hear the neighbor's sprinkler system start up, the programmed click and hiss of water hitting concrete before finding the grass.
"I would have said it depends on what kind of engineer she wants to be," Kate said, and she heard how insufficient that was even as she said it, how it was the kind of answer that deferred the real question into a future that was arriving faster than the deferral assumed.
In January 2028, a model came out that could do what Kate did.
She learned about it the same way she'd learned about the first one — obliquely, in passing, through a link someone shared without commentary, the way people had started sharing these announcements, the way you'd forward an article about a storm that was still offshore.
It wasn't marketed that way, of course. The release blog post talked about "autonomous product reasoning" and "strategic prioritization from unstructured inputs." But Kate read the technical demos and understood what she was looking at. You could give it a corpus — customer interviews, usage data, support tickets, competitive analysis, revenue figures — and it would produce a product strategy that was, if not good, then coherent in a way that would have taken her team two weeks of synthesis to achieve. It could write PRDs. It could prioritize a roadmap. It could look at a feature request and trace its implications through a system architecture and a pricing model simultaneously, which was something Kate had always considered a rare and specifically human skill, the ability to hold business logic and technical logic in the same thought.
She didn't set up a sandbox this time. She didn't need to. She read the demos and understood them the way a doctor understands a diagnosis — not with surprise, but with the grim clarity of a pattern completing itself.
What she noticed, in the weeks that followed, was how quickly the question changed. In 2027, the question had been can AI do this work? By early 2028, the question was why would a human do this work? It was a different question. The first one was empirical. The second one was existential, and it didn't have a technical answer, and people kept trying to give it one anyway.
Raj scheduled a board meeting for February. Kate prepared materials. She was not asked to present a strategy for integrating the new capabilities. She was asked to present a "forward-looking organizational design," which she understood to mean a plan for her own diminishment, and she prepared it thoroughly because she was good at her job and her job now apparently included drafting the spec for her own replacement.
The board approved a structure in which Kate's four-person product team would become Kate and a model. She would retain the title. She would retain the salary. She would be, in the language of the approved plan, "the human in the loop." She had used that phrase herself, months earlier, about her engineers. She noted the irony with the detachment of someone who had used up their capacity for that particular feeling.
What surprised her was how little her day-to-day changed. She still made decisions. She still talked to customers. She still sat in meetings with Raj and the head of sales and debated pricing tiers and contract terms. But the texture of her work had shifted. She no longer produced analysis; she evaluated it. She no longer built roadmaps; she approved them. The model generated options and she selected from them, and the selection felt meaningful in the moment and somewhat arbitrary in retrospect, because the options were usually close together, and the model's reasoning was usually sound, and she was increasingly aware that her "judgment" was becoming a thin layer of human preference spread over a substrate of machine competence, like butter on toast that didn't need it.
Around this time she started noticing things outside of work.
The woman who cut her hair had been replaced by a different woman. Kate asked what happened to Vanessa. The receptionist said Vanessa was "transitioning." Kate assumed this meant she'd moved to another salon until the new stylist, making conversation, mentioned that her previous salon had closed because the booking platform they used had started offering AI-generated style recommendations that linked directly to tutorial videos, and half their clients had started cutting their own hair. She said this cheerfully, the way people say things that would have been strange a year ago and have already become part of the scenery.
Nora's AP Computer Science class had been restructured. They were no longer learning to write code. They were learning to "specify and validate computational solutions," which was a phrase that appeared in the new AP curriculum framework and which meant, in practice, that Nora was learning to describe what she wanted a program to do and then evaluate whether the generated program did it. She seemed to enjoy it. She was good at it. Kate watched her do homework one evening — Nora talking to her laptop, refining a specification for a sorting algorithm through conversation, never touching a keyboard — and felt something she couldn't name, a feeling adjacent to pride but shadowed by something else, a sense of watching her daughter become fluent in a language that had no literature yet.
Miles had not found a job. He had interviews. He had second interviews. He had conversations with recruiters who were enthusiastic and then went quiet. The market for senior engineers had not collapsed — it had inverted. Companies wanted one or two very senior people to oversee AI-driven development, and they wanted them at a fraction of the previous cost, because the leverage had shifted. Miles was competing for roles with people who had been VPs. He started taking contract work, reviewing AI-generated codebases for compliance and security, which paid well by the hour but had no trajectory, no arc, no sense of building toward something. He did it at the kitchen table and Kate could see him through the glass door of her office the way he had once seen her, and neither of them commented on the symmetry.
One evening in March — a full year since the first model — they went to a dinner party at their friends David and Lina's house. David was a corporate attorney. Lina was a radiologist. The dinner conversation moved to AI the way dinner conversations now always moved to AI, with the gravitational inevitability of water finding a drain. David said his firm had reduced its associate class by sixty percent. Lina said her group had been "restructured" around a diagnostic model that was outperforming second-year residents on imaging reads. She said it casually. Everyone said everything casually now.
No one said “AI” the way no one at a dinner party in 2009 said “underwater.”
Kate drank her wine and listened and noticed that the conversation had a quality she recognized from early pandemic dinner parties — the same performed normalcy, the same careful calibration of alarm, the same unspoken agreement not to say the thing that everyone was thinking, which was: I don't know what I'm going to be doing in two years.
Driving home, Miles said, "David looked terrible."
"I know," Kate said.
"Lina too."
Kate turned onto their street. The houses looked the same. The lawns were mowed. The porch lights were on. Everything looked the same and she thought about how long everything can look the same after the thing that made it possible has changed, the way a bridge can look solid for years after the supports have corroded, the way you don't know it's failing until something crosses it that it can no longer hold.
The summer of 2028 was when it stopped being about technology and started being about time.
A new model dropped in June. Another in August. Kate stopped reading the technical reports. She read the summaries, the way people read storm warnings — not for the meteorological detail but for the trajectory, the estimated landfall, the list of things to bring inside. The August model could manage projects. Not assist with them. Manage them. It could take a quarterly objective, decompose it into workstreams, assign tasks, anticipate dependencies, adjust timelines based on real-time progress, and escalate risks with a judgment that Kate's best program managers would have called excellent if it had come from a person. It didn't come from a person.
Aldian's headcount was thirty-one. It had been a hundred and fourteen eighteen months earlier. Revenue was up twenty-three percent. Kate had stopped tracking the margin because looking at it made her feel something she didn't want to examine — not guilt, exactly, but a cousinal feeling, the awareness that she was standing on ground that had been cleared of people she knew and that flowers were growing there now and that the flowers were very beautiful.
She managed a team of three. Herself and two others — a designer named Soo-yun who had been with the company since its Series A, and a new hire, a twenty-four-year-old named James whose title was "AI Product Strategist" and whose actual job, as far as Kate could tell, was to be young enough to not find any of this strange. James had never worked in a product org that had more than five people. He had never written a PRD that wasn't co-authored by a model. He was competent and pleasant and Kate sometimes watched him work with the feeling of an immigrant watching a native speaker — admiring the fluency, aware of the accent she could never quite lose.
In July, Raj told her that the board was considering an acquisition offer. A larger company, a competitor, wanted to buy Aldian. The multiple was extraordinary — fourteen times ARR, almost double what it would have been two years earlier. Kate asked why the premium. Raj said it was because Aldian's cost structure was "best in class," which meant that they had cut deeper and faster than most and the resulting margin profile made them attractive in the way that a lean animal is attractive to a predator or a mate, depending on your perspective.
Kate asked Raj what would happen to the team. Raj said the acquirer had their own product function. He said this looking at a point slightly to the left of her face.
She understood.
That same month, Nora got her first job. She was fifteen. She worked at a frozen yogurt shop in a strip mall off Lamar. Kate drove her to her shifts and watched her tie her apron and thought about how the shop was one of the few places where the work was still what it appeared to be — a person standing behind a counter, handing things to other people, the transaction irreducibly physical, irreducibly human. The shop's accounting was automated. Its inventory management was automated. Its scheduling was automated. But someone still had to hand a cup of yogurt to a nine-year-old, and Kate found this comforting in a way that embarrassed her slightly, the comfort of a person clinging to the particular.
Miles had stopped looking for engineering roles. He was doing consulting work, which meant he was doing what Kate was doing — evaluating AI output, making judgment calls, providing "the human layer," which was a phrase that had entered the lexicon the way "gig economy" had a decade earlier, describing a new reality in language that made it sound intentional. He was making less money than before. Not dramatically less. Enough less that they had renegotiated the family's unspoken budget — fewer dinners out, the cheaper wine again, a conversation about whether Nora really needed the orthodontist visits to continue at the current frequency that was about money but was conducted entirely in the language of dental health.
The acquisition closed in September. Kate received a retention package that would keep her employed for six months while the "integration" proceeded. She knew what integration meant. She had been on the other side of it. She had been the person writing the integration plan, deciding which roles were "redundant" and which were "critical," and she knew that these words were not descriptions but decisions dressed up as observations.
On her last real day — the day her responsibilities officially transferred to a product lead at the acquiring company, a man she'd met twice on Zoom who seemed competent and tired — she cleaned out her desk. She didn't have much. A mug. A notebook she hadn't written in for months. A framed photo of Nora and her younger son, Ben, at a pumpkin patch, squinting into the sun. She put these things in a canvas bag and walked through the office, which was mostly empty because most people worked from home now and the ones who came in sat in a open room that had been designed for eighty and held, on a busy day, nine.
Soo-yun was there. They hugged. Soo-yun said, "What are you going to do?" and Kate said, "I don't know yet," and Soo-yun nodded in a way that meant she was about to say the same thing about herself soon and they both knew it.
Kate drove home. It was a Tuesday. The drive took fourteen minutes. She passed a billboard for an AI tax preparation service that said Your Life Is Complicated Enough and a closed Mattress Firm and an open Mattress Firm and a new restaurant she'd been meaning to try and a gas station where a man was standing next to his truck staring at his phone the way everyone stared at their phones now, with the posture of a person receiving information they hadn't asked for and couldn't stop.
She got home and put the canvas bag on the kitchen counter and let the dog out and stood in the backyard. It was warm. September in Austin. The heat hadn't broken yet and the grass was dry and the sky was the particular blue of central Texas in early fall, a blue that didn't know or care what the economy was doing, and she stood there for a while, in the yard, in the afternoon, on a Tuesday, and she thought: I am forty-four years old and I am very good at something that no longer requires a person.
“I am forty-four years old and I am very good at something that no longer requires a person.”
She went inside and started making dinner because the children would be home soon and they would be hungry and that, at least, was a problem she knew how to solve.
The retention period was a strange season. She went to work three days a week and did very little. The acquiring company had its own product function, its own AI pipeline, its own version of James. Kate attended meetings where she was introduced as "Kate, who's been invaluable during the transition," which was the corporate equivalent of a retired jersey — honorific, non-functional, hung where people could see it on their way to doing the actual work.
She used the time the way she imagined academics used sabbaticals. She read. She took walks. She noticed things.
She noticed that the Starbucks near Aldian's office had reduced its hours. Not closed — reduced. Open at six, closed by one. She asked the barista, a man named Tomás who had made her oat milk latte for three years, what had happened to the afternoon shift. He said there wasn't enough foot traffic anymore. The office park that fed the Starbucks had four buildings. Two were fully leased. One was half-empty. One had a banner that said FLEXIBLE WORKSPACE SOLUTIONS which Kate understood to mean it was empty and pretending not to be.
She noticed that Nora's school had cut its robotics program. Not for lack of interest — for lack of funding. The district's commercial property tax revenue had dropped for the first time in eleven years, and the cuts were distributed in the way cuts are always distributed, which is to say they fell on the things that mattered most to the fewest people with the loudest voices, and robotics did not have loud voices. Nora was upset. Kate wrote an email to the school board and then deleted it because she couldn't figure out what she was asking for.
She noticed that the conversations at school pickup had changed. A year earlier the other parents had talked about renovations and vacations and youth soccer politics. Now they talked about those things and also, underneath, around, in the negative space, they talked about work in a new way. Not complaining about bosses or hours — the old liturgy — but something more tentative, more careful. A mother she knew slightly, a woman named Christine who had been a marketing director at a SaaS company, was now running an Etsy shop selling hand-poured candles. She talked about it with the particular enthusiasm of a person who needed you to understand that this was a choice.
She talked about it with the particular enthusiasm of a person who needed you to understand that this was a choice.A father named Greg, who had been a financial analyst, was getting his real estate license. He mentioned it once, briefly, and never brought it up again, and Kate could feel the weight of what he wasn't saying, which was that the real estate market in Austin was softening because the people who had driven it — the tech workers, the remote employees, the dual-income couples with equity compensation — were spending less and saving more and some of them were leaving, moving to smaller cities, to parents' houses, to places where the math still worked.
She noticed a new store on South Congress. It was called Handmade and it sold things that were made by people and that was its entire value proposition. The sign in the window said Everything here was made by a human being. It sold ceramics and cutting boards and leather wallets and knitted scarves and the prices were high and the store was always full and Kate went in once and looked at a hand-thrown coffee mug that cost forty-five dollars and thought about the fact that "made by a person" had become a luxury designation, like organic or free-range, a signal of something that used to be the default and was now a choice you made with your wallet to preserve a world you weren't sure still existed.
She noticed that Miles had started running. Not for exercise — Miles had always exercised — but with a new seriousness, a new structure. He ran every morning at six. He mapped his routes. He tracked his splits. He came home flushed and precise and she recognized what he was doing because she had done it herself in earlier seasons of uncertainty: he was building a domain where effort still correlated with outcome, where the inputs and outputs still made sense, where he could be better on Wednesday than he was on Monday and the evidence was incontrovertible.
In November, Kate went to Thanksgiving at her sister's house in Houston. Her sister, Laura, was a pediatrician. Her brother-in-law, Kevin, managed a logistics company. The dinner table included Laura's mother-in-law, who was a retired schoolteacher, and Kevin's brother, who sold commercial insurance, and his wife, who had just been laid off from an accounting firm that had reduced its audit staff by half.
The conversation was careful. No one said "AI" the way no one at a dinner party in 2009 said "underwater." But the shape of it was everywhere — in Kevin's brother talking about how his clients were renegotiating their coverage because their headcounts were down, in Laura mentioning that the pediatric practice was fine because "you still need a person to look in a kid's ear," and the way she said still hung in the air for a moment, the word doing more work than she'd intended. Kate's mother-in-law, who was eighty-one, asked what everyone was so worried about. She said it kindly. She said, "There have always been changes. We had the internet. We had computers. People adjusted."
The difference between this and the internet wasn’t the technology, it was the speed — the interval between “this might affect my job” and “this has replaced my job” had compressed from a decade to a quarter.
No one argued with her. Kate watched the table absorb the comment the way a body absorbs a mild shock — a flinch, a reset, a decision to continue. She thought the old woman might be right. She thought she might also be wrong in a way that was impossible to explain without sounding hysterical, because the difference between this and the internet wasn't the technology, it was the speed, it was the fact that the interval between "this might affect my job" and "this has replaced my job" had compressed from a decade to a quarter, and human beings could adapt to almost anything but not to everything at once and not this fast.
She helped wash dishes. Through the kitchen window she could see the kids in the backyard, Nora and Ben and their cousins, playing a game that involved a football and rules that seemed to change every few minutes by mutual, chaotic agreement. She watched them for a while. They were not thinking about the economy. They were not thinking about the future. They were adjusting the rules as they went and it was working and everyone was still playing.