Everyone is telling you to upskill.

LinkedIn says the skills used in most jobs will change by 70% before 2030. The World Economic Forum estimates 39% of core skills will shift in the same window. Corporate training budgets are climbing. Coursera, Udemy, and a dozen platforms you've never heard of keep showing up in your feed. "Future-proof your career" has become background noise, the kind that makes you feel like you're already behind if you stop to think about it.

So you learn. You take courses. You watch conference talks. You earn certifications. You spend weekends on things that feel responsible. You're not slacking off. Nobody could accuse you of standing still.

But then Monday morning comes. You're in a meeting, presenting work that's the same quality as last year. The logic is sound. The slides are clean. No one pushes back. And yet the room feels different. Six months ago, the same presentation would have drawn a "Can you walk us through that again for the rest of the team?" Today, you get "Great, thanks" and the meeting moves on. The temperature in the room dropped by one degree. You can't point to what changed.

On the commute home, you open a search bar. Type a few words. Delete them. Type different words. Close the app. You don't know what to search for, because nothing specific went wrong. You're doing the same work, at the same level. Calling it "skill decay" doesn't match the feeling.

If any of this sounds familiar, you probably don't have a learning problem.

I'm a Notion Certified Consultant. I help companies design how work flows through their organizations. And I spent the second half of last year stuck in exactly this feeling. My consulting work hadn't gotten worse. My clients weren't complaining. But the responses were getting lighter, in a way I couldn't name. For six months, I assumed the problem was me, that my skills had gone stale. It didn't quite fit, but I didn't have a better explanation.

The explanation, when it came, pointed somewhere I wasn't looking. What had gone stale wasn't the skill. It was the question the skill was aimed at.

This issue traces the shape of that disappearance. Not a story about needing to study harder or learn new tools. A story about what happens when the question your answer was built for quietly moves to a place your answer can't reach.

For me, it crystallized on the train ride home from a client meeting.

I'd been talking about Notion. How to design database relations, how to set up views, how to keep the whole thing from rotting in daily use. Six months ago, that same talk would have gotten a "Yes, that's exactly the problem we've been having." Same slides, same order, same precision.

Today, the nods were lighter.

Not dismissive. The client was smiling, taking notes, asking questions. But six months ago the questions were "Could we apply this to the sales team too?" Leaning-forward questions. Today they were "So this toggles inside the view?" Checking questions. The type of question changed. The nods were just one centimeter shallower. Not a single word in my explanation had gotten worse. My tongue hadn't dulled. Only the reach had shrunk.

On the train home I started typing "Notion latest features" and stopped. I hadn't lost on features. My precision was intact. My materials were intact. Something was missing, but I couldn't find what. Because nothing had degraded. I was giving the same answers, at the same quality. Calling it "my skill went stale" didn't fit the feel.

Not stale, but not landing. Not worse, but lighter.

Let me swap one word. The answer didn't rot. It's swinging at empty air. Like a practice swing: the form is the same, the body is the same, but there's nothing at the point of contact. The ball moved to a different spot six months ago.

So what moved?

Same person, same day, opposite results

I want to bring in one piece of research here. Not as numbers, but as a physical sensation.

758 consultants at a large firm were given 18 tasks (Dell'Acqua et al., HBS WP 24-013 / Organization Science 2026). Half used AI, half didn't. Same firm, similar training, similar people. On one type of task, the AI-assisted group's quality rose 40%. Speed rose 25%. Among those who started with lower skill, the jump was 43%.

If it stopped there, the story would be simple: AI makes people better. Ordinary conclusion.

Here's the thing. When the same people tackled a different type of task, the AI-assisted group's accuracy dropped by 19 percentage points.

Same people. Same day. Same AI. Same training. The only variable was the type of task. A 40% boost flipped to a 19-point drop. Try explaining that with skill. You can't. The person didn't change.

As numbers, it looks strange. But translate it back to the lighter nods, and it makes physical sense. Same talk, same precision, but on one particular day the grip just doesn't land. That feeling.

What happened? The task sat in a different place. Inside a certain line, AI and humans together can crack the question. The moment a question crosses that line, the same approach starts swinging at air. The line is drawn not inside the body but on the question's side.

The line is hard to predict from the outside. If you could tell in advance which tasks fall inside and which fall outside, nobody would struggle. You can't, so you charge in with the same approach, and only the result flips.

Researchers call this line the jagged frontier. A jagged boundary. I like the term, because the line isn't straight. One task is inside, the next is outside, the one after that is inside again. It zigzags. From the outside it all looks like "the same knowledge work," yet the grip vanishes only when you cross the line.

When I first encountered this term, the first thing I thought of was that day with the lighter nods.

The ball was the thing that moved

Back to that day. My explanation hadn't degraded. The client's ears hadn't gone bad. What moved was the position of the question this person wanted answered. Six months ago, the question was "Which features, in what order?" Today, the question was probably something else. It didn't have a name yet. But its location had definitely shifted.

The skill didn't move. The question did.

It took me six months to land on that sentence. Six months of thinking "I need to study harder" on the train home, not knowing what to study, and finally arriving at this phrasing. The arrow I'd been pointing at myself suddenly swung somewhere else.

The answer is still in my hands. But the question it was aimed at is no longer there. Instead of "I need to relearn," it becomes "Where did the question my answer was aimed at move to?"

Shift one arrow, and what to do next changes. Before deciding what to learn next, there's something to do first.

This is where the paid section begins.

Before deciding what to learn next, how do you check where your existing answers are aimed? I'll share three questions I use in 30 seconds whenever the urge to rebuild a learning plan hits, and a Notion database template you can drop straight into your workspace. These aren't a precise diagnostic. They're more like a habit of looking at "what moved" before looking at "what to learn." This applies to knowledge work; care work and physical trades are a different story.

logo

Subscribe to Default to read the rest.

Become a paying subscriber of Default to get access to this post and other subscriber-only content.

Upgrade

Keep Reading