The Training You Took Six Months Ago Is Already Outdated
The average American company spent $1,254 per employee on training last year. AI skills were the fastest-growing category. Seventy-five percent of organizations plan to spend even more on AI training next fiscal year. Reskilling budgets keep climbing. Certifications keep multiplying. Companies are investing more in knowledge transfer than ever.
Here is what no one puts in the budget memo: half of what gets taught in those sessions is outdated within six months.
The best practice from March is obsolete by September. The certification you earned last quarter covers a software version that no longer exists. The onboarding manual peaked in accuracy the day it was printed. After that, quiet decay.
Was the training wasted? Was the teaching bad? Probably not. The problem was not how the knowledge was delivered. It was what was being delivered. I spent nearly a decade as a certified Notion consultant, confidently handing over right answers at 50,000 yen an hour. It took one 30-person company to show me that those answers were perishable goods.
Six months of careful work
Two months of design. One month of migration. Three months of training. I built procedures, created templates, handled every edge case one-on-one. Each property was customized to fit that company's actual workflows. "Use a relation here instead of a select." "Set this view filter this way, and you will see only today's tasks when you open Notion each morning." One by one, over six months, I stacked these small, correct decisions.
On the last day, I watched everyone working inside Notion. Honestly, I felt a real sense of accomplishment. Six months of careful construction, running. I looked at that screen, then sent my invoice.
Three months later, they were back on spreadsheets.
I did not go check on them. The project manager emailed me. Subject line: "Quick question." The body: "We are thinking of going back to the old way." When they shared their screen, the database I had designed was still there. Last modified: two months ago. Next to it sat the familiar spreadsheets. No one had said anything. They had quietly drifted back.
I was one of the first certified Notion consultants in Japan. I billed 50,000 yen an hour. I was confident in my knowledge, and I think I delivered it with care.
"We did exactly what you taught us"
At first, I thought my teaching was the problem. Maybe I should have been more thorough. Maybe the documentation was not detailed enough. Maybe I needed more sessions. But no matter how many times I replayed it, that was not it. The documentation had been read. The notes were still there. It was not that they did not understand.
What I noticed was worse.
The "right design" I had spent six months delivering had gone stale. Notion had shipped several updates since the end of training. The property setup procedures I taught had changed. View behavior was different. Relation mechanics had shifted. Some of what I had confidently handed over as "the right answer" was no longer right.
The teaching was not bad. The answers had an expiration date.
This is not just a Notion story. The expense reimbursement flow someone taught you in onboarding: six months later, the system changed, and the procedure broke. Someone ran another training session for the new system. AI knowledge decays even faster. The prompt engineering advice from six months ago does not work the same way now. The model changed. The shelf life keeps shrinking.
The people at that company had followed my instructions faithfully. They set up properties the way I told them. They used views exactly as I showed them. They followed the rules. It stopped working. Because the right answer had moved. The more faithfully you follow an old answer, the more it locks you in. When someone told me, "We did exactly what you taught us, but it is not working," I did not know what to say. They did everything right. The answer itself had rotted.
I was selling right answers at 50,000 yen an hour. If those answers had a three-month shelf life, I was selling perishable goods. Beautifully packaged, carefully delivered, rotting in the fridge within ninety days. Worse, the rot was invisible. People kept using the old answers, feeling something was off but assuming they were still correct.
Perishable goods at premium prices. Realizing that hit hard.
Last month, someone reached out: "I asked ChatGPT how to design my Notion workspace, built it exactly as it said, and it does not work." I looked at it. The design was fine. Property types were correct. Relation direction was right. But it did not match how their business actually flowed. The answer was correct. The context was wrong.
"Why do you think it is not working?" I asked. A pause. "I do not know. ChatGPT said it was right."
If the answer is correct, it should work. But it does not. So you ask AI again. Same answer. Still does not work. We are in an era where correct answers arrive in five seconds. The cost of acquiring information has dropped to nearly zero. But when everyone holds the same correct answer, something happens: some people make it work and others do not. Same answer, different outcomes. The ones who succeed chew on the answer in their own context. "In our case, this part works differently, so I would adjust it here." That extra step of rethinking is what separates the two groups. Not the quality of the answer. Whether someone stopped to rethink it for their own situation.
As answers get cheaper, the price of thinking goes up. As the speed of decay increases, the value of knowing how to think increases with it.
So what can you deliver that does not rot?
Over the years, seventy percent of the consulting requests I received were the same conversation. "How should I design my database?" "I do not understand when to use which view." "Can you check my permission settings?" Every one of these has an answer. I can deliver it. The moment I hand it over, the client feels relieved, and I feel like I have done my job.
But that answer has a short shelf life. Next year's Notion is not this year's Notion. Next year's AI is not this year's AI. What I call "the right answer" today might be unusable in six months. Do I keep delivering it, knowing that?
There is a specific discomfort in selling right answers. The client says, "Thank you, that was so helpful." Meanwhile, somewhere in the back of my mind: this will be outdated in six months. I push that thought aside. Move to the next consultation. Deliver the same kind of answer. Get thanked. It rots in six months.
If something survives, it is not the answer itself. Answers change. Tools change. Best practices get updated. So what stays intact even after the update?
I think the most important point has already landed. Right answers have a shelf life. Tools and workflows change every few months.
What follows is about what did not rot. Among all the workplaces I have seen, there were people who were fine when the right answer expired. In an era where AI gives answers away for free, what survives? And what I started doing after I stopped selling answers.
The rough database that outlasted my templates
There were people who stayed fine when the right answers went stale.
Something interesting happened at a manufacturing company I worked with. I went through my usual process: explaining, proposing designs, building a migration schedule. But one person had started using Notion on her own. She worked in accounting. She had begun entering her own expense reports into Notion, her own way. No one knew. No templates. No help docs. She just kept touching it and shaping it herself.
One day, the person sitting next to her asked, "What is that?" She showed him. He said, "Could we do our purchase order tracking like that?" It spread from there. Not my carefully designed templates. Her database, which was structurally rough, started getting used across the team.
Three months later, her team was the only one where Notion had stuck. The other teams had gradually drifted back from my polished designs to their old ways.
Here is what was interesting. When Notion shipped an update, the other teams said, "The procedure changed, I do not know what to do," and stopped. My right answers got old, and they could only wait for the next right answer. But her team adjusted on their own. "Something broke, so we tried changing it this way." They were not relying on handed-down answers. When the answers expired, they did not break.
What rotted was the right answers I delivered. What did not rot was the stance: touching it yourself, thinking about it yourself. That stance lasts far longer than any answer.
With training, AI, books, anything you receive and use as-is, you stop the moment the right answer changes. Only what you have touched with your own hands, broken, and rebuilt survives the next change. Learning how to touch things lasts longer than learning the right answer.
But touching things yourself was not always enough.
Three lines and a screenshot beat my explanation
Something similar happened in the Noticon Discord.
I wrote a detailed thread explaining how to set up relations. Diagrams included. Accurate, I think. The response was flat. People read it. No one said anything.
That same week, a member posted: "I tried this and it worked." One screenshot and three lines. The thread came alive. Questions followed. Someone else replied, "I tried it too and it worked."
Honestly, it stung a little. But what those three lines delivered was different from what my explanation delivered. My explanation said: "Here is the right answer." Those three lines said: "I tried, I got stuck, and here is how I got it working." Not an answer. A trace of thinking. The people who received it could follow the trace and try it themselves.
When you hand over a right answer, the recipient uses it. When it goes stale, they stop. When you hand over a trace of thinking, the recipient thinks. When the answer goes stale, they go find the next one on their own.
The next time you learn something, try sharing not "here is what worked" but "I tried this, got stuck here, changed this, and then it worked." That alone turns a delivered answer into something that sticks. The unfinished process stays in people longer than the finished answer.
Touching things yourself. Sharing the process. Those two carry you a long way. But there was something even those could not reach.
"Something feels wrong"
Let me tell you about why I stopped doing Notion consulting.
There were several reasons. The decisive moment came during a conversation with a construction site manager.
He came to me for Notion advice. As I was walking him through my recommendations, he said, mid-explanation:
"Something feels wrong."
He could not say why. His gut told him something was off.
At first I thought it was the usual resistance to a new tool. But as I listened, I realized it was something else. This man had spent twenty years watching people and materials move on job sites. The design I proposed was correct from a data-flow perspective. But it did not match the order in which people actually moved on his site.
Specifically, I had placed database entry at the start of each process. Open Notion in the morning, enter today's tasks first. The data would flow cleanly. But on his site, the first thing people did each morning was check the previous day's progress. New task entry came after. My design meant the first screen every morning said "enter something." Twenty years of morning routine, disrupted.
His instinct was right. I had been designing around data flow. He was seeing human movement. Same Notion screen. Completely different things being observed.
That "something feels wrong" was not a judgment framework or a way of thinking you could teach. It was something that had seeped into this man over twenty years on-site. The way the morning light hit the materials. The order in which people moved. Things with no words that had always been there. No matter how carefully I designed, I could not hand that over from the outside. It was never something that could be delivered.
When you receive a new tool's instructions, when you memorize a procedure from training, when you get an answer from AI: if something feels wrong, do not ignore it. That discomfort is something in your context reacting. An answer can be correct and still not fit your situation. That judgment only comes from inside you.
What I did not deliver is what lasted
Looking back, something interesting emerges.
The accounting person who built the rough database on her own. The Discord member who posted a screenshot and three lines. The construction site manager who said "something feels wrong." All three worked. None were things I designed.
The accounting person started touching Notion without being told. The Discord member shared his own trial and error instead of the right answer I had written. The site manager rejected my design. All three happened naturally.
Here is the uncomfortable part. Everything I carefully designed and delivered rotted. What I did not deliver is what lasted. When you design with intention, you end up back in the same structure: me delivering the right answer. The more precisely you deliver, the more the recipient becomes "a person who receives." A person who receives stops when the right answer goes stale.
Touch it yourself. Share the process. Trust your discomfort. These three were not delivered to anyone. They came from inside each person.
What does not rot
So I stopped.
Honestly, delivering answers was easier. "This is the right answer," and the client relaxes. You get thanked. "What an expert," they say. The invoice is straightforward. Right answers are easy to price. But that relief rots in three months. The "what an expert" fades in three months.
In the Noticon Discord, I once threw out a question without giving an answer, and someone replied, "So what should I do?" Months later, in a different thread, someone wrote: "I tried that question-framing thing you mentioned, and it worked." The answer was not something I delivered. It was something they found on their own.
Maybe stopping was itself an answer. I could have kept looking for better ways to deliver. But the better I got at delivering, the more the other person stayed fixed as a receiver. That had been bothering me for a long time.
If you are receiving things right now from training, from manuals, from AI: is there something in there that will not stick unless you touch it with your own hands? Among the things you are using as-delivered, is there something that will not actually work unless you chew it over in your own context?
I stopped selling right answers that rot in six months. I am looking for what does not rot. Still looking. But writing about the search here every week might itself be something that does not rot. At the very least, it feels like more honest work than beautifully packaging answers and delivering them with an expiration date.
Sources
Exploring Internal Stickiness: Impediments to the Transfer of Best Practice Within the Firm -- Szulanski 1996. A study on the four barriers that make knowledge transfer difficult within organizations
The Diagnostic Role of Gut Feelings in General Practice -- Stolper et al. 2009. Research on how physicians' "something feels wrong" functions in diagnosis
Why Does Peer Instruction Benefit Student Learning? -- Tullis & Goldstone 2020. A study on why someone who just figured it out teaches better than an expert
Humanity's Last Exam: A Benchmark of Expert-Level Questions to Assess AI -- Nature 2025. A report on AI reaching expert-level answer generation
The End of the Knowledge Tax -- KP Reddy 2026. On the collapse of business models built on information asymmetry
ATD 2025 State of the Industry Report -- Association for Talent Development 2025. Average direct expenditure per employee: $1,254. AI training as the fastest-growing topic in 2024
