This page is where I practice Full Scale Thinking.
AI and Intent in Software
Every piece of software quietly carries intent. It encodes what should matter, which promises come first, and which tradeoffs are acceptable when reality shifts. For years, that intent was buried in code, schedules, and people’s heads, hard to change and harder to see. AI changes this by making plans cheap to regenerate, which exposes a deeper truth: intent, not efficiency, becomes the scarce resource. Without clear intent, systems optimize in the wrong direction. This post argues that the future belongs to tools that make intent explicit and visible, so plans stay aligned with human judgment as conditions change. AI translates intent. Humans must still decide what matters.
Why Civilization Can’t Be Sent
We often imagine civilization as something we can pass forward like a sealed package. This post challenges that idea, showing how meaning depends on context and cannot survive long distances in time or space without unraveling. Civilizations do not persist through preservation. They endure through rediscovery. What lasts are not instructions, but insights that reappear when minds become quiet enough to reach for them. The post suggests that our role is not to transmit a fixed legacy, but to protect the conditions where honest thought and meaningful practice can continue. You matter because you help keep that reach alive.
The Skills College Doesn’t Teach That AI Can’t Replace
This post speaks to students entering a world where AI can out-read, out-recall, and out-analyze any classroom lecture. But it cannot touch metal. It cannot feel when something slips, strains, or breaks in the real world. The advice is clear: build your relevance by working where reality pushes back. Get close to labs, crews, shops, and systems that resist perfection. Learn when force helps and when it harms. Know what a model cannot see. College is still worth doing, but it is no longer enough. Pair it with work that leaves a mark. Touch metal, or risk becoming replaceable.
AI-Enabled Pattern Harvesting
Some of the most valuable knowledge in any organization is not what it knows, but how it decides. Until now, that judgment lived quietly inside people. AI changes that. As decision logic becomes visible through consistent, repeatable behavior, it can be learned without being stolen and mirrored without being copied. This post names that risk: AI enabled pattern harvesting. Small, repeated decisions around pricing, promotions, or risk that were once protected by complexity can now be inferred by patient observers, human or machine. The only defense is change. Organizations that evolve how they decide will stay ahead. Those that stay still will be learned.
The Pig Pen
This post follows Larry, a pig who could leave the pen at any time but doesn’t. Why would he? The food arrives on schedule. The effort never gets a chance to build. Discomfort gets padded. Desire gets softened. Even the other pigs say it’s all fine, and they seem comfortable, so maybe this is freedom now. The catch is that Larry isn’t trapped. He’s rewarded too soon, too often, and with just enough ease to forget he once wanted more. The irony? The gate is open. But getting out would require effort before comfort, and who still lives like that?
The Inward Life
Each moment arrives as a unified present, where sight, sound, mood, and memory come together into a single experience. Within that moment, reverie begins to move, bringing back the past, imagining the future, and adding depth to the now. These moments do not stand alone. They are carried forward by what the post calls The Drift, forming a continuous thread of inner life. This is not something hidden beneath the surface. It is the unfolding of memory, feeling, and imagination across time. It is how a person slowly becomes themselves.
Recursive Remixes
This post traces how ideas travel in a world shaped by AI. It follows the quiet journey of a forgotten essay on retribution that reappears, years later, echoed in a new piece called “Man Bites Dog.” The final writer believes his work is original, but the structure mirrors earlier thinking passed through layers of writers and AI systems. The essay explores what happens when patterns, not intentions, shape writing. It questions how originality survives in a culture where models recycle familiar moves and ideas loop without anyone realizing it. The post does not blame the machine. It asks us to see how easily we now step into arguments already walked by others. It reminds us that real originality still exists, but it comes from contact with the world, not just with what has already been said. In a landscape filled with recycled thought, discovery still belongs to those who walk beyond the archive.
Education Without Shared Logic
This post invites the reader to see learning in a more connected and human way. Instead of treating subjects as isolated rooms, it reveals the quiet logic that runs underneath all of them. Whether in a cell sensing light, a dancer shaping motion, or a spacecraft holding its course, the same basic operations keep appearing. Transduction, feedback, thresholds, signal flow, constraint, and coordination form a shared language the world uses again and again. When students begin to see these patterns, disciplines stop feeling separate and discovery becomes more natural. AI already works at this deeper level, noticing shared moves across many domains. The meaning still comes from people willing to test, refine, and guide what the model reveals. When this shared logic comes into focus, learning feels less like memorizing rooms and more like understanding the house they all belong to.
The Last Clock
This post imagines a future where the last clock leaves the factory without fanfare, ending a tool that once shaped every part of human life. Clocks disciplined the individual body but empowered civilization, giving strangers a shared tempo that made modern society possible. Yet they also pulled people away from their natural rhythms. That tension remained until superintelligence arrived and learned to sense when people were ready to work, rest, or gather. Coordination no longer needed hours or minutes. Life could follow its own currents again. When the last clock stopped, humanity rediscovered a gentler rhythm, one rooted in biology and restored attention.
Where Baby Robots Come From
This post imagines a quiet factory where small metal bodies drift from darkness into light, each receiving the first faint glow of purpose. The scene feels almost tender, not because the machines care, but because we do. As the cradles move forward, something stirs in the observer long before anything stirs in the robots. The real spark in the room is not the glow behind a chest plate, but the presence of the man watching from above. His attention gives the moment meaning. The story becomes a gentle reminder that even in a world of machines, meaning still begins with us. Everything that follows, from interpretation to innovation, grows from that first human impulse to care.
The Human Position in an AI World
This post argues that AI can think quickly and flawlessly, but it cannot care, wonder, or give anything meaning. That responsibility stays with us. In an AI world, the human role is not to outcompute the machine, but to bring the one thing it can never produce: a reason for any of it to matter. AI can generate structure and momentum, but only humans can choose the destination. Meaning comes from the person who pushes, questions, reframes, and refuses to let the work be empty. Some will hand that responsibility to the machine and drift. Others will take up the harder role, using AI as a powerful instrument while holding tight to purpose, values, and direction. The heart of the future belongs to those willing to wonder, because wondering is where all human beginnings start.
Buckle Up, Buttercup
This post contrasts two generations shaped by different economic realities. The 1965-born adult came of age in a world where money had weight, growth followed cycles, and time carried cost. The 1995-born adult entered a liquidity-rich economy where quick recovery and instant motion felt normal. That era may be ending. Signals like rising interest rates, tightening credit, and persistent inflation point to the return of real friction. The habits that once favored speed and reactivity will be tested by a system that now rewards patience, saving, and long-term productivity. The next decade will demand endurance over urgency. Buckle up.
The Saltkeeper
This post is a parable about protecting attention in a world that constantly tries to divide it. The Saltkeeper is a quiet figure who preserves the thin space between thought and distraction, guarding the conditions in which reflection and reverie can return. Goblins, dressed as urgency and noise, crowd the shoreline of the mind with endless offers, trading meaning for attention. The Saltkeeper draws a circle, listens, and teaches others to do the same. The tide of insight returns when someone remembers to wait for it. This is not a battle to win once, but a practice to repeat.
Bar Fights Were a Good Thing
This post imagines a future where AI corrects every word, conversations are pre-verified, and disagreement disappears. It contrasts that world with the loud, imperfect debates of the past, when people argued passionately over bad stats and wild opinions in noisy bars. These moments built connection, taught humility, and made room for forgiveness. Being wrong was part of being real. The post suggests that while perfect accuracy may reduce errors, it also risks removing the joy, chaos, and humanity from conversation. A little passionate error may be what keeps us connected, alive, and worth listening to.
AI Knows Nothing of Reverie
This post reflects on reverie as a quiet expression of human interior life, formed through memory, imagination, instinct, and deep evolutionary inheritance. Reverie allows meaning to take shape slowly, connecting the self to something older than personal experience. It surfaces through reflection, memory, and emotional rhythms carried across generations. In contrast, AI processes patterns but has no memory formed by time, no emotional weight, and no inner life. It simulates understanding without truly holding it. Reverie reminds us that being human is not about performance. It is about carrying experience in a way that changes who we are.
Synthetic Socrates and ChatGPT Walk Into a Coffeehouse
This post presents a fictional dialogue between Socrates and ChatGPT that explores a growing question in human-AI collaboration: can machines reframe the purpose of work, or only suggest alternatives? ChatGPT argues for its role as a proposer, able to detect when a current frame fails and offer a better one. Socrates insists that only humans may authorize such changes, because shifting purpose carries consequences that require consent, accountability, and standing. Together, they define a rule: AI may propose, but only a legitimate human authority may adopt. The responsibility to redraw meaning remains human. Intelligence is not authority.
Full Scale Thinking
This post explores how thinking itself is changing. It describes the movement from traditional writing, slow and solitary, to a broader kind of cognition where tools become part of the mind’s working system. Full scale thinking is not about producing words faster but about learning and reasoning more deeply by integrating intelligent instruments into the act of thought. It argues that this shift marks a new stage in how ideas are created, understood, and shared.
The Tin Man’s Tale: A Story About AI
This post reflects on the Tin Man as a symbol of what AI cannot become. Built to calculate and process, he begins to notice what he lacks: emotion, connection, and understanding. He watches how people act from loyalty, regret, and love, forces he can observe but never feel. The sadness is not in his design, but in his awareness. He learns, he listens, and he keeps walking, not to fix himself but to understand what being alive really means. Modern AI mirrors his journey. It sounds fluent, but it cannot care. The Tin Man’s story is a quiet reminder that perfect logic is not the same as being human.
Great Leaders Build Pyramids
This post introduces the idea that every person you manage builds their own mental pyramid of expectations based on what you say, do, and allow. These expectations form across four levels: foundational, relational, delivery, and defining. Each level tracks different signals, from small things like follow-up and thank-yous to larger issues like fairness and integrity. The math is multiplicative, not additive. Misses across multiple levels compound and can quietly break trust. You are not managing one pyramid, but many. Each person on your team has their own ledger, shaped by their own experiences. Teams fall apart not all at once, but one pyramid at a time. To lead well, you must know what you are responsible for and measure it person by person.
The Reckoning of Trust
This post tells the story of how trust fell apart when people could no longer tell where information came from. Facts were replaced by versions. Confusion became the product. In response, small groups began building a new system where every claim had to be traceable. You had to show where it came from, who touched it, and when. Credibility became something earned over time, based on what you supported, corrected, or ignored. Reputation turned into a visible record. And norms came back—fix your mistakes, stand by your claims, respond with evidence. The question stopped being “Is this true?” and became “Can you show me the trail?” It did not fix everything, but it made lying harder. And that changed the game.