The Challenge
Two Questions That Shape My Approach
I work in AI every day. I spend my weeks navigating autonomous agents, security architectures, and enterprise adoption curves for a world that's changing faster than any framework can track. Then I come home to the same question every parent is facing, whether they realize it yet or not: what does education mean when machines can do most of what we spent the last century teaching humans to do?
Nobody has figured this out. Not schools, not governments, not the parents I know who work in tech. Everybody's improvising. I don't have a curriculum for this—I have principles I'm iterating on and practices I'm rooting in as a dad of two girls. I'm sharing them because friends keep asking.
Question One
What skills and traits should we encourage so our children can be successful adults in a world transformed by AI?
Question Two
How can I use AI to be more present with my children, not less?
These questions are connected. The way I use AI in our family life is itself a form of teaching. Kids learn from watching what we do, not just what we say.
Theme One
Raising Humans When the Rules Are Changing
For generations, we optimized education for knowledge retention. Memorize the facts. Learn the formulas. Accumulate expertise through years of study. That made sense when access to information was scarce and expertise was hard to come by.
That's now the least valuable skill. When anyone can access expert-level knowledge through natural language, the bottleneck shifts. The question becomes: what do we nurture when access to information is no longer the limiting factor?
What matters now
- Curiosity over memorization
The drive to explore, question, and dig deeper—this is what AI amplifies. - Critical thinking
Evaluating outputs. Spotting nonsense. Knowing when something doesn't add up. - Creativity
The thing AI can enhance but can't originate. The spark that starts with "what if?" - Asking good questions
This is the new literacy. The quality of the question determines the quality of what you get back. - Emotional intelligence
Collaboration, empathy, reading a room—the deeply human skills AI can't replicate. - Comfort with tools as partners
Neither magical nor threatening. Just powerful instruments that reward skill.
We're not training them to compete with AI—we're raising them to collaborate with it, to direct it, to know when to trust it and when to push back.
This is a fundamental shift in what it means to prepare a child for adulthood. The goal isn't to make them AI-proof—it's to make them AI-fluent. Comfortable with the tools, confident in their own judgment, clear on when to lean in and when to question what they're seeing.
Theme Two
AI as a Parenting Amplifier
The Human-Centered AI philosophy isn't just something I talk about at work. It's something I try to live at home. If AI should augment humans rather than replace them, then the question for parenting becomes: how do I use these tools to be more present with my kids, not less?
The answer I keep coming back to: let AI handle the prep work so I can focus on the connection.
Project
No Thank You Evil — AI-Assisted Adventures
My daughters love tabletop roleplaying games. The problem? Creating adventures takes time I don't always have. Writing NPCs, balancing encounters, building worlds—it's creative work, but it's also preparation work that competes with actually playing together.
How it works
- AI generates adventure frameworks, NPC personalities, and encounter ideas
- I review and adjust to fit what my daughters are excited about
- The prep that used to take an hour now takes minutes
- That freed-up time becomes actual play—imagination, laughter, connection
Built with intention
- Trained on age-appropriate developmental psychology and neuroscience research
- AI tailors tasks, roles, and challenges to each child's developmental stage
- Puzzles and encounters calibrated to stretch without frustrating
- The tool is shaped by real expertise—not generic prompts
I'm not a neuroscientist or developmental psychologist. But now those insights are built into every adventure automatically. The AI brings expertise I don't have, calibrating challenges to where each child actually is—not where I guess they might be. That's one more thing I don't have to research, don't have to second-guess, don't have to hold in my head while I'm trying to be present.
The "fully polished" adventure might take hours to prepare from scratch—and even then, I'd be guessing about developmental appropriateness. The AI-assisted version gets me 70% of the way there in minutes, with the developmental expertise built in. That's more than enough to have a magical evening with my kids.
The AI handles the scaffolding. I bring the presence.
The goal isn't to automate parenting. It's to automate the things that aren't parenting so I can do more of the things that are.
The Deeper Connection
Why Specification Matters
The single most important finding from watching autonomous agents operate in the real world is that the quality of the output is determined by the quality of the human specification. I've written about this in the context of security—the difference between an AI agent that negotiates a great deal and one that sends 500 unsolicited messages to the wrong people is the clarity of the human's intent. Same technology. Same architecture. The gap is the spec.
That insight scales to education uncomfortably well. You don't get to write a good spec for something you don't understand. You can't evaluate AI output in a domain where you have no knowledge. You can't exercise judgment—taste, discernment, critical thinking—about work you've never engaged with deeply enough to internalize.
I've been writing about something I call the invisible operating system—the vast substrate of tacit assumptions, social contracts, and unstated values that every human processes automatically. AI lacks this substrate. The quality of what AI produces depends almost entirely on the quality of what humans can specify. That's true in security architectures. It's true in organizational leadership. And it's true at the kitchen table with my daughters.
Which means the foundations—reading, math, writing, the cognitive work of struggling with hard things—aren't the boring prerequisites before the exciting AI part. They're the investment that makes AI collaboration actually work.
Going Deeper
A Voice I Trust on This
I follow Nate B. Jones closely for his work on AI strategy and implementation. He recently published a talk that is the deepest exploration I've seen on the practical side of this parenting challenge—and it resonates strongly with how I'm approaching things at home.
Recommended Watch
Nate B. Jones — Parenting in the Age of AI
Nate lays out the case for building cognitive foundations before giving kids AI leverage—the calculator analogy, specification as the new literacy, why kids need to debug their own intent before they debug code. Practical, grounded, and worth every minute if you're a parent wrestling with these questions.
His framing of "foundation before leverage" aligns with what we're doing at home. The idea that the struggle of learning—reading real books, doing math by hand, writing with effort—is itself the cognitive infrastructure that makes AI collaboration possible later. Not a nostalgic preference. An investment.
Where I find myself extending his thinking is the connection to my professional work. The specification quality that separates a useful AI agent from a dangerous one is the same capacity we're trying to build in our kids: the ability to articulate what you actually want, clearly enough for a system to act on it. That's a human skill, practiced manually, and it starts long before anyone opens a chat window.
Bringing It Together
Modeling the Relationship
Here's what I've realized: the way I use AI in front of my daughters is part of preparing them for an AI-changed world. They're watching. They see me ask questions, evaluate answers, push back when something doesn't make sense, and use these tools to create things we couldn't create alone.
They see AI as a collaborator, not a replacement. As a tool that rewards curiosity and good questions. As something that amplifies what we bring to it.
That's the lesson I want them to carry forward. Not fear of being replaced. Not blind trust in machine outputs. A healthy, productive partnership—human creativity and judgment, amplified by computational power.
The same philosophy, whether I'm at work or rolling dice at the kitchen table.