We're Raising a Generation of AI Button-Pushers (And It's Our Fault)
I’ve had several conversations over the last week regarding the education (or lack thereof) of AI. Conversations have been full spectrum, whether in a business context or regarding our kids. One thing is clear to me: if we don’t treat AI like a subject worth teaching, we’ll end up with a generation (and workforce) that knows how to click buttons but not how to think critically about what comes out the other side.
Here's what's happening globally, what the US and Canada are (not) doing, and why the responsibility to teach AI falls on all of us.
Kids in Beijing Will Have Homework on AI
China isn't waiting. Starting fall 2025, China is integrating AI into the national curriculum, making it a compulsory subject for all students at primary and secondary school level. Children as young as six will learn about robotics, algorithmic thinking, and machine learning. Beijing requires at least eight hours of AI instruction per year for every student. The message is clear: AI literacy belongs right alongside math and science.
South Korea Is Moving Fast Too
By 2025, South Korea aims to have AI coursework in its national curriculum across all grade levels, starting with high school. New AI textbooks were introduced in March 2025 for grades 3, 4, 7, and 10, covering subjects like English, math, information, and Korean for special education. Already, 30% of South Korean schools now use AI-powered textbooks, and they're planning full implementation across additional subjects like social studies and science.
Saudi Arabia Just Joined the Race
For the 2025-2026 academic year, Saudi Arabia is introducing over 6 million students to a newly developed artificial intelligence curriculum, marking the country's first nationwide integration of AI in general education. This isn't a pilot program; it's a full national rollout.
Japan Is Writing the Playbook
Japan's Ministry of Education has already rolled out guidelines for generative AI in classrooms. Schools are encouraged to use it in English learning, group work, and accessibility support. But here's the kicker, they're pairing usage with explicit lessons on the risks: privacy, misinformation, and the need to verify. They aren't just handing kids the tool; they're teaching them the why and how behind it.
US & Canada: Still in Pilot Mode
Here at home? No national AI curriculum.
United States: The Department of Education sent guidance to grantees on leveraging federal grant funds to improve education outcomes through AI. A few states, like California, have resource kits for local leaders. Adoption is patchy. RAND found most teachers are experimenting, not standardizing. Meanwhile, 86% of university students reported using AI tools to assist with their school work and the numbers keep climbing. The kids are already ahead of the system.
Canada: No nationwide program, either. Some provinces and school boards (Ontario, Alberta, BC) are offering guardrails and toolkits. Nonprofits are filling gaps with coding and AI youth programs, but there's no consistent framework yet.
Why This Matters (Yes, Even to Adults)
AI isn't just a school problem. It's a workplace problem, too.
One of my industry friends recently got an email that still included the chatGPT dialogue. If we don't teach people to read critically, proofread, and edit, this is what slips through. That's not "using AI." That's outsourcing thinking. Left unchecked, it will dumb down our entire society.
I see this everywhere: presentations with obvious AI-generated bullet points that say nothing, emails that sound like they were written by a robot, and reports that regurgitate information without any human insight. We're training people to be passive consumers of AI output instead of active collaborators.
The Fix: Teach It Forward
Whether it's kids, coworkers, or clients, AI education should look a lot like digital hygiene: small, repeatable habits.
For schools & families:
Explain what AI is (pattern math, not magic)
Show why prompts (aka instructions) matter
Build "trust but verify" into every assignment
Keep privacy rules simple and non-negotiable
Ask kids to reflect: What did you add that AI missed?
Practice spotting AI-generated content together
For teams & organizations:
Write down house rules (what's allowed, what's off-limits)
Require a human pass for tone, facts, and names
Make space to share AI wins, fails, and tips each week
Label AI-assisted work when it matters
Train people to edit AI output, not just accept it
Proof That Guidance Works
Beijing proves you can bake AI literacy into school schedules. South Korea proves you can move fast at scale. Saudi Arabia shows that national commitment creates results. Japan proves you can stress ethics just as much as tools. The US and Canada show what happens when guidance lags. Kids (and employees) learn on their own, for better or worse.
The bottom line? Literacy first. Tools second.
Final Takeaway
AI isn't going away. The question is whether we raise generations of button-pushers or critical thinkers. We can't wait for formal curriculums to catch up. Other countries aren't waiting, and neither should we.
Teach it forward: at the dinner table, in classrooms, in boardrooms. If we don't, we'll get more copy-pasted chat logs in our inboxes, more presentations that say nothing, and more people who mistake AI output for thinking… and nobody wants that.