Are Students Outsourcing the Hardest Part of Growing Up?
In this thought-provoking reflection shared by Larry Cuban from Clay Shirky’s recent New York Times op-ed, a new concern emerges in the conversation about AI in education.
While much attention has focused on cognitive offloading — students using AI to complete academic tasks — Shirky suggests the bigger risk may be emotional offloading.
In short:
Students are increasingly using AI not just to think for them…
but to interact with them.
And that shift may impact development in ways educators are only beginning to understand.
Beyond Academic Integrity
Shirky recounts an early classroom observation where students used ChatGPT to generate responses to a professor’s questions — reading them aloud instead of answering themselves.
Initially, this appeared to be a familiar issue:
Students using technology to avoid cognitive effort.
But over time, Shirky recognized something more profound.
AI was being used as a social buffer.
Students were not only outsourcing thinking — they were outsourcing the emotional risk of participation:
The fear of sounding unsure
The discomfort of making mistakes
The anxiety of speaking spontaneously
In essence, AI had become a form of emotional armor.
The Rise of Emotional Offloading
Shirky argues that AI is increasingly functioning as what he calls a “social prosthetic.”
Young people are turning to AI tools to:
✔ Craft text messages
✔ Initiate conversations
✔ Write apologies
✔ Navigate social situations
In one reported instance, students who cheated used AI to compose their apologies to faculty — avoiding even the vulnerability of authentic remorse.
This suggests the issue extends beyond academic shortcuts.
AI is becoming a mediator of human interaction.
Why This Matters for Development
Growing up involves navigating uncertainty:
Trying new roles
Managing embarrassment
Learning social expectations
As sociologist Erving Goffman noted decades ago, entering new social situations requires individuals to interpret norms and respond in real time.
Historically, this process was messy — but essential.
Today, AI offers a way to smooth over that messiness.
Rather than practicing interaction, students can script it.
Rather than risking discomfort, they can outsource expression.
Shirky warns that this may weaken the ability to handle the give-and-take of real human relationships.
Just as overreliance on calculators can erode arithmetic skills or GPS can weaken spatial awareness, reliance on AI may undermine social competence.
A Generational Shift
Young adults are at the center of this trend.
One analysis found that individuals aged 18–25 accounted for nearly half of ChatGPT usage — even before accounting for teenagers.
This matters because adolescence and early adulthood are defined by:
Identity formation
Social experimentation
Emotional resilience building
AI tools may help young people avoid embarrassment.
But they may also allow them to bypass the very experiences that build confidence and competence.
Implications for Schools
For educators and leaders, the concern is not simply whether students are learning content.
It is whether they are developing as people.
Academic dishonesty has long been a concern — and institutions have developed systems to address it.
But emotional offloading presents a different challenge.
Schools must now consider how to:
✔ Preserve authentic interaction
✔ Encourage unscripted dialogue
✔ Build resilience in the face of uncertainty
If AI enables students to avoid vulnerability, educators may need to create more opportunities for:
Students Are Skipping the Hardest Part of Growing Up
by Michael Keany
yesterday
Students Are Skipping the Hardest Part of Growing Up (Clay Shirky)
Are Students Outsourcing the Hardest Part of Growing Up?
In this thought-provoking reflection shared by Larry Cuban from Clay Shirky’s recent New York Times op-ed, a new concern emerges in the conversation about AI in education.
While much attention has focused on cognitive offloading — students using AI to complete academic tasks — Shirky suggests the bigger risk may be emotional offloading.
In short:
Students are increasingly using AI not just to think for them… but to interact with them.
And that shift may impact development in ways educators are only beginning to understand.
Beyond Academic Integrity
Shirky recounts an early classroom observation where students used ChatGPT to generate responses to a professor’s questions — reading them aloud instead of answering themselves.
Initially, this appeared to be a familiar issue:
Students using technology to avoid cognitive effort.
But over time, Shirky recognized something more profound.
AI was being used as a social buffer.
Students were not only outsourcing thinking — they were outsourcing the emotional risk of participation:
The fear of sounding unsure
The discomfort of making mistakes
The anxiety of speaking spontaneously
In essence, AI had become a form of emotional armor.
The Rise of Emotional Offloading
Shirky argues that AI is increasingly functioning as what he calls a “social prosthetic.”
Young people are turning to AI tools to:
✔ Craft text messages ✔ Initiate conversations
✔ Write apologies
✔ Navigate social situations
In one reported instance, students who cheated used AI to compose their apologies to faculty — avoiding even the vulnerability of authentic remorse.
This suggests the issue extends beyond academic shortcuts.
AI is becoming a mediator of human interaction.
Why This Matters for Development
Growing up involves navigating uncertainty:
Trying new roles Managing embarrassment
Learning social expectations
As sociologist Erving Goffman noted decades ago, entering new social situations requires individuals to interpret norms and respond in real time.
Historically, this process was messy — but essential.
Today, AI offers a way to smooth over that messiness.
Rather than practicing interaction, students can script it.
Rather than risking discomfort, they can outsource expression.
Shirky warns that this may weaken the ability to handle the give-and-take of real human relationships.
Just as overreliance on calculators can erode arithmetic skills or GPS can weaken spatial awareness, reliance on AI may undermine social competence.
A Generational Shift
Young adults are at the center of this trend.
One analysis found that individuals aged 18–25 accounted for nearly half of ChatGPT usage — even before accounting for teenagers.
This matters because adolescence and early adulthood are defined by:
Identity formation
Social experimentation
Emotional resilience building
AI tools may help young people avoid embarrassment.
But they may also allow them to bypass the very experiences that build confidence and competence.
Implications for Schools
For educators and leaders, the concern is not simply whether students are learning content.
It is whether they are developing as people.
Academic dishonesty has long been a concern — and institutions have developed systems to address it.
But emotional offloading presents a different challenge.
Schools must now consider how to:
✔ Preserve authentic interaction ✔ Encourage unscripted dialogue
✔ Build resilience in the face of uncertainty
If AI enables students to avoid vulnerability, educators may need to create more opportunities for:
Real-time discussion Face-to-face engagement
Spontaneous thinking
Because these are the contexts in which social growth occurs.
Leadership Takeaway
The rise of AI invites us to rethink not only how students learn…
but how they grow.
Shirky’s argument suggests that the deepest educational mission remains unchanged:
Helping young people become capable human beings.
That requires more than knowledge acquisition.
It requires navigating ambiguity, discomfort, and interpersonal risk.
And those are experiences no algorithm can substitute.
Final Thought
AI may reduce effort.
But growth requires struggle.
If students are skipping the hardest part of growing up —
Schools may need to help them put it back.
Original Article
------------------------------
Prepared with the assistance of AI software
OpenAI. (2026). ChatGPT (5.2) [Large language model]. https://chat.openai.com