Skip to content
Article

Cortex, Schmortex

I think I was in my early thirties when I first heard a scientific observation that has since passed into common knowledge, namely, that our brains aren’t fully developed until age 25. While I don’t remember exactly when I first heard this factoid, I am confident that it was after my twenty-fifth birthday. In other words, I don’t think there has ever been a day on which my immaturity could be explained by the neuroscientists’ assertion that my brain was still a work in progress.

And yet my brain was a work in progress in my early thirties—not that you could have told me that in my early twenties. I remember with special embarrassment one conversation from the spring of my senior year in college. Graduation was approaching quickly, and I was speaking with a professor who was a great friend and a second father to me. I told him that while I knew I would continue to read and learn and grow, I really felt like I was mostly done. He smiled broadly, and said, “Oh, Mark, I’m going to remind you that you said that.” But he never had to remind me, because within months the absurdity was obvious even to me; my self-assessment had been too ridiculous for ridicule.

So what exactly do the neuroscientists mean when they say that our brains are not fully developed until age 25? Do they mean to imply that we do usually reach full maturity by 25, or only that we don’t usually reach full maturity before that age? And does anything important depend on whether they’re right? In particular, does this hypothesis have any relevance to the way parents and teachers should interact with adolescents?

What’s the claim? 

To begin with, we’re dealing here with a generalization across an entire population. So the claim is not that no brain is fully developed before age 25, nor is it that all brains are fully developed by age 25. The claim is rather that the average age at which a person’s brain is fully developed is 25, with some maturing earlier and some maturing later. Age 25 is only the midpoint of a fairly wide range, and even the midpoint conceals significant sex-based variation. If we look at the two sexes separately, women’s brains typically reach full maturity (by this standard) at ages 23-25, and men at ages 25-27. 

But what does “full maturity” mean in this context? The standard on which this generalization is based is entirely physical. Neuroscientists have used imaging technologies to look at brains of all ages, and they’ve found that while most structures in the brain get very close to their adult size in early childhood, the prefrontal cortex is still developing into the mid-twenties. How, exactly, is it developing? Partly by increase and partly by decrease. The myelination of the nerves in the brain is increasing, which allows for faster signaling. But at the same time the prefrontal cortex is undergoing “synaptic pruning”—the elimination of some neural connections. So, doing fewer things faster, perhaps? What an interesting basis for the specialization of labor! Adam Smith, call your office.

But the experts say it is not just a matter of doing fewer things faster; according to neuroscientists, the prefrontal cortex regulates planning, impulse control, judgment, and complex reasoning. It also plays an important, though not exclusive, role in our ability to empathize; to put ourselves in the place of others and enter imaginatively into their experience of the world. So the claim seems to be that people younger than 25 (or, say, 24 for women and 26 for men) are much more likely to engage in risky behaviors, on impulse, without fully considering the consequences for others.

This has the ring of truth for most of us, I suspect. It is broadly consistent with our standard timeline for the gradual permission to engage in adult behaviors like working, driving, marrying, leaving school, enlisting in the military, voting, and smoking or drinking—though reasonable minds can certainly differ about whether we have the order quite right. (Does it really require more maturity to smoke or drink than it does to get married?) Interestingly, though, our society’s graduated march from adolescence toward adulthood is not based on the neuroscientists’ discoveries; it predates those discoveries by decades. So what, if anything, can we actually learn from the observation that our brains are not fully mature until age 25 or so?

To be clear, I certainly don’t mean to question the factual findings; if the experts say that the prefrontal cortex is growing until age 25, and that they can tell what kinds of thinking happen there because of the lights they see on their fMRI machines, I’ll take them at their word. I think it’s fair to observe that there is a materialist presupposition at work—that is, a presupposition that we can assess (qualitatively) the maturity of minds by measuring (quantitatively) the size of brain structures—but let’s not quibble. I’ll accept even the materialism, at least hypothetically, so we can get to the question of more practical importance for the rest of us (and especially for parents and teachers of this age group): So what? 

So what? 

We sometimes hear the “not fully developed until 25” proposition used as a way of mitigating a young adult’s personal responsibility for his or her own behavior. For example, the criminal justice system treats juvenile offenders much more leniently on the theory that they are not yet fully capable either of controlling their impulses or of appreciating the consequences of their actions on others. It was this line of reasoning that led the U.S. Supreme Court to rule, in Roper v. Simmons, 543 U.S. 551 (2005), that it is unconstitutional for a state to impose the death penalty for any crime committed when the defendant was under 18. The trouble is that this rule has the unintended consequence of making it more attractive for gangs to recruit juveniles for the commission of serious crimes, precisely because they will be treated more leniently. So while there is a logical argument for reduced culpability in such cases, there is a moral hazard in saying so—a moral hazard that might have terrible consequences for the juveniles initiated into a life of crime, and for their victims. Indeed, the same premise—poor impulse control and disregard for others—might just as easily be used to justify the incarceration of all juvenile offenders until age 25 on the theory that we can’t trust them on their own out in society until their brains are fully mature. So, if the neuroscientists can be cited as support either for less punishment or more punishment, have they told us anything useful? 

Another curiosity about the “not fully developed until 25” proposition is that whenever we take a decision away from adolescents or young adults on this ground, we raise the important question of who gets to decide instead. Traditionally, parents decided things for their children until their children were competent to decide for themselves. But in some areas today, notably with medical interventions, we see laws pushing the age of consent downward in order to let children consent to procedures with the gravest lifelong consequences without even notifying their parents. The argument for treating 14-year-olds as fully capable to make such important decisions without even consulting their parents is, to say the least, irreconcilably inconsistent with the idea that they can’t be expected to know that it is wrong to carjack someone at gunpoint. 

Good parents, who typically have much more insight into their children’s intellectual and moral development than neuroscientists can acquire with their fancy imaging techniques, take a much different approach. Good parents typically treat children as responsible for whatever they, in fact, decide to do. For toddlers, we may invent the mitigating circumstances on our own: “missed the nap,” “hungry,” “out of sorts” for some other reason. But a slammed door is a slammed door, and parents who have detected a surly tone of voice are notoriously unimpressed by the assertion that they’re imagining it. I think as teachers we have to take the same approach, and the challenge is to make sure we know our students well enough to make sound judgments (often very quickly) of the intentions behind antisocial or otherwise disruptive behavior.

The fact that the prefrontal cortex apparently controls such critical academic skills as planning, judgment, and complex reasoning may also contribute to a lowering of expectations in schools, e.g., a postponement of some of the best books ever written because the students are not, on the whole, “ready for that.” I think we should resist this, not least because handling things for which we’re not ready is one of the most important things we do in life. Students need to learn how to be ready for the things they’re not ready for! And what sense would it make, structurally, for us to postpone any kind of learning until our students’ brains are “fully developed” if that time won’t arrive until almost all of our charges have left school? The point is obvious when applied to high school seniors, but I think it applies even in lower grades. In my logic classes, I have learned that some significant number of freshmen cannot translate “All that glitters is not gold” into anything other than a statement about metallurgy. The neuroscientists might tell me that it’s because their prefrontal cortexes are not yet fully developed. But whether or not that’s true, shouldn’t I be trying to help them overcome that challenge? Here, as in the other cases I’ve discussed above, I am deeply skeptical of the idea that age 25 is a floor for anything in our intellectual, moral, or social development.

Despite my overall skepticism about the relevance of the age at which the brain is “fully developed,” I think it might be useful as a ceiling rather than a floor—that is, as the age at which one should be presumed to be ready for anything. The fact is that we need people in their early and mid-twenties to engage in all kinds of risky but pro-social behaviors, like starting businesses, defending their country, getting married, and having children. And recently, they’ve been delaying some of these same behaviors. For example, I’m no sociologist, but I can’t help but wonder if falling birth rates are partly a natural consequence of people not using their most risk-tolerant years to create a family. Perhaps we can get the neuroscientists to tell them not to wait too long. With apologies to Karl Marx and Friedrich Engels, the twenty-somethings of the world have nothing to lose but their debilitating dependencies. They have a world to win.

The breadth, the depth, the richness of the world that awaits them brings us back once more to the materialist premise we accepted hypothetically above: that we can say something useful about our mental lives by measuring the size of brain structures. The reason I can so readily believe that my brain wasn’t fully developed in my early twenties is because I can so readily think of profoundly formative experiences that occurred later in my life. In that sense, my brain wasn’t fully formed until I had married my wife, buried my father and mother, held my newborn children, and so on. And in that sense, my brain is not fully formed today—at least I hope it’s not, and I’ll look both ways when I cross the street. Our goal in education and parenting should always be to prepare our young charges for rich, full lives. For my money, the age at which they’re ready for that is the age they were yesterday.

About the Author

Mark Grannis

Philosophy, History

Mark Grannis teaches logic and history at The Heights. He is the author of The Reasonable Person: Traditional Logic for Modern Life, and most recently posts his classroom quotes of the day at markgrannis.substack.com.

 

Learn More

Subscribe to The Heights Forum Newsletter

Name(Required)
I'm interested in content for...
Select if you'd like to receive a monthly newsletter specifically for any of these educator roles.
This field is for validation purposes and should be left unchanged.