Originally published on Scientific American MIND’s Guest Blog on November 16, 2016.

Neurobiology was the first class I shuffled into as a dopey freshman undergraduate student. Dr. Brown’s class began at 8AM. I wore that bowling jacket I bought from the Orem Deseret Industries, Utah’s version of Goodwill.

I’d spent much of my childhood in small towns: Middle and Junior High School in the Texas Hill Country; High School in rural Utah.

In High School, I would jog through the countryside—down by the River Bottom’s Road—and rehearse conversations and ideas that troubled me. I hadn’t learned the language of social justice or of science. I felt uneasy with many of the ideas I’d been taught but lacked the vocabulary to pinpoint why.

Dr. Brown’s first lecture covered visual perception, ocular dominance columns, and the idea that brain structure and function were intertwined. To use my parlance at that age, this was a Revelation.

The lecture outlined a completely novel way of thinking: the notion that between my ears, behind my forehead and nose was a collection of cells—of neurons, an organ—responsible for how I saw and perceived the world.

I was young, I was a drug-free virgin, and this was without question the greatest catharsis I had ever experienced.

Here wasn’t simply a foundation for my behavior, but for others’ as well. My theological leanings faded as I began to learn why I was Me.

In response, I worked my ass off.

I bought a twenty-pack of colored Staedtler Triplus Fineliners to sketch and learn neural pathways in the brain and spinal cord. I guzzled gallons of Diet Dr. Pepper (I didn’t drink “hard” caffeine at the time) to wage war on Organic Chemistry, whose written exams I defiantly took in ink. After the first deflating day of my Molecular Biology lab, I admitted defeat and transferred out. My tenacity extended only so far.

I studied neuroscience and philosophy as an undergraduate student, and learned that strokes in particular brain regions shattered personalities, could leave you speechless or unable to read. That a tumor in your amygdala could cause an insatiable desire for child pornography—or provoke a shooting rampage.

Such evidence convinced me that to understand behavior—how and why we do what we do—I needed to understand the brain, how it develops and forms, how it functions and dysfunctions.

During graduate school, I spent countless, excruciating hours learning how to process magnetic resonance images (MRI). I learned how to program—how to read and write in Unix and MatLab. I studied Calculus and Fourier Transforms and differential equations to pass my coursework; things that frankly, don’t interest me. All with the goal of studying the living human brain.

Medical school was a struggle; I have a shoddy memory for things like proteins, nonsensical Latin nouns, and the consonant-salads that are most drug names.

During my first year of medical school, I was reviewing our chest dissection alone in the cadaver lab late at night. I picked up our cadaver’s right lung and accidently held it close to my body. Worrying it had touched my scrubs, I looked down and noticed how the lung fit the contours of my own chest. That was the first time I realized that I was completely made of organs. As I looked at the dozen cadaver tanks in that basement hall, I thought that we are all completely made of organs: lungs, livers, bones, brains.

Life is fragile and it is short.

I love neuroscience because of its universality. Because it can be studied and tested and corrected—an ever-expanding, ever-improving existential philosophy.  It is as much a tool to diagnose and treat as to tolerate and love.

That decade of study has sculpted the way I think, has quite literally sculpted my brain.

But for me, this neuroscientific worldview wasn’t intuitive, certainly not something I would have cooked up on the River Bottom’s road. If science were intuitive, we wouldn’t have to study. Its dialogue spans centuries and represents lifetimes of trial and error. I spent a decade learning the language

I am deeply grateful for my embarrassing wealth of opportunities. I did not come from money and received financial aid each of the eleven years I studied after high school. I come from the Podunks and yet I’ve studied at Cambridge and Oxford Universities. I’ve observed clinical work in Italy and Nicaragua and China, where I saw that all people bleed and love and need. Me, the bumpkin who bought that stupid bowling jacket. I have been ridiculously lucky.

It’s hard to remember that someone could possibly disagree with my present existential leanings—why don’t people “get it”?

And yet eighteen-year-old me would have—without a doubt—endorsed Donald Trump’s rhetoric. Eighteen-year-old me would wear that red trucker cap and think I’m an elitist dick.

It’s tempting to recoil from my eighteen-year old ignorance, to retaliate at my unwitting prejudice. But, minus those experiences, I can’t think of why I’d’ve changed.

If the national dialogue frustrates me, I owe it to my eighteen-year-old self to speak up.

Share
About author / Daniel

I was born in Dallas and spent my childhood scampering through the countrysides of central and eastern Texas, with brief escapades in Maryland and Utah. I began medical school in San Antonio, where I met my wife and future psych co-resident Kristin Budde. After my PhD, we moved together to New Haven, where I finished med school. I enjoy writing about neuroscience as a way to think through some of the problems that come up in clinic. I spend a great chunk of my time thinking about and researching how to develop useful biomarkers of brain disease. When I'm not at the hospital or working on research stuff, I'll be fixing up my 1920s New England house. I just recently refinished an old Blue Jay sailboat, which was a great new dad project (sanding is a good activity when you're sleep deprived).

Leave a reply
Bitnami