A thinker's notebook: writers, autism, and Wittgenstein
Three mini-reflections with language at the root

I’ll be honest: a push to finish my book proposal draft this week is leaving me depleted. Typically, I’d pull an idea from my running notebook and develop it into a full post. But right now, I don’t have the energy.
So I’m trying something different.
I have plenty of bubbling thoughts. Rather than skip this week, why not share a few informal ideas? Maybe you’re a little tired of the long slog too, and would welcome a few bite-size pieces in lieu of a full candy bar.
What’s a “writer”?
When I was a kid, maybe eight or nine, I thought there were only a few kinds of writers: journalists, poets, and, mostly, book authors. My sense of the word leaned heavily toward that last category, because that’s what I consumed. Writer, to me, was just a more casual, breezy term for author, unless some qualifier indicated otherwise (eg, songwriter). So writer = author, which in my mind meant you wrote books: fiction, which I mostly read, or biography, which I sometimes read—or the more boring (to me then) and blurry category of nonfiction books.
Obviously, that was a child’s idea and not at all correct. But even now, I find myself puzzled by the label writer. As I grew up and realized writer could mean anything from advice columnist to film reviewer to cultural critic to creator of TV episodes, I felt… I don’t know, frustrated at the imprecision? Those jobs demand wildly different processes and skills. Why should the primary label we use be based on how their work is shared?
In a related sense, scientists are writers. Their findings don’t mean much if they aren’t written in papers and published. Philosophers are writers. Lawyers are writers. Legislators are writers. Judges are writers.
Recently, I saw someone complain that Substack is just made up of writers reading each others’ writing. Their point was that there isn’t a separate group of people consuming content, but I was more interested in their starting premise. We’re all writers here? I don’t think so. I see philosophers, marketers, historians, people summarizing complicated science. Am I a writer? I don’t consider myself one. I do writing, but I’m not a writer.
Maybe that person’s Substack experience is limited to literary types. Or, more likely, they think the mere act of producing writing makes you a writer. They’re not alone. And sure, in the strictest sense. But writing is just a medium for sharing ideas. Talking is another. Imagine Substack as an ancient forum, where in one corner someone’s critiquing the latest temple, in another someone’s spinning gossip, and in another someone’s sharing food preparation techniques. Would we call them all talkers?
But there’s no need to go back in time. Sticking with the present, should we group TV hosts, telemarketers, sports announcers, and therapists under the umbrella term talkers?
If I had to describe what I do here, it’s not primarily writing. That’s the mode but not the purpose. Mainly, I’m analyzing—and researching, collecting, connecting. I put thoughts into writing because that’s my preferred way of sharing them. But maybe calling myself an analyzer is no better than writer.
Last night I started watching River, an old BBC crime series. The main character, a detective played by Stellan Skarsgård, is grieving the death of his partner on the police force. He delivers this brief monologue before walking out on his therapist:
Therapist: Even if you were just colleagues, there was love…
Skarsgård’s character: In books and films and plays it’s always so compelling, so complex. There should be more than one word for love. I’ve seen love that kills, and I’ve seen love that redeems. I’ve seen love that believes in the guilty, and love that saves the bereaved. What we will do for love. Die for it even.
He’s right. There are so many kinds of love: parental love, romantic love, Platonic love, self love, neighborly love, divine love. For a concept so fundamental to our lives, why is there just one word—love—to which we affix adjectives, rather than different terms entirely?
The register is different, of course (though I’d be heartily amused to watch a therapeutic monologue on this topic), but I feel the same about writer: one word, too many referents, all tenuously grouped together.
Autistic people researching autism
Something that’s consistently amused me since my diagnosis is this: there’s a kind of person, grouped under the label autism by virtue of certain shared traits, whose mind naturally turns toward deep, focused exploration.
And what do many of us point that focus toward? Autism itself.
So you end up with a community of people who are unusually motivated to research things as a hobby, working to understand the very cognition or “disorder” they’ve been told they have. It’s not just that autistic people know their experience; it’s that they are uniquely driven, by their mental orientation, to research and systematize it.
What’s the right analogy? I keep fumbling for one. People with unusual strength and a propensity to lift things, who happen to be born in a rocky landscape? No, that’s terrible. But you get the idea (maybe).
This is why I see autism-focused subreddits as invaluable data troves. Autistic people offering firsthand insights into their own experience, motivated by a drive to understand, reflect, and synthesize. It’s an extraordinary record. I’ve often wondered why researchers don’t scrape those archives as datasets, unleashing large language models to map connections and point to new focuses of research.
My intuition is that autistic people are on the frontlines of understanding their own condition, coalescing around traits that research hasn’t yet—but will eventually—confirm. That happened with monotropic focus, which was first theorized by autistic autism researcher Dinah Murray and others. I suspect it will happen, too, with concepts like social object permanence (which has also been identified in connection with ADHD, as Hanna Keiner pointed out to me; she also introduced me to Dinah Murray).
Just a few days ago, I came across a 2017 paper titled Whose Expertise Is It? Evidence for Autistic Adults as Critical Autism Experts. The authors argue that autistic people are more scientifically knowledgeable about autism than non-autistic people. That’s not exactly shocking, though maybe I overestimate how much people with other diagnoses know about those conditions.
Somewhat more interesting are other points: autistic adults were more likely to describe autism experientially, to frame it as a neutral difference, and to challenge the medical model. The paper concludes: “Autistic adults should be considered autism experts and involved as partners in autism research.”
I agree. But I come to that conclusion less from the somewhat obvious idea that autistic people will both understand and experience autism differently from non-autistic people, and more from this observation: autistic people are not only uniquely positioned to understand autism, they have a tendency to deeply research and synthesize autism as an intellectual hobby. That combination leads to novel insights.
I haven’t checked whether this argument has gained ground in research circles since the 2017 paper, though I know the idea of involving autistic people in study design is a repeat topic.
But I’m getting at something different. More a methodological point: don’t just study how autistic people behave in a controlled setting or what answers they give to itemized questionnaires. Study what they’re saying on their own time about their minds, their experiences, and their understandings of the world.
Wittgenstein: the philosopher’s job
This one I’m really excited about. I hesitated to share it yet, though, because what I’m really saying at this stage is just: There’s this philosopher who argued something I find incredibly compelling.
I’m not a philosophy expert, not even an amateur. I’ve taken some college survey courses, but that’s it. I also have a terrible memory of my own experiences. So, appropriately discount the following statement: This is the only philosophical argument I’ve ever encountered where I immediately thought, yes, of course, absolutely.
Here’s the background. As a young student at Cambridge, Ludwig Wittgenstein (1889-1951) took up the classic problems of philosophy. But later, his outlook radically shifted. So much that he posited the following: nearly all of philosophy, including his own earlier work, was based on false premises.
Questions like What is time? or What is the mind? were, he argued, flawed from the outset. The ordinary-language concepts of time and mind aren’t fixed, metaphysical objects. They’re manmade terms that evolved within specific social and linguistic contexts to help us communicate.
The problem, as he saw it, was that philosophers had been tricked by language into believing there were fundamental contradictions to resolve—like what’s the difference between the mind and the brain?
But imagine a language with only one word for both, say, brind. (It’s just clunky enough for Anglo-Saxon. I wouldn’t be surprised if it were already in Shakespeare.) In that language, you couldn’t even ask what the difference was between mind and brain. The question is impossible. This shows that these deep philosophical questions are contingent on human usage. They arise because of how we use language.
This was Wittgenstein’s revolutionary insight: there’s no universal, absolute principle philosophers can construct by pure intellect. All they can do is describe and compare.
To describe, philosophers must gather examples of how the relevant words are used in ordinary language. To compare, they must set those uses side by side. Only through that process does meaning emerge, and the meaning is bounded by that very exercise.
That’s critically important: the meaning illuminated through this process is limited to its context. So: you can’t study the word spice in ordinary usage and then apply that meaning to Frank Herbert’s Dune. Similarly, you can’t analyze the meaning of the bishop in chess by asking, What’s the religious significance of moving diagonally? But, Wittgenstein argued, philosophers had been falling into just such traps for millennia.
He did allow that certain shared human activities serve as anchors around which language develops. But those activities provide context, not universal truths.
And crucially, he wasn’t dismissing the role of science. Science deliberately defines terms—akin to the way bishop has a precise, rule-bound meaning in chess—to advance empirical investigation. Wittgenstein wouldn’t, for instance, argue that gravity is merely a language construct rather than a real phenomenon studied through observation and experiment.
Philosophy, in contrast with science, is abstract. It doesn’t produce empirical knowledge about the world but instead clarifies how we think and speak about it. His point is that philosophy can’t answer the question what is gravity? by deducing some metaphysical essence through reason and logic alone. Science escapes some of the traps Wittgenstein warned about (though not all) precisely because it formalizes language to serve specific investigative aims.
What excites me most about all this is, first, how intuitively right it feels. But second, I see echoes of it everywhere, beyond the philosophical context. That’s what I’m going to develop in the future.
“The problems are solved, not by coming up with new discoveries, but by assembling what we have long been familiar with,” Wittgenstein wrote. Yes.
(Bracing myself for people who actually know philosophy to tell me I’ve got it all wrong.)
Did you enjoy this post? Ways to support my work—for free!
1. Subscribe for regular updates and 2. Tap below to heart this post so others discover it.
Looking for more to read? Check out these past posts:
Evolutionary mismatch: just one part of the neurodivergence story
"I can't make it sincere enough": Karen Read, Amanda Knox, and the performance of innocence
Stay curious,
Laura
I enjoyed this post very much and am glad to have caught it. I was interested in each of these three points and was especially struck by the Wittgenstein point. Thank you for sharing! (And I hope to go back through your previous posts as soon as I get a chance!)
Your studying your own condition reminds me of "metacognition" and other terms writers have used to point out the recursive, Escher-like quality of this phenomena. Neuroscientists have brains in which they seek to understand the nature of the brain: is there some sort of huge gap that will ensue from such a thing? If fish study fish, would they ever be able to "joots" (Jumping Outside Of The System: D. Hofstadter) and see that they're in water and other thinking beings are not? Etc.
Following from your Wittgenstein stuff: I was heavily influenced by the similar but now marginalized discourse of General Semantics. Its formulator, Alfred Korzybski, argued that "the map is not the territory." The menu is not the meal. You can't bit your own teeth, etc. One of his acolytes took him at his word and argued there's a bug in the structure of Indo-European languages: the copula and its forms, esp "is." (am/is/are/was/were/be): we especially get into trouble when we use "is" as an identity:
EX: "Robert is boring." I might "be" boring to you, but I "am" also other very many other "things" to other people, and myself. Robert and some of his utterances have STRUCTURE that can be delineated mathematically, and that's all that Korzybski took for knowledge: something that can be characterized in math with structure. He sought to mathematize our conscious use of language. (He ran into a lot of problems but I find his magnum opus, Science and Sanity [1933] cranky, repetitive, and marvelous. He's really great about amorphous nouns like God, freedom, democracy, etc: Be much more specific, please!)
I loved how Wittgenstein radically pivoted in 1953 with his Philosophical Investigations. His Tractatus was the ur-text for Analytical Philosophy, then he abandoned it. My favorite philosophers - like Rorty - think the "linguistic turn" harmed Philosophy, and I agree.
I look forward to your multivarious investigations into autism, Ms. Moore, and I hate to break it to ya, but you're a writer. And a good one!