On Moonlight

a film analysis, autobiography, & holistic societal critique

by Aaron Freed

Introduction (2025)

A brief history of On Moonlight

On 2017-03-11, I began writing commentary on Moonlight after a high school classmate, Adele Romanski, won the Academy Award for Best Picture as one of its producers. I forgot to stop writing until 2017-04-28. My film critique turned into a 62,000-word book that delved into autobiography and a critique of society itself. (Later addenda have already brought its word count to nearly 74,000, and I imagine it’ll be over 80,000 once I’m done.)

The protagonist of Douglas Adams’ Dirk Gently’s Holistic Detective Agency may or may not have been a con artist and may or may not have been insane, but he held that to solve a crime, one must also solve the entire society in which it occurred. That book’s events proved him correct. That was effectively my approach in this book.

I wrote this over eight years ago. I’ve changed since then; I no longer agree with everything I wrote at the time. Some parts now feel hopelessly naïve, while I’ve changed my stances on a few specific topics I covered. I’ve left these mostly intact with occasional remarks on particularly strong discrepancies. Most of these are [delimited in brackets, italicized, printed in blue text, and conclude with “–Future Aaron”]. However, a lack of such disclaimers should not be taken to mean my stances haven’t changed: I intend this introduction as a blanket disclaimer.

I also expanded §9 so massively that I decided to simply open it with a blanket notice about its expansion; however, this was less a change in meaning than it was a substantial expansion of my original point. As a result, this section is suffixed with ‘(2017/2025)’. However, any references to the present day within that section should still be taken to refer to 2017.

I wrote a few other new sections and subsections in 2025. These are suffixed with ‘(2025)’, and in these, references to the present should be taken to refer to 2025. I realize that many readers probably won’t recall this by the time they get there, so I will clarify this at the start and end of each such section.

Apart from that, I’ve edited my original words at most lightly, and mainly for two reasons: for brevity, or to leave some information private (mostly about others, occasionally about myself). Noting minor edits for brevity would defeat their purpose, so I didn’t, but I tried to leave the spirit of my original words intact.

Quick notes on America’s political divide

Parts of this book will no doubt be contentious. I want to emphasize that the main targets of my ire were and remain certain politicians who shall mostly remain nameless, a few specific news organizations that shall certainly not remain nameless, and bigots. If you don’t fall into any of those categories and anything I’ve written comes across as a personal attack, I’m sorry; please try not to take it personally. If you do fall into one of those categories, I’m sorry that you fall into one of those categories.

Snark aside, I apologize in advance if my attacks on any particular politicians or media outlets make you feel defensive. I tried to avoid direct personal attacks on average citizens, but I fear readers of more right-wing persuasions may nonetheless take what I’ve written about specific politicians as being personal attacks on themselves. I want to clarify that I intend no such attacks on readers who don’t fall into the above categories. In my estimation, conservatism at its best challenges advocates of change to prove their proposals are well-thought-out improvements to society, and as I write in §9, I believe its recent debasement is a major cause of our present political divide. Unfortunately, that divide is now so severe I’m not sure this problem is even avoidable, but perhaps this observation can serve as an occasion to all of us for personal reflection about how we got here.

My biggest regret with my approach in the body of this book is that I focused so much attention on our internal political division without closely examining where that division came from. Most Americans bear no responsibility for our political division. The blame for that falls squarely on our leaders.

I want to be clear: I do not consider a philosophical or political disagreement a valid reason to dislike someone. I have countless disagreements of both kinds with close friends. As long as you try to act according to your ethical principles, I hardly care how our principles differ. It’s alarming how much we shout at each other and how little we listen to each other. Mature adults should be capable of civil, good-faith disagreement.

Thus, I primarily blame the media figures and politicians who’ve shown no capacity for good faith. Political leaders who direct divide-and-conquer attacks at vulnerable citizens are the biggest culprits; media that report such attacks matter-of-factly without fact-checking or pushing back on them are only slightly less at fault.

I can’t say I have solutions for every issue I raise, but I want to be clear: I neither hold most Americans responsible for them nor think most Americans intentionally contribute to them. I believe we all have a responsibility to think critically about why they’ve been happening and how we can keep them from worsening. But if your ideal top marginal income tax rate or healthcare system don’t match mine, that’s OK. They don’t have to.

I still have a T-shirt from the Pine View Drama League with the following quote:

“The strange power of art is that sometimes it can show that what people have in common is more urgent than what differentiates them.”
John Berger

I fully believe this. I also think we all owe it to each other to focus less on our ideological differences and more on what unites us. Posterity may indeed depend on it.

A few last introductory notes

If ‘(2017/2025)’ follows a section’s title, I originally wrote that section in 2017 and substantially expanded it in 2025. Unless otherwise noted, all references to the present in such sections should still be taken to refer to 2017. On the other hand, if a section name is followed by ‘(2025)’, it was all written in 2025, and references to the present should be taken to refer to 2025. I apologize for any confusion this causes.

I used a lot of endnotes – originally eighty-three, with at least another since. I’ve tried to make these as accessible as possible: clicking an endnote will take you to the endnote; clicking the number in the endnote will take you back. Hovering your mouse cursor over dotted text frequently displays a summary of the endnote’s contents, although several endnotes were too long to display as tooltips. I generally note such truncations.

Following my original writing is a 4,500-word afterword delving into my thoughts on the last eight years. Spoiler alert: I’m nowhere near as optimistic now as I was then. I try to find a few silver linings, but there aren’t many.

Parts of my 2017 commentary are undoubtedly eight years too outdated to be relevant. However, I see no easy way to separate the autobiography and film critique from the surrounding material. However flawed this presentation may be, my hope is that my life story may be able to help people who are currently hopeless.

Aaron Freed
Tallahassee, FL
2025-08-08 (last revised 2025-08-25)

Contents

  1. Introduction (2025)
    1. A brief history of On Moonlight
    2. Quick notes on America’s political divide
    3. A few last introductory notes
  2. Contents
  3. Overview (2017)
    1. The 89th Academy Awards
    2. Moonlight’s storytelling innovations & societal critique
    3. The present danger to the marginalized
  4. Personal background (2017)
    1. A general overview
    2. The subjective experience
      1. Unawareness: My first abortive attempt at college
      2. Denial: Online socialization
      3. Acceptance: Finding (and losing) love
      4. Retail and autism don’t mix
      5. TV ratings: a much better fit
  5. On autism (2017)
    1. The pain of autism and what might help
    2. Autism and comorbidity with mental disorders
  6. Other marginalized groups (2017)
    1. People of non-binary gender
    2. The mentally ill
    3. The mentally and physically disabled
    4. The non-monogamous
    5. People on the asexual and aromantic spectra
    6. Sex workers
    7. Current and former adult film stars
    8. Drug users
    9. Identity issues are class issues
  7. Moonlight’s societal critique (Continued) (2017)
  8. Ethics, sexuality, and shame (2017)
    1. Casual sex is often ethically good
    2. Pornography is an artistic genre
      1. “People having sex on camera”
      2. “Intended to cause sexual arousal”
      3. “I know it when I see it”
    3. How we could do better
  9. The rarity (and value) of true conservatism (2017/2025)
    1. Why the national GOP is reactionary (and local politics are odd)
    2. “Nothing is security to any individual but the common interest of all”
    3. Milton Friedman supported basic income, and other little-known facts
    4. Defining ‘conservative’, and more reasons the national GOP isn’t
    5. “Stand[ing] athwart history, yelling, ‘stop’”: The value of skepticism
  10. Falsehoods in the news (2017)
    1. “But her emails”
    2. The pernicious myth of false equivalency
    3. The media’s reactions to the election
    4. Possible solutions to the news crisis
  11. Falsehoods in politics (2017)
    1. “Reality-based community”
    2. Falsehoods about healthcare
    3. Falsehoods about the economy
    4. Falsehoods about guns
    5. Blue lies
    6. Misconceptions about political organization
    7. “Times of universal deceit”
  12. Attempting to anticipate tomorrow’s problems today (2017)
    1. Automation and the future of work
    2. Security in information technology
  13. Some consequences of the 2016 election (2017)
    1. Personal reflections on Jewish heritage and resurgent anti-Semitism
    2. It got worse (2025)
  14. Sources of marginalization (2017)
    1. Unarticulated biases
    2. The Golden Rule’s fundamental flaw
    3. Structural problems in society
    4. “We Do What We’re Told (Milgram’s 37)”
    5. The nature of power
    6. Authority’s effects on people
  15. Problems with hierarchies (2017)
    1. Inefficiencies of hierarchies
    2. Rethinking hierarchy and restructuring society
    3. On A Vindication of Natural Society (2025)
    4. A note on politics and human fallibility
  16. Where we go from here, and why Moonlight matters (2017)
  17. Afterword (2025)
    1. “Ignī ferrōque”
    2. “The wrong lizard might get in”
    3. «Ἐν ὠχρῷ σελήνόφωτῐ́»
    4. “Through myself and back again”
    5. 「物の哀れ」
  18. Idiolexicon (2017/2025)
  19. Recommended Nonprofit Organizations (2025)
  20. Acknowledgements

Overview

The 89th Academy Awards

Like presumably thousands of others, I left a brief congratulatory note on Adele Romanski’s Facebook page after she won the 89th Academy Award for Best Picture. I was still in stunned shock at the time, despite the headline-catching incident occurring hours earlier, and my note was woefully inadequate in proportion to the scope of her achievement. I’m still processing what it means. Until I learned more about the film, I simply didn’t comprehend many aspects of its victory; actually seeing it clarified even more. The process forced me to confront assumptions I’d made about how the world works, and the film had already irrevocably changed my life for the better before I’d even seen it. Since seeing it, I am noticeably more confident and self-assured and have a much more optimistic view of humanity’s future in general and the power of storytelling in particular.

I doubt Moonlight will affect everyone else this profoundly, but I believe it’s a truly life-changing film. I intend, with this appreciation, to document many of the ways it’s transformative, though I don’t believe I will or even can touch on them all, and doing so will also require looking at many aspects of society. I will also share my life story, not because I feel it’s intrinsically notable, but because I believe it will prove of incomparable benefit to others like me and their loved ones, and because many of the reasons this film is so important to me are deeply personal. My journey towards self-acceptance was excruciating, and I was completely unprepared for it. Moonlight is, in many ways, the closest I’ve ever seen anyone else come to writing my life story; I have little superficially in common with its characters, but their struggle to accept their identities is the first such fictional struggle I’ve ever seen that even remotely resembled mine. All the others I’ve ever seen have greatly understated how difficult and painful coming to terms with one’s identity is. My account of my journey may feel uncomfortably personal at times; Moonlight may feel that way to some of its viewers, too. And yet, I believe both are far stronger for the personal details. Moonlight reflects fundamental truths about life that have seldom been expressed in fiction before and from which other storytellers can learn. Ultimately, the political, the personal, and art itself are all interconnected, and here I intend to discuss them all.

A few paragraphs of background may be in order. Like around 108 other people,⁽¹⁾ I’m one of Adele’s fellow members of Pine View’s Class of 2001. We first attended school together in sixth grade, when I was eleven. I’m currently thirty-three. I’ve been acquainted with her for legitimately more than two-thirds of my life. I don’t think we were close enough to qualify as friends for any of this time, but acquaintances is probably significantly underselling it: it’s a small school, so we probably saw each other in at least one class in at least most of our seven years there together, and we were both also actively involved in Pine View’s Drama League for most of our high school years.

I knew that group was special even in those days, which is why I kept returning despite my lack of personal aptitude for dramatic performance. Adele was a natural, and everyone knew it; our senior class voted her “most dramatic”. (I suspect she’d already had lengthy dramatic training before high school, but I can’t know this for sure without asking her directly, and I’m sure she has better things to do with her time than reminisce about the distant past.) Still, I hung around for all four years, and contributed how I could to productions which themselves ended up winning (far less culturally significant) awards. The contributions of which I was proudest, and which probably made the most difference, were to the sound design of several productions. In most cases, I couldn’t tell you what specific work I contributed to which specific productions any longer,⁽²⁾ but I have fond and vivid memories of that time; indeed, many of my fondest memories of my entire childhood are inextricably associated with those plays.

The Drama League was very likely the first time I got to see collaborative artistic creation up close,⁽³⁾ and it was awe-inspiring. When we started, we were a ragtag bunch of teens who didn’t entirely know what we were doing, but over the course of the weeks and months in which each production developed, we created something that, to my mind, could only have been described as art. And I felt like we also developed a sense of camaraderie, of the sort that can only be developed by people who are devoted to a craft and spend weeks and months collaborating on it. We went our separate ways after high school, but while I’ve forgotten a lot of details about probably the majority of my classmates who weren’t in the Drama League, I still have vivid memories of nearly all its participants, including many who weren’t even members of our class.

Adele is now a world-famous Hollywood producer whose acceptance speech was seen by tens of millions of viewers as it occurred, and probably tens of millions more as the news of the announcement error was reported. Due to the nature of Internet viewing, we can’t know exactly how many people have seen her speech now, but I’ve been in TV ratings for a year and a half, and I wouldn’t be surprised if it were closing in on a hundred million. That’s truly staggering. I genuinely can’t think of a single person alive today whom I’d prefer to have won that award or to have given that speech.⁽⁴⁾

And this film was, to all appearances, created to no small extent at Adele’s impetus. It was she who encouraged Barry Jenkins to make a second film; she also recommended Mahershala Ali for his role and conducted much of the location scouting. Dede Kleiner and Jeremy Gardner became attached to the project later. It’s certainly extremely doubtful that the film could have been made without them, either, but a significant part of the initial impetus for the film was Adele’s.

For all the reporting of the spectacle about the show and the announcement mix-up (which, as an event unprecedented in the ceremony’s eighty-nine-year history, was genuinely newsworthy) there’s been comparatively less discussion of the film’s content or what its victory means in our present tumultuous cultural moment. As a media researcher with a background in political science, I’ve been mulling over this since the awards, and much of the commentary I’ve seen in the mainstream press missed much of its full significance. I’ve spent a long time writing this book to attempt to correct this, revising as I’ve done so, and I’ve had to let it sit several times before resuming because I was still piecing my thoughts together. This grew to a much longer length than I anticipated when I started it (it’s as though every time I think or write about the film, I notice another layer), and I doubt I’ll be able to explore the film’s full significance either (I’m not convinced anyone actually can), but I’ve at least reached a point that I’m comfortable sharing with the world.

Not all of this is directly relevant to the film; some of it is only relevant indirectly. I can’t convey my full appreciation for Moonlight’s unflinchingly honest look at reality, or for the resounding acclaim it’s garnered, without contrasting it with the outright fraudulence that pervades so much of our culture or explaining the (often excruciating) experiences that have shaped my perception of the world. The premise of Douglas Adams’ Dirk Gently’s Holistic Detective Agency, recently loosely adapted for a superb BBC America series, is that to solve a crime, Dirk needs to solve society itself (hence holistic). To explain Moonlight’s full importance, I must do the same thing. One final note is that, since I want to be accessible to as wide an audience as possible, I’ve tried to avoid obscure academic jargon. However, academic terms are sometimes useful shorthand for complex concepts, so I haven’t been able to avoid them entirely. I try to define these terms within the text where they may be unfamiliar, or where academic usages differ from popular usage, but since there are quite a few of them, I also provide an idiolexicon (i.e., a glossary I wrote myself) before my acknowledgements.

Back to top · Table of contents · My portfolio · Contact me · Website index

Moonlight’s storytelling innovations & societal critique

In its own quiet way, I believe Moonlight is a potentially revolutionary film. I don’t use this term lightly. I’ve spent a long time studying anti-authoritarian leftist politics, and the term revolutionary has been cheapened by modern marketing to describe window cleaning products. Moonlight is not that. Moonlight depicts a way of thinking about the world that has the potential to generate paradigm shifts in society that, arguably, are entirely unprecedented.

This assertion, too, requires further explanation. One of the earliest, most profound lessons of my political science education was that the world is irreducibly complex and chaotic. There’s simply too much information out there for any one person to understand; if we didn’t ignore details we perceived as less significant, the world would appear to us as nothing but incomprehensible chaos. Indeed, a lot of this occurs subconsciously; the brain discards signals it perceives as irrelevant purely so we can focus. So we construct our own mental models of the world in order to understand what we perceive, often by drawing patterns between perceived connections. But, inevitably, these models omit details, and so their explanatory power is imperfect. Most people don’t even realize they’re ignoring anything, and the result is unconscious biases that may strike when least expected. We who are conscious of this can make efforts to adjust for it, but these efforts will still be imperfect. At best, we can continue adjusting our imperfect perceptions and remain aware we’re doing this.

Beyond this, our memories are imperfect. The brain stores memories based on what we perceive as important, which is naturally subjective and based on existing memories. Beyond that, the act of thinking about a memory changes the memory. What we are able to recall replaces our previous account of how we experienced the event. This is one reason witnesses are so unreliable, and so specifically susceptible to tampering: leading questions can actually change someone’s recollection. And naturally, if the narratives we use to understand reality have been constructed from subjective perceptions and recollections, then our perception is inherently subjective and unreliable.

This is, of course, where the unavoidable biases in history, journalism, and so forth come from: there are simply too many details to report, so some inevitably get omitted, and the choice of what to emphasize, what to understate, and what to omit entirely reflects the author’s priorities, whether intentionally or not. These biases may not even be consciously political, but “If it bleeds, it leads” still favors constructing some kinds of narratives and disfavors other kinds. Howard Zinn’s A People’s History of the United States is a great example of this; it’s imperfect and contains factual inaccuracies, but Zinn wrote it to emphasize perspectives that were underrepresented in previous histories of the U.S. He openly admits his bias in doing so at the start of the work, and therefore, I find him vastly more honest than preceding historians who’d hid behind veils of academic impartiality while omitting or downplaying the perspectives of women, the poor, minorities, and other marginalized groups. Since we’re incapable of perfect perception, we can’t have impartial perspectives on any aspect of life, particularly politics, and for many reasons, my sympathies align with the marginalized.⁽⁵⁾ Zinn’s work sparked an academic debate that continues today in fields like historiography and political science, and many of today’s most acrimonious media battles consist of essentially the same debate.

Moonlight examines many aspects of society that had been largely ignored in mainstream cinema: queer people,⁽⁶⁾ African-Americans, the impoverished, drug addicts, the drug trade, and some other communities that aren’t explicitly mentioned but are, at least under the surface, represented more truthfully than they are in traditional storytelling. It does so honestly and respectfully and – this is perhaps the most important part – many of those communities were directly involved in doing so.

Think back to the 88th ceremony. There was a justifiable uproar about the nominees’ overwhelming whiteness. The 89th Best Picture winner has an entirely black cast, and its writers and director are also both African-American. That is an incredible about-face to occur within a year, and it’s completely historically unprecedented. No previous Best Picture winner had an entirely black cast. I think it’s easy for white people to underestimate how much this signifies to African-Americans.

Representation matters. I discuss this at length below, but a lot of what we perceive about ourselves and the world around us comes from the media – books, plays, films, television, music, sports, video games, comics, art, news coverage, and so on. We interpret our lived experiences through stories we tell ourselves about the world, because we have to simplify reality’s complexity to understand it; the narratives we thereby construct help us understand our experiences. But inevitably, we omit details, so our understanding is at best imperfect. Whites’ and blacks’ lived experiences in America are simply qualitatively different. This is a large part of what Black Lives Matter has been protesting, but it’s broader than those protests, and extends to everyday topics most whites probably never consider. Sometimes, a piece of popular culture touches on one of these issues tangentially; in one episode of This Is Us, a black child’s adoptive white mother has to ask advice about hair care. Most white people have no experience to suggest that that would be an issue, so we don’t think about it, and it usually gets left out of our stories. And, of course, we also often don’t consider much more significant issues, like sentencing disparities in the drug war, discriminatory lending practices, police profiling, and numerous kinds of unthinkable violence that disproportionately affect black communities.

The Oscars So White protests weren’t really about the Oscars, though, or at least not exclusively so. They were about Hollywood’s output as a whole, and the Oscars provided a convenient metonymy; they were the symptom, not the cause. There was a dearth of quality films in 2015 starring minorities; ergo, stories about their lived experiences weren’t being told. And the very next year, the film sufficiently acclaimed by enough of the industry, critics, and the public to be declared, by the industry itself, the very best film produced that year has an entirely black cast, black writers, and a black director. I’m reiterating this because it’s important, and it hasn’t even been alluded to in most of the coverage I’ve seen of the awards. The victory of this picture at the awards has a significance that simply isn’t being discussed in most of the commentary about it.

And its cast and creators aren’t the only way it’s a trailblazing picture. It’s also the first picture focusing on queer issues to win Best Picture. While African-American representation has justifiably earned complaints for decades, queer representation has arguably been even more problematic: for decades of the industry’s history, queer people were at best entirely excluded from the stories the industry told, and at worst presented in horrifically unflattering ways that undoubtedly contributed to unimaginable oppression and paranoia among the queer community. The industry’s treatment of queer people overall remains far from ideal, but even comparing where things are now to where they were a decade ago shows an almost incomprehensible degree of positive development.

The film also acknowledges an aspect of society seldom covered in mainstream filmmaking: the systematic natures of the cycles of poverty, the drug trade, and drug addiction. (Moonlight’s critiques of society are rarely explicit; it generally avoids explicit commentary, preferring to offer an objective report rather than an editorial, but its presentations of facts imply several critiques.) Many people fall into these lifestyles simply due to lack of other options. Fictional depictions of poverty are frequently “Hollywood poverty”: ‘poor’ characters have access to and squander far more resources than real-life people in poverty ever get. Unrealistic fictional depictions of poverty are so common that the pop culture wiki TV Tropes has a trope entitled “Friends Rent Control”, after the TV show: characters are rarely or never actually shown working, and their homes or apartments would be absurdly expensive in real life. (Another trope, “Informed Poverty”, covers other unrealistic depictions of poverty.) This creates wildly unrealistic expectations of what poverty is and why people experience it. In real life, if a family is in poverty and their window gets broken, it stays broken, and that’s not due to irresponsibility; they simply can’t afford to fix it. (This is also one reason that, though “broken windows” policing policies may often be well-intentioned, they’re also horrifyingly out of touch with how real-life poverty works.)

Moonlight reminds me of The Wire for several reasons (though the comparison feels imperfect for reasons I’ll address soon), but foremost are both works’ bracingly honest looks at the systematic natures of poverty and drugs in America. The two works are almost unique for encompassing this in their critique of American society. In Moonlight, poverty is almost a black hole: those caught in its gravitational field have very few possible hopes of escaping it. The drug trade, as one of those few hopes, attracts even people who’ve experienced untold amounts of grief caused by drugs. The film is almost unique in Hollywood for not condemning these people for this at all. It doesn’t glamorize drugs in any way: their awful toll is fully portrayed on the screen through Paula’s unquestionably abusive treatment of Chiron in its second act. Yet it doesn’t heavily condemn her either; she, too, is a casualty of drugs, and its third act shows her to be an ultimately good, repentant person. It may portray Juan as an enabler for selling her drugs, but he too remains sympathetic; he’s a positive father figure to Chiron, who’d have otherwise lacked one, and while his profession is portrayed as unfortunate, it also provides him a living.

Kevin, in the third act, provides another look at a way out of poverty, but his way of life is hardly any more glamorous and, in some ways, could be even more unstable than Chiron’s; he’s barely scraping by day-to-day with his existence as a cook and waiter. He took years to find his feet, and almost didn’t; a single unforeseen expense could ruin him. His main source of fulfillment seems to come from fatherhood, but he’s fully aware how precarious his position in society is.

Films rarely tackle societal problems as horrifying and pervasive as poverty and drug addiction, yet retain this much compassion for their characters. If Moonlight has a major villain, it is ultimately society. Its closest human characters to villains are the first two acts’ bullies, but while their actions create a rift between Kevin and Chiron that takes years to close, the label of villain would assign them too much importance: they aren’t even in its final act, and their only lingering effect on the protagonists’ lives after the denouement is a direct consequence of America’s legal system. Moonlight is not ultimately a story of humans against humans; it’s a story of humans against the established social order.

I should also emphasize the quality of this story. Moonlight is a phenomenal accomplishment purely as a technical work of filmmaking. I’m not truly qualified to comment on its beautiful cinematography, its wonderfully paced editing, its immaculate direction, or any number of its other flawless technical aspects, but people with far more technical knowledge of those fields have written enough about them for me to accept that I’m not arriving at my amateur assessment that they’re impeccable purely from impulse or bias. I’ve studied language enough to write pages about its script’s sheer poetry, and maybe I will after rereading it. I’ve also studied music enough to write pages about the beauty and innovation of Nicholas Britell’s score, and maybe I will after listening to it several more times. Moonlight is one of the few films I’ve ever wanted to watch again in its entirety immediately after finishing it. I haven’t yet done so, but I intend to do so soon, and suspect I’ll watch it several more times after that. The fact that a film of this quality is telling these communities’ stories, and having such an impact on pop culture in the process, is astonishing; the fact that those communities made it largely by themselves is almost unprecedented.

It’s not merely the technical quality, either. Modern usage has devalued the term ‘authenticity’ to the point of near-meaninglessness, but Moonlight is a deeply personal, honest film that closely reflects its creators’ lived experiences in a manner almost entirely unprecedented in cinematic history. I know of no adequate comparisons for the film except, as I mentioned above, The Wire, but that comparison still feels imperfect: The Wire is a journalist and sociologist’s look at a city in its entirety, with its many flaws and points of pride, while Moonlight tells a deeply personal tale of what it’s like for one person to grow up in a city (albeit with the same degree of verisimilitude). It is as though The Wire had been told as Omar Little’s coming-of-age story, perhaps.

Back to top · Table of contents · My portfolio · Contact me · Website index

The present danger to the marginalized

I could go on about the picture’s quality, or its having been produced on a shoestring budget serving as a sign to all potential producers, directors, and scriptwriters that they can make a truly great film for a fraction of the usual cost, or the personal inspiration I take from its stories resonating in the current moment (I’ll return to this later), or any number of other factors, but I want to contrast it now with our politics. There’s so much cause for cynicism and despair right now, and this is an historical moment requiring a level of engagement from citizens never before seen in my lifetime. Like most of my generation, I was appalled and alarmed by 2016’s election results. This administration* has taken a disquieting authoritarian turn few of us are comfortable with, just as it campaigned on doing. This isn’t what most of us voted for, and indeed, the outcome of the election didn’t reflect the popular vote (nearly three million more people overall voted for Clinton than for the president*⁽⁷⁾).

The election’s effects have been chilling and immediate, and they directly affect Hollywood. Asghar Farhadi’s initial inability to get a visa made international headlines, but some policies could be vastly more injurious. For example, our adult film industry employs some workers on visas from countries that, euphemistically, forbid such works. If their visas are revoked and they’re deported, they may be in grave physical danger. These policies also threaten the United States’ stability as an entertainment center. If companies think erratic government policies may affect a given production, they may simply choose to film elsewhere. This demonstrably harms the industry and our economy as a whole.

Government budget cuts also cause serious harm. As mentioned above, I participated with Adele in a public school drama program that no doubt taught her at least some of the skills that she now uses as a Hollywood producer. I in no way mean to demean the scope of her accomplishments as a producer (after all, without them, Moonlight likely wouldn’t even exist), but since we all stand on the shoulders of giants, I must ask: if that program hadn’t existed, would Moonlight have still been made? If it had been made, would it still have been as good? Many schools already have programs far less comprehensive than Pine View’s, and some skills – particularly musical and linguistic ones – are much stronger if learned during childhood. And learning these skills doesn’t strengthen only artistic skills; it’s by now well established that teaching musical instruments to children also strengthens their mathematical abilities, for instance. How many potential filmmakers, musicians, actors, and other artists are we already losing purely from the absence of programs to nurture their abilities? And have the abilities of the ones we’re still getting been reduced as a result?

And of course, many of these policies are simply ethically wrong. I already named the potential harm to performers and producers. Tearing children apart from the only families they’ve ever known due to national borders is wrong. Proposing religious tests for refugees fleeing some of the most dangerous regions in the world is wrong. This administration* has proposed – and to some extent, with ICE’s assistance, is carrying out – literal ethnic cleansing. I won’t pretend this is unprecedented in our history; George Takei will be entirely willing to tell you how Japanese-Americans were treated during the 1940s,⁽⁸⁾ and African-Americans who suffered under Jim Crow have equally horrifying stories to tell. Further back, it gets even more horrifying; American Indians (or Native Americans; the community is split on the preferred term) were subjected to outright genocide, and I presumably don’t even have to mention the horrors of slavery. However, such heinous policies are entirely unprecedented in my lifetime, and since they’re now serious proposals, marginalized communities are in greater danger than they’ve been at any previous point of my life.

And what kind of malefactor proposes cutting funding to Meals on Wheels? Numerous programs currently under fire provide undeniable harm to sick, poor, disabled, or otherwise disadvantaged individuals with no other means to care for themselves, and we cannot rely on private charity to assist them; private charity has failed to avert our present homeless crisis, for example.

Correspondingly, I find it indescribably personally inspiring to hear an influential Hollywood producer say it’s important to keep telling marginalized groups’ stories. I sincerely doubt I can imagine a single nobler calling, and there’s far more territory to be explored here than many people are aware. There are numerous kinds of marginalization that many people don’t even think about until they’re confronted with the idea that they’re forms of marginalization. As long as society doesn’t listen to the personal perspectives of the marginalized, their marginalization will continue, and society won’t come close to solving all of its problems. I consider myself a member of several groups that are marginalized to varying degrees, and I’ll now discuss one in depth. (I want to emphasize that I don’t hold a grudge over this; I’ve come to believe that ignorance and misunderstanding were the main causes of most of the hardships I faced [that weren’t self-inflicted, anyway], and I’m not inclined to hold grudges over those.) I could discuss others at length as well, but I don’t want this book to be entirely or even mostly about me. However, without at least providing my personal history, I also can’t fully convey why this film and the potential it offers for future filmmaking are so important to me.

Back to top · Table of contents · My portfolio · Contact me · Website index

Personal background

A general overview

Broadly, most people who meet me probably assume I’m a rather nerdy twenty-something (or even teenaged; I still get carded at bars), cisgender, heterosexual male of relatively well-off Ashkenazi background who’s faced few major personal hurdles. This isn’t at all correct; the least significant inaccuracy here is that, as stated above, I’m actually thirty-three. Accepting the surface appearance as the full truth would be ignoring aspects of my life that have caused me grave suffering and adversity. I’ve been sufficiently privileged to receive opportunities to overcome much of that hardship, and my own personal existence is now relatively stable and happy despite the great turmoil in the world. I’m thankful for this every day. I’m thankful for my friends, my family, everyone who’s worked to make society more accommodating to people with my issues, and my own ability to have learned from some of my past mistakes, but I also fully recognize how much work remains to be done, and I feel we need to extend credit to everyone who’s doing that work.

I have high-functioning autism. Autism-spectrum disorders weren’t even negligibly represented in media until roughly fifteen years ago.⁽⁹⁾ I wasn’t told I had the disorder until I was eighteen, which is apparently fairly common for sufferers my age; indeed, it was quite common for sufferers not even to be diagnosed until 2002, the year the disorder first received widespread attention, and the year I became aware I’d been diagnosed.⁽¹⁰⁾ (In fact, a 2002 Time article brought the disorder to the attention of both my parents and my best friend at the time. That was the first time I became aware it even existed.) If the condition had been better known, I’m certain I’d have at least found out about my diagnosis, if not actually been diagnosed, at least twelve years earlier, and I can’t even begin to imagine how much better my life would have been.

I’m not blaming this solely on the media. I saw numerous professionals earlier in my life who appear to have completely failed to diagnose symptoms that, in retrospect, should’ve been obvious to anyone with the requisite training. I’m not blaming them entirely, either, though; English-language materials about autism simply weren’t widely available yet. Autism was popularly assumed, even among psychological professionals, to affect only a certain kind of person, and I wasn’t that kind of person. This is a systematic problem that I can’t simply blame on one source; I have to attribute it to ignorance and move on. Again, I don’t hold a grudge over this; I accept that it’s simply how things are. People are still working to change those problems, and this is another in a long list of things for which I don’t believe I can properly express gratitude.

Autism is far better represented now, but the general public still often doesn’t fully understand our problems. I’ve long had trouble because many Americans consider averting one’s eyes a sign of rudeness.⁽¹¹⁾ I do this unconsciously, without meaning offense. I simply find eye contact with strangers incredibly uncomfortable. I feel I’m revealing far more information about myself through establishing eye contact than I’m comfortable doing until growing better acquainted with someone, particularly since I never feel I learn as much from eye contact as others do.⁽¹²⁾ I’ve tried several times to train myself to look strangers in the eye, and it never stuck; I’m never capable of doing so without conscious effort, and expending such effort even briefly can be so draining that I need time alone or in the company of only close friends to recuperate afterwards.⁽¹³⁾ This has caused major problems with past employers, whose understanding of the disorder often felt incomplete to me.

Employment is a particularly sore subject for us. We’re unemployed or underemployed at rates far above the national average. How much above depends to a large extent on the state. Some states provide much better care for us than others do, and our employment rates there reflect that. We also frequently suffer workplace discrimination. The Americans with Disabilities Act requires employers to accommodate our areas of difficulty where possible, but such accommodations aren’t always actually provided, and proving discrimination is often more trouble than it’s worth, so employers often get away with it. I’ve known some autistic people who are profoundly disabled and will likely require assistance for the rest of their lives. But that isn’t true for all of us. I’ve had my current job for a year and a half, and I got promoted within three months of work due to my skill level. I’m not yet in a position to be self-sufficient with my current income, but I’ve already acquired a skill set that I fully believe can provide me a path to self-sufficiency, and I’m still working on enhancing it further.

In truth, many of us have strengths that will prove invaluable to numerous organizations (particularly ones whose work doesn’t require interacting with the general public) that many ‘neurotypicals’ don’t possess.⁽¹⁴⁾ But we often have incredible difficulty finding work with firms that recognize this. I’ve heard far too many stories about HR departments that dismiss candidates for STEM jobs based on criteria that might be appropriate for department stores, but which are inappropriate for STEM and disproportionately exclude us, to think they’re isolated incidents – and it’s not merely limited to STEM. As a result, companies often not merely pass over qualified applicants, but also hire unqualified ones, and we’re unusually likely to be affected by this. (Plenty of HR people are undoubtedly perfectly qualified and proficient at what they do, but quite a lot undoubtedly aren’t as well.)

We also struggle in college. I flunked out of my first attempt and lost several scholarships worth probably hundreds of thousands of dollars. I took several years to accrue a degree and had several missteps along the way. I can’t fault Pine View for my unpreparedness for college; they had no more idea what I was dealing with than I did, and even if they’d known about my disorder, I doubt they’d have had the requisite experience and knowledge to help me deal with it. That knowledge was simply buried in the ’90s and early ’00s. I didn’t learn about my diagnosis until I started dealing with the issues of collegiate life, and by then, it was far too late to salvage my semester or scholarships; I’d already flunked the semester. It could’ve been worse. I could’ve never learned about my diagnosis, and I’d never have gotten the treatment I needed to get well enough to graduate. I could also have received far less support than my family gave me, which was honestly far more than I probably deserved. Without them, I could’ve ended up institutionalized, homeless, or even dead.

I didn’t take my diagnosis well. Indeed, at first I refused to accept it, and I almost completely wasted the next few years.⁽¹⁵⁾ I was, frankly, an insufferable, self-centered jerk for several years, and I can’t even begin to comprehend how much grief I caused those closest to me. My family’s continued support after that ordeal reveals a patience and grace I doubt I personally possess and find nearly incomprehensible. After accepting my diagnosis, I let it define my entire self-perception for several more years, which was equally foolish; I mostly wasted these years as well.⁽¹⁶⁾ I spent so long doing work I hated largely because I believed autism limited my professional prospects to the service sector, despite its manifest unsuitability for someone with my skill set and areas of knowledge. I had almost none of the requisite skills for that sector and many that are completely useless for it.

I’m not saying I don’t have a disorder; I absolutely do. But a surprising number of aspects of autism are largely about different ways of perceiving the world. That can be and often is limiting, but in the right circumstances, it can also be empowering. My divergent life experiences can give me a perspective and lines of thought that might not occur to others. But it took a painfully long time living with autism to come to this stage of acceptance of it, and being capable of recognizing that I possess these abilities does not guarantee that I can persuade others that I possess them.

The number of autism diagnoses is growing, and our problems remain sadly under-recognized with the general public. Some works have done commendable jobs highlighting some of our issues; The Accountant, of all films, plays like an autistic Jack Reacher. It provides a nuanced, sympathetic portrayal of a relatable antihero with high-functioning autism, and it highlights the difficulties he faced as a child and still faces today. It’s flawed in some ways, but I have few complaints with its depiction of autism, and I’d recommend it overall. I also have to note Community, whose creator Dan Harmon has spoken in depth in at least one interview about his own experiences with autism.⁽¹⁷⁾ The character Abed Nadir is one of the rare examples in popular media of an autistic character who’s portrayed entirely positively. Indeed, he’s by far the show’s most consistent voice of reason. I can’t be entirely uncritical of the show’s presentation of autism, though, since it rarely directly addresses the difficulties we face in society: our struggles aren’t merely limited to sometimes feeling like outsiders and having difficulty grasping the norms of social interaction. They can run far deeper than that. (Regardless, Community remains the best live-action TV comedy I’ve seen since Arrested Development.)

Overall, HBO’s Temple Grandin is probably the best portrayal of our issues I’ve seen. It probably isn’t surprising that it turned out as well as it did given that Grandin herself appears to have been heavily consulted for it; Claire Danes’ performance in the title role is one of the two greatest television performances I’ve ever seen (alongside Tatiana Maslany’s in Orphan Black). The film does a fantastic job depicting how differently autistic people can perceive the world, though Grandin’s perception is quite different from mine, and it also does a fantastic job depicting many of the obstacles we often face in our professional and academic lives (Grandin also had to deal with misogyny, which is well depicted in the film as well). I can’t stress enough how necessary films like this are. It’s the only film I’ve ever seen that reflects my life as closely as Moonlight does, but it does so in different ways: Moonlight is about struggling to accept one’s identity in a society that is intrinsically hostile to people with that identity, while Temple Grandin more closely addresses the specific obstacles autistic people face in our lives. (It also doesn’t explicitly delve into how different our nonverbal communication is from neurotypicals’, which is a theme that no work I’ve yet seen has addressed; however, Danes’ performance does, to a rather large extent, take this into account.)

And obviously, I also have to mention The Good Doctor. To be honest, I haven’t seen much of it yet; it debuted at a time when I was, on one hand, not in a particularly stable mental state, and on the other hand, extremely busy. It’s the most watched new show of 2017-2018, and it has an autistic main character, which is obviously a huge step in the right direction – it could, in fact, credibly be described as historic. It remains to be seen whether this will inspire more creators to tackle the issues facing autistic people. The show has received mixed reviews, with praise for the cast’s performances and for the show’s treatment of autism but with criticism for its overall melodramatic tone.

Despite these positive exceptions, many of our issues aren’t being discussed at all in popular media, and most others are still under-discussed. I’m not fully sure why, but I suspect the causes are mostly benign: creators don’t want to offend us or the disabled community in general. I can respect this impulse, but I still find it inadequate. We may have a (likely deserved) reputation as aloof, but there are almost certainly many artistically skilled autistic people who’d be willing to consult to ensure we’re portrayed realistically and tastefully, and some are probably even artistically skilled enough to assist in the creation process overall.⁽¹⁸⁾

Another issue: how many people are even aware that around a quarter of autistic people are women? It’s often presented in media as an exclusively male disorder. As a result of this lack of awareness, women are less likely to be diagnosed in the first place, and since their symptoms often present differently, that makes it even harder for them to be recognized. (On that note, Sesame Street just introduced its first autistic character, and she’s female: another case of public arts funding providing an invaluable public good, although thanks are also in order to HBO, who’ll presumably keep the program afloat indefinitely if the government pulls the plug.)

Back to top · Table of contents · My portfolio · Contact me · Website index

The subjective experience

Unawareness: My first abortive attempt at college

I think one of the two biggest issues with autism’s under-representation in the media, though, is the difficulty of coming to terms with one’s identity as autistic.⁽¹⁹⁾ I already outlined several difficulties I faced. My subjective experience with those difficulties was far worse than I made it sound, though, since I didn’t explain at all what was going through my head. The obstacles I faced frequently felt entirely insurmountable, and no evidence available to me suggested that I was even capable of overcoming them. I was entirely hopeless for most of my adult life. I’m not entirely sure I can explain my journey towards accepting my disorder without jumping around, so I apologize in advance.

A lot of people say high school is one of the worst times of their lives, but that wasn’t my experience. My parents hadn’t even found a school that worked for me until Pine View, which I first attended in the mid-’90s and was the main reason we moved to Sarasota County; before then, I’d been in about six different schools in as many years. I had a teacher who actually cared about me in first and second grades, but other than that, elementary school was mostly awful. Middle school wasn’t as horrible as elementary school, but far worse than high school. I was bullied far more in middle school; the bullying mostly ended after my freshman year. High school also had plenty of social after-school activities where people were quite welcoming, including the Drama League. It helped that a lot of other Leaguers were outcasts; the League was a place for us to be outcasts together. I also learned how to drive at this time, which is a task many autistic people have trouble accomplishing.

But when I graduated from high school, it felt like the end of the world. Pine View was the only place I’d ever felt at home among my peers, and all of my support systems no longer existed. I had no idea how to perform even the most basic everyday tasks; I doubt I even knew how to use a schedule. My chances of ever again seeing most of my high school friends regularly were slim. I chose to attend the same college as my best friend at the time mostly because he was going there (it also was close to home), and we remained close. (As mentioned above, he later brought the aforementioned Time article to my parents’ attention.) But that wasn’t enough to counteract the shock of change. Getting used to changes for autistic people is frequently overwhelming. I didn’t even know I had autism yet, so I had absolutely no explanation for what I was experiencing.

I remember little about the first college I attended. It’s next to the campus of the one I now attend, and I still remember almost nothing about it.⁽²⁰⁾ I remember the dorms, two professors, a few details from class, and my first roommate, an earnest, soft-spoken Muslim from Pakistan. (Since I saw how peaceful he was, the post-9/11 jingoism and Islamophobia never tempted me.) I remember occasionally going down to a scenic area by a local bay. That’s about all I remember. I’m sure I was clinically depressed. I’m told I forgot to shower for days and sometimes slept in the bathtub. I remember none of this. I do remember a writing/game development project that I became obsessed with at the expense of class; I eventually abandoned it years later,⁽²¹⁾ but it was probably the only thing actually getting me out of bed. It might’ve been the only thing keeping me from contemplating suicide, too.

Denial: Online socialization

When I learned about my diagnosis, as I said, I completely refused to accept that I had the disorder for years. I wound up dropping out, and I’m told I stayed in my room almost exclusively. I remember none of this either. I spent a lot of time on Internet message boards, which were the only place I felt I was getting meaningful human interaction. I didn’t know anyone offline who shared the interests I had at the time, and I probably would’ve found it impossible to make friends in person even if I had, because I simply didn’t feel I had any social skills (which I might not have at the time).

I spent far too much time overall on online message boards and blogs for many years, but the experiences weren’t all negative. They taught me a lot about how to express myself in written form. Indeed, a lot, though by no means all, of the writing I did for many years was for online communities. The experience of communicating with others through text taught me an invaluable amount about expressing myself in written form; it particularly taught me how others will react to my words, which I did not always intrinsically understand before I started posting online. I’ve learned to be far more precise and careful with both written and verbal expression as a result of online experiences.

The Internet also introduced me to many new ideas and creative works, many of which I’d otherwise have had no chance of understanding or experiencing. I first developed an understanding of transgender people through online communities long before their issues were widely known; I knew several people who went through the process of transitioning and were extremely candid and frank about their experiences. I also developed a more thorough understanding of politics and power online, and it was partly protesting my exposure to political ideas through online communities that I ultimately chose to major in political science (the other major factor was seeing the horrendous abuses of power occurring at the time; Abu Ghraib in particular woke me up to politics’ full importance).

I also gained a greater understanding of myself through posting online. I would estimate that I wrote an average of two thousand words a day over a period of about fifteen years. The process of writing so much clarified my thought processes about many issues and taught me what I valued in a way I don’t believe I’d have fully understood otherwise. It also taught me how important it was to keep writing. I’ve gained many sources of understanding of myself through writing that I’d never have gained otherwise; indeed, the very process of writing this book has been a source of understanding and clarity that I don’t believe I’d ever have possessed had I not done so. I may never have fully understood the arc of my life had I not written this book; while I was already satisfied with my existence before doing so, the process of writing this book has enabled me to understand fully how much progress I’ve made. It’s caused me to consider aspects of my life that I hadn’t considered in years, which has given me a source of perspective that I don’t believe I’d otherwise have possessed.

There was one final benefit of my involvement online that I’ll address shortly, because it was one of the formative experiences of my entire adult life. It very well may outweigh all the others, including the writing. However, there were also serious problems, because I wasn’t merely posting online; I was outright obsessed with it. I neglected everything else in my life. Even after I began leaving the house again, I often spent time online to the exclusion of other activities; it unquestionably caused me to take longer to get a degree than I’d otherwise have taken, to have far less in-person human interaction than I truly needed, and to learn far less about how to live a functional life.

Communicating with others online can be a particularly beneficial source of understanding for autistic people, but it shouldn’t be our only significant source of interaction with peers, and it was mine for over a decade. I gained great personal benefits from posting online for the first few years, but the amount I could gain from doing so waned with time, and I spent far too much time doing so for far too long before finally finding a healthy balance. I rarely spend more than an hour or two per day posting online now, and I often spend much less. That’s really all most people should ever need.

My parents eventually had to issue me several ultimata even to get me to leave the house. They eventually made me attend the local community college, where I did fairly well. This at least got me out of the house again. They started me with one class at a time, which was probably wise; I doubt I’d have done well with more. I worked my way up to more classes later. I still didn’t accept my diagnosis, though, and it took me awhile to figure out what I wanted to study.

Acceptance: Finding (and losing) love

I think what truly put me on the path of becoming a functional human being was falling in love. Autistic people generally don’t find romance too often, and I’ve been no exception. It may be at least partially because we’re often awkward and nervous on first dates. That doesn’t make for good first impressions.⁽²²⁾

And my relationship with her was a completely different experience from in-person dating, because I’d initially met her online. Our relationship lasted for over two and a half years; a lot of it was long-distance, but we spent a total of about two months together in person and they were by far the happiest months of my life to that point. My family, who accompanied me on the first visit, agreed that I was completely different in person with her than I was with anyone else. I attentively hung onto every word she said, because it was the most important thing in the world to me.

But it didn’t last. Long-distance relationships are tough even in the easiest of cases, and ours wasn’t the easiest; we both had serious emotional issues. She’d been in a terrible state before we’d met, too. She didn’t discuss it much, but I’m sure she had posttraumatic stress disorder. I won’t discuss its likely cause; I don’t feel comfortable revealing details that personal, but I’d apparently drawn her out of the house for the first time in seven years. We ultimately vacationed in the Scottish Highlands the last time I visited her. It was wonderful, but something felt different towards the end. She felt more distant, and we weren’t as physically affectionate as we’d been during my first two visits.⁽²³⁾ We talked about her maybe visiting me in the United States for the first time, or my visiting her again, but several months passed and we hadn’t made any concrete plans this time.

I still don’t know what caused us to grow apart, but I have suspicions. I overheard an argument between her and her mother and freaked out. It sounded horrible. I remember thinking it sounded verbally abusive, but in retrospect, I’m not certain whether my judgment was correct; I don’t remember enough of what her mother said to be able to say reliably anymore, and I might’ve overreacted.⁽²⁴⁾ In any case, I offered to marry her and invited her to come live with me. I think that freaked her out in turn; she’d never lived away from home. When I said goodbye to her at the airport, I remember having a terrible premonition that I’d never see her in person again. This turned out to be accurate.

We broke up amicably about eight months later. I wasn’t remotely happy about it, but I knew there wasn’t any choice; I still loved her, but it was obvious we weren’t communicating enough to continue carrying on a façade of a relationship. I think I’ll always love her in some fashion, but it did eventually stop hurting; I’m not sure when, but it was many years later. We still talk every now and then, though it seems to come and go; during some years, we’ll talk several times a month, and during other years, months will go by without us contacting each other, but we remain on good terms. She eventually got better; she eventually did visit the U.S. and moved into her own apartment several years ago. I think she works with children now. She’d actually talked about visiting me at one point a few years ago, but hasn’t brought it up lately; she may have gotten involved with someone in the interim who didn’t want her visiting an ex. I don’t know; it’s in my nature to pry about matters like that.

Still, she remains one of six people in my life who’ve been most important to me, alongside my best friends from middle school and high school, my best friend now, and my parents; I’d like to dedicate this book to all six (and to Moonlight’s cast and crew). Conversely, I haven’t spoken to many other people I knew online in years. Some just drifted away, while some are from communities I deliberately cut ties with.⁽²⁵⁾

I took the breakup really poorly; I fell into another depression that I probably took about nine years to escape. I didn’t really escape it until I met my current circles of friends, and a lot of us were introduced to each other by our parents. But I’m getting ahead of myself. Before the breakup, I’d decided to study political science. By this time I’d also realized I had autism, which I’d denied for years. I think what woke me up was having people observe how differently I’d acted with my ex than I did with everyone else. I made eye contact with her protesting course, I already knew her; we’d communicated online and on the phone for almost a year before we finally met in person. So naturally, I acted wholly differently with her. And that world of contrast made me finally realize I had a disorder.

But realizing I had a disorder isn’t the same as accepting it. Many of these years were probably just as horrible as the ones right after high school. Evidently I was incredibly angry for much of this time. I don’t remember it well. I think I was going through Kübler-Ross’ five stages of grief over my diagnosis. It again felt like the end of the world. My parents sent me to some programs intended to teach me life skills. I think they cost some six figures. I don’t think I learned much; I was too angry to learn much. I remember some people from each of these places and a bit about the college classes I attended, but little else. Eventually my parents decided the programs were too expensive and weren’t teaching me enough, and I was already close enough to a degree. I had almost enough credits for an associate’s by the end of the last program (probably about a year and a half after I last saw my ex in person and almost a year after our breakup), and I chose to focus mostly full-time on school. I ended up failing a couple of classes, but ultimately graduated with a bachelor’s degree in political science.

Retail and autism don’t mix

About a year before then, I’d also started working in retail. This was actually beneficial to me at first. My first boss was extremely understanding about my diagnosis. He never explicitly confirmed this, and I never asked because it’s none of my business, but I’m almost certain he’s gay, so I suspect that gave him an understanding of being an outsider. He and the rest of management were supportive at the time, and it helped that for awhile, I was good at selling store memberships to people. But I got complacent and my sales dropped off. I also got increasingly nervous and brusque with people, and I’m completely hopeless at multitasking (which wasn’t required in my first position there, but was required when I moved to restocking items a few years later) and probably always will be.

Retail is simply a poor fit overall for autistic people, I suspect. I learned some things that helped me in the first few years, but after a few years, I gained little benefit from staying there. Unfortunately, I stayed there for seven and a half, and I’m pretty sure my experiences there convinced me my professional prospects were hopeless. I didn’t want to work in politics; I felt the system was too compromised to consider working within it for another politician, and I didn’t want to run for office myself.⁽²⁶⁾ I also had no confidence in my ability to do anything else for money.

It didn’t help that my first boss left about halfway through my stint at the store; he had serious health issues. My second boss didn’t seem to possess any understanding of my disorder, and as far as I can tell, we had a rather serious personality clash from the beginning and took an almost instant disliking to each other. I was commonly reprimanded for a number of things that my previous boss had let slide. Some of them were legit complaints. I’m not very personable with strangers; I’ll probably never be personable with strangers, and that’s a necessary quality for most retail employees.

Perhaps it’s necessary to address the issue of decision trees here. I apologize in advance, as this explanation may get a bit technical. A decision tree is a model of decisions and their possible consequences used to reach a decision. These are often used in machine learning and operations research, but they model the thought processes most people subconsciously use when evaluating their options. My reasoning does not remotely resemble a neurotypical’s in many cases.

I’ll use the example of being asked about the location of a given item I was unfamiliar with, which was a recurring problem I faced at work. The general approach to this is to look up the item on a computer, lead the customer to its location, place it in their hand if it’s there, and offer to order it if not. This is a reasonably simple set of instructions, and it probably causes neurotypicals little trouble.

It isn’t that simple for me. To begin with, I have a perfectionist streak I’ve never entirely been able to suppress. If I didn’t know the location of a piece of merchandise, I took it personally, particularly if it was a fairly well-known item. It didn’t matter that knowing the location of every item in the store was far beyond my job description. I still faulted myself for it.

There’s also the matter that customers are, shall we say, messy. Frequently, they would leave an item elsewhere than where they found it. This frustrated me far more than it would frustrate other store employees, because I recognized the full extent of the possible ramifications, and they consistently rankled me. If an item wasn’t in the correct location, that meant that when a customer asked for it, I wouldn’t be able to place it in their hand. While I could offer to order it for them, this request was less likely to result in a sale than actually having the item. In the era of Amazon, people shop at brick-and-mortar stores for convenience: they can get the item right away. If it isn’t in stock, that reduces the convenience, and it’s probably cheaper to order from Amazon.

As a result, if an item wasn’t in the location specified with a computer, that was a source of irritation to me as well. I get to autistic body language in greater depth starting in a later section, but in short, we’re frequently misread. In this case, customers likely correctly read that I was irritated, but they likely misread it as irritation at them personally, which, of course, wasn’t the case; I was irritated with the customer who mislaid (or shoplifted) the item being requested.

Beyond this, the company procedure specifying to offer to order the item carried with it, shall I say, problems. Here we come to the matter of two conflicting directives. Company procedure carried with it the directive, as in most retail industries, that the customer comes first. This is certainly understandable: a retail business depends upon sales to customers. But I also had a specific quota of other tasks (usually restocking items) I was expected to fulfill within a given time period. These two directives are at explicit odds with one another. Time spent helping customers is time not spent restocking items. Management could have chosen to emphasize one or the other of these tasks, and while they came down slightly more often on the side of “help customers,” they spent enough time on the side of “increase your restocking speed” that I found it difficult to find a balance.

In fact, the restocking speed I was expected to fulfill was one management themselves acknowledged was almost impossible: nine shelves per hour. There is a lot of merchandise on a shelf. Even if I had been able to spend the entire hour restocking without interruption, and even if the shelves had been in good condition (we were expected to stock in alphabetical order, and if items were disordered [as they usually were], this made stocking items more complicated), nine shelves per hour would’ve been a tall order. Management themselves acknowledged this, which raises the question of why they even used it as a quota, to which I never received a satisfactory answer. Regardless, they also said in almost every performance review that I should raise my restocking speed. And subconsciously, the nine-shelves-per-hour quota was in my mind the entire time. It didn’t matter that no one actually reached it; I still faulted myself for not reaching it even though I knew it was unrealistic.

I was consciously aware of the dichotomy between needing to increase my restocking speed and needing to improve my customer service the entire time I spent helping customers, and it produced a fight-or-flight response. Everyone experiences fight-or-flight responses sometimes. They’re most commonly responses to direct threats to survival. Autistic people, however, experience them far more commonly than neurotypicals do. It certainly isn’t advisable for a retail employee to fight a customer, so I spent almost my entire time on the store floor in flight mode. Even if I wasn’t directly helping a customer, I faced the possibility of being stopped for a question. As a result, I subconsciously made myself scarce – which, of course, is another behavior frowned upon in retail.

There’s also the matter of concentration. I suffer ADHD (attention deficit hyperactive disorder), which normally makes my attention wander. However, I’ve been medicated for it roughly since I was five. The medication tends to create the exact opposite effect: I concentrate intently on one task until I finish it. However, being interrupted breaks my concentration, which causes several responses that are unpredictable to someone unaware of my conditions – including, often, a fight-or-flight response.

I wouldn’t hold any of this personally against someone who interrupted me when I was concentrating on restocking items, to be clear. It was in my job description to help, and they would have no way of knowing I’d respond to a simple question with what is essentially a state of panic. But I would experience a state of general irritation about the entire set of circumstances – that society is structured to the convenience of people who are more extroverted and better suited to multitasking.

So, as a result, my customer service wasn’t great – maybe it was even poor. Some of this was due to personal failings, but others were likely beyond my control, and to be clear, I made a distinct effort to improve them. By some measures, I even succeeded in these efforts.

But there was a problem with communication between management and me, and it definitely wasn’t exclusively my fault. It goes far beyond the issues autistic people usually face in communicating with neurotypicals; it goes to an extent I would consider negligent on management’s side. They promised in one performance review to give me weekly feedback about my progress in face-to-face meetings. They gave me exactly zero such meetings after this review, but nonetheless they eventually gave me an ultimatum based on alleged communication that they’d never actually explicitly stated to me, despite acknowledging that I had, in fact, made progress. They seem to have felt I should’ve picked up on implicit cues – even though I’m autistic, and they knew this. Indeed, they claimed this ultimatum was based issues with my communication with management rather than any of the issues they’d raised in previous performance reviews (mostly restocking speed and customer service). My restocking speed and customer service weren’t raised as issues with my performance in this ultimatum, and my communication with management hadn’t been a major focus of any previous review.

I think they started from a desired conclusion and cobbled together a set of excuses to justify it. They needed to get rid of people and were looking for excuses to do it without actually firing them or laying them off. They tried to put a good face on things in all the staff meetings, but retail is shrinking. They were doing a lot better than some of their competitors, but better in this case may not actually mean good: their largest competitor actually went completely out of business during my tenure at the store. I also consistently got the impression that the store’s second general manager (the one in charge for the second half of my tenure) personally disliked me. I believe I was treated in a discriminatory fashion and don’t believe I could’ve done anything to keep my position.

I could’ve fought this. It still feels to me like a flagrant ADA violation. They promised communication that never arrived and then blamed me for their miscommunications. But by then I was fed up, and it wasn’t worth it. I also didn’t think I had enough direct evidence to make a case. I feel like I was railroaded out of the company, but by then, the feeling was mutual: I was so disgusted that I wanted nothing to do with them. Why bother challenging employment status at a hostile work environment that I hated anyway? The only other possible outcome in my favor would be a monetary settlement that likely wouldn’t be worth the anguish it’d cause. So I didn’t bother – which, if they were malicious instead of ignorant, would’ve been exactly what they were counting on. I still don’t know which was the case, and I still like some of my former coworkers, but I haven’t set foot in the store since.

TV ratings: a much better fit

Working at that place put a horrible dent in my self-image. I think that’s why I stayed there so long; I felt I had no other prospects. I was horribly wrong. About seven months after quitting there, I began working with my current employer.⁽²⁷⁾ It isn’t year-round, which was my only major complaint with the position when I started; it pays too little for me to be self-sufficient purely for that reason.⁽²⁸⁾ But I did well enough to be promoted from a data entry job (referred to on the floor as ‘editing’) to a research job after two cycles (roughly three months’ work). This is apparently rare; during training they said they sometimes promoted people this quickly, but only a few times a year. I’m not entirely sure why I was one of them, but I have suspicions; I was nervous and asked a ton of questions when I started editing, so they showed me editors’ rankings within the facility. Apparently I was in around fifteenth place among several hundred editors for speed and accuracy combined. I assume I did even better the second cycle. (I also quit asking so many questions after seeing how well I was doing.)

Research has a much higher learning curve: if editing is algebra, research is calculus. Editing probably has dozens of rules to remember; research probably has hundreds, and it isn’t taught in an organized fashion. New editors get a structured two-week class teaching the step-by-step process of entering a diary, covering the most common edge cases they’ll encounter and specifying what advertisers and TV stations want (naturally the most important part of TV ratings). You’ll still have questions after it, but rarely more than about ten an hour. Research is nothing like that; I suspect there aren’t enough of us (probably about thirty between both shifts) to make creating an organized curriculum cost-effective. You’re given a few hours of basic instruction; then you’re thrown right into assigning cable systems to households, even though you’ll inevitably have dozens of questions within an hour. (Our manager must be one of the hardest-working people in the facility; she trains new researchers and answers everyone’s questions. I certainly couldn’t do her job.) It takes two cycles even to undergo training, and you still won’t know everything; identifying broadcast stations is particularly complex.⁽²⁹⁾

Regardless, during roughly my second cycle in research I was one of three people in night shift recognized for going the entire cycle without a recorded error; I repeated this feat in the most recent cycle. This has, naturally, been incredibly beneficial for my self-esteem. I’m almost certain that a large part of what was ailing me was professional.

However, it wasn’t all professional. I heard about this job because my best friend mentioned working there and I applied. I met her sometime in 2015 as part of a support group for autistic people.⁽³⁰⁾ Many of these people have been good friends, but she’s the best friend I’ve had since my ex, honestly. I’d had supportive friends online, but online interaction can only do so much for you. Had this support group existed sixteen years ago, I suspect I’d have been incalculably less miserable.

As would have my family. I haven’t directly mentioned their experiences much, because I don’t entirely comprehend what they went through. I don’t think I even know how to comprehend it. I know it must have been at least as trying for them as it was for me, and probably more. Having recounted all these experiences, I have absolutely no idea how they had the patience to get through them. Then again, I don’t think I would have the patience to go through my own experiences again, either. The fact that we made it through these experiences at all is honestly kind of miraculous.

Back to top · Table of contents · My portfolio · Contact me · Website index

On autism

The pain of autism and what might help

I mentioned support groups. There are other parents in some of these support groups whose kids are essentially going through the same things now that I went through sixteen years ago. They won’t leave their homes. They don’t have friends. No amount of encouragement from others, even peers who might potentially become friends, appears to be able to encourage them to leave their homes. They’ve flunked out of schools. They don’t (and can’t) work. They barely talk to anyone. And I don’t know what to tell any of their parents. Some of what my parents did helped me, but a lot of my recovery was either partially due to external events (which obviously are uncontrollable) or was simply progress I had to make myself, and that process takes time. A long time. That’s not what any parent of an autistic child will want to hear. And it shouldn’t be what any parent of an autistic child has to hear. There should be a better solution. There should be several better solutions, because one that works for one autistic person may not work for all of them.

I’ve made colossal amounts of progress in the last sixteen years. I still have a long way to go, but I have confidence I can make that progress now. I had no such confidence starting out. Accepting my identity with autism was an excruciatingly long process, and I think a major reason was that I simply had no role models and no way of knowing things could ever improve. There are very few famous spokespeople for autism. Temple Grandin, John Robison, and… who else? I’m sure there are a few others, but the scarcity of public examples of how to live with the disorder makes it feel like a hopeless diagnosis. It feels like you’ll never learn to be ‘normal’; you feel doomed to a lifetime of care from others, dead-end jobs, and little control over your own life.

That’s not necessarily the case, though. I feel like I have more control over my own life than I’ve ever had before. I feel like almost anything I want is open to me professionally – if not now, eventually. I just have to avoid sectors of work that require skill sets that I don’t possess (like retail). There are a lot of things I still wish I knew. I wish I knew how to avoid coming off as awkward on first dates. I wish I were better at reading body language, and got more out of eye contact. These are things most people seem to understand how to do implicitly, as though through instinct. Autistic people generally have to learn them. But, for the first time, I actually feel capable of learning them.⁽³¹⁾

But how do you convey to someone else who’s hopeless that it’s possible to get to this point? This is where the lack of media representation is a serious problem. There are very few stories about journeys like this, and the lack of such stories makes one’s condition feel absolutely hopeless after receiving an autism diagnosis. It takes a long time for a lot of people who are diagnosed even to accept that they have a disorder; it certainly took me a long time. And just accepting that you have a disorder isn’t at all the same thing as actually accepting the disorder itself. I didn’t come to accept the disorder until very recently. I denied that I had a disorder at first; then I accepted that I had a disorder, but felt it limited what I could accomplish. That lasted for almost a decade. It wasn’t until recently that I came into a state of acceptance that was neither denial of my diagnosis nor viewing my disorder as an insurmountable obstacle. How do you help someone else get past that?

Seeing other autistic people do well might help. I’ve earned one degree. I’m nearly done with a second. I’m doing incredibly well at work. I have enough skills that even though this work will eventually be automated, I’ll likely find another position either at another company through references from this one or within this one (I very well may already have enough technological skill to assist with automation if there’s a relevant position within a workable commute from my home).

But another thing that might help is better, more widespread media representation. If there were more stories about autistic people who have managed to overcome their challenges, and more importantly, about how they did so, then getting the diagnosis might not feel so completely hopeless, because they might see a realistic path out of their despair. The existence of such stories might even be enough to stop the denial that is so many people’s first reaction to the diagnosis.

A third thing that could help is for neurotypical people to realize how their preferences and thought processes are normalized in society, and to be more aware of how others have different ones that are not. Some of us are very obviously autistic, but some of us get pretty good at hiding it. A rather large number of people who meet me don’t seem to realize I’m autistic these days. I’m still bad about eye contact, but I’m not as nervous as I used to be.⁽³²⁾ In other words, random people you meet might be autistic and you might have little to no idea.

I have a possibly slightly crackpot theory, actually: autism might be one extreme variation of a ‘typical’ brain, but there could also be an opposite extreme – pathologically extroverted, shall we say, for lack of a better term. And it’s entirely possible that those with that brain type are drawn to vocations that dictate social norms: police, psychology, cops, recruiters, HR, teachers, and so on. This isn’t true of everyone in such fields, but many such people seem to be extroverted to the point of perhaps being irritating and disruptive to others. And these fields also often rely on unspoken biases (“common sense”) in judging others. People whose preferences society normalizes can be entirely unaware that others don’t share them; if people keep giving them the message that their actions are normal and healthy, they may never question that assumption or even realize what they’re doing.

But are they even a majority? Maybe they’re not; maybe they’re just the kind of people likeliest to project their worldviews onto others and to be placed in positions with the power to do this. There’s good reason to think this is possible: extroverts are likelier to seek out positions involving contact with many people than introverts are, after all. But if pathological extroverts set society’s norms, is that really any better or more representative than it would be if autistic people were doing so?

Lack of eye contact is seen as a maladaptive response requiring correction, and it’s apparently an acceptable reason to fire employees; similarly, spinning and rocking in children are deemed maladaptive symptoms requiring a cure. At the same time, sensory overload, the idea that visual stimuli and peripheral noise are barriers to concentration, is widely accepted; educators and companies legally have to accommodate it to facilitate productive work environments. I’m not saying spinning and rocking can’t be distracting too, but ultimately it’s an arbitrary demarcation ungrounded in logic.

Making eye contact is normal because American custom says it’s rude not to. Not spinning or rocking is normal because custom says so. Limiting sensory overload is normal because custom says so. And, again, many of these are cultural; the Scots don’t seem to see averting one’s eyes as rude (though many are still comfortable looking strangers in the eye; they’re just not offended if you avert yours).

My observations here aren’t original, and I’m honestly cribbing a bit from others who’ve made similar ones (I’m not identifying them because they won’t want public scrutiny). Regardless, I’d had ideas along these lines even before others’ thoughts clarified mine, and I think people would greatly benefit both themselves and society overall by considering whether their preferences are normalized in everyday interaction and whether that inadvertently hurts those who don’t share them. “Do unto others as you would have them do unto you” is a flawed principle, because many people don’t want the same things you do; a more beneficial one is “Do unto others as they would like to be done unto.”

Championing diversity from a number of perspectives, including thought-related ones, will benefit more than just autistic people. I mentioned above that many companies disregard qualified candidates partly due to unwillingness to consider neurodiversity.⁽³³⁾ Displaying greater receptivity to ‘atypical’ thought and behavior patterns can benefit companies’ bottom lines: they’ll hire better qualified candidates more often. People should also learn to accept diversity as a general principle and to question the biases underpinning their assumptions about what is or isn’t socially acceptable, which often exclude specific demographics. African-American Vernacular English is a highly pertinent example unrelated to brain functioning; many people dismiss it as signifying reduced intelligence or education, but in reality, it’s simply a separate dialect of English. (Of course, Moonlight is a highly perceptive, poetic, intelligent film largely written in AAVE; maybe it will help change this.)

Another thing that might greatly benefit many people (not merely autistic people) is allowing workers to listen to their own music more often. Many people with and without autism find that ambient noise impairs their concentration;⁽³⁴⁾ many desk jobs rarely require enough attention to external audio stimuli for headphone use to be a problem. My current employer lets us listen to music on headphones, and I suspect that’s 25% of why I enjoy the job (the other 75% is the work itself and the fact that I’m good at it). I’d probably still be fine without it, but listening to music on headphones makes ambient noise less distracting; I don’t concentrate as well otherwise. It also calms me in moments of stress, and it’s difficult to overstate the value of that. I’ve had severe fight-or-flight responses for most of my life; music is one of the few things that can alleviate them.

And of course, there’s one final thing that would help incalculably: far, far more government recognition of and response to our problems. As I said above, our unemployment rates vary widely from state to state depending upon how effective state governments’ assistance is. The tiny amount the federal government has done, though, is shameful. The fact that resources are so utterly scarce for those who have just received diagnoses and their families, in particular, is an injustice I doubt I can adequately condemn without copious amounts of profanity, and alleviating it is entirely within government’s social contract. I don’t at all expect the president* to fix it; he mocked a disabled reporter, after all. But maybe our next Democratic president will help. Hillary Clinton proposed a massive, comprehensive mental healthcare program. I’m not sure how it could’ve passed Congress unless she’d gotten the Senate and House back under Democratic control, but it would’ve been a huge improvement for us if she had. I hope the elections over the next four years turn out much better.

I’m not fully sure how a society that treated us fairly would look. Some of ‘our’ problems aren’t fully our problems. If others can’t communicate with me, that isn’t just my fault; surely their ignorance of how to communicate with me is an equal factor. Perhaps others should be taught to communicate with us, too. Still, we have trouble with tasks like college, employment, and household chores that others intrinsically seem to understand, or at least to learn more easily. Government support could teach us how to overcome these obstacles, and our families certainly deserve more resources than they’re getting. This is a crisis, to be clear: about one in sixty-eight people born in the U.S. have autism. There are millions of us, and little is being done for us. Our difficulties remain poorly understood, and there’s little sign the public is being educated about them to the necessary extent.

I would like to emphasize, again, that I have no major complaints about the personal treatment I’ve received (apart from my first employer after my first boss left. I’m still not sure if they were malicious or just ignorant about autism’s severity; Hanlon’s Razor suggests the latter, but it often felt like the former both to me and to my parents). Particularly in the past few years, I’ve been unbelievably fortunate; I’ve received opportunities I doubt I even deserve. Some of my past behavior was truly reprehensible, and even now, I can’t in good faith paint myself as anything like the sort of person I’d like to be. But I’ve always suspected I even got those opportunities because my family is comfortable, white, and upper-middle-class; I can’t even begin to express the debt I owe them here.

My recognition of how greatly advantageous these privileges have been for me has incalculably shaped my worldview. Understanding the lived experience of autism has led me to ask ‘what-if’ questions about my life. I don’t believe I can even comprehend how difficult lower-class and minority autistic individuals’ lives are, but the unfathomable disparity of their experiences and mine may provide the foundation of my entire understanding of power and injustice.

I’d also like to note the fundamental offensiveness of the “vaccines cause autism” crowd. I’ll sidestep their horrific, destructive scientific illiteracy, which isn’t even worth dignifying with a response; I simply wish to note that, if one carries their narrative to its logical conclusion, they’re effectively saying they’d rather we die from preventable disease than live with autism. This is seldom recognized, and I feel obligated to mention it. Out of principle, I’ve almost never engaged directly with either vaccine-autism truthers or Nazis. As far as I’m concerned, there’s also little difference between them.

Back to top · Table of contents · My portfolio · Contact me · Website index

Autism and comorbidity with mental disorders

[Content warning: suicide. I’m linking to several major suicide prevention resources in advance:

If you are at all actively entertaining suicidal ideations, please call one of these hotlines. Anyone who would attempt to shame you or retaliate against you for doing so doesn’t deserve to breathe oxygen.

As furious as I was when I wrote this section in 2017, I wasn’t even fully aware how stark the problem is. The U.S.’ overall life expectancy is seventy-two years of age. Its life expectancy for autistic people is thirty-six. I wrote this addendum the day I turned forty-two. Being six years past my life expectancy is sobering.

To be clear, our high childhood morbidity rate is a major cause of the discrepancy. But there’s a second: Our suicide rate is also nine times the median. The only adequate expressions I know of my feelings about this are obscenities, so I’ll leave it at that. –Future Aaron]

I was going to end the autism discussion with that, but while I was revising this book, political scientist Will Moore ended his life. I wasn’t that familiar with his work, and certainly didn’t personally know him, but he was considered one of the world’s experts on political violence.

That isn’t, however, why I’m addressing his suicide here; the relevance is that he was autistic, and his suicide note made it absolutely clear to me that his suicide was a direct result of how society treats autistic people, which I hold as directly culpable for his death. I don’t expect everyone to share my ethical priorities, but I intend to establish that, if autistic people were treated better, Moore would almost certainly still be alive today.

I’ve commented before that autistic people have high unemployment rates. We also have high comorbidities with other mental disorders. It’s not clear how many of these are specifically caused by brain chemistry and how many are specifically caused by society’s treatment of us. Some of them may be, and indeed probably are, caused by both. I’ve described my own experiences with depression in depth; it’s estimated that up to 57% of us may suffer from it at some point, and we attempt suicide at roughly four times the rate of the populace at large. I also have ADHD, which caused me great trouble earlier in life but, since I’ve been given suitable medication for much of my life, has not been a major obstacle; ADHD and autism also have high comorbidities.

Despite the common comorbidity with depression, we often aren’t diagnosed with that either. Some of this may simply be because our symptoms present differently. Part of it is that our body language is also difficult from neurotypicals’. Not a lot of people even realize this. We have trouble communicating with neurotypicals because we don’t intrinsically understand their body language, but part of why they have trouble communicating with us is that they don’t intrinsically understand ours, and often assume they do. Further complicating this, our body language may vary widely from person to person. Even our facial expressions may have different meanings than commonly expected.

It doesn’t help that it’s not merely body language. Our voice tones aren’t always the same, either. A flat, emotionless affect can be a symptom of depression. It can also be a symptom of autism. When the two are combined, how do you even tell which is causing it? Sleep problems and difficulty concentrating, other common symptoms of depression, are also common symptoms of autism. And autistic people may not be able to communicate the symptoms of depression effectively to therapists: again, autistic body language is not intrinsically comprehensible to neurotypicals, or even necessarily to others with autism. Furthermore, since many autistic people may have difficulty with language, describing feelings therapists would recognize as depression may pose difficulties. Even high-functioning autistic people, who often have high fluency of language, may nonetheless have difficulty identifying and describing their emotions.

And so we come to Moore’s suicide note. A number of people who’ve read his note have commented that it didn’t sound like a product of depressed thinking. But that’s the thing about depression: it convinces you your thoughts are rational. They’re not. It’s a product of being in such extreme pain for so long that you stop noticing it. It’s like the proverbial frog in simmering water. It occurs gradually, so you don’t notice it, and since you’re no longer conscious of the pain, you assume depression isn’t controlling your thoughts anymore. But of course, it is. It doesn’t really feel like pain any longer – it’s more like the absence of pleasure, the absence of anything. People think of depression as an extreme case of sadness, but that’s wrong. It may start out that way, but over the long term, it’s better considered a case of anhedonia, the inability to obtain pleasure or fulfillment from activities that once provided them. And with that inability, your entire sense of purpose goes away, and that’s where the illness becomes truly dangerous.

One of the most insidious things about mental illnesses is that they convince you you’re completely in control of your mind and looking at things rationally, when of course, your thoughts are about as far from rational as possible. But Moore’s note was written in a calm, matter-of-fact tone, as though he were a detached observer about his own life. He felt he had no reason to be depressed, so he couldn’t have been depressed, right? But depression doesn’t work that way. You can have every advantage in life and still be depressed. That’s not a personal failing. It’s simple biology.

It’s not entirely clear what causes depression; it may be partially a product of external circumstances and partially a chemical imbalance in the brain. There’s another theory that it’s simply a malfunctioning of the brain’s fight-or-flight response, since the first occurrence is often linked to a specific event. A ‘switch’ in the brain gets stuck to ‘flight’, and the resultant yearning to hide from the outside world often has little to no connection with events in the depressed person’s life.

When I was depressed, improvements in external circumstances alone didn’t always alleviate the problem. My relationship alleviated my depression for the time, but flooding the brain with oxytocin and other hormones associated with love can do that. It very likely altered my brain’s chemical composition during the relationship, but once it ended, I fell right back into my depression. Getting my bachelor’s degree didn’t even register in my mind. My self-perception was so negative that I couldn’t perceive what a huge accomplishment I’d just made.

For me, the key paragraph in Moore’s suicide note is the one where he discusses society:

I didn’t “fit” in society. That isn’t a problem of society. Setting aside moments of petulance, I viewed it as a plain fact. There it was. What to do about it? Ask society to adapt to me? Hah!

Yes, actually, society should have adapted to Moore. Society is profoundly cruel to people who don’t fit in, and Moore was a victim of that. Paradoxically, our culture is both extremely individualistic and extremely intolerant of eccentricities. It’s an awful place to be introverted: you’re expected to put on a smile and be ‘positive’, 24/7. Wanting solitude is ‘weird’. Who needs introspection? Silence makes people deeply uncomfortable.

Society frequently expects those who don’t conform to its norms to adapt to them without asking if those norms are even healthy for them. For that matter, it rarely even considers if those norms are healthy for those who do conform to them. I’ve repeatedly been asked to alter my communication to suit others’ preferences. Never once in my life have I seen others asked to alter their communication to suit my preferences.

My second depression, which lasted for around a decade, was hardly improved by spending most of it at a job that considered my own communicative preferences intrinsically ill-suited to it and directly blamed me for having them. To be fair, I don’t expect a retail business to be a driver of social change on behalf of introverts. But at the same time, I don’t see anyone else in society fighting for that change, either. Needing time alone should not be considered an intrinsic personal failing, yet frequently, I felt as though it was being treated that way.

Being told, for your entire life, that you’re at fault for personality traits that aren’t intrinsically harmful to others is exhausting. Moore is a plain, clear writer, which is evident even in his suicide note. Moore clearly indicates how thoroughly he’d internalized the blame for receiving that message:

Being a misfit manifested itself in two broad ways over the course of my life: (1) far too often I angered, insulted, offended and otherwise upset people, without expecting or intending to, and (2) I rarely felt that I was successful explaining my ideas, perceptions, understandings to others. […]

The best way for me to articulate why I valued honesty is that it hurt to lie. White lies (told to spare another’s feelings) hurt. As Holden Caufield puts it, being “phony” hurt. […]

Small talk is a hugely important social lubricant. Intellectually I came to understand that. But emotionally I could not deal. […]

Over the years I came to understand myself as adopting a tone that has been described to me by various women in my life as “that tone,” “obnoxious” or “condescending.”

Sometimes I recognized what they were referring to. But, and this is the difficult part, I very frequently did not. Indeed, my ex-wife had to put up with more than a decade of me responding very defensively when she would make that observation. […]

I was tired of pissing people off, especially when I did not expect to or mean to.

Given how lucid and clear his writing is, I have to conclude that the problem wasn’t with his verbal communication. It must have been with everything else.

Most of us, when we are children, are very, very different from other children. We quickly learn to start hiding those differences, because our society is very, very intolerant of outsiders. Children have to learn that somewhere. They don’t intrinsically know racism, misogyny, queerphobia, and other exclusionary attitudes. They learn them from adults. And I suspect their intolerance of children with behavioral differences comes from the same place.

So we learn to start hiding our differences – we act more like the rest of society in an attempt to fit in. There isn’t much choice; if we didn’t, we’d continue to be bullied, and no one wants that. Even we want human connections. (You can actually see this in Moonlight, though it’s not explicitly commented on: Chiron acts ‘off’, as a child, in a manner that isn’t purely explained by being gay in a homophobic society or black in a white supremacist one. As he ages, his behavior comes to resemble that of ‘normal’ adults more closely. It’s not clear if Chiron is intended to have a disorder of some kind: if he is, the film doesn’t comment on it. But this is a pattern with autistic children: many are quickly taught to hide their most obvious ‘tells’ from their peers.)

But this is a double-edged sword. Because outsiders can’t tell we’re autistic, they assume they’re reading our nonverbal communication correctly. This is not, in most cases, a correct assumption.

93% of communication is said to be nonverbal. We don’t intrinsically understand neurotypicals’ nonverbal communication, and most of us understand that we don’t intrinsically understand it. We know there’s something there that we’re not getting, and it’s frustrating. The most obvious tells, like body language and voice tone, can be taught to a certain extent, but there are intangibles. Neurotypicals appear to be able to look into each other’s eyes and glean information about each other’s emotions in that manner. If there’s a way this can be taught to us, I’m not yet aware of its existence.

Conversely, as I’ve said, neurotypicals don’t intrinsically understand our nonverbal communication, either, and often assume they do intrinsically understand it. 93% of what we intend to communicate is misunderstood by people who don’t even understand that they’re misunderstanding it, who then blame us for their failures to communicate with us. If fault should be attached to any party, the fault should be shared, since communication is two-way. You can’t blame a person who speaks only English for inability to communicate with someone who speaks no English: the problem is one of ignorance. It’s as if, every time we said the word ‘horse’, the outside world heard it as ‘dog’, and blamed us for their reaction to the word ‘dog’. But it’s not our fault that others perceive our nonverbal communication incorrectly. The problem is again one of ignorance, and the ignorant parties are frequently so ignorant that they aren’t even aware they’re ignorant of something. I can’t even begin to describe how exhausting this is. It affects every single in-person interaction I have with others, and probably ultimately explains why most of my best friends now are also autistic: they’re the only people who consistently won’t assume they’re interpreting my nonverbal communication correctly.

Moore was exhausted by social conventions that personally bothered him, and he blamed himself for being bothered by them. The possibility that others could be even partially at fault does not appear to have even registered with him. If blunt honesty bothers society, but dishonesty bothers an individual, is the individual at fault, or is society at fault for considering dishonesty beneficial? And if an individual, without intending offense, is repeatedly misinterpreted as being offensive, but is never given a comprehensible explanation for what is causing the offense, then is the individual at fault for communicating in a manner others find offensive, or are others at fault for not being able to explain what has been causing offense and how to stop doing it?

The unspoken assumption here is that there’s a right way and a wrong way to communicate, and that a person who can’t learn the right way is to blame for it. The idea that one form of communication may not suffice for everyone in society and that others may need to learn to communicate with those who don’t share their preferences hasn’t even been considered. Those who don’t communicate in the preferred fashion are simply blamed for not doing so. It’s their fault for not learning how others communicate; it’s not others’ fault for not learning how they communicate.

And this is ultimately a large part of the reason it’s so painful to come to terms with one’s identity as autistic. Autism is an identity issue, just as ethnicity, sexuality, gender, and similar issues are. Yet it’s rarely seen as one in society. So much of what comes naturally to others in society isn’t natural to us. Body language, eye contact, verbal tone, facial expressions, unspoken social conventions: we’re blamed for not understanding them. Everyone else gets them naturally; what’s wrong with us? And yet most of this is simply a product of the fact that our brains work differently. Our body language is frequently different from others’. Yet we’re blamed for not understanding others’ body language, and we’re blamed when others don’t understand ours. We’re blamed when we misinterpret others’ words, and we’re blamed when others misinterpret ours.

And, in so many cases, this is a professional obstacle. The ADA is supposed to protect us, but there’s little understanding of the unspoken societal assumptions that hold us back. As I said, my first employer blamed me for their miscommunications with me; they don’t even seem to have considered that they simply weren’t communicating in a fashion I found comprehensible, or they considered its incomprehensibility my fault. This seems to be a recurrent pattern for autistic people; it certainly happened in Moore’s case. As I said, he’s clearly an incredibly fluent writer; an inability to understand language is not the issue here. It appears to be tone, body language, and all the other nonverbal aspects of communication that come naturally to others. We have to learn 93% of the parts of communication that others understand naturally. And it’s not taught in school. The resources for learning this aren’t widely available. I will openly confess to still not understanding much of it.

But at the same time, if we don’t understand others’ 93%, why do others assume they understand our 93%? The idea that the misunderstanding could be mutual never registers with others. As I’ve described in detail, our body language and even voice tones can be wildly different from neurotypicals’, who rarely ever consider that we may be communicating wildly different messages than they assume we are. Others simply assume they understand what we intend to communicate nonverbally without questioning whether they assume correctly. As I’ve said, this is exhausting.

By autistic people’s standards, Moore’s life was a resounding success. Most of us have trouble even maintaining employment. Moore was a highly respected political scientist whose work was widely recognized in academia and who made an incalculable impact on multiple generations of his students, many of whom considered him one of their most significant mentors. His work in studying the denial of human rights, repression, dissent, and similar topics was regarded as pioneering in the field. Yet he ultimately felt like he had nothing further to offer the world, and that his life had been a failure. Because he sometimes annoyed others, he seemed to think his entire existence was worthless.

And yet, apparently, no one was ever actually able to provide him a satisfactory explanation to him of why he annoyed others. What’s particularly ironic is that, if there had been more autistic people in his life, they may actually have had a better chance at picking up on it. One of us probably wouldn’t have assumed we were correctly reading either Moore’s or others’ nonverbal communication, and perhaps would have been able to identify factors common to the conflicts.

And I’m not sure I actually have words to express my condemnation of academia’s institutional treatment of mental health issues. The milder ones are insufficient to describe the outrage it causes me, and the stronger ones are either obscenities or might seem like hyperbole. I don’t intend either. According to a study by professors at Ohio State University and the University at Delaware, other professors cited, verbatim, the following reasons for not disclosing mental health issues to colleagues:

And there were many others like these. This isn’t merely a case of widespread violations of the ADA. This isn’t merely a case of widespread indifference. It’s a case of widespread malevolence. It’s a case of widespread, active hostility to people who are already marginalized in society, already face far more issues than the general populace, and it appears to be a product of institutional culture.

Society caused Will Moore’s suicide. And, as I’ve said, attempted suicide rates are indeed about four times higher among autistic people than they are among neurotypicals. It’s impossible for me not to conclude that society causes a lot of autistic people’s suicides. Until we improve our society, this will continue. I personally regard this as ethically a collective murder by indifference. I don’t expect this ever to be legally prosecuted, but I hold society collectively responsible, and changing these forms of behavior will be a central concern of my life going forward.

Again, I have few complaints with my own life right now. But society treats us horribly in general. This needs to improve.

Back to top · Table of contents · My portfolio · Contact me · Website index

Other marginalized groups

There are, of course, dozens of other groups who are marginalized in various ways and whose stories are rarely, if ever, told in popular media. A few examples, by no means intended to constitute an exhaustive list or to be construed as equally problematic, include:

People of non-binary gender

While queer representation in media has improved vastly, almost all media representations of transgender people still fit the gender binary. Trans people arguably remain America’s most marginalized population; they’re far likelier to suffer unemployment or mental disorders than cisgender people, and are murdered and physically attacked at startlingly high rates (for example, it’s estimated that up to two thirds of trans people are sexually assaulted at some point, compared to about one in six American cisgender women and 3% of cisgender men). Out non-binary people receive even worse treatment than trans men and trans women, and it’s not even exclusively from cisgender people – as bisexuals have sometimes faced prejudice from gays and lesbians as well as from straight people, sometimes non-binary people have faced prejudice from trans men and trans women as well as from cisgender people. The almost complete lack of media representation of non-binary people has many consequences, starting with the fact that they may not even come to terms with their identities for decades purely due to unawareness that gender is a spectrum.

The mentally ill

Often, gun control opponents engaging in whataboutism point to the mentally ill, as though treating mental illness will reduce violence. (At the same time, Republican politicians rarely actually support programs to treat mental illness.) In point of fact, the mentally ill are overall vastly likelier to be victims than perpetrators of crimes, with only a few specific mental illnesses providing exceptions. Despite this, violent mental illness remains a common element in popular portrayals of the mentally ill, which almost certainly contributes to continued stigmas against mental illnesses. A large fraction of the American public will suffer at least one mental disorder at some point, and stigmas against mental disorders cause widespread harm in numerous ways, most notably by reducing the chance that people will receive or even seek treatment – thereby greatly increasing the disorders’ harm.

Additionally, unemployment rates amongst the mentally ill are terrible. It is estimated that up to 80% of people receiving public mental health services are unemployed.⁽³⁵⁾ Employers also hire the physically disabled more often than people with mental disorders, and some have openly admitted they’re less likely to hire people with histories of psychiatric diagnoses, placing the effectiveness of disability quotas at improving the mentally ill’s unemployment rates in doubt. Much of this behavior directly contravenes the Americans with Disabilities Act. I specifically addressed academia’s treatment of mental disorders above, which extends into the overtly malicious. I suspect that many other fields are not much better.

As indicated above, I’ve suffered depression before, often for what felt like interminable numbers of years. I would definitely consider it a mental illness in the most literal sense of the term: it is a case of the brain not functioning in a healthy manner, often caused by chemical imbalance. I was able to keep my job through it for awhile, but I was underemployed for most of that time, and I ultimately got an ultimatum: quit or most likely be fired. Between autism and depression, I sort of feel like I had a double whammy there.

I’ve suffered other mental illnesses as well. I haven’t gone into detail about all of them, but I intend to explore one specific illness, depersonalization-derealization disorder, later. In brief, the disorder makes me feel as if I’m not a real person (depersonalization) and as if nothing else I’m experiencing is real either (derealization). It occasionally causes other perception issues as well, including desomatization (the sensation of leaving one’s body) and a severe distortion of one’s perception of time. These are not psychoses; we remain aware that the issue is with our perception rather than reality. While almost everyone experiences occasional episodes of depersonalization and derealization, only an estimated 1-2% of the populace experiences them severely enough to qualify as a disorder. As a result, the disorder is poorly understood and poorly represented in popular culture; the only popular representation I’m actually aware of is the rock band Counting Crows, whose lead songwriter and vocalist suffers it.

The mentally and physically disabled

I’m not entirely sure what a society that didn’t marginalize the mentally or physically disabled would even look like, but it certainly wouldn’t resemble ours. There are, naturally, some tasks and forms of employment that require a given amount of mental or physical aptitude to perform, and no one seriously expects this ever to change. However, the president* infamously mocked a disabled reporter, falsely linking mental and physical disabilities and deriding people who possess either (or both). Several terms denigrating people possessing these disabilities are routinely used as casual insults in our culture – as, for that matter, are insults towards mental health; I myself once used many insults I now regret, and on occasion still have to stop myself from dismissing an idea as ‘crazy’ or ‘stupid’. (I don’t always even succeed.) Language shapes our thinking about the world, and careless usage can encourage sloppy or even outright malicious thought. People who believe these terms don’t negatively impact people who suffer from these disabilities are sadly mistaken. Even if the disabled people aren’t capable of understanding the insults, they’ll affect those who love them, and they’ll affect the people who say, hear, write, and read them.

My issue isn’t merely with the meanness of the insults. It’s the fact that they rarely even accurately describe what they’re being used to describe. For example, the concept of intelligence, insofar as it has any useful meaning at all, indicates the biological capacity of one’s brain to absorb new information. (Psychology has long understood that there are multiple intelligences, so IQ is a vast oversimplification.⁽³⁶⁾) However, this isn’t the only factor that inhibits learning. If a person believes that they already know everything there is to know about a subject, they’ll be ineducable. This is a consequence of willful ignorance and has nothing to do with intelligence. Indeed, many people who are experts in one field are guilty of this in others, such as Ben Carson, gifted brain surgeon who separated conjoined twins – he’s also a creationist. It is possible that expertise in one field may actually make a person dangerously susceptible to this problem in others: having mastered one field, the person may assume that others should be easy to master. This strikes me as a dangerously mistaken assumption.

The word stupid is often used to insult someone who has wrongheaded or foolish beliefs. Those have nothing to do with intelligence. They’re usually a case of ignorance and being unwilling to learn, which is much worse. People can’t control their intelligence, but if they’re capable of learning and refuse to do so, that’s arrogance, and it’s a personal failing. It’s a much worse ethical lapse than stupidity. As I write this, possibly smart, certainly willfully ignorant people are doing incalculable damage to humanity and the planet through their policies of environmental deregulation. Their intelligence or lack thereof isn’t the problem; their ignorance (or, less charitably interpreted, malice) is. We’re not even describing the problem correctly by dismissing these people as stupid, and we’re tarring a community with the problem by association that ultimately has nothing to do with it.

Tabletop role-playing games such as Dungeons & Dragons provide a more useful way to think of this: INT vs. WIS (i.e. intelligence vs. wisdom). To oversimplify substantially (and with the caveat that I have fairly limited experience with tabletop RPGs), we can think of INT as book smarts and WIS as street smarts or life experience. In short, many statements phrased as targeting others’ intelligence are meant to target their wisdom.

The ancient Greeks also had a better way to deal with this. I have several major goals for this writing, but a minor one is to help revive the Greek word amathia (ἀμᾰθίᾱ, amăthíā)⁽³⁷⁾, a useful concept with no exact English equivalent.

[The common translations lack of education, ignorance, and stupidity are not entirely accurate. The most accurate translation may also be the most literal: not learning. In some contexts, I’ve translated it as refusal to learn; other reasonable approximations I’ve seen are willful ignorance and intelligent stupidity. The latter’s apparent oxymoronicity is a product of our English terminology’s imprecision; “high INT, low WIS” is the obvious D&D metaphor.

Amathia is not only ancient Greek word often translated as ignorance. The other is agnoia (ᾰ́̓γνοιᾰ, ắgnoiă). Again, translating it as ignorance is an oversimplification: it literally means not knowing.

The distinction is important. The English word ignorance usually implies a negative ethical judgement. The Greek words both convey more nuance and more directly address the root of the problem. Agnoia is an ‘innocent’ form of ignorance, since it is our natural state: we are all born without knowledge. Amathia can sometimes be caused by an inability to learn, but more frequently it is caused by a refusal to learn. Only the latter case merits a negative ethical judgement.

For the rest of this book, I will refer not to ignorance, but to amathia and agnoia. –Future Aaron]

Education is no panacea for amathia; the president* is a particularly relevant example. He has crowed before about how he attended “an Ivy League school” and Wharton, but his amathia about large areas of human knowledge stems from both his demonstrated unwillingness to learn and his belief that he knows better than experts in the fields (“I know more about [Daesh]⁽³⁸⁾ than the generals do; believe me⁽³⁹⁾). He ascribes little value to others’ knowledge; he behaves as if he already knows everything of importance about anything. Furthermore, he behaves as though his amathia is universal (“Nobody knew healthcare could be so complicated,” “People don’t ask that question, but why was there the Civil War?”). Perhaps most damningly, despite having it repeatedly demonstrated that there are, in fact, things he doesn’t know (such as how complicated healthcare is), this doesn’t appear to have changed his behavior in the slightest; he still attaches little value to experts’ judgments.

The strong version of the Sapir-Whorf hypothesis may be considered discredited, but my experience is that the usage of words shapes how they are perceived. If stupidity is colloquially used to mean amathia, then people of lesser intelligence will come to be mentally associated as willfully ignorant by association, even if a person isn’t consciously making the association. Many people of lesser intelligence nonetheless possess a desire to learn, and we should commend them for possessing this desire. So I suggest replacing stupid as a colloquial insult with foolish or willfully ignorant, depending upon which is most appropriate. And stupidity, in many cases, should be replaced with something more precise, like amathia.

Similarly, the word crazy is often used to insult extreme political beliefs with which a speaker disagrees. I’ve certainly known people with mental illnesses that had extreme political beliefs, and my own political beliefs would almost certainly be considered extreme by many with opposing political views, but such political beliefs are the exception rather than the norm amongst those with mental disorders. Some more fitting insults, depending upon the beliefs, include cruel, naïve, and, well, wrong. Crazy, again, isn’t correctly identifying the problem, and it’s tarring a community by association that deserves no association with it.

I can’t even begin to fathom what mentally disabled people’s lives are like – I can at least imagine what other marginalized people’s lives are like, even if I almost certainly don’t fully understand all the degrees of marginalization they suffer – but I’ve at least accepted that I can’t comprehend what it’s like to be unable to comprehend what I comprehend. I could as soon comprehend nonexistence (which I also don’t and can’t comprehend).

Our society is profoundly callous to people suffering from mental and physical disabilities, and since many of them can’t even advocate for themselves, it’s up to others to do so. Given that public funding is under constant threat under the current administration*, it’s not even a given that those unable to care for themselves will be continually cared for by professionals going forward, and it’s crucial that we continue to make certain that they are.

The non-monogamous

It’s difficult to know how much of the public consensually engages in non-monogamy, as continuing stigmas naturally cause many such people to be secretive about doing so.⁽⁴⁰⁾ However, it’s certainly more of the public than one would expect from looking at our media. An ongoing debate amongst biologists is whether monogamy even exists in nature. Nearly all species once thought to practice monogamy have been observed engaging in infidelity, which has led more radical advocates of non-monogamy to argue that humans aren’t naturally monogamous at all. This isn’t a view I share; in all likelihood, some people truly are naturally monogamous. But we also share about 98% of our DNA with bonobos, who are among the most promiscuous members of the animal kingdom, so many probably aren’t.

I believe that some people, whom for ease of explanation I’ll call “natural monogamists”, will never be satisfied with non-monogamous relationships, as they’re too prone to jealousy; some people, whom I’ll call “natural non-monogamists”, will never be satisfied with monogamous relationships (maybe they have a larger thirst for sexual or romantic variety than one person can satisfy); and some people, who can be considered “flexible third parties”, can be satisfied with either. The healthiest outcomes in relationships generally occur when people are honest with themselves and their partners about their needs and desires; it also follows that natural monogamists are incompatible with natural non-monogamists because they want or even need different things from romantic relationships, and thus members of the two groups should under no circumstances form relationships with each other. Unfortunately, many people may have a difficult time coming to terms with this aspect of identity, since it may require considerable experience with relationships to understand, and society’s expectation remains that relationships will settle into comfortable patterns of monogamy. Natural non-monogamists won’t know not to date natural monogamists if they don’t realize that monogamy isn’t suited for them. This can, and very often will, result in serious emotional harm to one or both parties if those relationships fail spectacularly, which they probably will.

This is worsened by the continued rarity of sympathetic portrayals of non-monogamy in the media. A particular pet peeve of mine concerns fictional love triangles. In quite a few real-life love triangles, all individuals concerned would derive the greatest amount of happiness if the ‘hypotenuse’ of the triangle formed relationships with both partners. (If the orientations and genders of the participants are compatible, perhaps all three can even form relationships.) This is rarely even acknowledged as a possibility in fiction, much less used as a resolution.

Discrimination against non-monogamists does occur. For instance, children have been removed from parents’ custody for the latter’s non-monogamy, and employers can legally fire employees based on their relationship style. Non-monogamists have also experienced housing discrimination, been denied visitation rights in hospitals, and even been diagnosed as pathological by alleged mental health professionals on the basis of their lifestyles; moreover, the tenuous legal ground for non-monogamists often leaves them open to blackmail. (And now I’ve made monogamy and its derivatives stop sounding like words. I’m sorry.)

People on the asexual and aromantic spectra

Some people simply have little or no desire for sex and/or romance. This is rarely even addressed in fiction, and asexuality and related topics remain greatly misconstrued. Often, such people are simply told, “You haven’t met the right person yet”, or asexuality is assumed to be a cover for some other identity of which a person is embarrassed. Alternately, people on the asexual/aromantic spectra, who might enjoy sex or romance but only under the right circumstances with the right people, are frequently told their standards are too high, as others don’t accept that they simply don’t find sex or romance as important as others do and can be happy without them. Treatment also gets far worse, up to and including housing and employment discrimination and, horrifyingly, ‘corrective’ sexual assault; studies also suggest that asexuals are seen as less human in multiple ways, being simultaneously perceived both as machine-like (emotionless, mechanical) and as animal-like (impulsive, unrestrained, unsophisticated) – which are completely contradictory.

I rather suspect I’m on both of these spectra. I’m certainly not completely asexual or aromantic; I greatly enjoy both romance and sex. I just don’t have as much interest in either as I suspect most people do (I’m also usually much more interested in sex as an abstract concept than in sex with a specific person), and I’m quite happy right now without either.

Sex workers

(Note that trans people, being disproportionately unemployed and underemployed, are disproportionately represented in this sector.) I doubt any other profession faces more societal prejudice than sex work, even though the euphemism “the oldest profession” suggests prevalent awareness that it’s always existed and is highly unlikely to disappear. Sex workers are assaulted and murdered at horrifying rates, and as victims, their work’s legal status poses them severe legal difficulties in finding justice. There’s also a horrifying amount of human trafficking related to sex work, no doubt worsened by the disinclination of clients and coworkers who may suspect it to report it (or, for that matter, any other crime they may witness or suspect, up to and including assaults or murders) for fear of self-incrimination.

One particular oddity is that paying for sex is, outside of parts of Nevada, illegal… unless it’s filmed with the proper records kept; then, it’s legal pornography. I’ve never heard a sensible explanation for this. Some people cite adultery, but prohibiting sex work doesn’t stop this; it just creates a black market. Others argue that circumstances shouldn’t drive people into sex work, but prohibition doesn’t stop this, either; if anything, since victimized sex workers have fewer legal remedies, it exacerbates their problem. (It also ignores the many sex workers who enjoy their work.) Perhaps the fact that numerous workers in numerous professions hate their jobs should be considered an indictment of our economic system, but curiously, opponents of sex work rarely fit this into the larger equation, preferring to treat sex work as sui generis (while, in many cases, still doing little to actually help victimized sex workers).

We should listen to sex workers’ suggestions about solving their problems.⁽⁴¹⁾ Most of them want their work legalized and regulated; it’s always been with us and always will. Legalizing it isn’t a surefire way to eradicate human trafficking; many places with legal sex work, such as the Netherlands, still have serious human trafficking problems. But the “Nordic model” (legalizing sex work while leaving solicitation illegal) disincentivizes clients who suspect such problems from reporting them. Legalizing their profession would allow sex workers to report crimes without fear of self-incrimination, and more easily to find sympathetic witnesses willing to testify for the same reason. (Of course, unsympathetic law enforcement might still be a problem; we may have to change that through social pressure rather than legal reform.)

Current and former adult film stars

People who have at any time starred in pornography can be legally discriminated against in several categories purely on that basis, up to and including being fired from jobs, denied loans, or denied housing applications. Similarly, stars who are outed are often horrifically abused, up to and including death and rape threats, and their families are often not spared this treatment either.

We have a strange prudishness about pornography in this society: we’re the world’s largest production center for it several times over; almost all of us watch it; almost none of us admit to watching it; almost none of us talk about it. I will examine all of the effects of this in a later segment of this writing, but our silence about pornography has a number of incredibly deleterious effects on our society, ranging from it often being teenagers’ primary source of practical information about how to have sex (to which it is not always a reliable guide) to the often horrific treatment to which those who make it are subjected.

Drug users

I’m saying this as one who’s smoked marijuana probably twice and has no interest in trying hard drugs of any sort: recreational drugs’ illegal status benefits only private prison companies. Prohibition not only doesn’t stop drug abuse; it arguably makes it less safe. A number of seriously harmful drugs like heroin are actually more dangerous due to their illegality. A consequence of alcohol prohibition was that bootleg liquor contained hazardous chemicals that killed many people. The same thing occurs today when opiates are laced with fentanyl, which is cheaper, stronger, and far likelier to cause overdoses. (It killed Prince, for instance.) Black market heroin’s inconsistency further increases the risk of overdose, since no one is certain how much they are taking. I’m obviously personally unfamiliar with the subjective experience of opiate abuse, but it appears unlikely that anyone would intentionally overdose unless they were suicidal (another thing prohibition isn’t stopping); accidental overdoses are much rarer when users know the consistency of their drug of choice.

Many illegal drugs have damaging long-term effects, of course, but prohibition doesn’t solve these either. It simply makes users turn to the black market, which, again, is unregulated and is subject to all kinds of violence. Because dealers have no legal protection for their product, they have to defend it with often lethal force, which is unnecessary in legitimate markets. Drugs’ illegality also makes ex-cons who use drugs likelier to associate with criminal figures, as their criminal records make legitimate employment harder for them to obtain. This also makes escaping the cycle of drug abuse more difficult. (There’s also a racial element to drug prohibition, which Moonlight directly addresses. I further explore this in the next section.)

Since many addicts harm others close to them, addiction isn’t always a victimless crime, but prohibition doesn’t help those people, either. Putting people with medical problems into a criminal system actively reduces their chances of receiving the medical help they need. We now know that addiction is a disease. When it’s treated like one, more addicts recover.

An additional problem with drug prohibition is that it makes psychedelic therapy inaccessible to many populations that could benefit from it. The idea that psychedelics can be of therapeutic value is hardly novel, but recent scientific investigations have provided increasing evidence that it may be correct. Their use remains poorly understood, but tests suggest that if combined with therapy in the right settings, drugs such as MDMA, LSD, mescaline, psilocybin, and DMT can be of incomparable benefit in treating ailments such as PTSD, depression, anxiety, OCD, and even addiction. However, these drugs’ ambiguous legal status (they’re technically illegal, but law enforcement has so far mostly ignored their use in therapeutic settings) means that these therapies are largely unavailable to poor and minority communities.

Often, drugs are also a case of self-medication; there is common comorbidity between addiction and mental disorders. In short, simply attempting to wean a person off of drugs may not solve the underlying problem; the mental disorder that drove them to drug usage is still there. This is one of countless examples of how different forms of marginalization often intersect (both commonly intersect with homelessness as well).

Portugal decriminalized all recreational drugs in 2001. Instead of going into the criminal system, drug offenders go before a panel. Most cases are simply suspended; repeat offenders may be given treatment ranging from counseling to opiate substitution therapy. With problem users less afraid to seek care, all measurable addiction statistics have plummeted, as have new HIV/AIDS infections and other statistics linked to drug abuse. Regulation works. It’s definitely a more sensible approach than the black market we have now.

Identity issues are class issues

This also relates to a key complaint of mine: the idea that “identity politics” are separate from “class politics”. In the same way that all art is implicitly political in that it reflects ways of thinking about the world,⁽⁴²⁾ all politics, ultimately, are both identity politics and class politics, in that a number of identities are strongly correlated with economic class and, indeed, were foundational in constructing our ideas of class to begin with. I’ve directly addressed how many of the above groups are marginalized in ways directly related to their economic status, and this goes far beyond them. Further examples:

Certain misguided faux-leftists and faux-liberals say we should stop agitating on any issue that isn’t a “class issue”. What they are actually asking us to do is to cease agitating on any issue that isn’t a cisgender, heterosexual, able-bodied, able-minded, Christian white man’s class issue. Even today, some feminists have been criticized for blind spots about working-class women and women of color, who argue in reply that a feminism that doesn’t fight for all women is not their feminism. A similar argument can be made for a class consciousness that excludes the class experiences of queer people, the mentally ill, ethnic and religious minorities, women, and other marginalized groups. A left that doesn’t fight for the dispossessed of all stripes is a left with fatal blind spots.

Back to top · Table of contents · My portfolio · Contact me · Website index

Moonlight’s societal critique (Continued)

Not all of the above groups are represented in Moonlight, but one of its many points of beauty is its accessibility to interpretations encompassing its characters as members of some of them. For example, one can read Chiron as having a disorder that affects socialization, perhaps even autism. He gets better at hiding it as he gets older, but he’s awkward as a child in a manner that isn’t purely explained by being gay or an outsider. He can also be read as being on the asexual spectrum; there are several possible interpretations for why he wouldn’t have been intimate with anyone besides Kevin, including shame about his sexuality, but another is that he simply doesn’t develop sexual or romantic attractions frequently. The film doesn’t linger on any of these character elements, if the writers even consciously placed them there; they’re simply treated as normal, not as sources of shame.

Moonlight’s critique of society also relates race and sexuality to economics. The adult Chiron has few legitimate career options as a direct result of his teenage incarceration. And, again, we see a critique of our racially biased school and justice systems; it’s far unlikelier that a white teenager in a well-off suburban district would be tried as an adult, convicted, and incarcerated for Chiron’s actions. Ex-convicts leave correctional institutions with criminal records make legitimate employment all but impossible to find. They still need money, and their only option may be the black market, where their criminal records may even be advantageous. This entraps racial minorities, the poor, the mentally ill, and other marginalized groups in a revolving door of incarceration and crime, which further intensifies poverty’s gravitational field and makes it even harder to escape.

And there’s also a critique of toxic masculinity. Chiron’s breaking point is a direct consequence of bullying that directly enforces heteronormative ideals of masculinity and has been ongoing for years. Naturally, he responds decisively with violence; nonviolent resistance only works with cameras shining on the oppressed, and there are none to be found; moreover, Chiron’s life experiences have led him to expect the system to do nothing for him, and he may not even have been taught about nonviolent techniques of resistance. The social worker who offers to help him seems entirely well-meaning and earnest, and perhaps she truly would try to stop the bullying, but Chiron has no reason to expect this to be a permanent solution, and good reason as well to suspect the authorities’ involvement would only worsen things in due course.

For far too long, bullying wasn’t taken as the grave threat to children’s existence that it often is, and so many queer teens committing suicide after prolonged bullying is an acute national disgrace. (This is another reason I’m thankful for works like Moonlight: children need reassurance that their lives will improve.)⁽⁴³⁾ Moonlight lays bare what a breaking point it can be, and Chiron’s reaction, while incredibly destructive, is also a moment of catharsis for the audience. And unfortunately, it reflects a rather brutal truth about society: often, bullies only back down when subjected to greater force. Changing that may ultimately be our job; if we create a society where violence isn’t a valued trait of masculinity and people aren’t excluded due to their identities, perhaps scenes like this one will feel like remnants of a strange past, the way discussions of feudalism sound today.

In exploring this territory, Moonlight also addresses a topic recent fiction has often been accused of largely ignoring: the nature of power. Moonlight critiques economic power by examining poverty and its causes, condemning economic power disparities for directly causing untold amounts of human misery and highlighting the fundamental injustice of children growing up in the conditions it depicts. Kevin works hard for decades, but still has a low-wage job and a precarious financial status. He’s portrayed as ultimately happy, but is this a just outcome or an optimal distribution of social rewards? Surely he’d be even happier if fewer factors caused him anxiety. Moonlight also depicts Chiron’s economic disenfranchisement due to his conviction as a minor, reflecting the sentencing disparities that unduly affect minority communities and perpetuate poverty within them.

Moonlight’s critique of heteronormative social order encompasses disparities in social power: boys who conform to society’s idea of heterosexual masculinity have social capital over boys who don’t, and abuse it in horrifying, even life-threatening ways. Moreover, when the abused fight back, the system incarcerates them for doing so. The disparity in penalties may not even be an intentional enforcement of heteronormativity, but its consequences are no less harsh for that.

Moonlight addresses political power in its critique of the social system, seen above, and in its critique of disenfranchisement: ex-convicts are economically disenfranchised by being refused entry to many places of employment, and many states, including Florida and Georgia, also disenfranchise them politically. Many states don’t even let rehabilitated prisoners vote; many others feature elaborate requirements to regain the franchise that many will be unable to satisfy. Many people are disenfranchised by fundamentally unjust laws, and even just laws are often applied unjustly, with sentencing disparities one-sidedly applied to the poor and minorities; their more frequent incarceration results in less political representation, which further widens all the above power disparities.⁽⁴⁴⁾ Moonlight’s characters are dispossessed in myriad ways, and the fundamental injustice of their dispossession and of the power disparities that instigate and exacerbate it is (alongside its characters’ journeys towards self-acceptance) arguably the central theme of the film.

Moonlight’s handling of violence is almost unique among films that address the drug trade in that, apart from Chiron’s backlash against his bullies, almost no violence is actually depicted onscreen. This is not to say that violence is absent from the film; Juan is murdered offscreen, and the threat of violence lingers over the entire film (including the bullies’ treatment of Chiron). But another, more significant aspect in which it differs from other drug films is that it does not actually depict violence as a decisive solution. Indeed, while Chiron’s act of violence removed him from his bullies’ presence, its consequences changed his life in much longer-lasting terms, and not for the better.

This distinguishes Moonlight not merely from mainstream drug films, but mainstream cinema generally. I’ll return to this topic in much greater depth later in this book, but belief in violence’s efficacy is virtually an underlying foundational myth of this country, and it can be found in most of our works of fiction from both the political left and the political right. Violence is depicted as a decisive solution throughout our culture. Ursula K. Le Guin is the only well-known American author I can think of that consistently depicts violence as negatively as Moonlight does. (For this and several other reasons, she’s also one of my favorites.) Outside the United States, violence is often treated far more ambivalently. In this respect, Moonlight has more in common with foreign films. Its few acts of violence don’t actually improve the lives of those who carry them out.

Back to top · Table of contents · My portfolio · Contact me · Website index

Ethics, sexuality, and shame

Another reason that Moonlight’s examinations of gender, sexuality, bullying, race, and similar issues resonate so strongly with me is that I believe much of our contemporary disillusionment comes from traditional sources of ethics proving themselves ethically compromised. Masses of people found values in organized religion for millennia, but as many religious groups turn blind eyes to societal crises like sexual abuse, soaring poverty levels, and racial animus, many younger people are finding religions to be increasingly unreliable ethical guides.⁽⁴⁵⁾

Other sources of ethics may seem equally deficient. One might wish to look to the life of the mind, but with many public universities increasingly dependent upon private funding and the scandals of outright predatory for-profit schools, academia may no longer be a safe bet. Other sources out there are much, much shadier – pick-up artists, neoreactionaries, gun rights extremists, and other fringe groups online and elsewhere.

Right-wing critics have frequently accused Hollywood of not caring about ethics. This, to me, feels entirely off-base, particularly when films like Moonlight exist – or, to name an example of a film in a wildly disparate genre that takes a similarly uncompromising ethical stance, Logan. It’s not as though Hollywood has a single viewpoint to offer (the films of Clint Eastwood and the Wachowski sisters, to name two examples almost at random, feature wildly divergent worldviews), but insofar as this criticism is even coherent, it’s really saying Hollywood doesn’t share their ethics, which is an entirely different criticism.

I see Moonlight as part of an artistic movement that’s constructing a new code of ethics that may ultimately be more coherent than any we’ve known, based on the idea that causing unnecessary suffering is wrong, and reducing others’ suffering is good. I should note that many great world religions have been founded on this basis, including Buddhism and Jainism. Indeed, the Dalai Lama is possibly the only well-known religious leader I can name who still seems incorruptible (though I must acknowledge Pope Francis as well; while not perfect, he seems to be trying to drag Catholicism into the twenty-first century). [Thus far, Pope Leo seems to be cut from the same cloth. –Future Aaron] However, those lacking supernatural beliefs are unlikely to find solace in religions, which may also feature unpleasant cultural baggage.

There’s also, of course, the matter of sexuality, a source of particular friction between many religions and modern society. Many of the world’s largest religions came into being when human society was largely agricultural; Christianity in particular was founded when reliable contraceptives did not exist.⁽⁴⁶⁾ Religions that originated when sex was vastly likelier to cause pregnancy than it is now would naturally have far less permissive attitudes towards sexuality than modern society does, and the acrimonious culture war battles waged over sexuality as a result seem almost inevitable.

Casual sex is often ethically good

There are likely few issues about which we feel more shame than sexuality. Europeans often note that we allow extremely graphic violence on network television but panic over a woman’s exposed nipple. This reflects a rather absurd paradox in our society that remains unresolved. We’re probably the most ambivalent Western populace when it comes to sex and sexuality. We’re one of the world’s production centers for pornography, and it’s widely viewed across the country; porn consumption is perhaps one of the few things red states and blue states still have in common. However, many people who watch it would prefer to pretend that it didn’t exist, or that sex in general didn’t exist. That’s unfortunate, since it leaves discussions of sex and sexuality to prudish culture scolds and hypocrites, and there’s a reasonable critique to be leveled at them, but few people are willing to level it. The result is a general sense of shame and confusion that pervades national attitudes about sexuality.

The best example may still be Rush Limbaugh’s comments a few years ago, expressing bafflement that liberals viewed consent as the crucial component of sexual ethics. This is actually not wholly wrong, though other concerns are also important, like honesty. Infidelity isn’t wrong because it isn’t monogamous; it’s wrong because it violates a partner’s trust. Protection and contraceptives are also often essential to avert disease transmission or unwanted pregnancies. Certainly, some people are too young to consent to sex; involving them is wrong.⁽⁴⁷⁾ And using sex to hurt others emotionally is surely wrong. But apart from those, the crucial piece of sexual ethics to me is, indeed, the participants’ consent. And if these standards are met, I not only can’t see any cases in which private sex acts between adults are anyone else’s problem, but also think such acts are good.

This is, to be clear, a tremendous break from traditional Christian doctrine on sexuality, and I’m advocating it even though I’ve never had casual sex and am not looking to start. (I’m clearly not much of a hedonist.) An awful lot of authoritarian beliefs are rooted in the idea of controlling women’s sexuality in particular. I can see why casual sex was discouraged when reliable contraceptives didn’t exist, but they do now and have for decades.⁽⁴⁸⁾ Regardless, the idea that sex outside relationships is good has yet to find wide acceptance in our culture. Yet sex often makes people feel better. It’s a way to make oneself and someone else simultaneously feel better. Isn’t that a good thing overall? People shouldn’t be condemned for doing this; they should be praised for it.

This goes beyond merely casual sex. Religion is still deeply at odds over homosexuality, even though lay society mostly accepts it now; other lifestyles, such as consensual non-monogamy or BDSM, have not even reached that point. This is surely a clash point with secular society, because while society may not approve of casual sex or other divergences from traditional expressions of sexuality, it at least generally accepts them: outside (understandably) infidelity, consenting adults’ private sex lives are mostly no longer considered others’ business.

I’ll go beyond that, however: if the above standards (honesty, contraceptives, protection, and so on) are met, consenting adults’ private sex acts are not merely no one else’s concern, but a net good to society. Further: anything that makes people happier without adverse consequences to oneself or others is a net good. Suffering isn’t intrinsically virtuous; if you’re an adult, don’t enjoy celibacy, have a willing partner or partners also of consenting age, and none of you are violating the above standards, then your celibacy won’t make anyone happier. You only have so many years in your life; enjoy them. If you voluntarily undergo suffering, at least undergo it for a cause that ultimately helps people, like feeding the hungry or curing the sick.

I can, in fact, take this a step further. We frequently tell people that the true test of their character is not how they treat those to whom they are close, but how they treat those to whom they have little to no connection. I cited feeding the hungry and curing the sick as examples of helpful acts, and it’s difficult to find people willing to denounce these acts of altruism. (You can find such people, to be clear; many of Ayn Rand’s followers, for example, will denounce them, which is a major reason that Objectivism is a fringe philosophy.) “The hungry” and “the sick”, in these examples, are generally strangers to those who are helping them, and we applaud the latter for improving their lives.

Yet, at the same time, we don’t extend this to every aspect of human existence. If we’re serious about the idea that treating strangers well should be a foundational underpinning of our society, then the idea that, in essence, giving a consenting stranger one or more orgasms through sex should be anything but a praiseworthy act is indefensible. Otherwise, we’re actually treating consensual sex differently than we’re treating other aspects of life. (Certainly, society treats consensual sex differently from other aspects of life all the time. But should we?)

Back to top · Table of contents · My portfolio · Contact me · Website index

Pornography is an artistic genre

“People having sex on camera”

I’ll follow this up with a second radical proposition: Pornography is a form – more precisely, a genre – of art.

This may confuse many people on several accounts. Objection one is bound to go along the lines: “What? It’s just people having sex on camera.” But this is, of course, an oversimplification. You might as well say stand-up comedy is “just people telling jokes.” Such a reductionist view of stand-up is flawed not merely because the medium moved far beyond mere joke-telling long ago, but also due to the elements of craft such as set-up, delivery, and comic timing that distinguish artistically successful stand-up comics from their peers.

Similarly, there’s much more to pornography than “just people having sex on camera”. Porn depicts idealized sex, which often differs substantially from how people usually have sex off-camera. Most professional porn is heavily edited so that clips build in intensity; filming can, in some cases, take place over several days, with various lighting and camera tricks employed to make it appear as though clips from multiple shoots occurred consecutively. While the sex often appears spontaneous, performers usually meet ahead of time to discuss and agree upon boundaries, and performers who violate these agreements these days often become pariahs in the industry.

They’re called ‘performers’ for a reason; while they generally are enjoying themselves on camera, and while they may incorporate elements from their real lives into their videos, they’re still playing characters. A good comparison to mainstream television might be American Crime, which features most of the same actors playing different characters each season. Porn performers keep the same name from shoot to shoot, but aren’t necessarily playing the same character from video to video, nor do they always have the kind of sex on camera that they most enjoy in real life. A performer may be a housewife in one video, single in the next; a virgin in one video, sexually experienced in the next. Some videos have surprising amounts of setup and characterization, while others have almost none. Some porn films have intricate stories and multimillion-dollar budgets rivaling those of Hollywood productions (Pirates [2005] actually had a larger budget than Moonlight). This setup can be analyzed using the same kind of literary analysis that has been increasingly applied to other genres of film and television in recent years (including in this very writing). There can be social commentary, political commentary, humor, bathos.

And plenty of other artistic decisions can go into the crafting of a porn film. Deciding what to include in a camera frame is an artistic decision. Deciding what clips to feature in what order is an artistic decision. Deciding what camera angles to feature in what order is an artistic decision. Deciding where to film is an artistic decision. Many of these productions are shot in beautiful houses with beautiful furniture and beautiful artwork in the background. These are surely artistic decisions. These works’ cinematographers can, in many cases, be extremely gifted professionals; Greg Lansky’s cinematography for sites such as Blacked and Vixen has earned him awards and comparisons to cinematographers for mainstream films and television shows.

Back to top · Table of contents · My portfolio · Contact me · Website index

“Intended to cause sexual arousal”

So, clearly, it’s not “just people having sex on camera”. The second objection, then, is likely to be that pornography focuses on sex, and that, unlike other media, it’s intended to elicit a reaction of sexual arousal, which somehow differentiates it from other art forms.

But this objection has an unspoken assumption underlying it: sexual arousal deserves to be treated as somehow abnormal among the human experience, or that works intended to elicit a specific emotional state are somehow lesser works. But to do this is to ignore a significant part of the human experience, or to relegate it to lesser consideration. Almost all living humans experience sexual arousal. The belief that sexual arousal is somehow unworthy of serious artistic consideration reflects an underlying belief: that there’s something unworthy about sexual arousal. And that has consequences throughout society, most of which are not pleasant; I will discuss some shortly.

Moreover, there’s nothing unnatural about artistic works trying to elicit emotional responses. Comedies endeavor to elicit laughter. Tragedies endeavor to elicit heartbreak and catharsis. We don’t pretend these aren’t art. Romances are sometimes denigrated in a rather misogynistic fashion, but they too endeavor to elicit emotional responses (we might note that Moonlight is, at its core, a romance and a coming-of-age story). Emotions are a part of human nature; why wouldn’t they be a part of art?

Back to top · Table of contents · My portfolio · Contact me · Website index

“I know it when I see it”

The third likely objection is that legal battles have been waged over whether works are legally classified as pornography or art. Ulysses being classified as not obscene was considered a major triumph for artistic expression. Justice Potter Stewart famously wrote that he wouldn’t and possibly couldn’t intelligibly define “hardcore pornography,” “but I know it when I see it.”

Stewart’s quote has been praised for its candor, but apart from its wit, I’ve never much cared for it. Sloppy language encourages sloppy thinking, and if we haven’t defined a concept, we’re not thinking clearly about it. Hardcore pornography is easy to define clearly: it’s an artistic genre whose focus is on explicit depictions of sex. It baffles me that anyone has trouble clearly defining it.

As a legal matter, it’s well worth separating film shoots that feature explicit sexual activity from those that do not. The agreements I spoke of above are established to avoid sexual assault. Clearly, films that do not feature explicit, unsimulated sexual conduct between its actors are less likely to feature sexual assaults on camera,⁽⁴⁹⁾ and non-sexual film shoots remove one source of worry about underage participants. This is all well and good. But this is a concern for the producers, the directors, the actors, and the legal system, not a question of art.

And frankly, this isn’t the distinction the legal system has, historically, made over obscenity. Miller v. California, the standard in the United States since 1973, features a three-prong test for a work:⁽⁵⁰⁾

  1. The average person, applying local community standards and looking at it in its entirety, must find that it appeals to the prurient interest.
  2. It must describe or depict, in an obviously offensive way, sexual conduct or excretory functions.
  3. As a whole, it must lack “serious literary, artistic, political, or scientific values”.

#3 is the sticking point here. I’ve already explicitly attacked the idea that there’s nothing intrinsically artistic about depictions of sexual activity in and of themselves, but even if we set this argument aside, plenty of works that would be considered to ‘pass’ the Miller test, like Westworld and Nymphomaniac (2013), featured explicit sexual conduct in their productions. These certainly have ‘serious’ artistic concern behind their production, but they still feature unsimulated sex on camera.

I can, moreover, understand why the distinction would be of concern to distributors. I’m certainly not suggesting that the 18+ categorization for porn films be lifted. But then, we live with Game of Thrones and Westworld having TV-MA ratings and no one I’m aware of suggests that they aren’t art.

Back to top · Table of contents · My portfolio · Contact me · Website index

How we could do better

The Internet Movie Database (IMDb) takes an approach to porn that I’d like to see more mainstream sites adopt: it covers it and categorizes it with the genre ‘adult’. If you look up a given porn series on IMDb, you’ll be able to find information about its episodes, performers, release dates, and so on. It’s simply treated as another genre of film or television, like romance, comedy, fantasy, or sci-fi.

Disregarding pornography as art has led to a number of unpleasant results. An especially pernicious one concerns its performers: they often get treated as lesser people in society, and they have little to no legal resource when this happens. As I stated above, performers can be subjected to horrific harassment; they suffer unconscionable discrimination which makes it difficult for many to leave the industry even if they want to. In general, society does not treat them as human beings with agency, and very few people seem willing to defend them.

My personal view of artists and performers is that it takes both a bit of narcissism⁽⁵¹⁾ and a bit of altruism to want to make an artistic work. Narcissism because you want people to pay attention to you; altruism because you want them to derive pleasure and/or meaning from doing so. I’m not excluding myself from this view. I wrote this because I wanted people to read it. But at the same time, I also hoped they’d find it clarifying, insightful, meaningful, helpful, or otherwise beneficial. I don’t want personal fame. But I still want people to read it. That’s certainly a bit narcissistic.

I wouldn’t exclude porn performers from this categorization either. Wanting people to watch you have sex on camera presumably requires an exhibitionistic streak, and probably more than a touch of narcissism. And there are certainly financial incentives; it doesn’t happen for most performers, but the best-compensated pull in six figures a year. But I’d assume most porn performers also want people to enjoy watching them have sex on camera. In other words, they want to enrich people’s lives in some way. Most people aren’t willing to look at it this way, because American prudishness prevents us from looking at orgasms as being beneficial. But wanting to create a work that people will enjoy is, I contend, an act of altruism even if that work focuses on sex.

We more or less accept this mix of narcissism and altruism with mainstream actors. People may look at actors as a bit melodramatic, and there may be grumbling about “Hollywood liberals” or the like, but Americans overall love celebrities like George Clooney, Jennifer Lawrence, Chris Pratt, Emma Watson, and Dwayne Johnson. These people get respect. We watch them on talk shows; some of us follow their personal lives (I don’t really care); we read interviews with them.

We rarely afford porn performers the same respect. Liberal-leaning mainstream publications like the Huffington Post and Rolling Stone will occasionally feature interviews with performers like Belle Knox or Stoya, and sometimes performers like Asa Akira or Tera Patrick will write books that get some mainstream attention, but they’re rarely given the same kind of cultural cachet other actors are. And this is unfortunate on many levels; it makes life more difficult for them, and many of them have quite insightful things to say about their industry, about sexuality in general, and for that matter about life in general that I suspect would benefit the country as a whole. If anything, I’d say porn performers tend to be a lot more politically aware than most of the country, though, given that politics can have a major influence on their industry, this may be a matter of necessity; in any case, one can certainly find plenty of socially aware commentary from adult film stars on social media.

Porn isn’t the first medium of expression to have its artistic value questioned. Roger Ebert, arguably the greatest film critic of the past fifty years, infamously stated that video games weren’t art (though he later walked back his statement somewhat). Comics and graphic novels had their legitimacy questioned for awhile, but Watchmen ultimately made it onto Time’s list of the best English-language novels of the twentieth century.

There are certainly legitimate artistic and sociopolitical critiques to be made of how pornography is currently made. While I already consider it to have artistic value, I think it could be made to have more than it currently does; aspects of how it is produced are troubling even now, and how it is distributed and categorized often raises troubling implications about how we think of women, racial minorities, queer people, and others. There are people trying to challenge these tendencies, and they often face obstacles that occur specifically protesting how we think about (or, more frequently, don’t think about) pornography.

In any case, the ideas that sex is generally good and that pornography is an artistic medium are, as I’ve said, radical ideas. Many otherwise progressive people have yet to fully embrace them. Perhaps society would be healthier if they were more widely embraced, however. If there were less shame and conflict in society over an act that frequently makes its participants happier without causing anyone harm, society overall might also be happier. Shame should be reserved for acts that harm people, not for acts that improve people’s lives. And this brings us back to Moonlight.

Moonlight tells us, unambiguously, something that many traditional sources of ethics have struggled to say: there is no shame in any aspect of one’s identity, whether it’s tied with sexuality, gender, or anything else. An identity is something you are, and shame should be reserved for unpleasant things you do. This is something I struggled with for years, for reasons I directly discussed above and for reasons I haven’t even delved into. I’m honestly not even sure I fully accepted who I was until recently – perhaps the last year; perhaps the last month. I had to be brought around to realizing I don’t need the world’s approval of me to be happy. I had to be brought around to realizing I don’t need the world’s approval of my code of ethics to be happy. I had to be brought around to realizing I don’t need the world’s approval of my hopes for the future to be happy. I had to be brought around to realizing I don’t need the world’s forgiveness for my personal failings to be happy.

And still, seeing Moonlight earn as much praise as it has earned has been a truly inspiring experience, because it resonates with me on so many levels: philosophical, artistic, political, social – and the fact that I essentially grew up with one of the creators of the film makes its unprecedented historical achievements even more magical. It has somehow managed to prove to me, unassailably, something I had always hoped, but never quite fully believed: artistic expression can be a profound force for positive social change. It may even, in some cases, be more powerful than existing social institutions. If that’s true, then there’s hope for the future of a sort I never previously dreamed possible, because I know the kind of stories that the kind of people who want to take us back to the times when queer kids hid in the closet tell, and I know the kind of stories that the kind of people who want queer anxiety to be a thing of the past tell, and our stories are better. Moonlight is the proof.

Back to top · Table of contents · My portfolio · Contact me · Website index

The rarity (and value) of true conservatism (2017/2025)

Why the national GOP is reactionary (and local politics are odd)

[This section used to be a subsection of the following section, “Falsehoods in the News”, but I wrote so much new content that it no longer felt appropriate there. Moreover, I have revised it in so many ways that it no longer seems appropriate to note which parts are revisions and which are part of the original text, because otherwise, over half of this section would be in blue, italic font. I have not in any meaningful way changed any of my original points: I have simply expanded on them further. They apply, if you’ll pardon my stealing a famous Republican presidential campaign slogan, now more than ever. (Get it? Because Ford pardoned… never mind.)

Despite the addenda, the present tense will still refer to 2017. If I need to refer to events since 2017, I’ll add another blue italic passage, but in my present addenda, I haven’t yet done so. Also, note that the footnote numbering will be messed up here because I moved this section. I’ll fix it when I’m satisfied that I’m finished reördering sections in this book. –Future Aaron]

What better time to discuss what I find valuable about conservatism than immediately after a section that almost certainly enraged every reader that self-identifies as conservative?

I almost never use the word conservative to describe American politicians, and I haven’t for decades; truly conservative American politicians may be even rarer than truly honest American politicians. I make no claims for Republican voters, but at least at the national level, Republican politicians abandoned conservatism long ago. I now consider it more accurate to call them reactionaries, but I’m not even sure that’s fully accurate; reactionaries typically at least valued social stability. However, it’s an apt description of their behavior: they react. To paraphrase what’s known online as Cleek’s Law (after its coiner), what today’s GOP politicians support tends to be what Democrats oppose, updated daily.

I should clarify that most of my references to elected Republicans throughout this book refer only to national-level ones (i.e., House, Senate, Presidency); Republicans and Democrats often vary widely at the state level (state legislatures, governorships), because state politics are often quite strange. States like Rhode Island and New York have only one viable party, so many of their Democrats would probably run as Republicans elsewhere.

Meanwhile, a lot of politicians who’d be moderate Democrats in most of the country run as Republicans in Florida; Charlie Crist has by now switched parties, but even when he was our Republican governor, he had essentially nothing in common with the national party, which is probably why he lost 2010’s GOP Senate primary. (I voted for him as an independent [he was polling better than the Democratic candidate, and his term as governor had impressed me enough to vote for him], but he lost to Marco Rubiobot, the emptiest suit in Florida politics. I again voted for Crist when he ran for governor as a Democrat in 2014, and he again lost, narrowly and inexplicably, to Lex Luthor Rick Scott.)

Also, I personally know one of Sarasota’s erstwhile school board members. Despite being one of the most outspokenly liberal people I know, he ran (and won several times) as a Republican.

Lastly, former Florida Senate president Andy Gardiner did incalculably much for people with disabilities, including autism (e.g., he set aside tens of millions of dollars for scholarships for disabled children in the state budget). He also tried to do things like expand Medicaid under the ACA and waive driver license fees for the poor (both blocked by Gov. Skeletor Scott), and in general, he acted basically nothing like a national Republican except for his somewhat pro-gun stance (which still wasn’t pro-gun enough to avoid the gun lobby’s ire) and occasional curiosity towards private charter schools and school vouchers (though national Democrats, Obama included, have plenty of that too, so even here, he’s little different from them).

Gardiner was term-limited out in 2016. Legislative term limits get rid of people who know how the legislative process works (which takes years to learn) and use it to help people, thus giving lobbyists far more power. I have no faith his replacement will do half as much for the disabled in particular as he did.

Back to top · Table of contents · My portfolio · Contact me · Website index

“Nothing is security to any individual but the common interest of all”

When I think of someone who typifies what I find valuable about conservatism, Edmund Burke always comes to mind first. Burke knew whence he was coming: he was a staunch defender of human rights who argued against slavery long before it was commonplace to do so and was appalled by the Reign of Terror’s cruelty.

Honesty compels me not to quote A Vindication of Natural Society as if Burke meant it sincerely; he almost certainly intended it as satire. Nonetheless, remember the name of the work; we’ll revisit it later. Here are three non-satirical quotes from Burke’s other writings⁽⁵⁹⁾ that I find fairly representative of his overall views, and that I find it difficult to imagine any reasonable person disagreeing with:

Liberty, if I understand it at all, is a general principle, and the clear right of all the subjects within the realm, or of none. Partial freedom seems to me a most invidious mode of slavery. […] But the true danger is, when liberty is nibbled away, for expedients, and by parts. […] Indeed nothing is security to any individual but the common interest of all.
Edmund Burke, letter to the Sheriffs of Bristol, 1777-04-03

To me, Burke is saying two things in this passage:

  1. Partial freedom is not true freedom, and the gradual erosion of liberties is dangerous.
  2. Improving overall standards of living is in everyone’s best interest – discontentment breeds instability, and, as the modern idiom says, a rising tide lifts all boats.
“Whenever a separation is made between liberty and justice, neither, in my opinion, is safe.”
Edmund Burke, letter to M. de Menonville, 1789-10

Or, put another way, justice without freedom is not truly justice, and freedom without justice is not truly free.

“Government is a contrivance of human wisdom to provide for human wants. Men have a right that these wants should be provided for by this wisdom.”
Edmund Burke, Reflections on the Revolution in France (1790)

That is, government exists to provide for people the necessities that they cannot provide for themselves, and they have a right to expect it to do so. (In this case, I read ‘wants’ to mean “things people need and lack”, though Burke could also mean “things people desire and lack”.)

In short, Burke may have been a conservative, but he was also a classical liberal, and to some extent even a social liberal. Common usage aside, it is entirely possible to qualify to some extent as all three, and indeed, Burke is considered one of liberal conservatism’s most important thinkers.

(Social liberalism equates roughly to modern American liberalism. Classical liberalism is more economically centrist and laissez-faire in opposing heavy economic intervention either on behalf of the poor or the rich. In some European countries, unprefixed ‘liberal’ typically means classical liberalism, but since I’m focusing primarily on the United States, I’ll continue using unprefixed ‘liberal’ to mean social liberalism.)

Back to top · Table of contents · My portfolio · Contact me · Website index

Milton Friedman supported basic income, and other little-known facts

Other historical conservatives can be placed well to Burke’s left, notably Benjamin Disraeli. A crucial component of Disraeli’s one-nation conservatism is the state’s obligation to help the jobless, elderly, sick, and others unable to care for themselves. Disraeli observed that rulers’ indifference to mass suffering could cause the collapse of social stability and even revolution; John Maynard Keynes made a similar observation some seventy years later. One of Disraeli’s Liberal-Labour peers, Alexander Macdonald, remarked that Disraeli had done more for workers in five years as prime minister than the Liberals had managed in fifty.

Similarly, Milton Friedman, who saw firsthand the horrors poverty created during the Great Depression, was well to the left of today’s Republicans: he stopped short of calling for reducing income inequality, but in Capitalism and Freedom’s twelfth and final chapter, he proposes a negative income tax (NIT) to alleviate poverty. The universal basic income (UBI) many leftists now propose in response to the ongoing, steadily worsening crisis of job loss to automation is “simply another way to introduce a negative income tax”, according to no less a source than Friedman himself. Richard Nixon later unsuccessfully tried to pass a variant of Friedman’s proposal.

Friedman argued that the NIT would streamline inefficiencies in the welfare system, lower administrative overhead, assist the poor more directly and immediately than means-tested welfare programs, and avoid the market distortions of tariffs or minimum wages. This may surprise some people to read, but I actually agree completely with his reasoning on this and only part ways with him on two details:

  1. I consider reducing income inequality a laudable goal in its own right.
  2. Healthcare is a sufficiently volatile expense that it should be covered universally for all citizens.

I’d also add that the NIT/UBI would necessitate much stronger rent control to prevent inflation. So long as those caveats were fulfilled, however, I’d actually consider it a major improvement if all non-healthcare-related welfare were replaced with the NIT or some close variant thereof. It would also provide one additional benefit that might actually outweigh all the others combined: Since the NIT would taper off at a steady level, it would completely eliminate “welfare traps”, which actually make it disadvantageous to pass certain income thresholds. I consider these one of our economy’s most diabolical features.

Friedrich von Hayek, another economist today’s Republicans often cite approvingly, wrote, “We shall again take for granted the availability of a system of public relief which provides a uniform minimum for all instances of proved need, so that no member of the community need be in want of food or shelter,” and of “the recognized duty of the public to provide for the extreme needs of old age, unemployment, sickness, etc.”, which extends beyond endorsing merely a basic income and into advocating universal healthcare.

As a final example, here are two quotes from Adam Smith, the alleged father of capitalism:

“All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind.”
“Civil government, so far as it is instituted for the security of property, is in reality instituted for the defence of the rich against the poor, or of those who have some property against those who have none at all.”

I intend to claim none of the above as leftists of any sort (though I could see a legitimate, if unconventional, case being made for Smith), but rather to indicate the extremity of today’s GOP.

Back to top · Table of contents · My portfolio · Contact me · Website index

Defining ‘conservative’, and more reasons the national GOP isn’t

Modern American discourse has oversimplified political ideologies to the point of all but stripping their meaning. The Republicans are the right-wing party, so if you’re conservative, you must be a Republican, right? And by extension, if you’re progressive or liberal, you must be a Democrat.

In fact, it’s not that simple. It’s worth asking ourselves what these terms even mean. I am not primarily and have never primarily been a member of the Communist Party⁽⁶⁰⁾ a conservative. However – and this too may surprise people, given everything else I write in this book – my thought has some conservative tendencies to it, and my behavior certainly does.

Conservatism, in the Burkean sense, argues to preserve the status quo when it is functioning and only abandon it where change is empirically likely to be beneficial. I would hold that, if any modern American party is conservative in this sense, it’s the Democrats (who, to be clear, are also liberals).

Today’s Republican politicians generally have little interest in preserving the status quo, and many flat-out reject empiricism; if they were conservatives, gains such as marriage equality and healthcare wouldn’t be under such grave threat, and they’d take scientists’ recommendations on issues like climate change and pollution seriously. (Miami, Moonlight’s setting, has suffered steadily worsening flooding over the last decade.)

No, most of today’s Republican politicians wish to take us back to an imagined past, but their vision of the 1950s omits such facts as the top marginal income tax rate being 91%, amongst other incongruities.⁽⁶¹⁾ Many prominent Republican leaders wish, in fact, to undo the Great Society and Affordable Care Act, to nullify the EPA and the FDA, in some cases even to overturn the entire New Deal. As I said: reactionary.

Back to top · Table of contents · My portfolio · Contact me · Website index

“Stand[ing] athwart history, yelling, ‘stop’”: The value of skepticism

The following two quotes sum up the importance and value I see in conservatism:

  1. “If it ain’t broke, don’t fix it,” as the common idiom goes. There’s obvious wisdom in this: tinkering with a perfectly functioning machine is a great way to break it.

  2. “The National Review […] stands athwart history, yelling, ‘stop’,” as stated by its founder, William F. Buckley, Jr. (whom, to be clear, I have plenty of disagreements with).

Now, to be clear, I don’t think change is inherently bad. But I also don’t think it’s inherently good. There’s genuine value in questioning the need for every proposed change before going ahead with it; indeed, my personal stance is that no change should be enacted without empirical evidence suggesting that it is likely to produce tangible benefits that outweigh its drawbacks.

Since autistic people often have particular difficulty adapting to change, my life experience provides a pertinent argument for being skeptical of changes. If we have difficulty adjusting to change, that applies in both directions. The lack of consistency can itself turn into a source of anxiety. Indeed, I’m suffering from it right now.

I have plenty of issues with the Catholic Church, but I keep thinking of the Promoter of the Faith (prōmōtōr fideī), popularly called the Devil’s Advocate (Latin: advocātus diabolī). In canonization hearings, the Promoter of the Faith once served as prosecutor, presenting the case against canonization; the Promoter of the Cause (prōmōtōr causa), popularly called God’s Advocate (advocātus Deī), served as defense. (The Promoter of Justice [prōmōtōr iūstitiae] now takes the Promoter of the Faith’s role; the latter still has other roles.) Regardless of one’s view of the entire concept of sainthood, there’s genuine value in taking a skeptical attitude and looking for holes in the evidence. It’s considered a form of the Socratic method, and for good reason.

Of course, there’s a caveat to this: studies have suggested that inauthentic dissent – that is, assigned devil’s advocates – can actually prove counterproductive in group decision-making. Dissent from a place of genuine belief can prove genuinely persuasive, but dissent done purely out of a sense of obligation generally entrenches people more in their original beliefs, which defeats the entire purpose of the exercise. (This may be why the Church called atheist and strident Mother Teresa critic Christopher Hitchens in her canonization hearings. The Devil’s Advocate, Hitchens wrote, was “the one officer of the Holy Mother Church whom everybody, sacred and profane, actually believed in,” but even though it was effectively eliminated, he still considered it “my pleasure and privilege to be the first ever to represent the Evil One pro bono”.)

This, in turn, suggests that there is value in cultivating an intrinsically skeptical attitude. Not a cynical attitude, in the colloquial sense, but a questioning one. (The historical Cynics were, ironically, not at all cynical in the colloquial sense; in fact, they rank among history’s most idealistic philosophical schools in rejecting conventional desires for power, glory, wealth, conformity, fame, and possessions.)

The job of conservatism is, effectively, to play authentic devil’s advocate: to maintain a healthy attitude of intrinsic skepticism and to continually challenge advocates of change to prove that it will be beneficial. It’s unfortunate, then, that no one is doing so today. How many people currently stand athwart history, yelling, ‘Stop’?

As I said, I would sooner call the Democrats Burkean than I would the national GOP, whose attitude is less ‘stop’ than “take me back to (what I imagine to have been) 1925.” However, while the Democrats do have aspects of his thinking (and, like him, are also liberal), I don’t consider them truly conservative, either.

I think a large part of why we’re now so divided as a culture boils down to precisely the fact that there is no truly conservative faction in American politics. Both sides are currently engaged in a tug-of-war where one side’s gain is the other side’s loss. I don’t know how we come back from that.

The term conservative needs to be reclaimed; it once referred to an insightful political philosophy with much to offer the world, and its present misusage does us a grave disservice.

[End 2025 expansions. Present tense will refer to 2017 until further notice.]

Back to top · Table of contents · My portfolio · Contact me · Website index

Falsehoods in the news (2017)

“But her emails”

I’d like to return now to 2016’s election and contrast media coverage in general, and of that election particularly, with the world we’ve seen in Moonlight. The concept of reality itself is currently under attack from the current administration* and many other sources. The presence of idioms like ‘post-fact’ and “alternative facts” in our discourse is alarming. Moonlight is an honest representation of reality, and honest representations of reality have become increasingly elusive. I wrote above that our perceptions of the world reflect stories we tell ourselves about the world, or that others tell us about the world. I shall now examine some sources of falsehood.

As is the case for many people my age, Hillary Clinton wouldn’t have been my top choice for president in 2015. She’s taken many stances that conflict with mine, from her Iraq war vote to her ’90s positions on matters like crime (though, far too frequently, her stances have been reflexively equated to her husband’s, as though they’re the same person). But she’s also exhibited a commendable responsiveness to popular opinion, seen in 2016’s Democratic Party platform: Bernie Sanders’ campaign, which ran to her left on several, though by no means all, major issues (and for which I voted in the primary, though he’s frankly gotten increasingly tone-deaf on minority and women’s issues), caused her to shift left. Public officials are supposed to respond to public opinion; that’s the entire premise of representative democracy. She’s been attacked for this, as if it signifies insincerity or ambition, but it shows she reëxamines her stances when presented with persuasive arguments, which is a sign of a thoughtful leader. Would it have been better if she hadn’t evolved with popular opinion? That hardly would’ve won her praise either. Many people who attacked her for this accepted Barack Obama’s similar evolutions on issues like marriage equality. I can’t help reading an air of misogyny to many criticisms of Clinton: I can’t see anything she could’ve done to please some of her critics.

While she wasn’t my first choice in the election, by Election Day I was fully convinced she was the best choice. I’m honestly not sure anyone who’s ever run for the office has been more qualified or, possibly excluding Adlai Stevenson, knowledgeable. Her debate performances, especially, revealed an intelligence, grace under pressure, and wide-ranging command of knowledge that would’ve served us fantastically as president. Which, of course, is exactly what we didn’t get. The president* is hot-headed, willfully ignorant, and foolish. (“Nobody knew health care could be so complicated”? He may be the only living adult, if such he can be called, who didn’t know that.)

A number of factors deserve blame: the FBI’s wildly irresponsible, historically unprecedented decision to release a wholly deceptive letter announcing a discovery of what amounted to literally nothing concerning Clinton eleven days before Election Day (they had no warrant and no reason to think they’d find any new messages, as indeed they didn’t); hacks and malicious leaks directed by a hostile foreign power; overconfident forecasts of the results; complacent voters who stayed home (I’ll exclude victims of voter suppression measures from culpability here) or cast useless protest votes (margins in several crucial swing states were razor-thin and exceeded by third-party votes, write-in Sanders votes, or both); and three decades of right-wing smears of Clinton. However, I ascribe the largest share of blame to the media: the FBI’s letter would not have been as decisive if media coverage had not already established a (fundamentally false) portrait of Clinton as unethical.

A Gallup study released six days before Election Day revealed that by far the main word voters associated with Clinton was emails. (Other Gallup studies in August 2015 and September 2016 found similar results.) By contrast, they associated no issue with the president* to any comparable degree. The Access Hollywood tape alone, on which he was recorded bragging about serially committing sexual assault, should have been immediately, prohibitively disqualifying, and the fact that it was almost forgotten by November 8 is an ethical stain that the media will likely never be able to cleanse.⁽⁵²⁾

There was, indeed, a reason for this; a similar study of network news media coverage by Tyndall Report reveals that ABC, NBC, and CBS’ primetime news broadcasts gave Clinton’s emails three times more coverage than they gave all policy issues combined. Let me repeat that, because it’s important, and it might not have fully sunk in the first time. Primetime network news broadcasts gave Clinton’s emails three times more coverage than they gave all policy issues combined. Indeed, the three networks barely covered issues at all; as of late October 2016, they’d discussed issues for a combined total of thirty-two minutes (a mean of ten). By contrast, their 2008 broadcasts discussed issues for 220 minutes. This undoubtedly helped the president*, as Clinton’s policy proposals were usually incredibly detailed, while his were often nearly nonexistent, and he frequently changed his stances from day to day. (And, as I noted, she was attacked for flip-flopping; the fact that he wasn’t similarly attacked suggests a lack of good faith among the people making these attacks or, again, misogyny.)

I’ve focused on network news because the figures here are the most shocking, but major newspapers were also mostly execrable. Front-page headlines often alleged Clinton ‘scandals’ on outright false if not even libelous bases; for instance, an Associated Press tweet claimed “more than half” the people she met with as Secretary had donated to the Clinton Foundation. This is false by every conceivable metric: AP only analyzed 154 meetings, and Clinton attended over 1,700 as Secretary; fewer than 5% were with Foundation donors, many of whom are hardly most notable for donating to her foundation. Muhammad Yunus, for example, is a world-renowned philanthropist, microfinance and microcredit pioneer whose awards include the Nobel Peace Prize, Presidential Medal of Freedom, and the Congressional Gold Medal. I don’t expect the general public to know who he is, but I became aware of him in my studies of politics and economy years before Clinton’s first run for the presidency. He has undoubtedly improved millions of lives incalculably. His Foundation donation is hardly why a Secretary of State would see him, and accusing her of a conflict of interest for doing so at least borders on reckless disregard for the truth – the standard for libel of a public figure. It’s also unfair to him; scurrilously implying he’s more concerned with gaining access to a politician than with the good the Foundation did worldwide is indefensible. AP focused more on Yunus in its article than on any other figure with whom Clinton met, which I feel is a suitable reflection of the article’s overwhelming fraudulence.

Likewise, the New York Times ran a story alleging “questions raised” about Foundation associates getting special access to State. What “raised questions”? A longtime Foundation employee sent an email seeking a diplomatic passport. Was it granted? No; State only gives such passports to department employees and others with diplomatic access. So questions weren’t ‘raised’; they were answered, and the answer was no. But the Times still ran a headline alleging misconduct, and only several paragraphs into the article (after most readers have stopped, as any journalist knows) did it become clear there was no there there. Again, this borders on libel.

I’ve spent half a page discussing coverage of the Clinton Foundation, an entirely legitimate charity that won numerous awards for its philanthropy, was consistently ranked one of the most transparent, efficient charities in existence, and very likely prevented millions of deaths from HIV/AIDS alone. The president*’s foundation, by contrast, was ultimately shut down for being an illegitimate money funnel to its namesake. The latter received far less critical coverage; indeed, the Washington Post’s indispensable David Fahrenthold was single-handedly responsible for almost all of the investigation it received (he wasn’t the only person covering it, but most other coverage simply repeated his findings). An unfortunate side effect of this election’s outcome is that future candidates for office in this country are now indescribably likelier to avoid philanthropy as a career choice.

The president*, at one point, purchased a full-page Times ad advocating what amounted to lynching the Central Park Five, a group of African-American and Hispanic teenagers falsely accused of and imprisoned for sexual assault and attempted murder. They’d been coerced into confessing, and their accounts differed substantially with established facts and with one another on virtually every major aspect of the crime. The case completely unraveled after the actual perpetrator, already imprisoned for other crimes, confessed to the crime. DNA evidence confirmed the confession and unraveled the case against the Five, and their sentences were vacated. Despite this, the president* was still calling for their execution as late as October 2016, and completely refused to apologize for doing so. A search by historian Rick Perlstein suggests that the Times covered the president’s* treatment of the Central Park Five once throughout the entire campaign, in an op-ed by Sarah Burns.

Tracking firm mediaQuant valued the free coverage the president* was given throughout the entire election cycle at $5.6 billion, more than Marco Rubio, Ted Cruz, Paul Ryan, Bernie Sanders, and Hillary Clinton received combined. News outlets repeatedly ran uninterrupted footage of the president*’s rallies; Clinton was given widespread, unfiltered media access arguably only during the Democratic National Convention and the presidential debates. Of all the media’s many grave disservices during the last campaign cycle, this was arguably the gravest.

The general perception is that both candidates’ press coverage was more negative than positive. This is admittedly true. However, if one factors in the entire election cycle from the primary through Election Day, Clinton’s coverage, astonishingly, was more negative than the president’s. The Harvard Kennedy School’s Shorenstein Center found that Clinton received 62% negative coverage to his 56%, and 38% positive coverage to his 44%. But even these numbers understate the level of negativity, because much of the president*’s ‘negative’ coverage was horserace-related, e.g., “he can’t win” (a false prophecy if one ever existed). The two candidates’ coverage on non-horserace topics like policy, leadership, ethics, and personality was identical: 87% negative to 13% positive.

Moreover, after Comey’s ‘emails’ letter, Clinton received by far more negative coverage than the president*. Over a third of her coverage in the final two weeks was devoted entirely to ‘scandals’, and it became vastly more negative, going from 52% negative on the week of October 23 to an average of 73% negative for the last two weeks; meanwhile, the president*’s coverage got overwhelmingly less negative, going from 91% to an average of 66%. This shift in coverage resulted in far less coverage of the president*’s many disqualifications, such as his confession of serial sexual assault, his calls for lynching innocent black teenagers and war crimes, and so on.

Back to top · Table of contents · My portfolio · Contact me · Website index

The pernicious myth of false equivalency

I’m not entirely sure why the media insisted upon fabricating Clinton scandals when there was so little evidence of them, but I think some of it stems from not wanting to be accused of bias. If you accuse one candidate of dishonesty, but don’t point out an example of dishonesty from their opponent, then a dishonest person may claim you’re favoring the latter. But it doesn’t work that way; not all politicians are equally honest, and treating them equally is biased in favor of less honest politicians. It’s arguably also dishonest, and certainly isn’t factual. False equivalency may be one of our society’s most pernicious narratives; it reduces stories to “he said, she said” sound bites in cases where all the evidence clearly favors one side, like climate change and vaccines.

Republicans, in their more honest moments, may also admit that they make ungrounded accusations of bias in an attempt to garner more favorable coverage. In 1992, Rich Bond, then-Chair of the Republican Party, admitted to the strategy behind Republicans’ constant use of the “liberal media” refrain: “If you watch any great coach, what they try to do is ‘work the refs.’ Maybe the ref will cut you a little slack on the next one.” Republican pundit Bill Kristol, similarly, has said, “I admit it. The liberal media were never that powerful, and the whole thing was often used as an excuse by conservatives for conservative failures,” while former Republican presidential candidate Pat Buchanan has admitted, “I’ve gotten balanced coverage and broad coverage – all we could have asked. For heaven’s sake, we kid about the ‘liberal media,’ but every Republican on earth does that.” Between talk radio, Fox News, and sites like Breitbart, the Republican media infrastructure has grown enormously in the past twenty-five years, and it’s entirely believable that journalists for mainstream publications are more afraid of accusations of “liberal bias” than they’ve ever been.

A particularly salient point is that PolitiFact, a fact-checking site which ranks politicians’ honesty,⁽⁵³⁾ ranked Clinton as overall one of the most honest American politicians: it ranked around one in four of her statements as varying degrees of false. That’s not great, but few politicians do better; Bernie Sanders, Barack Obama, or John Kasich may have ranked better at certain times, but no one else at their national level of prominence came close. By contrast, the president*’s allegations (calling them statements would taint the word by association) ranked that way sixty percent of the time or more (usually more; often much more). Despite this, the popular perception of the two candidates on Election Day was that Clinton was the less honest or trustworthy of the two, which would be inexplicable if we hadn’t already examined so much flat-out defamatory campaign coverage.

Another likely contributing factor was overconfidence: the media simply assumed there was no way voters would elect such a fundamentally foolish and repulsive man, and thus covered Clinton as if she were already president. Overconfident statistical modeling didn’t help either. Many have argued that polling failed this cycle, but national polls were within the margin of error, as they’ve been for decades; the failure was at the state level. States are polled less frequently and are harder to poll protesting their smaller populations. Unfortunately, many statistical models relied heavily on state polls from before Comey’s October Surprise. I can’t blame the modelers too harshly; many were well-meaning people with strong track records in past elections, and the Huffington Post and Sam Wang certainly didn’t intend to help the president* win. Wang even admitted before November 8 that his model was projecting too high a confidence level, but he’d settled on it earlier in the cycle and felt altering it mid-cycle would be unscientific. (He still considered a Clinton loss highly unlikely, to be clear.) But there’s a fundamental flaw in predicting human behavior: polls are only accurate if voters vote, and if they believe a victory is preordained, they may not bother, which can alter the results. There’s no way to know the probability of that with anything like the 90+% confidence levels many sites projected. In other words, it’s physics’ observer effect as applied to politics: the act of observation alters the phenomenon being observed. Virtually none of these sites took this into account. Most of the biggest failures of polling and analysis this cycle occurred when analysts drifted into punditry rather than simple statistical analysis; hopefully, many have learned their lessons.

There are likely a number of reasons for the terrible policy coverage; a cynical interpretation, which may or may not be correct, is that an unfortunate number of journalists simply don’t understand policy and prefer to avoid it. There was certainly no sign they understood the technical issues behind the alleged email scandal. The frequent conflations of the Podesta email ‘hacks’ with Clinton’s private email server indicate flabbergasting technological illiteracy.⁽⁵⁴⁾ Clinton’s server was never hacked. The official State Department server, by contrast, was hacked – after she left the department, mind you, but regardless, a legitimate case could be made that she improved national security in the short term by using a private server to host, it must be pointed out, exclusively unclassified material.⁽⁵⁵⁾

Some other criticisms focused on the deletion of around thirty thousand emails from Clinton’s server, which were allegedly private communiqués unrelated to her work.⁽⁵⁶⁾ Context is crucial here: Bush 43’s administration deleted around twenty-two million emails; the breathless reporting on Clinton rarely mentioned this. Perlstein only managed to find seven references to Bush 43’s deleted emails in any newspaper, two of which were foreign (by contrast, he found 785 references to Clinton’s deleted emails). It’s certainly possible that Clinton failed to deliver some occupational missives to State, but that would’ve been profoundly foolish on her part and incompetent on her legal team’s, since a Secretary of State’s work email is mostly to other government staff and thus duplicated on government servers. If a FOIA discovered work emails that weren’t on her server, that would be seen as incriminating. As far as I know, this still hasn’t occurred, and it certainly hadn’t by Election Day.⁽⁵⁷⁾

False equivalence doesn’t merely extend to coverage of politicians’ honesty; policy coverage, in the rare cases where it still actually occurs, often runs into the same problem, although I can’t blame this solely on the media. In 2012, political consultants for Democratic campaigns ran into a problem: if they reported Romney and Ryan’s tax plans (which would’ve cut Medicaid and Medicare, increased taxes on those making under $200,000 per year, and cut the rich’s taxes) honestly, focus groups flat-out refused to believe that they were the GOP’s actual proposals. The voters considered the description of the plan, in Jon Chait’s words, “so cartoonishly evil that they found [it] implausible,” despite its accuracy. Some of the blame for this has to be placed on the public.

That said, the fact that the media so rarely actually report such proposals matter-of-factly no doubt adds to voter incredulity over them. It may have only become clear to the public this year just how reactionary the GOP’s healthcare agenda truly is: when it became clear that they really did want to cut twenty-four million people’s healthcare coverage, widespread protests erupted, and constituents justifiably inundated politicians with calls and protests.⁽⁵⁸⁾ I wrote reactionary deliberately, for reasons outlined above. I wrote above, “Many prominent Republican leaders wish, in fact, to undo the Great Society and Affordable Care Act, to nullify the EPA and the FDA, in some cases even to overturn the entire New Deal.” Mainstream coverage rarely acknowledges any of that.

Back to top · Table of contents · My portfolio · Contact me · Website index

The media’s reactions to the election

One might hope that the media would have learnt something from the disastrous outcome of its 2016 coverage, and perhaps they have, but I’m unconvinced. The New York Times devoted three above-the-fold stories to Comey’s October Surprise and only one to a January GOP decision to gut House ethics oversight. We’ll be lucky if the mainstream media devote a third of the energy to criticizing Republicans that they devoted to the Great Clinton Email Snipe Hunt.

Many influential media figures, including the Times’ generally respectable Lynn Vavreck, blamed the candidates for the lack of policy coverage, which is, as Paul Krugman points out, an unparalleled act of media gaslighting.⁽⁶²⁾ The president* may have discussed policy as little and as superficially as possible, but Hillary Clinton discussed issues more often than any other presidential candidate I’ve observed in my lifetime, and in greater depth. The media gave him billions of dollars’ worth of free coverage, while her discussions of issues received the flimsiest coverage.

Indeed, arguably the only times she received unfiltered media access during the entire election cycle were during the Democratic Convention and the presidential debates. It’s perhaps not remotely surprising that after each of these appearances, her poll numbers rose appreciably. These may have been the only times most voters ever actually saw the real Hillary Clinton. Those who’ve met her in person have almost unfailingly reported that she is genuine and personable, and rarely come away anything but spectacularly impressed. (She’s not, however, a particularly gifted public speaker.)

It’s similarly unsurprising that after every Comey ‘emails’ announcement, her poll numbers dropped. This is the manufactured “Hillary Clinton” that has existed in the right-wing fever swamps and far too much of the mainstream media for most of Generation Y’s existence. This is a recurring problem female candidates face in this country; they are almost always more popular in office than they are as candidates. Remember, as Secretary of State five years ago, Clinton was the most admired woman in the country, with a nearly seventy percent approval rating. It appears that much of our media (and perhaps the FBI) cannot abide a woman with ambition.

To me, the biggest sign of who the real Hillary Clinton is came at the point of her concession speech. It displays a grace and fundamental decency that I honestly didn’t even believe anyone on the planet possessed, and that the president* (who, by contrast, is one of the most repulsive human beings I think I’ve ever been made aware of) didn’t deserve. I realize that she and Bill were once friends with the president*, but I’m not sure how you maintain a friendship after all the horrible things the latter said about both of them. Somehow, she still managed to display a poise and graciousness that are entirely beyond my ken.⁽⁶³⁾ For that matter, so did Obama, who may be third only to Jimmy Carter and Abraham Lincoln [fourth only to Jimmy Carter, John Quincy Adams, and Abraham Lincoln –Future Aaron] as the most fundamentally decent person ever to have occupied the office.⁽⁶⁴⁾ The ability of all of them to put the good of the country over their own personal feelings means that they are the kind of public servants I could never hope to be.

Back to top · Table of contents · My portfolio · Contact me · Website index

Possible solutions to the news crisis

I could go on for several additional pages (I might even be able to write a book on the topic), but I believe I’ve made my point about how generally awful mainstream news coverage has been lately. But I must be entirely fair here: the decline of newspapers is partially our fault. The Internet has accustomed us to not paying for content. This means that newspapers’ revenues have declined precipitously; Internet advertising simply isn’t as profitable as print ad space, particularly given the decline of classified pages (Craigslist has obviated the need for them). This means that newspapers have increasingly fewer resources to devote to real investigative reporting, which often costs serious money. This issue is already problematic at the national level, but it’s particularly awful at the local one, because it means local politics escape close scrutiny to an even larger extent, since there generally isn’t anyone but the local paper to report on these stories. It will be, as The Wire’s creator David Simon (a former journalist himself) has noted, a golden age of corruption.

This means that, if we want things to improve, we have to get back into the habit of paying for content. I can’t endorse paying for the New York Times after its horrible coverage last year (though Gail Collins, Charles Blow, and Paul Krugman remain especially valuable columnists); the Washington Post also does national reporting and indulged less in false equivalency, apart from the egregious Chris Cillizza, who no longer works there. For me, the Post, though flawed, is now the newspaper of record.

Thankfully, many journalists are still doing excellent work. Foremost, I’d like to note that Fahrenthold completely deserved his Pulitzer; by himself he’s a major reason I recommend subscribing to the Post. David Cay Johnston, Kurt Eichenwald, Julia Ioffe, Ezra Klein, Matt Yglesias, Kevin Drum, and others whom I’ve no doubt omitted also did commendable reporting (I’ve indeed relied on some of them for parts of my critique).⁽⁶⁵⁾ Plenty of other sources also do excellent reporting on our country; publications like Mother Jones, The Guardian (despite being a British publication), and ProPublica have broken important stories of national importance, while sites like Vox and Talking Points Memo contextualize developing stories in more intelligent and comprehensive fashions than many traditional newspapers have managed. And for that matter, teen and women’s magazines like Teen Vogue and Cosmopolitan have done fantastic political reporting lately.⁽⁶⁶⁾

Perhaps surprisingly, I’ve also found some blogs to be fantastic sources of news analysis, though you have to be discerning with these, because they’re subject to Sturgeon’s Law (“90% of everything is crap”). I’ve learned an unfathomable amount about military and espionage-related matters from Adam Silverman of Balloon Juice;⁽⁶⁷⁾ no other source I’ve read has done a better job contextualizing the president*’s Russian connections in particular. Scott Lemieux and Erik Loomis of Lawyers, Guns & Money,⁽⁶⁸⁾ university professors in political science and history respectively, have probably taught me more about legislative politics and labor history respectively than anyone else has since my college political science classes. Other notably informed or perceptive bloggers include Juan Cole of Informed Comment, whose commentary on the Middle East and Islam is peerless; Brad DeLong, probably the best economics blogger this side of Krugman; Dave Neiwert, whose sporadic updates on authoritarianism are consistently alarming and indispensable; Heather ‘Digby’ Parton of Hullaballoo; Charlie Pierce of Esquire; and probably a dozen others I’m forgetting.⁽⁶⁹⁾⁽ⁿⁱᶜᵉ⁾

Obviously, I don’t expect anyone to subscribe or donate to all these publications, because I’ve mentioned an awful lot of them, but I do recommend finding some you enjoy and throwing them some dough. You should subscribe or donate to at least one great national newspaper, the best local paper, and a few great magazines, websites, or blogs. If we don’t start paying for content, news coverage will just keep worsening, and we should reward sources who do good reporting where possible. After all, the public needs to be well informed, and the fact that it’s so misinformed is a large part of what’s gotten us into such a huge mess. Keeping reliable sources of news coverage afloat is vital.

Back to top · Table of contents · My portfolio · Contact me · Website index

Falsehoods in politics

“Reality-based community”

I’ve spent several pages discussing the mainstream media. The right-wing, of course, has its own media. These sources are, if possible, less factual than mainstream ones, but discussing them feels superfluous. The splintering of news coverage, however, reflects a more general fragmenting of American society. It’s as though we live in two separate countries. In one, Barack Obama was a tyrannical socialist who ruined America’s economy, grabbed people’s guns, and forced changes like marriage equality and healthcare reform on an unwilling public. In the other, he was a well-meaning liberal who worked as well as he could with an often intractable Congress, with some successes and some failures. In reality, the economy recovered steadily under Obama; unemployment steadily decreased. He attempted to pass gun control, but gun sales skyrocketed under his administration, and many states loosened their gun restrictions. (There are also faux-leftists who believe he was somehow right-wing, which is equally off-base; classifying him as a moderate, a moderate-liberal, or a liberal is defensible, but he was not in any sense right-wing.) And this, I feel, is the actual problem.

People are entitled to their opinions. However, reality is indifferent to those opinions. The climate doesn’t care if you believe it’s warming; we’re getting consistently record-breaking temperatures regardless of anyone’s personal beliefs. The climate isn’t capable of caring; it’s inanimate. People are entitled to opinions, but they don’t get their own facts. The idea that every aspect of reality is up for debate is simply mistaken. There’s such a thing as empirical reality. Humans may be, and often are, terrible at recognizing it. But it exists independently of our ability to recognize it, and it can and will have a major impact on our existence. The concept of “alternative facts” is one of the most ridiculous concepts to make its way into American political discourse, and this is a discourse wherein a prominent political strategist once dismissively used the phrase “reality-based community”.⁽⁷⁰⁾

The idea that we can create our own reality is a pleasant one. Sometimes it may even be true. I genuinely believe creative works can create social change, and I address below how Moonlight may be an example of this. (Will and Grace was another.) But some forces are simply too powerful for us to ignore. Climate change will immensely impact our generation, and many people treat it as an honest political disagreement between two well-intentioned sides with differing opinions. In reality, it’s a scientific matter, and one side is simply wrong. Its adherents may genuinely believe they’re correct, but their belief doesn’t make them any less wrong. And being wrong can be deadly.

Back to top · Table of contents · My portfolio · Contact me · Website index

Falsehoods about healthcare

The problems are not merely limited to the climate. Healthcare is another case. Kenneth Arrow, who passed away a few months ago, received the Nobel Memorial Prize in Economics for a number of contributions to the field, but the most enduring may be his analysis of healthcare. He demonstrated conclusively that markets can’t provide an entire population full coverage. There are multiple reasons for this, starting with cost: surgery is prohibitively expensive and unpredictable. Workers can’t plan for these expenses and save accordingly, since the body doesn’t care about financial circumstances; it needs medical attention when it needs it. This is why health insurance exists. However, paying for costs is a loss for insurers, so their incentive is to deny as many claims as possible. (These could be considered, in fact, literal “death panels”.) This process is resource-intensive and thus inefficient; private insurers have much higher administrative costs than single-payer systems.

It’s also impossible to comparison shop, which is a supposed advantage of capitalism and competition (and probably is a real advantage in many fields). One can’t seek the best deal on surgery or rely on experience; this is why we hold doctors to an ethical code. Health maintenance organizations perform cost management and make other hard choices, but they, too, are intrinsically profit-making organizations, with one’s treatment as the cost, so people don’t trust them either.

It’s not as though government-run systems are necessary; single-payer is nice in theory, but I don’t expect it ever to become a reality here, since too many people are already attached to their existing providers. “Medicare for all” is a nice slogan, but people who like their current coverage won’t want to lose it. A more suitable solution is the public option: give everyone the option to buy into Medicare or Medicaid. The benefits will be twofold: it holds private insurers to higher standards of efficiency by making them compete with government care, and it ensures everyone has coverage.

However, the Republican Party officially opposes government involvement in healthcare entirely. There are a number of reasons the party’s recent attempt to ‘reform’ healthcare fell apart, starting with the fact that it simply doesn’t reflect reality. There isn’t a single extant example of an entirely private system providing full healthcare coverage. There are plenty of hybrid systems like Switzerland and Germany’s, where many or even all providers are private but the state subsidizes costs. This seems the likeliest outcome for our system, too. Germany’s system seems a particularly ideal model for ours, because it’s considered Europe’s most unrestrictive, consumer-oriented system: patients are free to seek almost any kind of care they want at any time. Most Americans like having choices.

There are a number of lessons to be drawn from the failure of the GOP’s American Health Care Act:

  1. Collective action is frequently effective. Protest and other forms of collective action are necessary but not always sufficient to enact political changes. The New Deal wasn’t handed down from upon high; Roosevelt encouraged the public to agitate for reforms, and the political pressure to pass the program probably would’ve been insufficient without the resulting popular agitation. (Other factors also contributed to its political viability; for example, certain segments of business, most notably General Electric, supported it as well.) More recently, Republicans in swing districts were made keenly aware just how destructive the AHCA would be to their constituents – and hence their political careers. Without the widespread protests, there would’ve been no leftward pressure on Republicans, and it very likely would’ve passed in an even more draconian form, since the other opposition to the bill came from the right.
  2. Governing requires the ability to negotiate compromises. The AHCA failed in no small part due to the GOP’s inability to marshal its left and right flanks. Every major piece of American legislation, from the New Deal to the Great Society to the Affordable Care Act, was in some way a compromise between different ideological factions of one or two parties. The necessity of compromise means no piece of legislation is perfect or entirely satisfies anyone, but it’s the price of governing, and it’s worth paying if it benefits people. Thanks to the ACA, tens of millions of people gained healthcare coverage they wouldn’t have otherwise gotten. Thanks to the Republican coalition’s inability to unite in government, they still have it.
  3. Policy expertise matters. GOP politicians have disdained government bureaucrats and legislative research assistants since Gingrich. As a result, few elected Republicans still fully understand the legislative process. Getting elected to office requires an entirely separate skill set from writing legislation. Many officeholders have assistants for this, and there’s no shame in that; staying in office requires constant fundraising (which is shameful, but unlikely to change while Citizens United prevents campaign finance reform), which leaves little time to draft legislation oneself. However, GOP leaders have come to disdain expertise.

    Paul Ryan is often considered today’s GOP’s “intellectual leader”, but, as commentators like Paul Krugman have argued for decades, this reputation is entirely undeserved. The AHCA was largely Ryan’s bill, and his policy misunderstandings played a direct part in its failure. Ryan sincerely believes his political stances, but they’re largely not based on evidence. The AHCA would’ve destroyed the mechanisms by which the ACA functions, thereby ending tens of millions’ coverage. Elected Republicans’ overall misunderstanding of this is a large factor in their bill’s failure. It’s not even fully clear if they understand how insurance works. As long as they lack this understanding, their attempts to reform healthcare are likely doomed to failure.

  4. It’s much harder to eliminate an already functioning government program than to block a proposed new one. This is, indeed, a major reason the GOP agenda in opposition to Obama was so thoroughly obstructionist. Rep. Tom Rooney (R-FL) recently commented, “I’ve been in this job eight years, and I’m wracking my brain to think of one thing our party has done that’s been something positive, that’s been something other than stopping something else from happening.” As I said: reactionary. The ACA was initially unpopular, but as more people received coverage under the act and saw how it benefited them, they became attached to it. Republicans will now likely attempt to undermine the ACA through administrative malfeasance. It remains to be seen how effective this will be, given lesson five:
  5. Voters tend to blame the president’s party for policy failures. The Democratic Party didn’t actually possess the unrestricted ability to enact legislation for most of Obama’s presidency. Indeed, due to Al Franken’s late seating in the Senate (his electoral victory was legally challenged for months) and Ted Kennedy’s illness and death, Democrats’ filibuster-proof Senate majority only lasted about a month. After 2010, Republicans controlled the House of Representatives, and after 2014, they also controlled the Senate (though their majority wasn’t filibuster-proof). The ACA even existing is little short of a miracle; in retrospect, it’s almost unbelievable that Obama also bailed out the auto industry and a passed major economic stimulus in his first two years. Despite a steadily improving economy and legislative veto points severely constraining Obama, voters blamed policy failures on him, resulting in poor Democratic showings in 2010, 2014, and 2016. However, Republicans now control both chambers of Congress and the presidency. Voters blamed the GOP for George W. Bush’s failures in 2006 and will likely blame them for policy failures once again.
  6. Elections have consequences. Democrats did awfully in 2016 overall, but nonetheless gained House seats (despite still not controlling it). Since they maintained a unanimous front of opposition, the GOP could afford fewer defections to pass legislation. Had Democrats not gained those seats, Republicans might’ve had enough votes, and the bill could’ve passed. If Democrats make additional gains in upcoming elections this year and next, they’ll be able to obstruct more of the president*’s agenda.

Back to top · Table of contents · My portfolio · Contact me · Website index

Falsehoods about the economy

The economy generally, too, is often subject to myths, starting with supply-side economics, or what George H. W. Bush called “voodoo economics” in one of his more candid moments. The idea is that cutting taxes on the wealthy leads to a ‘trickle-down’ effect of increased government revenue and greater wealth for the entire populace. In almost four decades of practice, this has never occurred. Income inequality has steadily increased since the Eighties. Wages have barely kept pace with inflation, despite steadily increasing worker productivity; meanwhile, the rich have become vastly richer. This is, overall, terrible for economic growth, because people can’t spend money they don’t have. If people can barely afford rent and food, they certainly can’t afford iPhones and PlayStations. Meanwhile, the wealthy often became wealthy by not spending. Much of the cash they receive from tax cuts goes into savings. The working class will never see a cent of this money.

To be perfectly clear, there’s a point beyond which raising taxes on the wealthy won’t increase government revenue. We’re nowhere near it. As mentioned above, the top marginal income tax rate under Eisenhower was 91%. If we were at 91%, there’d be legitimate cause to question whether tax rates were too high; most economists believe that the optimal tax rate to maximize government revenue is in the 65% to 70% range. The top rate now is less than 40%, and since the wealthy often earn much of their income from capital gains, which are mostly taxed at 15% (though the rate varies from 0% to 28% overall), the effective tax rate on our country’s wealthiest citizens is actually, in practice, much lower than it is for the middle class; moreover, when one accounts for local and state taxes, which are often quite regressive, it’s often comparable to the tax rate on the working class.⁽⁷¹⁾

Back to top · Table of contents · My portfolio · Contact me · Website index

Falsehoods about guns

Another prominent set of falsehoods I’d like to examine concerns guns. At no point do I expect this country’s individual right to bear arms to disappear. Even if I supported the idea in theory (to be clear: I don’t), it’d be unworkable in practice. Our country simply has too many guns for seizing them all to be a workable proposal.

But it’s entirely possible that no topic is subject to more American mythology than firearms, starting with the Second Amendment itself, which explicitly cites “a well-regulated militia” as its basis for private gun ownership. In other words, possession of firearms can’t be outlawed, but it can be regulated. Limits to amendments are hardly unusual; the First Amendment doesn’t protect defamation, violations of privacy, or falsely shouting fire in a crowded theatre.⁽⁷²⁾

When the Bill of Rights was drafted, the most potent guns widely available fired a shot per round and took around a minute to reload. Today’s rapid-fire guns would be unthinkable to the Constitution’s Framers. I’m not saying they should necessarily be banned entirely, but I also doubt the Framers would recognize a right to own a nuclear weapon arsenal. One must consider the historical context.

A “good guy with a gun” stopping a “bad guy with a gun” (why is it always a guy?) is one of gun advocates’ favorite tales, but armed bystanders rarely intervene successfully to stop violent crimes. Many people lack the training to intervene without harming innocent bystanders. Police and military have extensive weapons and combat training, and even they sometimes make mistakes. All too often, armed people without proper training are disarmed and their guns are used against them. Indeed, firearm ownership can decrease homes’ safety; ill-intentioned intruders often use improperly stored firearms against their owners. However, people who have proper training in handling weapons and avoiding disarmament, practice shooting regularly, follow proper storage procedures, and aren’t subject to psychological ailments may accrue security benefits from owning firearms. (Owning firearms greatly increases one’s risk of suicide; those subject to depression should avoid it.)

Often, firearms are also handled improperly. The number one rule of gun safety is: always treat guns as if they’re loaded, even if you’re sure they aren’t. The number of people who’ve forgotten to check the chamber alone to fatal results is probably uncountable. Other commonly neglected rules include: never aim guns at targets you don’t plan to shoot; keep your finger off the trigger unless you intend to fire; point the muzzle away from non-targets; be sure of targets and what’s beyond them. These rules are violated appallingly often in both popular media and real life. Proper storage is also routinely ignored; there’s been a rash of minors, some as young as toddlers, obtaining firearms with fatal results after their parents or guardians didn’t store them securely.

One final note is the “government wants to grab your guns” myth. It’s entirely infeasible that any Democratic politician would actually seek to commandeer the guns of law-abiding citizens. That battle is lost, and anyone who can get elected knows it. All national gun control regulations recently proposed have been for widely supported requirements for purchases: background checks, training, and/or psychological screening. Preposterously, purchasing a gun in America is easier than flying on an airplane. All recent proposals for national-level regulations of gun sales have failed; restrictions at state levels have generally been loosened. Gun sales increased sharply under Obama; they’ve fallen precipitously under the president*. I’m not saying the gun lobby utilizes fear to sell new weapons, but I’m also not not saying that. I invite readers to draw their own conclusions.

Back to top · Table of contents · My portfolio · Contact me · Website index

Blue lies

Why are so many of these falsehoods so common? I suspect some people sincerely believe them. I mentioned above Paul Ryan, who certainly has the air of a true believer. Others, like Mitch McConnell and the president*, don’t seem so ideologically motivated. For them, I think these may be blue lies.

White lies are simple concepts; they’re lies told to spare others’ feelings. Children start telling these at around age seven. Black lies are also simple; they’re selfish lies. Children start telling these at around age three. Blue lies are more complicated; they’re lies told to benefit an in group at an out group’s expense. Children start telling these at around age eleven. They are commonly accepted in some cases; hardly anyone questions the need for espionage. Their usage in politics may not initially make logical sense, but if one considers that some people may genuinely believe their political opponents are existential threats to the country, then their origin perhaps becomes more fathomable.

To be clear, I don’t endorse blue lies at all. I’m not going to pretend that I don’t believe the president*’s administration* is an existential threat to America, but I also don’t believe that making further attacks on the truth will save us; such attacks are a large part of the reason we’re in such a precarious state to begin with. (Inherently exclusionary attacks on the truth are particularly ill-considered; we’re already divided enough.) A recent Scientific American article delved into the science of blue lies as they apply to politics, the president*, and his supporters; I recommend tracking it down.

Back to top · Table of contents · My portfolio · Contact me · Website index

Misconceptions about political organization

[I think a lot of the above examples are subject to intentional distortions by people with ideological agendas. I want to be very clear that this last example is different: I think mostly well-intentioned people are just mistaken about some basic aspects of how our system functions (or doesn’t function, as the case may be). I’m nonetheless putting it here because an unintentional falsehood is just as false as an intentional one. –Future Aaron]

I’ve mentioned that many people blame the Democratic National Committee for Democrats’ substandard performances in 2010 and 2014. In truth, structural factors made Democratic victories during those years highly unlikely. Those elections’ outcomes were doubtless worse for Democrats than they should’ve been, but off-year elections structurally disfavor even relatively popular incumbent presidents’ parties. Due to poor civic education in the United States, many voters possess little understanding of which parts of government have which powers. Many aren’t even capable of naming their national legislators or which parties they belong to, much less any info about their state legislators; many are also incapable of naming any current Supreme Court justice. (To be fair, I’m not sure I could immediately list all nine off the top of my head; it would probably take a minute or two.)

Thus, many voters blame presidents for factors entirely beyond their control. If Republicans control Congress and the president is a Democrat, voters tend to blame Democrats more than Republicans, even though the president’s only direct influence over the legislative process is vetoing legislation (and even that can be overruled). Voters also often aren’t conscious of structural economic factors. The effects of 2007’s economic crash, the worst since the Great Depression, were guaranteed to be felt for years afterwards regardless of who controlled the presidency or what they did; however, many voters with little education on how the economy functions were inclined to blame Obama for the sluggish recovery, however much responsibility he truly bore.⁽⁷⁹⁾

As a result, structural conditions in America’s electoral system strongly disfavor a Presidential incumbent’s political party. The incumbent’s opposition is usually galvanized by anger; the incumbent’s party is usually demoralized since not everything has been fixed. There have been exceptions to this: Republicans faced losses in 1998 because their impeachment of Bill Clinton backfired with voters, and they gained in 2002 protesting the aftermath of 9/11. These structural factors, moreover, haven’t always existed, or at least haven’t been as extreme; when civic education was in a better state, voters were less likely to blame the president for factors outside his control. Given, however, that many Americans would be unable to recite even a Schoolhouse Rock-level understanding of their political system, 2010 and 2014’s outcomes seem like foregone conclusions.

That said, both were likely worse than they had to be due to the sorry state of Democratic organization on the ground. Still, the blame for this has mostly been apportioned to entirely the wrong institutions:

None of these organizations are responsible for local organization and campaign infrastructure, because operatives in Washington, D.C., cannot possibly possess this knowledge for 435 separate Congressional districts and 50 separate states. Doing so requires personal connections on the ground and detailed knowledge of local politics. A campaign for Congress is likely to require at least $1 million of funding even in regions with few residents like the Dakotas and Vermont, and competitive districts like Jon Ossoff’s in Georgia can require $10 million or more. The DCCC took in $206 million in donations last cycle, and when you factor in operating costs, it can’t possibly hope to cover the costs of 435 separate congressional races.

As a result, campaigns for office need competent campaign managers willing to work for low or little pay (most won’t get paid). They need contacts in their district. They need to know what the constituents want and what their culture is like; this varies from district to district. They need skilled speechwriters, publicists, and marketing firms who know the district well and can craft effective messages that appeal to the constituency. They need sympathetic media contacts. They need connections to local donors. They need volunteers who know how to capably train other volunteers to canvas, register voters, and phone bank effectively. They need staff who can do cybersecurity and manage large databases of donors, volunteers, and likely voters. They need people who can input all that information accurately and quickly. The list goes on and on and on.

And, most importantly, they need a candidate for office with sufficient skills to run for office, with no skeletons in their closet to make them unelectable, and (most) importantly who wants to run. Running for office means giving up your privacy; it means giving up months or years of your life for no pay and no guarantee of any payoff; it could mean subjecting yourself to possible threats if your constituents are particularly hostile to your party.

All of the tasks I listed above are difficult; they require skills not everyone possesses; many can’t safely be performed by people of certain ethnicities in certain districts; none are particularly glamorous; and almost none will be rewarded with a cent of pay. In a red district, there’s no guarantee that finding any of these people will be easy. This is specialized knowledge for each district, and it will change when people move out of the area or become too busy to volunteer or lose interest in politics. A donor in Ft. Lauderdale may have no desire to give money to a campaign in Miami, much less Orlando. The desires of the constituency will also vary from region to region. What works in urban Texas may not work in rural Nevada. What works in rural Nevada may not work in urban Nevada. What works in Austin may not work in Dallas. What works in Dallas may not even work in Fort Worth. Constituents of each district may have highly idiosyncratic wishes that may not be readily apparent or understandable to people who haven’t lived there and experienced its culture.

This may even extend to institutional culture. A form of organization that’s palatable to volunteers in one part of the country may not be palatable to volunteers in another, because volunteers in the second part may simply be used to doing things a different way, or may chafe at being told what to do by outsiders, or may simply work less effectively because they have no experience doing things that way. A team’s interactions can change radically with the addition or subtraction of one particular person; without close knowledge of the team members, one can’t hope to manage them effectively.

A national party organization cannot possibly cultivate all this information for 435 separate districts, and it’s rather mind-boggling to see people who claim to detest top-down structure in America’s economy and political system behave as if our major left-leaning political party should be structured in a top-down manner. There’s no possible way to direct all this work from Washington, D.C., with existing technology and win elections. This is the purview of local and state parties, and most of them have been neglected by both volunteers and the media.

And then there are people who complain that any donation from a wealthy person corrupts a candidate for office or a particular nonprofit organization, and also complain that the DCCC and DSCC don’t do more for congressional and Senate races. Where exactly do these people think these donations come from, exactly? The unpleasant nature of this system is one major reason I’ve consistently advocated overturning Citizens United, but we have to work with the system we have, and these two complaints are self-contradictory. We should not let the perfect be the enemy of the good.

A substantial number of ‘leftists’ appear to spend all their time online criticizing the Democratic Party for many things it’s not actually responsible for. I suspect more than a few are paid Russian trolls, but many others are undoubtedly Americans with benign intentions. However, they’re entirely off base. If they first took the time to learn what the Democratic Party actually does (I haven’t outlined even half of it here) and how American elections actually work, then actually invested the time they spend online into improving it, then the Democratic Party would be far better off. But this would require listening to other people who may know more about certain topics, and thus requires the ability to acknowledge that one’s own knowledge is imperfect. Some of these people seem to lack that ability.

Back to top · Table of contents · My portfolio · Contact me · Website index

“Times of universal deceit”

It’s a sad condemnation, then, how often news coverage and political discourse no longer reflect reality. The concept of truth itself is currently under siege. The current administration* has engaged in an all-out war on truth with coinages like “alternative facts”, but I’m no longer certain mainstream news sources can be considered reliable either. Many narratives they’ve presented us are unreliable or simply flat-out false. As Orwell observed, in times of universal deceit, truth-telling is a radical act.

If the news fails, where do we turn? Maybe we abandon mainstream sources and seek ones that were dependable all along. I hope people do this. I hope people support these sources financially so that they can continue doing dependable work; journalism is indispensable to democracy, and it’s in an incredibly sorry state right now. But perhaps, too, truths can still be told in fiction that can no longer be told in the mainstream news. Moonlight certainly tells some of them. I hope to see more.

Back to top · Table of contents · My portfolio · Contact me · Website index

Attempting to anticipate tomorrow’s problems today

Automation and the future of work

A topic I particularly hope to see fiction examine that few news sources are covering today is the inevitable loss of jobs to automation and other new technologies. It’s already occurring: much of the “economic anxiety” in regions like West Virginia⁽⁷³⁾ is in response to the likelihood that, despite the president*’s promises, coal jobs won’t return; Robert Murray, CEO of America’s largest private coal extraction company, recently admitted this. Renewable energy is simply already too efficient for coal extraction to remain economically justifiable. Manufacturing, overall, is vanishing as the process becomes more automated. Transportation jobs will fade when autonomous driving comes of age. The service sector, food production, finance, possibly even education: these can increasingly be automated, and they will once it’s profitable (again, possibly excluding education). If we’re not prepared for the disappearance of these jobs, the world afterwards will be horrifying.⁽⁷⁴⁾

One solution is the government as employer of last resort. Keynes satirically proposed burying currency in disused mines, filling them with refuse, and providing permits to private enterprise to mine them as a solution to economic depressions. He compared this to gold mining and said it “would be better than nothing” because it would give people a source of funds and thus drive up demand. But his actual argument was that the government could create employment to more constructive ends: “build[ing] houses and the like”. This has been done before; programs like the Civilian Conservation Corps and the Works Progress Administration employed millions of workers during the Great Depression and constructed infrastructure all around the country that is still used today. The original Humphrey-Hawkins Full Employment Act proposed the government as a “reservoir of public employment” as a last resort for when the private sector proved unable to provide full employment, but while Jimmy Carter signed this into law in 1978, this specification of the Act has been neglected, as, indeed, has much of the rest of the Act. This seems an appropriate time to revive it.

There are many potential benefits to expanding government employment. Baby Boomers are aging. Elder care is a tremendously neglected field, but it currently isn’t that profitable. This means that, all too often, aging populations are also neglected. Government subsidies could increase the pay scale for this field, which would draw more and better qualified workers. While some well-qualified people are willing to make $10 an hour caring for the elderly, more will be willing to make $15 an hour doing so, and still more willing to make $20. Our infrastructure is currently crumbling. Bridges, roads, tunnels, train tracks, subways: all need repairs. Our national roads are still unprepared for the emerging technology of electric cars. There have even been proposals for roads that gather solar energy; the technology for this already exists. The list goes on; we can keep people occupied installing green energy sources for homes, performing maintenance, planting trees, or any number of other things that need to be done but simply aren’t profitable enough to the right person to actually get done right now. Without government intervention, none of it seems likely to occur.

However, it remains unknown whether this solution alone can provide enough helpful employment to keep the whole populace employed, gainfully, indefinitely. And that leads to another question: Will we become like the Earth depicted in The Expanse, with half the population dependent upon government subsidies and miserable? One solution to the inevitable problem of too few jobs is universal basic income: government pays everyone a steady stipend, enough for food and shelter, regardless of work status. I should note that I fully support this, but I’m not convinced it will be a sufficient solution to the crisis of automation, and it’s not for economic reasons, but for more elementary ones: I’m simply not convinced a government stipend will be adequate for the human spirit.

It turns out that, when programs like this have been tried (e.g., the Mincome program in rural Manitoba), most people who can work keep working. People like to feel useful, and unemployed people rarely feel useful. The only people who quit working for long in any large numbers were new mothers and students who’d only been working to support their families. The remainder who quit jobs mostly used the income as a stopgap until they found new employment; they probably hated those jobs and would’ve quit them far sooner, but didn’t have the resources to do so independently. However, what does it do to our societal happiness if there simply aren’t enough jobs for everyone?

Maybe people can find additional fulfillment through creativity. Robert A. Heinlein, in one of his earliest novels, For Us, the Living, imagined a world like this. But is that realistic, or is the widespread discontentment of the population of The Expanse likelier? It’s difficult to know this in advance, but few people are even discussing it. It seems an important question to ask, and an appropriate one for fiction. I certainly find fulfillment in creativity. Maybe it’d be sufficient for me in a world where work for survival was no longer required. On the other hand, much of my fulfillment in the last two years has also come through work. I enjoy what I do, and I’m good at it, and I’m steadily improving. I’m certain this is all correlated. I’m not sure creativity would provide this much fulfillment unless I felt I was improving. I’m not even sure whether I would feel certain of that unless I had a substantial audience. If everyone’s a creator, can everyone have a substantial audience? I’m also extremely introverted and differ from the average person in many other ways. Will creation matter as much to others as it does to me? My experiences certainly aren’t universal in any sense of the imagination.

Another option is strengthening labor laws. Institute a maximum amount people can work per week. There isn’t enough work to do? Then cut the maximum work week to well below forty hours. Mandate paid vacation time, family leave, any number of other changes labor advocates have sought for ages. Some Europeans accuse us of working too hard as it is; maybe this isn’t the worst idea overall. But taking it too far runs into the risk of government overreach, telling people what they can and can’t do. Too much of that could make people feel oppressed. Maybe there’s a compromise between the future of basic income, where everyone has nearly unlimited freedom but maybe not everyone feels useful, and the future of heavily regulated work, where people have less overall freedom but still feel useful because they still contribute directly to society’s continued existence.

Maybe, also, these three ideas – government as employer of last resort, universal basic income, and restrictions on hours worked – can all be combined productively. Perhaps there are even other solutions that I haven’t heard, or that haven’t even been imagined yet. These ideas remain largely unexplored. The news media certainly aren’t discussing them, but alongside climate change, they seem increasingly likely to define the remainder of the twenty-first century.

Back to top · Table of contents · My portfolio · Contact me · Website index

Security in information technology

Having looked at falsehoods and the future of work, I now turn to a topic that is likely to underpin a large amount of the remaining human labor of the future: security in the information technology sector. Current IT security practices are very, very bad; there have been a number of well-publicized leaks of data from large, well-known companies like Sony, from government departments like the State Department, and from virtually unknown actors like the data consulting firm Deep Root, a major contributor to the Republican Party’s vast data-mining operation. These problems are only going to get worse in the future; while the computer science field in general has not expanded as quickly as observers predicted it would, the security sector has continued to expand, and it only seems likely to expand further.

There are a number of reasons for this. Some of them are highly technical, and I won’t delve into specifics like the nature of cryptographic algorithms, but the basic concepts underpinning IT security are actually fairly simple, if widely misunderstood. The primary reason IT security will continue to grow, however, is that it requires a perfect track record to prevent breaches of information. A company or state actor has to defend against multiple points of intrusion, some of which the defending party may not even know exist; an attacker only has to penetrate one to be successful. The involvement of state actors like Russia and North Korea in cyberwarfare has raised the stakes to historically unprecedented levels, and we have been caught woefully unprepared by them.

And IT security has, thus far, been underemphasized, because there are tradeoffs between performance and security, and people haven’t wanted to make the tradeoffs. Some of this is because people simply aren’t fully aware of the risks involved; some of it is because no matter how good a security protocol is, it can’t defend against human error; and some of it is because even if it were possible to build a perfectly impenetrable algorithm, doing so would require sacrifices in terms of development cost, processing power, storage space, and other factors that developers, end users, or both have decided they are unwilling to make.

The basic security services expected out of IT are fairly simple to explain:

  1. Confidentiality: Prevent unauthorized persons, processes, or devices from accessing private user data.
  2. Integrity: Protect systems and data from accidental or malicious modification or destruction.
  3. Availability: Ensure authorized users reliable, prompt access to data.
  4. Authentication: Verify a message sender’s identity or user’s credentials to access requested data.
  5. Non-repudiation: Provide proof of identity to senders and recipients so that neither can later deny interactions (e.g., for electronic transactions).
  6. Access control: Ensure users, programs, processes, or systems are authorized to access data or resources.

Most of these services are maintained through encryption. I’m not going to go into the specifics of how encryption algorithms work; encryption is a dry, technical, mathematically intensive process that many readers may not understand and that is not particularly relevant to the topics I’m addressing here. The important thing to know about encryption is that it’s based on mathematical algorithms that make it easy to decrypt messages if you have the correct key, but practically impossible if you don’t.

(Barring a stroke of extraordinary luck, the encryption keys used in today’s software couldn’t be decrypted by today’s best supercomputers using a brute-force attack – in other words, trying every possible combination of keys until one worked – before the heat death of the universe. If quantum computing ever takes off, asymmetric algorithms that depend entirely on prime factorization could be broken, but luckily, postquantum cryptography is likely to be well underway by the time that happens, if it ever does.)

As a result of this, attacks instead usually focus on ways to obtain keys through deception, exploitation of code flaws, malicious software (malware), or other vulnerabilities unrelated to encryption. One particularly popular form of attack is attempting to trick a user, device, or software into disabling encryption; 2G (GSM) wireless phone technology is in the process of being phased out in the United States because it is particularly susceptible to these kinds of attacks.

Why does this matter? For starters, so much of our data is already online, and as we move even more information into the cloud and carry more data with us on smartphones, this trend is simply going to increase. Most people have no idea how much information private companies already know about them and would be alarmed if they did. Facebook is infamous among privacy advocates for compiling massive amounts of data about every human being on the planet; the company creates “shadow profiles” for people who haven’t signed up accounts. Facebook claims it keeps most of this data private and uses it mostly internally for advertising services. There actually isn’t much evidence that it’s lying about this specific claim, but this raises the important question of what happens if its data is subject to a breach of confidentiality.

There have been numerous alarming breaches of privacy lately. I mentioned the Deep Root hack above. This one was fairly recent; it’s a consulting firm for Republicans that had compiled massive amounts of data about voters’ party registration, addresses, contact info, and likely political opinions. Most of the information was extrapolated from available data online, and while it wasn’t actually compiled directly from the voters, experts who got a look at the data confirmed that it was scarily accurate. Before the security hole was closed, this data was available to anyone who had the correct URL; it was not protected with even a login/password request. The company claims the information wasn’t accessed by malicious third parties, but we only have their word for it.

This is fairly typical to many companies’ approaches to security: it’s not a priority. Many companies will give their employees courses on IT security best practices that plainly aren’t followed within the company itself. Even obvious stuff like “don’t write your password down close to your computer” isn’t followed in many workplaces; my own employer actually has our passwords on our monitors. And yes, they gave us an IT security course that tells us not to do this.

The problem, of course, is that there’s a tradeoff between security and usability, and any approach to IT security is going to have to take into account actual human behavior. Software engineers aren’t always good at this. Originality wasn’t the main reason Steve Jobs was considered a pioneer in the computer industry; Apple’s products, with the arguable exception of the iPhone, weren’t especially original.⁽⁷⁵⁾ Jobs was, rather, a pioneer because he took into account the end user experience; he wanted his products’ design to reflect how people actually used computers instead of how software designers wanted them to use computers. This is an important approach, and to this day it’s one many computer professionals seem absolutely oblivious to.

Any IT “best practices” course that doesn’t take into account actual human behavior is doomed to failure. You can have the best encryption algorithm in the world, but if you don’t safeguard the human element, then it doesn’t matter how good your encryption is; attackers will get into your system by bypassing the algorithm and obtaining the key through trickery. This is, in fact, what happened to the DNC in the 2016, and it was an extraordinarily foolish mistake. The fault wasn’t with the DNC employees; Podesta, in fact, did exactly what IT best practices state you should do in cases of an attempted attack. Suspecting a phishing attack, he had his aide email his IT security team about the offending email. The IT team responded, “This is a legitimate email. Podesta needs to change his password.” So he did. The problem is that the IT team’s response was evidently a typo. They meant to write, “This is an illegitimate email.” The email was a phishing attempt. In other words, the DNC data dump and every single Wikileaks story that resulted from it were the result of a typo.

I’m in IT, and I’m autistic. There are a lot of autistic people in IT. Overall, we don’t have the best grasp of how neurotypicals communicate, and overall, we’re not very good at predicting how neurotypicals behave. This is a problem we need to solve. Any IT security approach that doesn’t take into account communication differences and behavioral preferences will be undone by little things like typographical errors. I have an advantage: I’ve spent decades writing online posts for audiences largely consisting of neurotypical people. I don’t expect others in IT to have this background; it’s an unrealistic expectation. But communication is a serious problem that needs to be addressed. I’m not sure if we solve this by requiring IT training to incorporate more communication training and writing classes, or whether we incorporate this at the corporate/campaign end by providing more classes on it there. But if it’s not addressed, it’s going to lead to more problems.

And there are additional problems: a lot of IT security requirements that have been implemented have actually, arguably, been counterproductive to increasing security. For years it’s been common wisdom that passwords should be easy to remember and difficult to guess. A simple password that incorporates only one word is vulnerable to a dictionary attack (i.e., going through the entire dictionary one-by-one until you find a word that works); a password that incorporates a pet or lover’s name, a birthday, a mother’s maiden name can be cracked by anyone who knows the person. But then we’ve thrown in a bunch of wrinkles. Passwords have to contain at least a certain number of characters: this is intrinsically sensible; it increases the difficulty of guessing the password. Passwords need to have a number, symbol, and/or combination of uppercase and lowercase letters: this prevents simple dictionary attacks. And so on.

But then we take these things too far. Not reusing passwords across sites is sensible, because if a hacker gets one password, they can’t access all your accounts. But that leaves you with dozens of complicated passwords to remember. And then a lot of sites require you to change your password every year or six months or three months, and how is anyone going to remember all those without writing them down somewhere or using a password manager? This is where you run into the limitation of basic human behavior. People aren’t going to be able to remember all those passwords.

There are some things that can be done to fix these problems. Two-factor authentication is a good one; this requires a separate confirmation besides a password. Usually it’s a physical device that only the correct user would have access to. Of course, this runs into the problem of this device getting stolen, but a person would have to be able to steal the device and crack the password, which is much unlikelier than simply cracking the password. (Podesta was instructed to use two-factor authentication, but this was apparently in the same message that told him to change his password, so it was too late to prevent a hack.)

But there are other problems with passwords. The most obvious is when people simply tell others their passwords, either because they’re tricked into doing it or because they trust the people they tell and want to share the account. Another is when people use easy-to-guess passwords. “123456” was the most commonly hacked password in 2016, which is just one digit off from a password used in a scene from Spaceballs that many IT security professionals love to show (“that’s amazing; I’ve got the same combination on my luggage”). In many other cases, users simply keep manufacturers’ default passwords, which are also easy for hackers to guess. Another is the danger of password entry being observed by onlookers in public places. Another still is installing a keylogger on users’ devices: a piece of malware that reports users’ keystrokes, with the assumption that some will be passwords.

Biometric identification is emerging as a potential alternative, although it has its own limitations; for instance, since biometric factors won’t match exactly, there is a danger of false positives or false negatives. Biometrics could also be spoofed; for instance, the technology to create a false retinal scan or a false fingerprint is likely not far away. The strongest approach to security will likely involve combining biometrics, passwords, and two-factor authentication.

Back to top · Table of contents · My portfolio · Contact me · Website index

Some consequences of the 2016 election

Back, again, to 2016. In early November, Wisconsin governor Scott Walker said Clinton would be Obama’s third term. He meant this critically, but I can’t really think of a better endorsement. Obama wasn’t perfect, and Clinton wouldn’t have been either, but it’s unlikely our system could elect a better president; Duverger’s Law, the observation that winner-take-all voting systems like ours result in two-party systems, is a harsh mistress, and if we want better presidents, we need a better voting system, like 3-2-1 voting or approval voting.⁽⁷⁶⁾ Had the Republican primary used a better voting system, the president* almost certainly wouldn’t even have won it.⁽⁷⁷⁾

The Obama years were wonderful years for America overall. Queer people made enormous gains; our economy recovered steadily from the worst crisis since the Great Depression, with consistent job growth for eight straight years; our broken healthcare system was reformed to an extent unprecedented in fully a century of attempts to do so; the world remained largely peaceful, despite horrifying developments in the Middle East; respect for America increased worldwide; the American auto industry was rescued from obliteration. Clinton would have continued this, and she had some fantastic new ideas that, if implemented, would have further improved our country. (I was particularly enamored with her absurdly comprehensive mental healthcare proposal. Most Americans never heard about it, because the media were too busy gossiping about Anthony Weiner to cover it.)

Moonlight reaches us at a particularly crucial time, since many of the precarious gains marginalized groups have made in society are now threatened by the current government. The danger is very real. Mike Pence is the worst kind of ideologue; he believes queer people don’t deserve equal legal protection and has ambiguously hinted they should forcibly be subjected to shock “conversion therapy”, a barbaric practice linked to anxiety, depression, drug addiction, homelessness, and even suicide; it has justifiably been outlawed in several states. His policies in Indiana caused an explosion in HIV/AIDS infections (over 150 in a county of 23,000) when they closed Scott County’s Planned Parenthood, its only HIV testing site; he dismisses condoms’ effectiveness, conflating “typical usage” statistics (i.e., inconsistent/inept usage) with “correct usage” statistics (i.e., consistent usage as intended).

The president*’s proposed policies amount to literal ethnic cleansing and severely endanger minority communities, particularly Hispanic and Muslim ones. (Mahershala Ali’s Best Supporting Actor Oscar, which made him the first Muslim to win an Oscar for acting, could hardly have come at a time when America’s Muslim community needed more reassurance, either.) The administration claims “sanctuary cities” are dangerous; in reality, the opposite is true. If undocumented immigrants are afraid the police will deport them, they’re also afraid to report crimes or testify. That puts everyone in greater danger, particularly if desperate immigrants respond to injustices with violent actions or appeals to organized crime. These cities also aren’t “refusing to enforce immigration laws.” They’re simply refusing to take additional measures (that aren’t legally required) to make federal immigration agents’ jobs easier. Beyond that, lacking the correct immigration status isn’t a criminal offense; it’s a civil one. Entering the country illegally is a criminal offense, but once a person has done so, they’re no longer committing it, and since it’s not a local offense, it’s not local law enforcement’s concern; plus, many undocumented immigrants entered the country legally and simply overstayed their visas. (This is a major reason the president*’s wall has always been so thoroughly preposterous.)

Back to top · Table of contents · My portfolio · Contact me · Website index

Personal reflections on Jewish heritage and resurgent anti-Semitism

I’ve mentioned my Ashkenazi background. I’ve never actually practiced Judaism; my Jewish heritage is from my father’s side of the family. I was raised Christian, and while a number of the religion’s teachings remain major concerns of mine (especially its concerns for the sick and the poor), I consider myself agnostic. But I also strongly identify with my Jewish heritage. My name is Jewish; I look Jewish; most people who meet me probably assume I’m Jewish in both ethnicity and religion, and since I don’t much care about agnosticism one way or the other, I have no incentive to correct them.

The president*’s final campaign ad contained so many anti-Semitic clichés that it may as well have consisted of readings from The Protocols of the Elders of Zion. Apologists have argued that his daughter and son-in-law are both Jewish and appear to be two of the few people he genuinely respects, but having a “black best friend” is no longer seen as an acceptable defense to the charge of racism, nor should it be. Liking one or two specific Jews similarly doesn’t absolve a person of anti-Semitism. Moreover, for a person who hypothetically isn’t anti-Semitic, the president* certainly has surrounded himself with a large number of neo-Nazis and white supremacists: Stephen Bannon, Sebastian Gorka, Jefferson Beauregard Sessions III, even Stephen Miller, whose own Jewish ancestry doesn’t make me any less suspicious of his authoritarian tendencies (particularly given his long association with Richard Spencer, the Nazi everyone loves to watch get punched in the face).

Threats to Jewish community centers and synagogues have skyrocketed since November 8. They’ve been vandalized with swastika graffiti and calls for genocide. Some of the threats were hoaxes, but many perpetrators remain at large. When an Orthodox Jewish journalist asked the president* about anti-Semitism, he condescended to the journalist, claimed nobody cared more about Jews than he did, and steadfastly refused to name anything he’d do to combat anti-Semitism. The administration and congressional GOP erased us from their Holocaust Remembrance Day statement; for Passover, press secretary Sean Spicer first claimed Hitler never used chemical weapons, then claimed he never gassed “his own people” and finally that he didn’t target “innocent people” (I guess we didn’t count as either). Spicer ultimately apologized, and he appears to have sincerely meant it (and to have been so utterly oblivious that he didn’t realize his words could be interpreted as referring to the Holocaust), but the fact that he said these things at all is alarming. Otherizing a people is the first step to genocide. It doesn’t mean genocide will definitely follow, but it’s alarming that it’s happening at all.

I’m certainly not in anything near the danger from my ethnicity that most of America’s marginalized communities are. Overall, in fact, I’m extremely safe. The president* isn’t Adolf Hitler. I don’t think he actually has a coherent enough ideology to turn into Hitler, although, to be clear, Hitler’s ideology was never at all coherent, either. The president* is too much of a narcissist to have anything resembling an ideology; he’s certainly a racist and a misogynist, but by contrast, Hitler’s hatred sprung from deep-seated beliefs about the world, and I don’t see any evidence that the president* believes anything with any sincere conviction. He is, in fact, almost entirely an empty shell: he says what he thinks his audience most wants to hear, regardless of its connection to reality, and it’s simply unfortunate for us that he’s so proficient at ascertaining what his audience wants to hear. He’s not a conventional liar, as a liar at least knows the truth and calculatedly contradicts it; he simply doesn’t care what the truth is, and he’s as happy to tell the truth as to lie if he believes it will benefit him.

But I’m aware of history. When racial hatred is incited, the ultimate results are rarely pleasant for us. And some of the president*’s supporters seem to see sizeable parallels to Hitler. After Amazon released “Resistance Radio” as a tie-in to The Man in the High Castle, some of the president*’s supporters read this as a criticism of him, despite clearly being an in-universe attack on the series’ Hitler. Perhaps they’re simply unaware of the context; perhaps not. I offer this without further comment.

Anti-Semitism isn’t like other racisms. It disappears underground for thirty years, fifty years, a century, and people think it’s ‘solved’. Then it reappears, fully-formed, as though from out of nowhere, with all the old conspiracy theories and calls for genocide intact. The anti-Semitism of the past few months is the sort of thing I’d only seen in history books and films before. I’d seen modern anti-Semitism before, but it was nothing like this – nowhere near as virulent or outright genocidal.

I don’t entirely trust the left to fight it, either. I certainly remember that during WWII, a rather large number of Jewish refugees came to our shores and were turned away. Roosevelt, had he chosen, likely could’ve shamed the country into accepting more refugees, but he seems to have had his own prejudices. I can’t help thinking more of us would’ve survived the 1940s if he hadn’t had them.⁽⁷⁸⁾ And I certainly notice how many ‘leftists’ blame the Democrats’ performances in 2010 and 2014 on the DNC and Debbie Wasserman Schulz (many relevant factors are entirely outside the DNC’s purview, and Tim Kaine was DNC chair in 2010); use “Goldman Sachs” as a synonym for Wall Street’s ethical failures without mentioning similar repeat offenders like JP Morgan Chase and Wells Fargo; and blame Haim Saban and George Soros for seemingly everything they dislike about the modern Democratic Party. I have no great liking for Wasserman Schulz, Saban, Soros, or Goldman Sachs, but the pattern here is difficult to miss. Some modern ‘leftists’ seem to doubt that anti-Semitism still exists. I mistrust them as allies.

Recent articles in mainstream publications have asked if we’re white. For decades, American society’s answer to that question seems to have been, “Yes, conditionally.” It remains to be seen if it will be rescinded. Since the election, I’ve wondered if I’ll ever feel I need to make aliyah. To be clear, I regard Israel as an apartheid state. However, if anti-Semitism gets bad enough to flee, I don’t want to find out later that it’s gotten equally bad in my adoptive home country. If I have to flee, I only want to do it once. And if I end up in Israel, perhaps I can do some good there that I can’t do from here.

Overall, though, for now, I’m extremely safe. I have a good family and a good job, I live in a good community, and while I’m a member of multiple marginalized groups, no one who meets me is likely to discern anything about me that would plausibly place me in personal danger. I intend to keep it that way. I’m far more worried about others’ safety and security. However, I’m no longer entirely confident that the state considers me a full citizen, nor do I take white privilege for granted.

It got worse (2025)

[2025 addendum. Present tense within this subsection refers to 2025.]

I’m only going to mention in passing the Tree of Life shooting and various other anti-Semitic hate crimes that have occurred since I wrote the above. I’m also not going to spend much time on a convicted felon claiming there were “very fine people on both sides”. Remember that? Neo-Nazis shouted “Jews will not replace us”, and a white supremacist murderer whose name I won’t dignify printing shot and killed nine people at a black church, and the convicted felon whose name I also won’t dignify printing wouldn’t even condemn them. I don’t want them to go down the memory hole entirely, but it’s too depressing to contemplate them in full.

It’s gotten so bad that Mike Godwin, coiner of the eponymous law stating that the longer an Internet discussion goes on, the closer the odds of someone making a comparison to Hitler or Nazis get to 100%, had to state that his law did not immediately mean whoever made the comparison lost the discussion. Godwin, who is Jewish, could get on board with frivolous comparisons losing the discussion, but he felt compelled to clarify that not all Nazi comparisons were frivolous. In fact, the precise factors compelling him to make that clarification were “Jews will not replace us” and “very fine people on both sides,” and he’s outright encouraged comparisons of the speakers of both the quotes under discussion to Nazis. As Godwin wrote, his law “still serves us as a tool to recognize specious comparisons to Nazism – but also, by contrast, to recognize comparisons that aren’t.”

I do feel compelled to cap this off with a statement that I wouldn’t feel entirely comfortable making if I weren’t of Jewish background.

I get incredibly annoyed at people conflating all criticism of Israel with anti-Semitism. There has been some anti-Semitic criticism of Israel, don’t get me wrong. But it’s not anti-Semitic to call Israel an apartheid state, nor is it anti-Semitic to claim Israel is committing war crimes in Gaza, especially considering that numerous Jews, Noam Chomsky and Norman Finkelstein among them, have themselves made those exact arguments. And I’m going to back them up on this: Israel is an apartheid state, and it’s committing war crimes in Gaza.

Engaging in this sort of histrionic knee-jerk reaction trivializes actual anti-Semitism and makes people that point it out look more frivolous. The Anti-Defamation League at this point is almost doing more harm than good to our cause. They have, in fact, lost by extension of Godwin’s Law, because calling someone an anti-Semite is really just calling them a Nazi by a different name.

[End 2025 addendum. Present tense will refer to 2017 until further notice.]

Back to top · Table of contents · My portfolio · Contact me · Website index

Sources of marginalization

Unarticulated biases

I’ve spent much of this book detailing problems in society. To solve them, we may first need to identify their sources, many of which I suspect remain commonly misunderstood.

A common myth is that all bigotry comes out of malice towards a particular out-group. This is probably true of the worst kinds of lizard-brain, knee-jerk bigotry like the Ku Klux Klan’s, but many forms of bigotry are subtler. They’re often rooted in benign-seeming assumptions about human behavior or biology, the sort that generate statements like “You’re so articulate” (implying that other members of the subject’s race aren’t) or positive-seeming stereotypes like Jews and Asians’ intelligence (implying that other minority groups aren’t intelligent, and possibly implicitly insulting Jews and Asians who aren’t considered intelligent). These often aren’t caused by any form of malice at all, and many of the people uttering them probably have no idea of their implications; they simply stem from unexamined assumptions that, unfortunately, have unpleasant consequences.

I’ve addressed how many aspects of human behavior that result from unarticulated biases (“common sense”) are normalized in everyday interactions. Since I’m autistic, this affects my communication with others. People commonly blame autistic people for their problems communicating with others, but communication is a two-way street. We wouldn’t solely blame someone who doesn’t know Spanish for inability to communicate with someone who speaks only Spanish. The problem is that the two don’t know any mutually comprehensible languages. If a neurotypical has trouble communicating with autistic people, the latter aren’t solely at fault. The neurotypical may never have learned how to communicate with them, or even been given cause to consider that this is something they may need to learn. This is a problem of agnoia, not of one person’s behavior.

I’ve also discussed my difficulties with eye contact. I’ve learned not to worry so much about it. I’m not so desperate for meaningful human interaction that I need to make friends with people who are offended that I don’t look them in the eye at first, and I’m not in a position where eye contact with strangers is likely to be a major professional hurdle. Should I come into some form of employment that requires it, I’ll likely reëvaluate my position, but for the same reason, I’ll also avoid sectors of employment that require interactions with the general public if possible, such as retail and service. I recognize how incredibly privileged I am to be able to avoid these sectors. Many autistic people don’t have these opportunities. And this relates to my broader point.

Most people simply never have cause to question how society normalizes their preferences and needs, nor how this normalization affects those who don’t share them. I’ve mentioned African-American Vernacular English as being considered a signifier of reduced intelligence. In an endnote, I mentioned someone who’s incapable of understanding others’ speech when they’re looking away. There are dozens of other examples, and I can’t hope to compile a comprehensive list. Most people rarely think about how issues like these affect others. I think this is largely because the foundational ethical principle we are taught from childhood is inherently self-centered.

The Golden Rule’s fundamental flaw

I’m talking, of course, about the Golden Rule. It’s intrinsically flawed; I’d even say self-centered. It establishes a position of understanding others’ desires by way of one’s own. The underlying assumption is that all humans are functionally equivalent. But this is fundamentally wrong. Not all people want or even need the same things. “Do unto others as you would have them do unto you” establishes your desires as your behavior’s intrinsic purpose: if you treat others how you want to be treated, it implies, they’ll treat you likewise.

But the Golden Rule doesn’t account for the fact that many people don’t want to be treated the same way. Some people want fame; others like anonymity. Some people like heavily regimented work; others like the freedom to use creativity. Some people love parties; others only want a few close friends and find crowds exhausting. Some people want strangers to look them in the eye; others feel uneasy looking strangers in the eye. The variation in human preferences is astounding, and it’s seldom even acknowledged. The normalization of some of these preferences and not others has enormous impacts on our lives that most of us are never given cause to consider.

I’ve already stated the correct principle, of course. It can be phrased a few ways, including: “Do unto others as they would like to be done unto,” or, “Treat others how they want to be treated.” If they want your help, help them. If they want to be left alone, leave them alone. If they like parties, invite them to parties. If they hate crowds, don’t force them to attend.

[This vastly better principle is called the Platinum Rule. The Golden Rule might be an appropriate moral compass to teach kindergarteners who haven’t moved beyond solipsism, but adults should be able to do better. If others don’t share your preferences, basing your treatment of them on what you want does neither of you any favors.

Put another way, the reciprocal of “Treat others how you want to be treated” isn’t “they’ll treat you how you want to be treated”: it’s “they’ll treat you how they want to be treated.” This is another reason that how they want to be treated is a much better guide to how to treat them; after all, the reciprocal of “Treat others how they want to be treated” is “they’ll treat you how you want to be treated.”Future Aaron]

Of course, the Platinum Rule has its own underlying assumption: that you know how others want to be treated. This means that if you don’t know, it’s your duty to find out; you can’t just assume they want what you do. And the rephrasing shifts the emphasis entirely. It emphasizes what others want, and it removes the implication that doing so will result in equitable treatment. One of the harshest lessons I’ve learned as an adult has been how frequently the world doesn’t treat people equitably.

[Nonetheless, by actively going through the effort of discerning their preferences, you also convey to them that it would only be polite for them to return the favor – otherwise, they might never even have cause to consider that your preferences might differ. –Future Aaron]

Naturally, this has become a central concern of my life going forward. Numerous societal problems stem from power disparities. I’ve examined many already. Many aren’t even cases of power as the concept is commonly understood; one certainly wouldn’t classify social expectations as an example of political power, since no particular body enforces them, yet they still include some populations and exclude others all the same.

But we still have things in common. Autistic people are often assumed to be aloof, and that’s partially true, but I think we all still want friends, just like everyone else. I doubt there’s a person alive who doesn’t want or need friends, probably not even the most sadistic serial killer. The need for belonging may indeed connect every last human being in history. We are, ultimately, social animals.

Back to top · Table of contents · My portfolio · Contact me · Website index

Structural problems in society

One aspect of Moonlight that few reviews have addressed is that it addresses white supremacy without including a single white character. The absence of white characters was clearly a conscious artistic decision, and I suspect it was intended to illustrate how structural problems can be perpetuated in society without any conscious intentions from the people perpetuating them. It’s highly unlikely that any of Moonlight’s characters have any conscious prejudices whatsoever against their own race; instead, the entire system is structurally prejudiced against them, which means that their actions within the system can perpetuate structural inequalities without their even being conscious that they’re doing so, much less intending to do so.

This is a widespread problem in society, and it’s not limited to minorities. Take, for instance, white parents who move to new school districts to secure better educations for their children. The vast majority of these parents and children have not a single malicious intent, and inconveniencing oneself for the betterment of one’s offspring is an entirely laudable impulse. However, since white parents are likelier to be able to afford to live in better school districts, this contributes to structural racism without a single racist intent by perpetuating a phenomenon known as “white flight”: minority children get left behind in substandard school districts.

I want to be as clear about this as I can be: I’m not calling the white parents racist at all. I am, in fact, saying the exact opposite: most of them aren’t racist. Structural racism doesn’t require anyone to be racist. It exists entirely independently of intent. It’s called structural racism for exactly that reason.

And that’s one reason it’s is so insidious: it persists entirely outside of anyone’s specific behavioral intent, and many well-meaning people aren’t even conscious that it exists, because they haven’t been trained to see the societal patterns that perpetuate it. The academic definition of racism doesn’t encompass merely well-intended ‘articulate’ sympathies and Klan-style racism: it also denotes structural processes in society entirely independent of individuals’ specific intentions. Regarding individuals’ actions, it’s an ethically neutral judgment: we can’t reasonably condemn someone for a process of whose existence they’re entirely unaware. As with the communication example above, the root problem is of perception and agnoia; it’s not an ethical failing.

Unfortunately, every option open to these parents is problematic, even if they’re aware of the outcome: if they do move out of their district, they’re contributing to its worsening, but if they don’t, they’re risking their children receiving substandard educations. My parents moved to Sarasota specifically so I could attend Pine View, and if they hadn’t, I doubt I’d ever have ceased struggling in school. I certainly don’t think they were racist or wrong to have done so. Since the problem is structural, there are no good choices; the problem is society itself, and one person’s actions won’t change it, any more than one person buying an electric vehicle will stop climate change. Local school districts are largely financed by property taxes, and thus the revenues to fix substandard schools simply don’t exist in their districts. One family can’t hope to solve that problem by remaining in a district.

Dozens of such processes occur without malicious intent on anyone’s part. Listing and explaining them all would probably double this book’s size, but they perpetuate structural problems like white supremacy, patriarchy, and environmental degradation. That’s one reason I’ve called society Moonlight’s true villain: it’s an abstract concept that exists independently of any one well-intended person, and it’s structured in such a way that well-intended actions by well-intended people can still produce destructive results. Indeed, it often places well-intended people in positions where, no matter how they choose to act, their actions will harm others. The solution is to reform society, which is an impossible feat for one person, and at best a difficult one for millions.

The solution to white flight, then, is addressing the societal processes that cause minorities’ schools to be substandard in the first place. Much of this is due to conscious educational choices by Republicans and Democrats alike (and while Arne Duncan seems less overtly malicious than Betsy DeVos, his tenure at the Department of Education was still overall destructive). Constructive, evidence-based approaches to school reform exist; I particularly recommend the work of Diane Ravitch, who, were I somehow appointed president tomorrow (a position I neither want nor am even yet eligible for), would be my personal choice to head the department.

But structural problems also make getting an education secretary who recognizes these troubles into a position to fix them difficult, especially since many other educational problems stem from well-intended but misguided laws that simply don’t work (No Child Left Behind, for example). Since our system has abnormally many legislative veto points, and any legislative compromise that addressed these problems would need to satisfy a large number of factions with competing interests, fixing this legislation may be difficult.

And since politicians elected to office generally only get so much political capital to spend, even those who may wish to fix some of these problems may not be in positions to do so. For example, Barack Obama has often been blamed for not going more heavily after Wall Street. Instead, he chose to spend political capital on providing healthcare coverage to tens of millions of people who’d never received it before. This proved divisive enough to cause a backlash to Democrats in subsequent elections, ending their filibuster-proof legislative majority and leaving Obama with vastly reduced political capital. Had he chosen to reform finance first, he may not have possessed the political capital to pass the Affordable Care Act, and tens of millions of people would never have received healthcare. Did that help greater numbers of people in more urgent ways than financial reform would have? Who can possibly possess the resources to answer that? These are the political calculations elected officials in our system need to make, which is one of many reasons I have little desire to seek office.

[I do know that, had John Roberts not gutted the ACA’s Medicaid expansion on the preposterous grounds that no state government would be foolish enough to refuse it (Chief Justice, you obnoxious Federalist Society hack, have you met Florida’s government?), I’d have saved tens of thousands of dollars in healthcare insurance over my lifetime. Even so, I have the ACA to thank for even having insurance over that time: without it, I’d have been dropped for “preëxisting conditions” decades ago. –Future Aaron]

One final conclusion is that if all individual people simply stop holding racist beliefs, that may still not actually be enough to eliminate structural racism within society. Many processes that perpetuate these structures occur entirely independently of anyone’s conscious intent. Not being racist is good. Actively fighting racism, however, is much better. If people aren’t conscious of the structures that perpetuate white supremacy, they may outlast actual racist beliefs, since no one will notice they’re still there. This applies equally to other forms of marginalization, such as patriarchy and the marginalization of those with mental or physical difficulties. To end marginalization completely, people must be aware of the structures in society that perpetuate it.

In short, our political system’s structure directly causes many forms of marginalization, often renders even well-meaning people powerless to fix problems, and sometimes even places them into situations where every choice available to them will harm someone.

Back to top · Table of contents · My portfolio · Contact me · Website index

“We Do What We’re Told (Milgram’s 37)”

Another factor at play, if properly recognized, can serve as a Rosetta Stone unlocking the root cause of many forms of marginalization: We are, by nature, unbelievably credulous regarding those we see as authority figures. Shaking a person’s trust that an authority is qualified and has benign intentions is a herculean task.

After the trial of Adolf Eichmann, the Jewish psychologist Stanley Milgram derived one of his field’s most famous experiments. He wanted to ascertain if “just following orders” was enough to motive people to commit atrocities. He hired actors who’d pretend to receive a series of shocks, then told volunteers to administer ‘shocks’ (which, to be clear, weren’t real) of increasing magnitude to the actors if they failed to learn a specific word pair. Actors eventually banged on the wall and complained about heart conditions, and later ceased responding entirely.

The vast majority of Milgram’s volunteers kept administering shocks despite the actors’ pleas. Some stopped and questioned the experiment’s purpose, but after being assured they wouldn’t be held responsible for its results, most continued. Volunteers who asked to stop were given four specific verbal prods, concluding with “You have no other choice; you must go on.” If they refused all four, the experiment stopped. Twenty-six of forty volunteers (65%) in Milgram’s first trial administered the final 450-volt shock, though many were uncomfortable doing so and every last one questioned the experiment’s purpose.

There are several caveats. A contributing factor to the Holocaust was the dehumanization of the Jewish people and others; by contrast, Milgram’s volunteers didn’t know the actors, and many displayed great anguish at their actions. They’d also been assured their actions wouldn’t cause the actors lasting harm, while the Holocaust’s perpetrators certainly knew they were killing their victims. The Holocaust also went on for years, while Milgram’s experiment took an hour. It’s also not clear that all of his volunteers believed they were truly administering shocks to the actors; subsequent researchers have suggested many may not have, though even these researchers have suggested that around a third of those who believed it was real obeyed all the orders.

[Milgram, in fact, had his own caveats: he ran several variants of the experiment to identify influences on the compliance rates of each experiment’s forty volunteers. The most dramatic differences occurred in the variants with additional ‘teachers’ (who were also paid actors). After two teachers refused to comply, only four volunteers (10%) completed Experiment 17. After one teacher complied fully, thirty-seven volunteers (92.5%) completed Experiment 18 (shouts-out to the three that didn’t). The implication that we are enormously influenced by the precedents of each other’s behavior is extremely difficult to overlook.

Peter Gabriel intended this section’s eponymous song title as a tribute to the 35% of volunteers who didn’t complete the primary experiment, but he was off by 2%. In this context, 37 of 40 volunteers completing Experiment 18 is an ironic coincidence: the title is meant as a tribute to those who resist unjust orders, but instead can be read as a condemnation of those who follow them. –Future Aaron]

Despite these caveats, similar results have been replicated in subsequent experiments in societies throughout the globe, though with somewhat differing percentages. (84% of Milgram’s volunteers later said they were glad to have done so; many wrote him letters expressing thanks.)

One interpretation of Milgram’s results is that people can’t be counted on to realize authority figures they assume are benign are in fact malicious, no matter how much evidence contradicts their initial assumption. Subsequent experiments support this interpretation; when volunteers are given ideological explanations for an experiment, their belief in the ideology’s benevolence overrides their empathic concern for others. Volunteers are less likely to continue when prods to continue resemble orders, and likelier to do so when said prods stress the experiment’s importance for science.

This doesn’t seem to apply merely to malice, either. Observations of religious groups that predicted apocalypses have revealed that, after the date of the predicted apocalypse passes, their faith generally isn’t weakened; if anything, it’s strengthened. Some might argue that anyone responsible for spreading religious beliefs has malicious intentions, and this is a level of cynicism I can’t possibly justify. I have no religious beliefs now, but once upon a time, I did. Any proselytizing I’d have done would have been out of a literal, genuine belief that someone’s soul was in eternal danger. It’s hard to believe that anyone can consider this a malicious motive. I don’t have such beliefs now, and there are certainly religious leaders who give off every sign of having malicious intentions (hello, Scientology), but I’m not about to write off every religious leader or believer as having such intentions. Biological scans have revealed that people use the same parts of their brain to think about both religion and politics: thus, people are bad at reconciling reality with both religious and political beliefs.

This has widespread implications. For example, I named a number of factors that contributed to the 2016 election’s outcome, such as the FBI’s October Surprise; false media coverage; Russian-directed hacks and leaks; and overconfident projections of election results. Many people responsible for some of these factors (particularly the projections) may have been well-intentioned, but if people are bad at determining that authority figures are untrustworthy, then those figures’ oversized effect on the outcome becomes easier to explain. If media figures are telling untruths (whether intentionally or not), and people are simply bad at ascertaining the trustworthiness of those they regard as authorities, then they are poorly equipped to ascertain the truth of authorities’ statements.

And we must further combine this with blue lies, the lies people tell to benefit their in-group at an out-group’s expense. The people telling them may genuinely believe they’re doing the world a great service by lying because they believe their cause is righteous. And people who share their worldview and recognize these as lies may not even care that they’re lies, because they similarly believe they’re serving a higher cause. (Again, appeals to an ideology’s righteousness can incite people to perform actions they’d otherwise avoid.) Moreover, it appears that, in many cases, the only people who can shake the in-group’s beliefs in their leaders’ benevolence with any great success are others they recognize as fellow group members. In other words, people on the left can expose right-wing leaders’ dishonesty as much as they want without doing much to shake the faith of people on the right. The only people who may actually reliably shake these people’s faith are others they recognize as being on the right. As a result, much as I often disagree with them, I have to praise people like Glenn Beck, Tom Nichols, Jennifer Rubin, and David Frum who have criticized the dishonesty of the right-wing media. They may be accomplishing something that I, as a person unmistakably on the left, can’t hope to accomplish, and we need to see more of it. We need more engagement with reality, not less.

Back to top · Table of contents · My portfolio · Contact me · Website index

The nature of power

There are further weaknesses of hierarchies. Another particularly important one is that people are, to a large extent, products of their environment. This is another major theme of Moonlight; people are certainly shaped to a large extent by their upbringing, but surprisingly, they’re also influenced by their surroundings and positions within society. In short, placing a person into a position of authority over others can alter that person’s behavior. This is an old observation, often phrased in the form of Lord Acton’s dictum, “Power corrupts; absolute power corrupts absolutely.” But it comes in many other forms, and psychological experiments have increasingly revealed its truth.

Before I delve into this further, however, I must address the different forms of power. Intersectional feminists and anarchists use the term power-over to refer to authority over others, and I shall do so for the remainder of this book as well. The power over one’s own life to affect one’s surroundings is called power-to. Collectively wielded power amongst coöperative groups is called power-together.

These forms of power are, naturally, not equal. Most of the power disparities in society, many of which are analyzed in Moonlight, result from the simple fact that an increase in one person’s power-over often decreases another person’s power-to. By contrast, multiple people can accomplish things by coöperating that none of them could accomplish on their own, which is why we separate power-together from power-to. Power-over results from a top-down, hierarchical view of society; power-together results from a horizontal, coöperative view of society. Power-over results from a view that humanity needs to be forced into performing actions for society’s good; power-together results from a view that humanity accomplishes better things by coöperating.

I should note that nearly every film in history is in large part the result of power-together. Few individuals could direct a film, shoot it, write its script, perform its parts, write and perform its soundtrack, curate its sound effects, edit it, secure its budget, and perform all other necessary tasks for its existence entirely without assistance from others. To be fair, there are still hierarchies within a film’s production. Actors naturally follow the director’s guidance and the scriptwriter’s script. The musicians performing the soundtrack follow the composer’s score. And so on. But films could not be made if their creators did not coöperate.

This is where the distinction between necessary hierarchies and unnecessary ones comes in. Some hierarchies are absolutely required for the continued functioning of society. The most obvious one is parent and child. A parent can, of course, forfeit the right to raise children through cruelty or neglect. However, without some form of guardianship, minors would be wholly unable to fend for themselves, and due to their brains’ immaturity, we certainly can’t expect society to treat them like adults.

There are other necessary hierarchies as well, but they’re based on knowledge and expertise. Parents have hierarchy over children because they have knowledge and expertise about how the world works that children don’t. Teachers have hierarchy over students because they have knowledge and expertise about their subjects that students don’t. Film directors, we can assume, have knowledge about assembling films that may not be available to others within the production. And so on.

However, many aspects of our society are structured in hierarchical manners that have nothing to do with expertise or knowledge, starting with the economy. Do CEOs actually have hundreds of times more expertise or knowledge than their workers? They’re often given multimillion-dollar payouts even when their tenure was disastrous. Raising the company’s stock prices over the short term can make a CEO’s performance look particularly effective, and yet the strategies that increased the stock price may actually damage the company’s interests in the long term. Economists refer to such arrangements as perverse incentives, and they’re a consequence of the economy’s structure.

Many people defend the American economy as a meritocracy, but it’s actually full of perverse incentives. Our economy is structurally inefficient at distributing gains based on skill or merit and often places many people who’d otherwise be perfectly capable of learning skills into positions that will never allow them the opportunity to learn them (for example: Chiron’s difficulty obtaining legitimate employment due to his past incarceration). America’s low class mobility indicates how much one’s starting position in life can affect one’s ability to succeed. The characters of Moonlight are essentially trapped in poverty. They have little way to gain expertise or knowledge, and even if they had it, they’d have little way to benefit from it. They are, in short, in a situation in which others’ power-over has decreased their power-to. And powerlessness, the lack of power-to, also corrupts.

Back to top · Table of contents · My portfolio · Contact me · Website index

Authority’s effects on people

Philip Zimbardo’s Stanford prison experiment is another iconic, albeit unconventional experiment with troubling implications about human nature and authority. Zimbardo himself participated in it and ultimately terminated it before its planned conclusion when he realized it had become grossly unethical. Zimbardo’s team selected twenty-four male participants, mostly white and middle class, who appeared the psychologically stablest and healthiest; they excluded those with psychological impairments, criminal records, or medical issues from consideration. Half the participants were chosen for the role of prisoners, half as guards. Guards were instructed not to physically harm prisoners or withhold food or drink, but were also informed they could create a sense of boredom or fear, and that prisoners should feel they had no privacy or power.

The first day of the experiment was relatively uneventful, but conditions quickly deteriorated; guards forced prisoners to defecate and urinate in buckets, removed their mattresses, stripped them naked, forced them to repeat their prisoner numbers as a form of dehumanization, and attacked them with fire extinguishers. Experimenters suggested a third of the guards exhibited genuinely sadistic tendencies, and most were upset when the experiment concluded early. The prisoners internalized their roles and continued participating after forfeiting monetary compensation. Zimbardo himself says he also became absorbed in the experiment; fifty people observed it and only one, his then-girlfriend (later wife) Christina Maslach, objected to the prison’s conditions or to the experiment’s ethics generally. Her objections convinced him to terminate it after six days.

Again, the results have caveats; it’s been suggested that advertising it as a study on “prison life” rather than as a “psychological study” may have created a selection bias toward more aggressive and authoritarian participants, or that using a more socioeconomically diverse group of participants would likewise have produced different outcomes. Changing aspects of the experiment, in short, may change the results. However, these caveats nonetheless suggest troubling things about human nature: they imply that positions of authority may self-select for unusually cruel people; and that minute, poorly understood (yet easily manipulated in the wrong hands) differences may be deciding factors between horrendously abusive institutions and more benign ones.

Particularly troubling is the implication that people’s situations (situational attribution) influence their actions more than their personalities (dispositional attribution) – or, more succinctly, that Lord Acton’s dictum is correct.

Additionally, authority figures may create institutional cultures that encourage further abuse. One person in a position of authority on their own may not be easily corrupted. However, when placed alongside others in the same position, they may urge each other into things they otherwise wouldn’t have done, due either to peer pressure or to establishing precedents for each other where malicious actions come to be viewed as normal.

Police critics have charged that black officers can be as unpleasant as white officers to black civilians. It’s possible that authoritarian blacks are simply more drawn to the police, but that may not be the entire story: many people have claimed black officers they know changed after joining the police. In short, institutional police culture itself may have actually shaped officers’ attitudes without their conscious awareness. If institutional culture is likelier to treat blacks than whites as suspects, people within that institution may be conditioned to associate blacks more readily with crime than they do whites. A person with no conscious racist biases can, in short, be given racist impulses through a Pavlovian process without even being consciously aware of it.

This draws us, once again, back to Moonlight’s holistic critique of society. If people aren’t conscious of their surroundings’ effects on them, they may be completely impervious to how their actions affect others, since they haven’t been trained to perceive many of the forces that shaped their own lives. People who don’t understand how they are marginalized can behave in ways that perpetuate their own marginalization entirely without conscious intent, and indeed, this is a central theme of Moonlight.

People perceive, to a rather large extent, what they’ve been trained to perceive, and they’ve been taught to perceive their own desires and needs as the basis for how to treat others (the Golden Rule). Without being taught that their experiences and methods of perception aren’t universal, they have no reason to believe the narratives they’ve constructed about reality could be incomplete, to question their assumptions, or to believe that others might perceive the world differently.

As a result, they grow up believing in “common sense” (a collection of unconscious biases) that is a product of their surrounding culture. They believe that their experiences are universal. They live in bubbles of culture, and that culture influences their thought and perception in ways of which they aren’t even conscious. And a natural conclusion of this observation is that since people are malleable, so is culture itself. However, it also stands to reason that people must be made aware of how their surroundings affect them before they fully can understand how they can contribute to changing their culture.

It isn’t inevitable that everyone placed into a position of authority will be corrupted by it. However, even if we suppose that only half of such people will be corrupted, a half-corrupt system is entirely too vulnerable. And estimating the number at half may be quite naïve. If those who are likeliest to abuse positions of authority are also likeliest to seek them, we must conclude that positions of authority are intrinsically susceptible to abuse.

Back to top · Table of contents · My portfolio · Contact me · Website index

Problems with hierarchies

Inefficiencies of hierarchies

Another weakness of hierarchy is that it often simply produces inefficient outcomes. Management studies have suggested that projects are much likelier to succeed when managers cite expertise, the ability to learn, or a task’s sheer challenge as performance incentives than when they cite their own positions of authority, punishments, or rewards. This doesn’t merely mean managers’ expertise, to be clear: successful managers know they lack other team members’ complementary areas of expertise, thus enabling them to utilize and learn from their knowledge and, in turn, benefiting the entire project. They also listen empathically, which improves their ability to convey stakeholder needs and desires and project requirements and, in turn, work quality. By stressing projects as team efforts, they also increase team members’ investment in project outcomes; and by emphasizing synergy, where everyone benefits from team performance, they improve project members’ emotional investment, coöperation, and (again) work quality.

My professional experiences reflect this. As I’ve said, towards the end of my tenure with my first employer, they did an awful job communicating with me and made no discernible efforts to understand my needs. They appealed to company earnings as our paychecks’ source, but never particularly emphasized synergy or teamwork; they almost always entirely stressed management’s needs, not employees’. By contrast, my current employer continually highlights how individual employees’ effective performance benefits everyone: accurate performance now means less work fixing errors later and ensures higher satisfaction from advertisers and TV stations (project stakeholders). Even internal nomenclature reflects this attitude; employee groups are specifically called teams, and managers also repeatedly emphasize that the entire facility and, indeed, the entire company is a team. This might fall flat if employees didn’t actually feel like teams, but the difference is fairly clear.

There are further implications. Employees’ productivity plummets if they work for over forty hours a week for over a month. Many Americans work multiple jobs because each job’s paycheck is insufficient to pay their expenses. Thus, they routinely work over forty hours a week, so they’re getting paid for more than forty hours, but they’re still only providing around forty hours’ worth of output. In short, if employers paid employees enough to live on so they didn’t need multiple jobs, they’d actually be more productive, and the overall payroll would stay the same or even decrease.

At one point, employers realized this. Henry Ford, no particular altruist (and a horrid anti-Semite, no less), chose to pay his workers a then-unprecedented $5 per day, which far outstripped what his competitors were willing to offer. This decreased turnover and increased output, since experience increases employees’ skill. His employees’ reduced preoccupation with financial concerns and susceptibility to health issues further increased their productivity, and eventually, they were able to buy his company’s products, thus increasing demand.

But at some point, employers began to believe reducing payroll costs outweighed the benefits of properly compensating employees. To be clear, there’s a point beyond which increasing wages may not tangibly benefit companies’ bottom line, but most modern companies are nowhere near it. Employees who aren’t paid living wages are unlikely to work to their full potential, since they’re overstressed, fatigued, and disinvested in their work as a source of satisfaction. It’s a simple result of Maslow’s hierarchy of needs: without emotional security over their source of food and shelter, they simply can’t work to their full potential.

All of this reflects an overemphasis on power-over. Companies have been run increasingly along top-down lines in recent decades, but a horizontal, coöperative approach emphasizing power-together creates more productive and satisfying results for both employees and employers. Supply-side economics is amathia of human nature (and for that matter, economics). Demand is at least equally important, and probably far more: people can’t be sources of demand for what they can’t afford. When a benefit to one is a benefit to all, everyone benefits more.

Back to top · Table of contents · My portfolio · Contact me · Website index

Rethinking hierarchy and restructuring society

I’ve spent a lot of time discussing politics, but I haven’t actually laid out what kind of society I’d actually consider ideal. I don’t know that I have an ideal society, because humanity’s imperfection means all societies will ultimately be flawed, but my political ideals are ultimately most influenced by anarchism, quite likely the most misunderstood political philosophy in existence. I think this is largely a simple result of its name: people hear it and hear no state. It doesn’t mean that; it means no rulers. This is a crucial difference. Simply eliminating the state wouldn’t give us a society without rulers. The most likely result would be that corporations would quickly take over, and even if we somehow avoided that, all kinds of social pressures would still create de facto leaders.

[Given my recent study of Greek, I’d be remiss not to mention the etymology of anarchy. It descends from ᾰ̓νᾰρχῐ́ᾱ (ănărkhĭ́ā), a compound of ἀν- (an-), the alpha primitivum, used to denote a negation or absence; ᾰ̓ρχή (ărkhḗ), which in this sense means rulership; and -ῐ́ᾱ (-ĭ́ā), the abstract noun suffix. Thus, it means the lack of rulership.

(Ᾰ̓ρχή can also mean beginning or origin; our word arche is its direct descendant.) —Future Aaron]

The principal goal of anarchy is to create a society without unnecessary hierarchies. Parent-child will remain. Student-teacher will remain. Expertise-based hierarchies will remain. Others will vanish. Anarchy doesn’t mean no rules. Those who harm others (likely including animal cruelty and harm to the environment) will be removed from positions that allow them to harm others until they’re no longer a threat to others. However, the central principle of anarchy is coöperation.

Another term closely related to anarchy is libertarian socialism. This often reads to Americans as an oxymoron, but this comes from a misunderstanding of political history; anarchism is the oldest form of socialism and the oldest form of libertarianism. Libertarianism was originally coined to refer to anarchist communism in contrast with Pierre-Joseph Proudhon’s mutualism (which one could consider the social democracy to anarchist communism’s socialism); right-wing libertarians only adopted the term in the 1960s. Libertarian socialism isn’t always anarchist, but anarchy is one form of libertarian socialism; anarchy is largely based around the idea of collectively owned workplaces with coöperative decision-making, and these concepts are intrinsically libertarian socialist.

There’s a common misconception that anarchists want chaos, since anarchy is colloquially used as a synonym for chaos. But of course, this is the last thing any anarchist wants. The common Ⓐ (circle-A) symbol actually stands for “Anarchy Is Order”, a statement from the writings of Proudhon, the first person to use the term anarchist to describe himself. The world, anarchists suggest, is already chaotic. The inevitable response to the suggestion that the biggest gang would simply take over is that this has already happened; governments and corporations represent the biggest gangs.

But these are largely theoretical discussions, because nothing resembling anarchy currently exists on a large scale, nor is it likely to exist anytime soon.⁽⁸⁰⁾ There have been functional cases of anarchy in the past; a popular citation is Spanish Civil War-era Catalonia. George Orwell’s Homage to Catalonia describes Barcelona at this time, though he spends more of the book discussing the war itself. The Free Territory of Ukraine was another example, though a strange one. Nestor Makhno, the closest thing it had to a leader, strictly outlawed all political parties (which he felt were based on coercion) and considered himself solely a military leader; the society was based on voluntary coöperation, and Makhno organized defenses against outside threats, most notably the Soviet Union.

Despite the existence of the Free Territory and Anarchist Catalonia, there aren’t many completely reliable accounts of them, as both existed during politically tense situations with many inevitably politically motivated observers. There have been accusations against the anarchists of various forms of brutality, but many accusers may have had their own political or personal motivations.⁽⁸¹⁾

Outside threats are an inevitable problem of anarchy; both Catalonia and the Free Territory eventually succumbed to authoritarian leadership. Franco ultimately ruled Spain for several decades, but in-fighting between Stalinists and anarchists didn’t help the outcome. Ukraine, of course, was eventually absorbed into the Soviet Union. This is one reason I regard anarchy largely as an intellectual exercise, something to aspire to. Perhaps someday we can have it as a species, but I suspect we’d need it to be a worldwide endeavor. We’re surely not ready for that.⁽⁸²⁾

Another misconception about anarchists is that they’re wide-eyed idealists who believe humans are inherently good. Humanity’s flaws are why anarchists find the state an insufficient solution. Anarchists’ primary goal right now isn’t establishing anarchy; it’s reducing society’s power disparities. I’ve already mentioned the three forms of power and Lord Acton’s dictum, which is essentially the guiding principle of anarchist thought: if power-over corrupts, then we should eliminate it from society as fully as possible, because its result is that others lack power-to. (Moonlight depicts cases of extreme powerlessness; its protagonists’ lack of self-autonomy is one of its central themes.)

The consequence of this is an ethics that rejects domination, either over others or by others, as inherently wrong. The goal is a society in which people are free to do anything that doesn’t harm others; in which people may voluntarily associate or not associate however they choose; in which people mutually assist each other when necessary and desired; in which decisions are made voluntarily, coöperatively, and with full participation from those affected; in which large-scale organization is done through confederations of smaller, directly democratic organizations; and in which people are represented by themselves rather than by elected officials. In short, a society with liberty, equality, and solidarity, in which everyone’s power acts as a check on everyone else’s.

This is, as I said, a wonderful ideal to aspire to. I don’t expect to see it in my lifetime. But we need to spend time thinking of how we can improve society, because if we don’t have a clear vision for what we want, then who, exactly, is going to listen to our complaints about existing problems in society?

Anarchists don’t stress the idea of eliminating the state because, as I have stated, simply eliminating the state wouldn’t result in anything resembling anarchy. The anarchist graphic novelist Alan Moore wrote that doing this would not result in “the Land of Do-as-You-Please” (a reference to Enid Blyton’s children’s novel The Magic Faraway Tree); it would result in “the Land of Take-What-You-Want,” or what Hobbes would have described as “the war of all against all.” This is effectively correct. Anarchists’ strategy for reducing power disparities is, instead, to find anarchist tendencies that already exist within society and encourage them to the fullest extent possible.

This doesn’t actually mean that everyone involved in such a society has to call themselves an anarchist. Indeed, most of them probably won’t. However, the anthropologist David Graeber has observed that many of today’s most prominent social movements may not be explicitly anarchist in intent, but nonetheless are organized along largely anarchist lines, with mostly horizontal structural organizations emphasizing coöperation among individuals. The essayist Rebecca Solnit has similarly observed tendencies in human nature that often appear after particularly great disasters emphasizing coöperation and common purpose; her book A Paradise Built in Hell explores the aftermath of several disasters in depth as case studies. These tendencies’ mere existence suggests that under the right circumstances, they could extend much further in human society; Solnit writes, “What happens in disasters demonstrates everything an anarchist ever wanted to believe about the triumph of civil society and the failure of institutional authority.”

Encouraging the growth of horizontally organized social structures in society will strengthen their position and influence, and the influence of coöperation generally. The tactic behind the idea of syndicalism, a concept often associated with anarchism and labor unions, is to form coöperative workplaces and use them to supplant existing capitalism. This does not actually require a revolution and, indeed, can in theory be accomplished peacefully. Many labor unions are not organized among syndicalist lines, however; the Industrial Workers of the World (nicknamed the Wobblies) are undoubtedly the most famous anarcho-syndicalist union.

This can be extended far beyond the workplace as well; Graeber and others have endorsed creating “gift economies” to supplant capitalism. Graeber has also suggested that gift economies may actually have been the norm in earlier human history, with land and resources communally owned, while barter tended only to occur between people who didn’t trust each other, possibly even enemies. This narrative suggests that people may gravitate towards commons-based economic structures when left to themselves, and money only becomes a concern with outside threats of violence.

There’s substantial contemporary evidence for Graeber and Solnit’s theses. A Paradise Built in Hell appeared in 2009; among other events, it discussed Katrina in 2005 and the 1989 Loma Prieta earthquake. More recent disasters have had similar results. In 2012 in the northeastern U.S., Occupy Wall Street activists set up Occupy Sandy in Hurricane Sandy’s aftermath to emphasize mutual aid, coöperation, and long-term rebuilding of devastated communities, and while many began as self-taught amateurs, their assistance has been singled out as more effective and timelier than that of larger, better established charities and even government institutions like FEMA. They distributed food, supplies, and medical aid to affected communities and have since removed mold and conserved and repaired damaged structures. Their ability to muster support quickly through online channels has been identified as a particularly crucial component of their effectiveness, which suggests further strengths of horizontal organization: people can see an existing need, then quickly communicate it to others with better resources or training to alleviate it. They’re estimated to have provided at least $100,000 of assistance.

Occupy activists have also focused on buying up debt for cents on the dollar and forgiving it; as of 2015, they claim to have forgiven $32 million of debt in total, including at least $19 million of student loan debt and at least $15 million of medical debt (John Oliver also did this for $15 million of medical debt). They have also published pamphlets for debt resistance and have assisted the students who sued against Corinthian College, a for-profit college that was eventually shut down by the U.S. Department of Education for illegal, predatory activity.

Occupy activists have additionally set up the Occupy Alternative Banking Group as an alternative to traditional banking; OABG has published a well-received book on the financial system outlining the causes of the 2007 crash and proposing a policy framework it calls “popular regulation.”

Horizontal organization can have limitations, however. People who have lived in communes have often observed that they often function perfectly with small numbers of people, but as soon as one jerk moves in, they can cease functioning until structures are put in place to limit the amount of damage the jerk can cause. Organizations run along lines that are too rigidly horizontal can be similarly vulnerable to hijacking by one person’s pet issue; structure may need to be imposed to ensure that the organization maintains focus. This is also one reason anarchy generally focuses on the idea of federations of small coöperatives; too large a scale can make it difficult or even impossible to achieve consensus on important issues. It’s also quite likely that an anarchist economy would consist of worker-managed syndicates alongside independent artisans who would produce their own work; in short, it would likely end up resembling a mixture of the libertarian socialism described above and the mutualism imagined by theorists like Proudhon and, more recently, Kevin Carson.

A particularly detailed framework for economic organization, designed as an alternative to both the central planning of past communist states and the top-down organization of capitalism, has been developed by the political theorist Michael Albert and the economist Robert Hahnel, known as participatory economics; this has been accompanied by work from political scientist Stephen Shalom on a framework he calls participatory politics, designed to increase the average citizen’s direct political influence in society. These proposals are far too detailed to outline here, but their authors stress that they can’t be implemented on economic or political grounds alone; they’ll require similar transformations in politics, culture, and kinship to be implemented. They have also acknowledged polyculturalism and feminism as being possible foundations for such a society. The International Organization for a Participatory Society has been created to promote these visions.

The defining principle of my political thought is another I’ve outlined above, and one that is central to most anarchist visions of society: the people who are most affected by a decision’s outcome should get the largest amount of influence over it. This is a tremendous shift from one person, one vote, the principle that has underpinned democracy throughout history. It’s a flawed principle: not every decision affects everyone equally, and many of history’s greatest injustices resulted from the tyranny of the majority (majorities violating minorities’ rights). Checks and balances have never been especially reliable; I’ve mentioned Japanese-American internment, Jim Crow, slavery, and the Trail of Tears. Checks and balances aren’t failsafe in other countries, either; as much as countries like Canada and Norway may look like paradises to American liberals or socialists, their governments haven’t been able to stop abuses of laborers or environments by their companies, and indeed have sometimes been directly responsible for some such abuses themselves. Institutions’ ultimate problem is that they’re at best as ethical as the people running them, and sometimes far less. (Corporations are intrinsically profit-seeking institutions; pursuing profit wherever possible is their explicit purpose. Their more charitable activities are only allowed so far as they help with PR.)

And this, I feel, is the ultimate problem: it’s widely acknowledged that power-over corrupts. So why do we keep giving it to people and expecting better results? Putting people into established institutions and suggesting that institutional norms or checks or balances will somehow prevent them from abusing their power simply hasn’t worked. Some do use it benignly, but too many don’t. We’ve been trying this for centuries, and we still have poverty, hunger, pollution, discrimination, police brutality, racism, sexism, queerphobia, ableism, all kinds of bigotries and injustices that decent people have wanted to eliminate throughout human existence. Society still operates on the Great Man theory of history: if we simply elect the right leader, they’ll fix everything, and we won’t have to do any of the work. Why do we still cling to this fantasy? Isn’t it possible that power itself is the problem?

To be clear, I’m not saying we shouldn’t vote. I voted for Clinton, and I feel that anyone who could have done so but didn’t shares culpability for the president*’s ‘election’. (Voter suppression victims are excluded, naturally.) Voting won’t fix our problems, but it can stop them from worsening. Voting is, in other words, necessary but insufficient. We need to engage with the system we have, not the system we wish we had. And we get certain points of input into the system, so we need to use them.

But it’s because voting is insufficient that I find the recent activism so inspirational, and it’s another reason I find films like Moonlight so inspirational. There’s no simplistic plan that will fix our problems. “Overthrow the state” won’t work anymore than ‘Vote’ will. (Indeed, overthrowing the state is very likely the last step in any thoughtful anarchist’s political manifesto.)

The actual solution must occur within us. We must change our culture. That won’t occur protesting one person; it’ll occur protesting millions. We’ve tried to fight systematic injustice by casting a ballot every two to four years, or maybe showing up on the streets occasionally when something especially horrible occurs. But there are signs of change; the ongoing protests from groups like Indivisible and Black Lives Matter are widespread and have been occurring continuously. This is an absolutely beautiful development that I never expected to see in my lifetime, and it suggests people may finally be realizing the Great Man theory of history is wrong. Great change can occur because millions of ‘ordinary’ people agitated for it just as much as because one powerful person dictated it. School history rarely tells this lesson, but it’s one of Zinn’s most powerful lessons, and it’s hardly unique to him. Many anarchist and socialist creators and intellectuals, from Eric Flint to Naomi Klein to Noam Chomsky to Emma Goldman to Grant Morrison, have taught the same lesson. It can be rephrased as this: The masses can reshape society even if they hold no official power. And a related lesson is this: Reducing power disparities can result in a better world. Power-together is, ultimately, a far more benevolent force than power-over, and increasing everyone’s power-to can increase everyone’s power-together as well.

Another of Chomsky’s most important lessons is our need to work within the world we have, not the world we’d like. He’s often presented as a wild-eyed radical, but his work shows his recognition of existing institutional constraints. He endorses clear-eyed recognition of our system: vote for Democrats, but don’t expect them to fix everything, since they won’t and indeed can’t. He recognizes forms of activism like voting, writing or calling one’s representatives, protest, collective action, and labor agitation as all necessary but insufficient by themselves. The change we want will come when people band together and fight for each other; one person can rarely cause a significant change, but a million people can. This is quite likely one of the most urgent lessons Americans could learn in our present times.

Another urgent lesson, of course, is to reëxamine our values and assumptions about the world and to consider how they may affect others. We all like thinking of ourselves as fundamentally decent people – everyone’s the hero of their own story, after all – but simply lacking malicious intent isn’t a guarantee that our acts don’t inadvertently harm others. Many of the most insidious kinds of marginalization are perpetuated by unawareness that they’re forms of marginalization at all. Given today’s increased danger to marginalized communities, this topic requires special consideration. I’ve mentioned many marginalized communities in this book, but I’m sure I’m still unaware of plenty of others. I intend to keep reëvaluating my thought processes throughout my life to ensure I contribute as little as possible to others’ marginalization, and I encourage others to do likewise.

One of Moonlight’s most beautiful aspects, I feel, is its emphasis of just how much one’s surroundings can shape one’s life. In the “nature versus nurture” debate, Moonlight comes down emphatically on ‘nurture’. Its characters are unquestionably shaped by their environments. This observation has a corollary: create better environments, and you may end up with kinder people. People’s actions are often learned behaviors. Much of our society’s behavior is self-centered because that’s what we’ve learned, but one of the great anarchists, Pyotr Kropotkin, observed that altruism is as much part of our nature as selfishness is. He referred to it as mutual aid, and biologists now regard it as a major part of evolution. In short, much of our behavior is malleable; it isn’t an inevitable result of our biology. If we stop emphasizing self-interest as our primary motivation and start emphasizing aid to others, we can change society wholesale. It hasn’t been attempted on such a grand scale before, but I still think it’s possible. Given our resurgent political activism and the success of artistic works like Moonlight, it may actually be more possible now than ever before in human history.

On A Vindication of Natural Society (2025)

[I added this subsection in 2025. Present tense within it refers to 2025.]

This seems like the best time to revisit Edmund Burke’s A Vindication of Natural Society. Remember, I mentioned above that it’s generally accepted that he meant it satirically. Bear that in mind when reading the following:

The most obvious division of society is into rich and poor; and it is no less obvious, that the number of the former bear a great disproportion to those of the latter. The whole business of the poor is to administer to the idleness, folly, and luxury of the rich; and that of the rich, in return, is to find the best methods of confirming the slavery and increasing the burdens of the poor. In a state of nature, it is an invariable law, that a man’s acquisitions are in proportion to his labours. In a state of artificial society, it is a law as constant and as invariable, that those who labour most enjoy the fewest things; and that those who labour not at all have the greatest number of enjoyments. A constitution of things this, strange and ridiculous beyond expression! We scarce believe a thing when we are told it, which we actually see before our eyes every day without being in the least surprised.

[…]

The rich in all societies may be thrown into two classes. The first is of those who are powerful as well as rich, and conduct the operations of the vast political machine. The other is of those who employ their riches wholly in the acquisition of pleasure. As to the first sort, their continual care and anxiety, their toilsome days and sleepless nights, are next to proverbial. These circumstances are sufficient almost to le­vel their condition to that of the unhappy majority; but there are other circumstances which place them in a far lower condition. Not only their understandings labour continually, which is the severest labour, but their hearts are torn by the worst, most troublesome, and insatiable of all passions, by avarice, by ambition, by fear and jealousy. No part of the mind has rest. Power gradually extirpates from the mind every humane and gentle virtue. Pity, benevolence, friendship, are things almost unknown in high stations.

Edmund Burke, A Vindication of Natural Society (1756)

Burke’s satirical arguments were so convincing that when it was published, many reviewers missed the satire. In fact, since it was published anonymously, some reviewers sincerely thought it was an original work by Lord Bolingbroke, the exact target of Burke’s parody. (It should be noted that the reviews were uniformly positive, particularly for the quality of his writing.) The second edition, published a year later, identified Burke as the author and included a preface clarifying that it was meant as satire. Most modern scholars accept his explanation, but a few, such as historian Peter Marshall, have still argued Vindication is too persuasive to be a joke. My own view is closer to that of William Godwin (funnily enough, the subject of three books by Marshall, but, to my knowledge, no relation to Mike Godwin of the eponymous law), who calls Vindication:

a treatise in which the evils of the existing political institutions are displayed with incomparable force of reasoning and lustre of eloquence, while the intention of the author was to show that these evils were to be considered trivial.
William Godwin, Enquiry Concerning Political Justice: And Its Influence on Morals and Happiness

In short, Godwin recognized Burke’s satire, but also supported the arguments Burke meant satirically. Proudhon was the first person to call himself an anarchist, but Godwin is generally considered the first modern proponent of anarchism. No less a source than an Encyclopedia Brittanica article by Pyotr Kropotkin called Godwin “the first to formulate the political and economical conceptions of anarchism, even though he did not give that name to the ideas developed in his remarkable work.” Godwin also influenced Wordsworth, Coleridge, and Shelley.

Vindication is a textbook demonstration of Poe’s Law, which, in Nathan Poe’s original formulation, read, “Without a winking smiley or other blatant display of humor, it is utterly impossible to parody a Creationist in such a way that someone won’t mistake for the genuine article.” This has since been extended far beyond Creationism; a case could be made that it applies to all forms of parody and satire. (A corollary is, “It is impossible for an act of fundamentalism to be made that someone won’t mistake for a parody.”) But oddly, Vindication isn’t even the ur-example of Poe’s Law; Burke wrote it with heavy influence from Jonathan Swift, who is.

In any case, a strong case can thus be made that, satire or not, A Vindication of Natural Society is the first modern explication of anarchist principles. Paradoxically, this suggests that anarchism and conservatism need not be entirely diametrically opposed. After all, even if Burke did not intend the arguments he presents in the work sincerely, he had to understand the perspective he takes well enough to write it in the first place.

Often times, people assume that all anarchists are revolutionaries, but this is as false as the assumption that anarchists support chaos. Godwin, for example, held violent revolution to be superfluous, as education was sufficient to produce change: “The proper method for hastening the decay of error is not by brute force, or by regulation which is one of the classes of force, to endeavour to reduce men to intellectual uniformity; but on the contrary by teaching every man to think for himself.” And “the task which, for the present, should occupy the first rank in the thoughts of the friend of man is enquiry, communication, discussion.” In short, it’s entirely possible to be both reformist and anarchist: indeed, reformist anarchism is the original form.

Noam Chomsky is primarily thought of as an anarchist – which he is – but he also identifies with some aspects of classical liberalism and what he calls “old-fashioned conservatism”. He has frequently cited Adam Smith approvingly, repeatedly referring to the quote “All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind” (from The Wealth of Nations) as “Smith’s maxim”. In other respects, he advocates pragmatism:

The main currents of anarchist thought were derived from classical liberal ideas that emerged in the Enlightenment and the Romantic era. The central idea, Chomsky said, was that “institutions that constrain human development are illegitimate unless they can justify themselves.” Anarchists seek to challenge those institutions and dismantle the ones that cannot be justified, while creating new institutions from the ground up based on cooperation and benefits for the community. This tradition of libertarian socialism or anarcho-syndicalism was still alive, Chomsky claimed, despite challenges and suppression.

[…]

Chomsky also addressed some of the issues confronting anarchist activism, noting that while anarchists stand against the state, they often advocate for state coercion in order to protect people from “the savage beasts” of the capitalists, as he put it. Yet he saw this as not a contradiction, but a streak of pragmatism. “People live and suffer in this world, not one we imagine,” Chomsky explained. “It’s worth remembering that anarchists condemn really existing states instead of idealistic visions of governments ‘of, by and for the people.’”

Thus, he had harsh words for those who refused to vote for Biden in 2020:

That brings up some memories. In the early ’30s, in Germany, the Communist Party, following the Stalin­ist line at the time, took the position that everybody but us is a social fascist, so there’s no difference between the Social Democrats and the Nazis. So therefore, we’re not going to join with the Social De­mocrats to stop the Nazi plague. We know where that led. And there are many other cases like that. And I think we’re seeing a rerun of that.

So let’s take the position “Never Biden. I’m not going to vote for Biden.” There is a thing called arith­me­tic. You can debate a lot of things, but not arithmetic. Failure to vote for Biden in this election in a swing state amounts to voting for [the felon]. It takes one vote away from the opposition, the same as voting for [the felon]. So, if you decide you want to vote for the destruction of organized human life on earth, for a sharp increase in the threat of nuclear war, for stuffing the judiciary with young lawyers who will make it impossible to do anything for a generation, then do it openly; say, “Yeah, that’s what I want.” So that’s the meaning of “Never Biden”.

I’d dispute this only insofar as I’d say it’s half a vote for the opposition, not a full vote, for the same reason that:

In the game of thrones, you win or you die first-past-the-post, you either lead or you don’t. There is no middle ground. First-past-the-post’s many shared mathematical traits with team sports may intrinsically reflect its innate foolishness. Regardless, Chomsky’s point that the mathematics supported the argument that not voting for Biden would aid the felon was spot-on, even if he was oversimplifying the reasoning behind it.

Ultimately, I am skeptical of change, and yet I am equally skeptical of hierarchy. These principles are not as irreconcilable as they might first sound.

[End 2025 addenda. Present tense refers to 2017 until further notice.]

Back to top · Table of contents · My portfolio · Contact me · Website index

A note on politics and human fallibility

I have, of course, spoken at length and with conviction about many particular aspects of politics. But I, too, am human. And humans don’t think rationally. We make decisions irrationally and justify them with after-the-fact reasoning to feign rationality. This is especially true of religion and politics.

I’ve studied politics in depth, and I’ve cited facts and statistics to back up my assertions. But I can’t claim that I’ve arrived at my political opinions any differently than anyone else has. My convictions developed the same way everyone else’s did: as a result of my life experiences. I was an outsider throughout childhood and high school, so I leaned somewhat to the left and disliked Bush 43. The more direct my experience of marginalization became, the more thoroughly I identified with anyone else who was marginalized. I suspect it is, therefore, not coincidental that I began to side with socialism and anarchism after receiving my autism diagnosis. My ability to read and absorb others’ experiences has given me understanding of the many kinds of marginalization that exist, but my understanding of what marginalization itself feels like comes directly from my own experiences, and so my understanding of politics is, like everyone else’s, ultimately based in emotional reactions.

[Socrates may have put it best: “What I do not know, I do not think I know” (Attic Greek: «ᾰ̔́ μὴ οἶδᾰ οὐδέ οἴομαι εἰδέναι», «hắ mḗ oîdă oudé oíomai eidénai»; this has often been paraphrased as “I know that I know nothing”). Both Xenophon and Plato report the Pythia (Oracle at Delphi) saying Socrates was the wisest person in Athens. Socrates believed her, but was also convinced of his own agnoia. He thus concluded that nobody knew anything, and that what made him the wisest person in Athens was being its only resident to recognize their own agnoia. In short, wisdom requires acknowledging one’s own deficiencies of knowledge. The more I learn, the more I learn about how much I still have to learn. –Future Aaron]

I deeply encourage others to try to reach an understanding of the world through the knowledge that others’ perceptions and experiences are different – an understanding through empathy, in other words. But no one’s understanding can hope to be perfect. We can’t forget that. Our narratives are, at best, models. We are omitting facts. We are imperfect. And thus any society we create and any judgment we make will also be imperfect. Don’t forget that. Everyone is fallible. Everyone’s experience is subjective. Everyone has room to learn new things. I’m not pretending to be any different.

Back to top · Table of contents · My portfolio · Contact me · Website index

Where we go from here, and why Moonlight matters

I have to hope, and even believe, that even if some of the recent progress we’ve made is rolled back, it’ll be temporary. However, even a temporary rollback will threaten untold numbers of people’s livelihoods and even lives. There’s reason to hope that the government may be unsuccessful in rolling this progress back, however. One particular reason for optimism is that the popular resistance to the current government, highlighted by the Women’s March and too many other effective protests and forms of resistance to list here, indicates an unwillingness to let reactionary attitudes govern American civic life. The public’s readiness to volunteer and donate has also been commendable: the ACLU received $24 million in donations within days of the president*’s first attempted Muslim ban; for context, it usually receives $4 million a year.⁽⁸³⁾ The public resistance has already caused the abandonment of several major policy proposals, including most recently the AHCA, and if the public continues to maintain pressure on elected officials, it will undoubtedly be able to stop more of them.

But the popular resistance is only one form of possible resistance. The artistic community can also offer its own resistance, and while my lived experience had led me to expect plenty of resistance from artists, what they’ve come up with has already far exceeded anything I could’ve dreamed. And that takes us back to the original subject of this book. I cannot think of anything that better embodies the current spirit of resistance amongst the artistic community than the fact that the Academy just named Moonlight the best picture of the year. We live in a time of pervasive mendacity. That a film that so profoundly and eloquently reflects reality can win such widespread, overwhelming acclaim and success is an astonishing cause for hope, as is the mere fact of its existence.

I write my accounts of my own personal experiences not as a pitch for a film or any kind of related tale – I have no interest whatsoever in extracting any kind of fame or personal gain from my past suffering. In fact, being recognized on the street by strangers is close to my idea of personal hell; I much prefer anonymity. But I do feel the experiences I’ve recounted here could help other autistic people, or their loved ones who may lack hope. The diagnosis feels like the end of the world, both for autistic people and for those who love them. It isn’t. It heralds a world of difficulty that will require tremendous growth, but it isn’t insurmountable. I doubt I would even have been able to write this book if it were.

And for storytellers in general, my personal accounts could provide examples of a kind of experience, rarely addressed in popular fiction, that still contains the conflict and turmoil that are all compelling stories’ narrative backbone. I’m convinced this is one reason Moonlight has resonated as strongly as it has: it addresses fundamental truths that are common across all shades of human experience, in a way few previously told stories managed – because most of those stories left out those details.

As an upper-middle-class kid who went to a gifted school in a largely wealthy, largely white county, I seemingly have almost no background in common with Moonlight’s impoverished black characters. And yet Moonlight expresses truths about my life that no previous film I’ve ever seen has come close to expressing – not just because Chiron is an outsider, but protesting his and Kevin’s journey towards self-acceptance, which is depicted more unflinchingly and honestly than in any previous film I’ve seen. It is emotionally raw, even uncomfortable at times. And protesting that, it’s also much truer to life. My journey towards self-acceptance, as you’ve seen, was excruciating and took almost half of my life. Moonlight may have come closer to reflecting my journey than any previous work of fiction I’ve absorbed, even though its self-acceptance concerns a completely dissimilar kind of marginalization. The fact that it manages to end on an optimistic note is the capstone to its achievement: it depicts reality honestly, without whitewashing society’s flaws, and still gives audiences a cause for optimism. Things can change, and Moonlight itself has served as proof. It provides a message of hope and, incredibly, it is not a false one: society’s reaction to the film has proved its exact thesis.

And that, to my mind, may be Moonlight’s greatest accomplishment: it has captured reality in an unprecedented way and, in doing so, has demonstrated that there’s a larger audience for the stories of the marginalized than most people, including many members of marginalized groups, ever thought possible. As I’ve mentioned several times, I work in TV ratings. For the end of February and most of March, I was presented, eight hours a night, five nights a week, with consistent, unassailable proof of Moonlight’s significance: every additional time I saw further viewing figures for the Oscars come in – which, I might add, meant most of the returns from the fourth week of February – provided further proof of the cultural impact of the film and its awards victory. At a time of such widespread despair, when audiences most need proof that progress is still possible, Moonlight has served as indisputable proof that it is. I hope its success leads to more people exploring these stories in thoughtful and inclusive ways, and telling them well and in mediums that will, in turn, reach larger and larger audiences. Hopefully, the telling of these stories will lead to better understanding among the general populace and, finally, greater compassion and empathy. And if we’re lucky, maybe even more love.

There is, of course, a dire need to continue telling marginalized groups’ stories, particularly in the current political climate. One point of encouragement for me is that Moonlight’s success is not only likely to give its creators more opportunities to tell such stories (which they’ve explicitly said they plan to keep doing), but also very likely to make other creators in Hollywood and elsewhere see that telling them offers room to break new artistic ground while still reaching large audiences. That said, it remains to be seen how many other creators can tell these stories as sympathetically, honestly, and artistically as Moonlight does – at the minimum, a necessity in telling these stories well is to listen to marginalized groups’ stories, although ideally, members of those groups would be directly involved in their telling, as was the case here. I ultimately can’t think of a more exciting time to care about the arts; this is truly a time when anything seems possible, and I look forward to seeing not just what Moonlight’s creators, but creators across the globe do next.

Obviously, Moonlight’s Academy Awards don’t mean the end of racism, discrimination, prejudice, bullying, or any of the other horrendous cultural problems we still face. But they’re nonetheless an enormous cultural watershed. The comparisons that come most readily to mind are Jackie Robinson playing for the Dodgers or Muhammad Ali’s boxing titles. This must have been what it felt like to witness those similar cultural breakthroughs. I don’t think I ever comprehended it before. I still don’t know if I fully comprehend it now, but I’m much closer than I ever was before. Perhaps future generations will look back on this moment and declare that this film did, indeed, change the world.

Of course, if it is to change the world, that’ll require the actions of millions of audience members to create the world it envisions for us. I have more belief than I ever did in the past that this is possible, however. I’d read, of course, about how mass mobilization had enacted sweeping changes in society before – this is, indeed, quite likely the primary lesson of Howard Zinn’s work – but reading it wasn’t the same as seeing it unfold in real time. Witnessing and, in some cases, participating in events like the widespread protests against the current administration*, the 89th Academy Awards, and the unraveling of the administration*’s detestable healthcare agenda, all in a short time, has been an incredibly powerful experience. My hope is that we can keep this up. Changes can come from the bottom up as well as the top down, and many of the ones that come from the bottom up are better. The artistic community can catalyze this kind of progress, and with works like Moonlight, I think it has.

I’ve identified numerous reasons causes for pessimism: climate change, increasing unemployment, widespread mendacity, unspeakable injustices. For most of my life I probably qualified as a pessimist about both human behavior and humanity’s future. As of the last few months, I’m at least no longer pessimistic about our future. We face terrible problems, and solving them will be among the most overwhelming challenges we’ve ever faced as a species. But for the first time I can remember, I believe we’re capable of surpassing them. This is for several reasons – the widespread, successful protests have certainly also contributed – but perhaps above all, it’s because I’ve been presented with the most powerful example I’ve ever witnessed of the transformative power of art.

My primary purpose in writing this lengthy recollection/appreciation has mostly been to express the level of congratulations I should’ve given Adele and her entire cast and crew the first time around, to explain why I feel this level of congratulations is necessary and how important and revolutionary their picture is, and finally to thank all of them for making it in the first place. I think it’s easy, in the aftermath of an awards ceremony like this, to focus all the attention on the awards themselves, and not to reflect on what they actually mean. Telling stories like this makes an incalculable difference, and I don’t think enough people acknowledge that. The fact that people are telling them and winning widespread recognition for doing so sends a positive message to society that’s urgently needed right now: a group of passionate, dedicated people working together actually can make meaningful, positive changes in society. I desperately needed to be reminded of that. And in that light, I can say their film has irrevocably changed my life for the better: it has given me a source of hope that I genuinely do not think I would ever have possessed without it.

Aaron Freed
Sarasota, FL
March 11 to April 28, 2017 (last revised 2025-08-24)

Back to top · Table of contents · My portfolio · Contact me · Website index

Afterword (2025)

“Ignī ferrōque”

I am not, to my knowledge, bipolar, but when I reread a lot of this now, I find myself half-wondering if I wrote it in a manic phase. I expressed levels of optimism that now feel entirely foreign. Things not only haven’t improved to anything approaching the degree I’d hoped; they’ve gotten so much worse on so many fundamental levels that engaging with politics frequently causes me literal, not figurative, panic attacks.

These sorts of things are cyclical, of course, but even after 2020’s election, I felt extremely pessimistic. 2024 was a slowly unfolding horror film. I do not generally enjoy horror. I especially do not enjoy living through it.

(I do have a soft spot for the work of H. P. Lovecraft, despite his noted flaws as a person. In his defense, toward the end of his tragically short life, he quite eloquently recanted the virulent racism he’d held for most of his life and expressed a regret that bordered on self-loathing for having been so bigoted.⁽⁸⁴⁾ One of the terminals I cowrote for Eternal, a game mod for Bungie’s Marathon trilogy, is an affectionate pastiche of Lovecraft’s style and subject matter, inverting his usual perspective of humans’ fear of the alien by expressing an alien’s fear of humans in distinctively Lovecraftian language.)

The Democratic Party still seemed defensible to me in 2017. I certainly cannot defend them in 2025. I think an under-discussed factor is that the massive number of fundraising texts and emails they’ve been sending out has likely turned voters off. I’ve gotten countless texts from candidates in districts I’ve never lived in, whom I’ve never donated to, whom I’ve never expressed interest in donating to, purely because I have donated to candidates in the past. I have to imagine that gets on people’s nerves after a while. And it didn’t even help. Democrats outspent Republicans by several hundred million dollars in 2024, for all the good that did.

But of course, first-past-the-post voting is as awful as ever. In the 2010s, I closed every message I wrote, regardless of topic, with some variant of “Cē­te­rum cēn­se­ō par­tēs Re­pūb­li­cā­nās dē­len­dās ig­nī fer­rō­que es­se” (Latin for “Moreover, I opine that the Republican Party is to be destroyed with fire and sword”, paraphrasing Cato the Censor, who ended all his speeches to the Roman Senate, regardless of topic, with some variant of “Cē­te­rum cēn­se­ō Car­thā­gin­em dē­len­dam es­se”, or “Moreover, I opine that Carthage is to be destroyed”). Today, “Prī­mō, par­tēs Re­pūb­li­cā­nae dē­len­dae sunt; de­in­de, cōn­si­li­um Dē­mo­cra­ti­cum nā­ti­ō­nā­lis” (“First, the Republican Party is to be destroyed; then, the Democratic National Committee”) is my stance on a good day. On bad days, it’s “Sys­tē­ma po­lī­ti­ca in­teg­ra nos­tra dē­len­da est” (“Our entire political system is to be destroyed”).

I can’t much blame Harris for 2024’s results. One of my closest friends spoke to several people in the wake of the election who openly admitted that they wouldn’t vote for Harris purely on account of her gender. It may well infuriate a lot of people for me to say this, but this country has been fundamentally misogynistic and racist since the day it was founded. Despite that – because of that – I also don’t want her to run again, precisely because admissions like those give me no faith that this country will ever elect a woman as president.

Beyond that, Biden should never have run for a second presidential term, and his decision to do so anyway was the biggest own goal in American political history. (Hot take: Thomas Eagleton did nothing wrong, and the reporters and political opponents who went after him for having the audacity to seek treatment for depression should have been publicly shamed every time they went out to dinner for the remainder of their miserable, petty lives.) His refusal to step down as president forced Harris to run without an incumbency advantage. She never managed to adequately define her positions to voters, and I can’t blame her for that, because there was a spike in queries along the lines of “did Biden drop out” in the last days before the election. People clearly weren’t paying attention, and on the whole, the media did an awful job actually convincing them to do so.

Biden stepping down would also have normalized the idea that a woman can be president in the first place. This might seem like a trivial difference to those who want to envision human behavior as rational, but the glass ceiling still exists. If Harris had already been an incumbent, the idea would have been normalized to people for whom the idea remains unthinkable: at least a few might have observed by the time of the election that the country hadn’t burned down.

Beyond being a bad political move, Biden not stepping down was also grossly irresponsible and bordered on being an abuse of power. If he was unable to run for office, he was unable to perform the duties of the presidency. That complaint certainly looks trivial in comparison to the abuses of power going on today, but norms are eroded one at a time.

Back to top · Table of contents · My portfolio · Contact me · Website index

“The wrong lizard might get in”

Mostly, though, I’m utterly sick of a particular rent-free occupant of the entire country’s collective unconscious who long ago ceased to be a source of humor. I’ve never personally disliked a politician in my lifetime more, and I couldn’t stand George W. Bush or Dick Cheney either. We never got a break, even when he was out of office. I continue to hold the media at fault for this.

In fact, it’s far worse now. I singled out The Washington Post for exemplary coverage in 2017. I can no longer do so now. I don’t know that I can single out any major newspaper for exemplary coverage. I’ve been trying to avoid doomscrolling, and honestly, to the extent that I pay attention to the news, it’s because I want to be well enough informed to get out in time before the camps go up.

To be clear, there have been several Republican presidential candidates I’ve respected in my lifetime. I might have had personal disagreements with the likes of John McCain, Mitt Romney, Jon Huntsman, and John Kasich, but I never thought they had anything other than the country’s best interests in mind. If one of them were in office, I might have complaints with their policies, but I wouldn’t be worried that democracy itself was under existential threat.

One unfortunate result of first-past-the-post voting is that it encourages extremism. The people that vote in primary elections tend to be the most fanatical, which makes even many of the party’s more moderate voters see it as even more essential to win the general election, because no matter how flawed their candidate might be, look at how extreme the opposition candidate is! And because there are only two viable candidates, compromise becomes almost impossible. I’m reminded of this old chestnut from Douglas Adams, in which Ford Prefect explains the greeting “Take me to your lizard” to a bewildered Arthur Dent:

“It comes from a very ancient democracy, you see...”

“You mean, it comes from a world of lizards?”

“No,” said Ford[…], “nothing so simple. Nothing anything like so straightforward. On its world, the people are people. The leaders are lizards. The people hate the lizards and the lizards rule the people.”

“Odd,” said Arthur, “I thought you said it was a democracy.”

“I did,” said Ford. “It is.”

“So,” said Arthur, hoping he wasn’t sounding ridiculously obtuse, “why don’t people get rid of the lizards?”

“It honestly doesn’t occur to them,” said Ford. “They’ve all got the vote, so they all pretty much assume that the government they’ve voted in more or less approximates to the government they want.”

“You mean they actually vote for the lizards?”

“Oh yes,” said Ford with a shrug, “of course.”

“But,” said Arthur, going for the big one again, “why?”

“Because if they didn’t vote for a lizard,” said Ford, “the wrong lizard might get in. Got any gin?”

Douglas Adams, So Long, and Thanks for All The Fish

Whereas, in systems that use proportional representation or score voting, compromise is often necessary just to form a government. We often treat ‘compromise’ as a dirty word, but the complete lack of compromise is a large part of what got us where we are today – horrendously divided, shouting at each other, unwilling to listen. We’ve put ourselves in the sort of place where it’s virtually impossible for both the left and right to get what they want. That’s unsustainable, and I shudder to imagine what the long-term consequences will be.

As scathing as I was to the Republican Party establishment in 2017, I feel I was, if anything, too kind. A large part of what got us to this point is that its leaders long ago abandoned any semblance of engagement with reality. And for this, we can blame the likes of Rupert Murdoch and Roger Ailes – very few people have done more to divide this country, except perhaps Mitch McConnell, who, when the dust settles, may well emerge as the most despicable American political figure of this century’s first two and a half decades. I could go over each man’s long litany of offenses, but I don’t want to reward them with further attention.

Returning to 2024’s outcome, I think a lot of problems were structural. As I remarked in 2017, the Democrats only controlled a filibuster-proof majority for about a month of Obama’s presidency. At no point of Biden’s did they control one, and the Republicans were determined to sink his presidency by any means they could. Voters tend to blame the president’s party even for problems the president has little control over. The economic recovery from COVID was sluggish, and Democratic voters were demotivated to begin with. Harris’ economic proposals weren’t actually bad, but they also weren’t five-second sound bites, so they didn’t play well in the TV news.

Contrast this with the convicted felon who played a smart businessman on the teevee saying things like “Only I can fix it.” They weren’t true, but that doesn’t seem to matter any longer. My only half-ironic hot take is that the Democrats should have convinced Oprah or George Clooney to run. Americans worship celebrity. Running another celebrity would have nullified one of the felon’s big advantages.

My proposals are not as ironic as they might sound on the surface. Unlike the felon, Oprah built her empire up from essentially nothing – in short, she’s the smart businessperson the convicted felon plays on TV. And Clooney has a ton of policy and administrative experience from his charity work. To be clear, I’m not happy that the presidency has been cheapened in such a way that celebrity has become a decisive factor in presidential elections, but I’m also a pragmatist.

Whatever issues I may have with the Democratic Party, I’m effectively a single-issue voter. I don’t want Federalist Society hacks appointed to the Supreme Court. Democrats won’t appoint them and Republicans will, so I remain a Democrat. It’s probably hard to overstate the damage the end of Roe will do to poor women in the U.S., and that’s just the tip of the iceberg.

I don’t think our political left fully appreciates the importance of judicial appointments. Say what you will about Republicans: they vote. Judicial appointments are a large part of why. They’ve played the long game for decades. Democrats, to paraphrase Terry Pratchett’s Lord Vetinari, don’t seem to have the knack. I get it: There’s something gauche about seeking power. The problem is, to quote Douglas Adams again:

The major problem—one of the major problems, for there are several—one of the many major problems with governing people is that of whom you get to do it; or rather of who manages to get people to let them do it to them.

To summarize: it is a well-known fact that those people who must want to rule people are, ipso facto, those least suited to do it.

To summarize the summary: anyone who is capable of getting themselves made President should on no account be allowed to do the job.

To summarize the summary of the summary: people are a problem.

Douglas Adams, The Restaurant at the End of the Universe

I now feel I was too charitable not merely to both political parties, but the entertainment industry as a whole, and the adult film industry in particular. A lot of unpleasant revelations have come out in the past eight years that I wasn’t aware of at the time I wrote this. This section is already dark enough as it is, so I won’t go over all the sordid details. But whatever problems the industry has – and they are many – at least it’s not NoFap, which is linked to erectile dysfunction, belief in false conspiracy theories, bigotry, shame, helpless feelings, anxiety, depression, and suicidal ideation.

Back to top · Table of contents · My portfolio · Contact me · Website index

«Ἐν ὠχρῷ σελήνόφωτῐ́»

I never had biological kids. A few years ago, I had a vasectomy to make sure I never would. I’m increasingly glad I made that choice. By the time they’d have reached my age, half this state will probably be underwater. I turned forty-two while I was working on this afterword. I’d left open the possibility of adopting, but I’m approaching an age where I’d consider it inappropriate to do that, either. If I raise children, I want to be able to be there for them for at least as long as my parents have been there for me. But, at any rate, I’m currently single, and I’m certainly not prepared to be a single parent. Nor would I have the resources to raise kids by myself, at any rate.

I graduated with a degree in cybersecurity in 2021. I don’t know if I’ll ever work in the field. Honestly, the idea of being in charge of so many people’s privacy and security terrifies me. I currently do freelance writing on autism-related topics for my income. It’s the most rewarding work I’ve done, though I haven’t been consistently able to do it, owing to some rather severe mental health issues I’ve been working through. Luckily, my supervisor has been quite understanding. (Beyond the other awful things that happened earlier this year, I suffered the worst series of betrayals I’ve ever experienced. To be honest, they affected me more, and I’m still reeling from them.)

Other than that, I’ve been keeping myself busy with creative output. I’ve done a lot of game modding over the past several years. Eternal, the project I mentioned above, is the work I’m proudest of. I’ve been working on it, off and on, since 2018; I’ve contributed level design, music, writing, scripting, and artwork, and I’ve been managing its development largely by myself for the last few years. I’m proud of my work on several other projects, as well. Apotheosis X (for which I did sound design and scripting) got featured in PC Gamer.

I think I actually kept myself too busy. Another game I worked on, Tempus Irae Redux, was finally released earlier this year, after almost five years of work. Unfortunately, a combination of stress and ill-timed trouble with medication refills and insurance at the start of the year caused me to decompensate; I ultimately decided to quit the project. I’m sure most people will find the version of the game that got released to be satisfying. I can’t say that I do. Several planned features that cost me a lot of time and stress ended up being cut.

Some other projects I’ve worked on remain unfinished. Some may never be finished, although I hold out hope for Where Monsters Are in Dreams, a project I worked on from 2019 to early this year. Someday I may also finally finish the project I started in high school, Marathon Chronicles.

I’ve already alluded to writing music. I buried the lede on the scope of it. Since late 2022, I’ve released over nine hours of music, much of it on four albums that are now mostly complete (I may revise some of the mixes and write new parts for a few of the songs). See You Starside is based on the Marathon soundtrack, but thirty-eight minutes of its seventy-eight-minute running time consist of new material I wrote myself. (It’s thus the length of a double-LP set.) The next three albums would each be double-CD or four-LP sets if I’d physically released them. I’m now working on another album. As of 2025-08-29, in chronological order:

(Table scrolls left/right on mobile devices.)
My albums since 2023
Album / romanization / translation Composition start & end Songs / Length
See You Starside: The Marathon Soundtrack
Reimagined
(cowritten with Alex Seropian)
2022-12-05 2023-01-21 16 1:18:17
Compositions 2023-2024 2022-12-29 2025-01-06 15 2:35:27
Mūsica ex tempore malōrum
Music from a Time of Disasters, or
Music from After the Opportunity of the Disasters
2024-08-29 2025-01-30 23 2:37:55
Κᾰτηγορῐκή ᾰ̓πολογῐ́ᾱ
Kătēgorĭkḗ ăpologĭ́ā
(meaning is deliberately ambiguous)
2025-01-22 2025-04-25 16 2:07:50
Τὸν ὦνον πᾰ́ντων καί τὴν ἀξῐ́ᾱν οὐδένων
Tòn ônon pắntōn kaí tḕn axĭ́ān oudénōn
The Price of Everything and the Value of Nothing
2025-06-26 TBD 10 38:55
Total 80 9:18:24

Since that’s a lot, I anthologized my best work from January 2023 to April 2025 on two releases:

(Table scrolls left/right on mobile devices.)
My anthologies since 2023
Album / translation Composition start & end Songs / Length
Selected Works 2023-2024 2022-12-05 2025-01-05 15 2:34:08
Iānuāriō Aprīlem MMXXV
From January to April 2025
2025-01-10 2025-04-25 5 1:18:25
Total 20 2:52:33

Of course, those are themselves collectively almost four hours. Brevity is not and will never be my strong suit. But then, Selected Works anthologizes more than just my albums: it takes four songs (over fifty minutes) from Eternal’s soundtrack, which itself lasts more than seven hours.

I mentioned that I didn’t intend this to be a book when I began writing it. This year, it happened again. This month, I noticed that a page I started writing last year is now so long it qualifies as my second book – over 46,000 words and 200,000 characters (80,000 words and 380,000 characters if you factor in an appendix). It’s about the mathematical principles underpinning harmony, scales, modes, and music theory; it’s also a history of the diatonic major scale that Western music has used for at least 2,500 years. I suspect it will be of interest (or even comprehensible) to a much smaller group of people, but it’s also much less likely to infuriate any of them. But in any case, once I started working on it, I couldn’t stop.

I’m a somewhat competent programmer these days – a friend and I did 2024’s Advent of Code earlier this year in Rust, which is probably my favorite language I’ve worked in, just above Lua and C#, in that order. I never got on well with C++ – it’s too low-level to have gotten rid of the annoyances that higher-level languages like Rust and Lua solve, but it’s too high-level to be an interesting challenge like assembly. C++ evolved. Rust was designed by computer scientists who had actually studied a large number of languages and their associated problems. That’s rare for a programming language. A lot of its features are designed to prevent common logic errors, which vastly decreases the annoyances programming used to cause me. Rust has a steep learning curve, but its iterators in particular are immensely powerful. A good chunk of my music theory book (including the appendix in its entirety) has employed a Rust program my friend and I have been working on for scale analysis. I doubt I could make a living programming, though. I’m only capable of doing it a few hours a day before I’m spent.

I’ve also been learning Latin and Greek, partly because etymology has always fascinated me, and so much of our language descends from one or both of those languages; and partly because I’ve just wanted to learn them for a long time. (Most English words with at least three syllables came to us from Latin or Greek; exceptions like kamikaze are rare. The letters ph, meanwhile, almost always descend from φ, even in neologisms like phreaking: it’s derived from phone and freak, the former of which descends from τῆλε + φωνή [têle + phōnḗ, literally afar voice]. Of course, this has had drawbacks. The letters φθ might look cool in Greek, but they’ve given us such frequently mispronounced, frequently misspelled words as diphthong, ophthalmology, Nephthys, and phthalimide. Reminder: ph is pronounced f. Not that χθ is any picnic either – but then, without spellings such as chthonic, where would Lovecraft have gotten the idea for Cthulhu?)

I am by no means fluent in either of these languages, to the extent that anyone can be fluent in a language (or dialect) with no native speakers. But I understand them well enough to write in them, and to have been praised for my writing in them. And my Greek pronunciation has been praised, too. Not bad, considering I’m entirely self-taught in both. I wrote these lyrics ⟨aaronfreed.github.io/kategorikeapologia.html#vidireslyrics⟩ for Κᾰτηγορῐκή ᾰ̓πολογῐ́ᾱ’s final song, taking inspiration from Blade Runner and the Marathon trilogy:

Vīdī rēs vōs hominēs nē crēderētis
(I’ve Seen Things You People Wouldn’t Believe)
Lyrica latīna
Latin lyrics
Tran̄slātiō anglica
English translation
Viāvī omnis circum astra
Vidē nāvēs fervēntes in Ōrīōne
Trabēs C ad portam Tannhäuser
Mōmenta erunt perdita in tempore
Sed post omnia quae vidēbam
Ac post omnia quae faciēbam
Prīmae vēritātēs quās cōnstābunt
Omnēs viae ad sōlem dūcunt
I’ve voyaged all around the stars
I’ve seen ships on fire in Orion
C-beams by the Tannhäuser Gate
Moments will be lost in time
But after everything I’ve seen
And after everything I’ve done
The first truths will stay constant
All roads lead to the Sun
Post īnsidiās Strauss et MIDA
Ac rampans ignis ferrum cūdit
Nōs vīcerimus mangōnibus
Ac ballō Ūnā Obscurā
Ac vīsimus ad astrum vagum
Sciō nec quamdiū viābō nec quem fiām
Sōlum dēstinātiōnem meam
Omnēs viae retro sōlem eunt
After Strauss and MIDA’s machinations
After rampant fire forged a sword
After we’ve won the war with the slavers
After I’ve danced with the Dark One
And after we’ve visited the rogue star
I know not how long I’ll travel nor who I’ll become
I know only my destination
All roads lead back to the Sun
Ἀττικός Ἑλληνική
Attĭkós Hellēnĭkḗ
Attic Greek
Ῥωμᾰῐ̈σμένη
Rhṓmăĭ̈sméni
Romanized
Μετάφρασις
Metáphrasis
Translation
Ᾰ̓πᾰντήσομεν ἐν ὠχρῷ σελήνόφωτῐ́
Ἐν κήπῳ εἰς γένεσῐν κόσμου
Μᾰχώμεθᾰ ἤ ἔρωτευώμεθᾰ
Ᾰ̓πό νῷν αἰωνίου χοροῦ
Καί εἰ πᾰ́λῐν σύμβῐβᾰ́σοιμεν
Πᾰ́ντες τῠ́ρᾰννοι οͅφείλωσῐ δείσειν
Δῐκαιοσῠ́νη καί ἰσότης νῑκήσουσῐν
Πᾶσαι ὁδοί ἡγέονται εἰς ἥλῐον!
Ăpăntḗsomen en ōkhrôi selḗnófōtĭ́
En kḗpōi eis génesĭn kósmou
Măkhṓmethă ḗ érōteuṓmethă
Ăpó nôin aiōníou khorôu
Kaí ei pắlĭn súmbĭbắsoimen
Pắntes tŭ́rănnoi ofeílōsĭ deídein
Dĭkaiosŭ́nē kaí isṓtes nīkḗsousĭn
Pâsai hodoí hēgéontai eis hḗlĭon!
We’ll meet in the pale moonlight
In the garden at the world’s birth
We may fight, we may fall in love
From our eternal dance
And if once more we should unite
All tyrants should beware
Justice and equality will prevail
All roads lead to the Sun!
Vīdī rēs vōs hominēs nē crēderētis
(I’ve Seen Things You People Wouldn’t Believe)
Lyrica latīna (versuum seriēs prīma)
Latin lyrics (first stanza)
Viāvī omnis circum astra
Vidē nāvēs fervēntes in Ōrīōne
Trabēs C ad portam Tannhäuser
Mōmenta erunt perdita in tempore
Sed post omnia quae vidēbam
Ac post omnia quae faciēbam
Prīmae vēritātēs quās cōnstābunt
Omnēs viae ad sōlem dūcunt
Tran̄slātiō anglica (versuum seriēs prīma)
English translation (first stanza)
I’ve voyaged all around the stars
I’ve seen ships on fire in Orion
C-beams by the Tannhäuser Gate
Moments will be lost in time
But after everything I’ve seen
And after everything I’ve done
The first truths will stay constant
All roads lead to the Sun
Lyrica latīna (versuum seriēs secunda)
Latin lyrics (second stanza)
Post īnsidiās Strauss et MIDA
Ac rampans ignis ferrum cūdit
Nōs vīcerimus mangōnibus
Ac ballō Ūnā Obscurā
Ac vīsimus ad astrum vagum
Sciō nec quamdiū viābō nec quem fiām
Sōlum dēstinātiōnem meam
Omnēs viae retro sōlem eunt
Tran̄slātiō anglica (versuum seriēs secunda)
English translation (second stanza)
After Strauss and MIDA’s machinations
After rampant fire forged a sword
After we’ve won the war with the slavers
After I’ve danced with the Dark One
And after we’ve visited the rogue star
I know not how long I’ll travel nor who I’ll become
I know only my destination
All roads lead back to the Sun
Ἀττικός Ἑλληνική (τρῐ́τη στροφή)
Attĭkós Hellēnĭkḗ (trĭ́tē strophḗ)
Attic Greek (third stanza)
Ᾰ̓πᾰντήσομεν ἐν ὠχρῷ σελήνόφωτῐ́
Ἐν κήπῳ εἰς γένεσῐν κόσμου
Μᾰχώμεθᾰ ἤ ἔρωτευώμεθᾰ
Ᾰ̓πό νῷν αἰωνίου χοροῦ
Καί εἰ πᾰ́λῐν σύμβῐβᾰ́σοιμεν
Πᾰ́ντες τῠ́ρᾰννοι οͅφείλωσῐ δείσειν
Δῐκαιοσῠ́νη καί ἰσότης νῑκήσουσῐν
Πᾶσαι ὁδοί ἡγέονται εἰς ἥλῐον!
Ῥωμᾰῐ̈σμένη (τρῐ́τη στροφή)
Rhṓmăĭ̈sméni (trĭ́tē strophḗ)
Romanized (third stanza)
Ăpăntḗsomen en ōkhrôi selḗnófōtĭ́
En kḗpōi eis génesĭn kósmou
Măkhṓmethă ḗ érōteuṓmethă
Ăpó nôin aiōníou khorôu
Kaí ei pắlĭn súmbĭbắsoimen
Pắntes tŭ́rănnoi ofeílōsĭ deídein
Dĭkaiosŭ́nē kaí isṓtes nīkḗsousĭn
Pâsai hodoí hēgéontai eis hḗlĭon!
Μετάφρασις (τρῐ́τη στροφή)
Metáphrasis (trĭ́tē strophḗ)
Translation (third stanza)
We’ll meet in the pale moonlight
In the garden at the world’s birth
We may fight, we may fall in love
From our eternal dance
And if once more we should unite
All tyrants should beware
Justice and equality will prevail
All roads lead to the Sun!

It’s no Homer or Virgil, but oddly, it’s better than I could do in English. I’m too wordy in English to write lyrics in it. How else would I manage to accidentally write not one, but two books?

(Note: rampans is not a real Latin word. Rampancy is so central to Marathon’s story that not using a cognate for it wouldn’t have felt appropriate. Unfortunately, there isn’t one.)

Back to top · Table of contents · My portfolio · Contact me · Website index

“Through myself and back again”

I didn’t write my planned section on depersonalization and derealization in 2017. I’ve suffered episodes of the disorder off and on since a bad car crash several years ago. After that, I stopped driving much to anywhere except the job I was working at, and once that went online, I stopped driving there, too. I haven’t driven anywhere in ages, and frankly, I don’t really miss it. I never liked driving. I liked having driven.

PTSD generally leads to no flashbacks, one flashback, or many flashbacks. I got lucky: I only ever had one. It left no doubt that I had PTSD, so in that regard, I’m almost glad I had it. It wasn’t more than a few weeks after the accident, and it was so surreal that I can’t even say I found it entirely unpleasant, but I’d find them far beyond unpleasant if they recurred routinely. I can immediately see how they can undo years of progress, and I’m therefore annoyed when people casually use ‘triggered’ to describe mere annoyances. They’re not on the same level.

I had been experiencing depersonalization and derealization for a while already when one night, while I was driving, I experienced an episode of desomatization, the feeling that I had physically left my body. As I wrote in 2017, desomatization is commonly linked with depersonalization and derealization. It felt as if I were physically floating above myself. I knew I wasn’t – it’s not a psychosis – but it spooked me to no end. I remain spooked about it. It is, honestly, as much a reason I don’t drive as the PTSD from the crash itself is.

I’ve never really found an effective coping strategy for depersonalization, other than time. It helps to be aware of one’s surroundings, to take them in, to remember to breathe, to focus on one’s breathing. But it’s not sufficient to overcome the disorder.

It wasn’t all bad. I experienced synesthesia for the first and only time while experiencing an episode of depersonalization. Normally, you would say “this guitar solo sounds like a Neil Young solo” if that was the style it was performed in. But that wasn’t what I experienced. It was “this guitar solo feels Neil Young.” I realize that sounds nonsensical if you haven’t experienced anything like it, but I can assure you that it was entirely literal.

It wasn’t worth all the other problems I experienced from it, though. I mentioned that Counting Crows’ frontman Adam Duritz suffers the disorder. A lot of his lyrics sound metaphorical. I can assure you that lines like:

I walk in the air, between the rain
Through myself and back again
Where? I don’t know
Counting Crows, “Round Here” (August and Everything After)

are also entirely literal. That is exactly how the disorder feels. In fact, that’s how I learned I had it: his lyrics described exactly what I was experiencing. I found out he had the disorder, my therapist and I went down the checklist of symptoms, and something like 90% of them matched symptoms I’d experienced in the last month.

Undergoing a flashback, synesthesia, and desomatization within a few months of each other made me fully cognizant of the subjective basis of human perception, and ever since, I’ve been convinced that we’ll never be able to completely explain the entire spectrum of human experience through purely materialistic or empirical means. Near the start of this book, in “Moonlight’s storytelling innovations & societal critique”, I covered some objective reasons our perception is subjective, but subjectively, I doubt I’d fully understand many of the implications had I not had those experiences. No verbal account can truly suffice to convey how they felt, and I might even doubt my memories’ reliability had I not written them down, since they’re such massive outliers from my usual experiences that I might think I’d somehow given myself false memories.

I don’t know what else to say about it. I felt like I had a lot more answers in 2017 than I do now.

…Actually, that’s not even true. I still have answers. It’s just that far too many of those answers are now along the lines of the Japanese saying 「仕方がない。」 Shikata ga nai. There is no way. Nothing can be done about it. It can’t be helped. Which is the sort of answer that threatens to plunge a person into despair.

Back to top · Table of contents · My portfolio · Contact me · Website index

「物の哀れ」

Speaking of Japan, I find one of its concepts hauntingly beautiful: 「物の哀れ」 (mono no aware), literally the pathos of things. Aware means something different in Japanese, but mono no aware signifies an awareness that all things are transient. It’s not really a happy realization; it’s often wistful, and some writers have compared it to Virgil’s lacrimae rērum (tears of things). But at times like this, it comforts me. Along the same lines is an aphorism that comes to us from Persian: این نیز بگذرد (īn nīz bogzarad). This too shall pass.

One of my closest friends from high school, who worked on several Drama League productions with Adele and me, ended his life about seven years ago. I wish I’d spoken to him more after high school. It’s one of the greatest regrets of my adult life that I didn’t. I don’t know if I could’ve done anything to prevent it. But whatever he was dealing with, I wish I could’ve made him aware that it was transient. One of the most important things that has kept me alive, even during my darkest moments, is being aware that suicidal impulses are just that, impulses, and that they too shall pass. But there’s no sense making it easy for myself, so I also don’t own a gun.

After writing this, I sat on it for over eight years. I wasn’t fully confident in it when I wrote it, and its flaws seem even more severe to me now. My thinking about works of art, especially after working on dozens of them myself, is that they’re never finished. Nearly all creators eventually come to feel revising a given work is no longer a productive use of their time, but only the most arrogant feel their work is perfect. I have a growing backlog of songs whose mixes I need to finalize, and no idea when I’ll do so. I’ve been working, off and on, on Eternal since 2018. I don’t feel it’s finished yet. Maybe I never will. I’ll probably never feel this is finished either.

But maybe my life story can still do some good for others. And maybe that’s still enough.

Aaron Freed
Tallahassee, FL
2025-08-08 (last revised 2025-08-22)

Back to top · Table of contents · My portfolio · Contact me · Website index

Idiolexicon (2017/2025)

Although I’m not the first person to come up with the pun idiolexicon, it’s not a real word. But it should be. It’s a combination of two Greek words, whose etymologies and definitions further break down as follows:

  1. idiolect (ιδιόλεκτος, idiólektos): personal manner of verbal expression
    1. ἴδιος, ĭ́dĭos: personal
    2. εκτος, -lektos: manner of verbal expression
      1. λέγω, légō: say, speak
      2. ος, -tos: (adjective suffix added to nouns or verb stems)
  2. lexicon (λεξῐκόν βῐβλῐ́ον, lexĭkón bĭblĭ́on): book of words; dictionary; glossary; lexemic catalogue
    1. λέξῐς, léxĭs: saying, speech, word
      1. λέγω, légō: say, speak
      2. ῐς, -sĭs: (noun suffix to verb stem, denoting an action, process, result, or concept)
    2. ̆κόν, -ĭkón: neuter of ̆κός (-ĭkós), of or pertaining to
    3. βῐβλῐ́ον, bĭblĭ́on: book

Thus, we can surmise, it’s an author’s collection of definitions, in their own words, of terms they use frequently in their work. Idioglossary would also fulfill the same purpose, but the pun on idioglossia (ιδιογλωσσία, from ἴδιος + γλῶσσα [tongue] + -ία [abstract noun suffix]) might confuse people.

In any case, there’s a case to be made that this should be standard practice. It removes an avenue for authors to hide behind idiosyncratic jargon and claim they’re just being misinterpreted, and by extension, a potential source of disingenuity. The more of those we can remove, the better.

On that note, I wrote all the definitions below in my own words, including my formulations of the Platinum Rule and all eponymous laws. I partly derived the etymologies from Wiktionary, but translated many of them myself. (Not all entries have etymologies yet; I plan to add more.) Unless otherwise stated, etymologies are from Latin or Greek. If an entry’s relevance isn’t obvious yet, I likely plan to write about it in an upcoming addendum to this book. I only included meanings directly relevant to topics I discuss or plan to discuss: thus, I didn’t note mirror’s traditional definition, glass-covered reflective surface, because it’s only metonymically relevant. –Future Aaron]

Back to top · Table of contents · My portfolio · Contact me · Website index

Recommended Nonprofit Organizations

The following organizations are sufficiently reputable and have been around for long enough for me to recommend donating for them if you can spare the cash and volunteering for them if you can spare the time. I don’t entirely agree with them all on every issue (e.g., the ACLU’s stance on free speech is more extreme than mine), but our disagreements feel inconsequential in the present political context.

Back to top · Table of contents · My portfolio · Contact me · Website index

Acknowledgements

The actual process of writing is quite solitary; it generally consists of one person with a pen and paper, a typewriter, or (as in my case) a word processor. However, ultimately, no work is ever written alone; every writer in history has benefited in some way from their predecessors’ achievements.

To protect people’s privacy, I won’t be naming people connected with me directly. I’ve thanked many of them privately. I do, however, wish to publicly acknowledge several groups of people.

First: my friends and family throughout the years. If not for their support, I’d never have become healthy enough to complete a project of this magnitude. I didn’t even know I was capable of creating something this ambitious when I began working on it, and my ability to produce it has been in large part a testament to how invaluable their assistance has been.

Second: my teachers and professors. Many of them have been far more understanding of my personal issues than I’d ever have expected, and I’d certainly never have possessed the knowledge, research skills, or writing ability to complete a project of this scope without them.

Third: the many people with whom I’ve interacted online throughout the years. As I said earlier in this book, writing, in all probability, thousands of words every day and receiving feedback from others who perceive the world and read language in radically different ways helped hone my verbal expression in a manner I’d otherwise never have received. This, too, made me a much better writer. I suspect I’ve ultimately drawn upon many of their ideas to some degree in writing this book, as well.

Fourth: anyone who read drafts of this book and provided feedback, plus anyone who responded to my personal accounts in the past year. In all likelihood, without their encouragement, I’d never have grasped the full importance of this project to others with autism or to their families, and it would have been a far lesser book. I doubt I’d even have included so much personal detail without their feedback. I don’t think the book would say everything I’d wanted it to say without the personal details, so I’m not certain I can fully express how invaluable their assistance has been.

Fifth: Anyone upon whose research I’ve relied in writing this book. I’ve generally provided in-text acknowledgement where possible. I may attempt to provide endnotes in a final draft.

Sixth: The cast and crew of Moonlight. This book wouldn’t even exist without them, naturally.

Finally: Anyone who works to improve the lives of people with autism, as well as anyone who works to improve the lives of any marginalized population. Without your work, I’d have far less hope, and I’m not sure I’d even have seen the point of writing this book.

Aaron Freed
Sarasota, FL
2017-04-28

Back to top · Table of contents · My portfolio · Contact me · Website index

Endnotes

# Note
1. Our yearbook lists 110 Class of 2001 members, but I’m not sure if they all graduated with us. There may also have been people who graduated a year early, but the only one I remember is listed in our class.
2. I do clearly recall helping director Cameron Stuart, Adele’s male “most dramatic” counterpart, with sound design for Lizzie Loves Joe Loves Sharon, which won several awards and got us to State. My work in a small bit part in the play was very likely far less memorable or capable; it didn’t help that I had trouble standing still.
3. Our middle school orchestra instructor was very good, but I wouldn’t classify those performances as art; besides, we weren’t conducting ourselves, while the Drama League’s plays were all student-directed. I’m not sure it’s possible to overstate how important that was; I’m pretty sure it wouldn’t have been half as fun otherwise, and we certainly wouldn’t have learned as much.
4. To be fair, skeptics may point out that I haven’t ever been personally acquainted with any other Hollywood producer, and that I may find it kind of cool to be able to tell people that a former classmate of mine with whom I worked, even if only tangentially, in my high school theatre troupe now has not just an Academy Award, but the Academy Award. Guilty as charged. On the other hand, no other film I’ve ever seen has resonated with me as strongly as Moonlight has, and the film’s content and artistic strength are why I’ve reacted this strongly to it. This film may never leave my head; it’s changed my perception of the world to a degree which I don’t believe any two-hour film has previously managed, and which few works in any medium have.
5. A Zinn quote that I find highly relevant: “The cry of the poor is not always just, but if you don’t listen to it, you will never know what justice is.” Swap ‘poor’ with ‘marginalized’ to get a broader observation that still applies.
6. I debated which term to use for this community for awhile. Queer was once considered offensive, but is now considered reclaimed in the U.S. and, unlike LGBT and its derivatives, doesn’t exclude any queer identities; it serves as an umbrella term for all identities that don’t fall under traditional norms of gender and sexuality, probably including mine (I’ve chosen not to address how; I’d devoted enough of this book to my identity issues already). Other neologisms like QUILTBAG (charming though it is) are simply unwieldy. Queer still seems to be considered offensive elsewhere, but since I find it less exclusionary and I’m an American focusing on the U.S., I ultimately settled on it. I apologize to anyone still bothered by its usage; I could find no perfect solution. I’d like to stress, however, that I don’t intend its usage in any derogatory fashion; I intend to further its reclamation.
7. Naturally, the asterisk signifies the underlying absurdity of a 2.86 million-vote loser being declared victor. This usage is not unique to me; Esquire politics blogger Charlie Pierce also employs it, but I began using it independently. I suspect we were both inspired by the asterisk sometimes used in baseball record books.

[A newspaper reported a while back that he would only pay attention to stories if they mentioned his name. For this and several other reasons, I’m therefore refusing to print it. Even though I’m certain he’ll never want to read this, and may not even possess the ability to do so, I have no intention of rewarding a malignant narcissist with any sort of attention. –Future Aaron]

8. There is, in fact, a stage musical entitled Allegiance based on his experiences.
9. Rain Man, of course, is an exception, but the film presented autism as an incredibly unusual disorder, and its presentation of Raymond as a savant is not actually typical, or even close to typical, of most autistic people’s lived experiences, nor are his repeated freak-outs particularly characteristic of us. I’ve spoken to several autistic people about the film, and none considered it representative of their experiences.
10.

This means that no one during my time at Pine View even knew I was autistic. Our local paper even reported after my graduation that Pine View had accepted its first autistic student – who was, in fairness, probably its first student whom it knew to be autistic. My family and I only became aware of the diagnosis when I was nineteen, but I’d apparently been diagnosed at least a year earlier and possibly several; the professionals I’d seen felt I’d already been burdened with too many labels. I can see why they felt this, but in retrospect, knowing it earlier could’ve saved me an almost inconceivable amount of grief because, among all the conditions I’ve suffered, autism has had by far the largest impact on me, and learning how to cope with it has been the defining personal challenge of my entire adult life.

I will also, starting here, use ‘autistic’ to describe all people with autism-spectrum diagnoses and ‘autism’ to describe all such disorders. This may not be strictly accurate, but I quickly tired of using more specific nomenclature like “people on the autism spectrum” because it’s far too wordy (Orwell: “If it is possible to cut a word out, always cut it out”) and the disorder and its impact are a central focus of this writing.

11. Other cultures appear to consider this more acceptable; people who have spent time in the northern U.K. have observed that Americans are far more intrusive about personal matters, including eye contact. I’ve spent roughly two months in Scotland, and this seems accurate; the Scots are friendly but not intrusive.
12. I should note that I’ve had conversations with neurotypicals who have assured me they don’t learn as much from eye contact as popular culture suggests. Evidently, a person would need lengthy professional training to learn that much from a glance at someone’s eyes. This has lessened but not eliminated my insecurities about establishing eye contact with strangers, because even if they don’t learn as much from eye contact as I thought they did, they’re still probably learning more than I am.
13.

To clarify how wearisome I find eye contact with strangers, I’d unquestionably be more comfortable having strangers see me naked than I am making eye contact with them, and I might even be more comfortable having sex with a woman I’d just met than making eye contact with her, though I’ve never tested the latter hypothesis. (To provide additional context here, I may also be on the asexual spectrum, as I rarely develop interests in sex with specific people, but I at least find sex enjoyable, so the hypothesis seems plausible.)

Also, note that I’m not uncomfortable with all eye contact; if I both know and trust you, I will make eye contact with you, though it probably won’t be constantly (maintaining it for too long without pause feels awkward).

Note, as well, that my issues with eye contact don’t extend to everyone with autism. There are some autistic people with the exact opposite problem: if you don’t look at them, they literally can’t communicate with you. They apparently process the world in such a visual manner that it’s impossible for them to perceive communication based on words alone. This mode of thought is so completely alien to me that I’m not even certain I can fully comprehend what it’s like. It’s much like descriptions of schizophrenia: I understand the concept, but I certainly can’t imagine what actually living with it is like, or even what experiencing it is like.

Autistic people’s subjective perceptions usually vary widely from neurotypicals’, but since society normalizes neurotypicals’ preferences, they may never have cause to consider how that affects others who don’t share their preferences. Autistic people may have widely different communicative preferences or even needs that aren’t normalized in society, so they may have trouble communicating that is in no way related to intelligence.

14. I use the term ‘neurotypical’ throughout this writing to describe people who don’t fall into the autism spectrum. The term has been criticized as implying that those without autism can be treated as having fundamentally similar brain types, which is obviously a nonsensical assumption to make, but the phrase “non-autistic people” is (again) wordy and also sounds awkward to me, as are any others I could come up with to describe them, so I won’t be repeating any of them after this endnote.
15. The only reason I can’t write those years off as a complete waste is that I set aside part of nearly every day for writing, as I’ve been doing for at least twenty years. The best way to improve one’s writing ability is to write as much as possible, and I’ve written a lot.
16. Again, the fact that I kept writing nearly every day is a major reason they weren’t a complete loss. I also earned my first degree during this time, though this problem persisted for several years after that.
17.

In 2018, I obviously can’t mention Harmon without also acknowledging the allegations against him. Later in this writing I will address this more fully, but the same individual can, on one hand, be responsible for unforgivably awful transgressions and, on the other hand, for creative works that inspire people in wonderful ways. At this point, I will simply quote Terry Pratchett and Neil Gaiman’s Good Omens: “And just when you'd think they were more malignant than ever Hell could be, they could occasionally show more grace than Heaven ever dreamed of. Often the same individual was involved. It was this free-will thing, of course. It was a bugger.”

[In 2025, this choice of quote reads as bitterly ironic, since Harmon was pretty much a model of grace and humility in dealing with past misbehavior, while Gaiman has now been accused of much worse than Harmon ever was. I’m certainly not going to defend his behavior, but the quote still seems accurate to me. –Future Aaron]

18. This isn’t a veiled request to work on a film. I’m not that self-centered or shameless, and whether I could even help would likely depend on production deadlines, my work/school schedules, whether I’d need/be able to travel, and probably other factors I haven’t even considered. I’m also not that underhanded; if I were desperate to work in film, I’d be far more direct about it (I would, for instance, have included a résumé and portfolio). I’d certainly cherish and be grateful for any such opportunity I were offered, though; I’ve enjoyed every collaborative artistic project I’ve participated in, and I suspect I have stories to tell that will be of great personal benefit to many people who are currently hopeless.
19. The other later gets its own section, and since they’re interconnected, I’ll obliquely address it before then as well, but I don’t wish to identify it explicitly until I’ve further explained my own perspective.
20. Remember, many of my memories of high school are quite vivid. My lack of recollection here is more a product of how horrible my experiences were, not the length of time that’s elapsed since.
21.

I abandoned it for several reasons, but the main one was that it was too ambitious; finishing it would’ve required far more work than it was worth. I also no longer identified with the narrative I’d planned for it, which wasn’t even finished, and making the levels I’d already built serve another narrative would also have required far more work than it was worth. Eventually I lost interest in games entirely. The toxicity of ‘gamer’ culture in recent years hasn’t helped, but my interest was waning even before it got overtly misogynistic.

[lol, so about that… well, I’ll go into it in the afterword. –Future Aaron]

22. We’re also probably not easy to have relationships with; our emotional needs aren’t always comparable to neurotypicals’, nor do we always communicate in the same ways, particularly when it comes to nonverbal communication. I discuss this further later, but not only do we not possess intrinsic abilities to read theirs, but they don’t possess intrinsic abilities to read ours, either, and many of them don’t even realize it. For that matter, many of us also don’t realize it, and it may actually be the greater of the two problems: neurotypicals may assume we intend messages we don’t intend, and not only do they not realize it, but we may not realize it, either.
23. I was going to get more specific here, since I thought it’d be a beneficial learning experience for others with autism, but it was uncomfortably revealing about her. I’m fine revealing uncomfortable personal details about myself, but there’s a line beyond which I’m not comfortable with revealing details about others, particularly when they’re not comfortable having them revealed, either (though, to be clear, even if they were OK with it, I wouldn’t be). The only information that either of us is comfortable having revealed here about our sex life together is that we had mutually enjoyable sex. I tried writing an appendix of sex advice for autistic people and their partners instead, but that also felt like the wrong approach, not least because I’m really not qualified to write it.
24. In retrospect, it would’ve been hugely out of character; her mother was always extremely kind to me and let my ex and me do basically whatever we wanted together, despite being under her roof most of the time. But abusive people can evidently also have pretty huge mood swings. I now regard whether I overreacted as one of the biggest mysteries of my life. I suspect I’ll never know the answer for sure.
25.

This isn’t to say I’ve lost track of everyone I used to know online, but she’s one of only a handful of people I met online whom I’d still go out of my way to track down if it proved necessary. In the long run, only a handful of people I’ve known online have been truly important to me, and she’s by far the most important. As for the communities I cut ties with, I felt they’d become toxic experiences; they were either too hostile or too laden with internal politics (or both) to be worth my time any longer. While I still spend a decent amount of time at a few Internet communities I still find worthwhile, it’s much less than I used to.

I don’t like losing contact with people I’m on good terms with, but I felt cutting ties with everyone from those communities for the time being was the quickest way to get past some rather serious emotional disturbances the communities had caused me (it doesn’t help that I felt my experiences at one such community had rather closely mirrored my experiences with my first employer, which were also causing me rather serious emotional turmoil at the time; I’ll describe the latter later), and it wound up being effective. At this point, I can have direct contact with people from those communities without too many unpleasant memories flooding back, but I’m not sure I could have, say, in 2016.

I was also following advice I now see as terrible: social media generally reflects people’s experiences only at high and low points, so it creates unrealistic perceptions of their lives; thus, seeing them achieve success might induce jealousy or feelings of inadequacy. This is risible advice and is based on an intrinsically and unjustifiably cynical view of human nature. My immediate reaction to Moonlight’s Academy Awards was, first, extreme jubilation that a classmate had won Best Picture, and secondly, anger at myself for having wasted my life. However, I quickly got over the anger, since within days, I started writing this book, which quickly revealed that what I’d thought had been a waste had actually provided me with quite a lot of relevant experience and knowledge. This isn’t simply my opinion, and I wouldn’t be expressing if it were (I’m not prone to braggadocio and, in fact, tend to be rather self-effacing overall; a particularly relevant example is that my best friend from my high school class only found out about a particular achievement I’d made after our principal mentioned it in my graduation speech, and was shocked that I hadn’t told anyone about it); feedback on drafts from others, particularly from other autistic people and their parents, has been far more enthusiastic than I initially expected.

I’m not even sure I’d have ever achieved this level of confidence in my writing without Moonlight’s Best Picture win, nor would I ever have felt I had any relevant stories to tell that could find an audience, or that my life experiences had this much value. Moonlight’s mere existence, its honest depictions of multiple marginalized communities, its wide mainstream success, and its reception of film’s highest honors show that I was completely wrong: there is an audience for stories like mine, and I may even be able to tell some of them. Moonlight’s awards indirectly gave me a confidence in myself, in my writing, in the progress I’ve made, in my ability to make further progress, and in society’s ability to make progress (remember, overt queerphobia was considered a winning political strategy in this country as recently as 2004) that I’m not sure I’d otherwise have gained.

The above factors are also a large part of why Moonlight’s Best Picture win crept up on me as such a shock, and why I knew very little about the film before the awards. I wasn’t checking Facebook at all, so I had no idea what any of my classmates were doing. I also didn’t pay much attention to the award nominees, so I only even heard Adele was up for an award the morning of the ceremony. Along with not volunteering for and donating to Clinton’s campaign more than I did (I both volunteered and donated, to be clear; I just felt I could’ve done more in retrospect), losing touch with classmates and others who were once close to me is one of my biggest regrets of the past few years, not merely because I didn’t get to see Adele’s career truly take off in real time, but also because there are probably dozens of people to whom I no longer regularly speak and I wish I did.

26. To be fair, I still don’t want to run for office, partly because I’m extremely introverted, partly because I don’t want the degree of close personal scrutiny that would bring, and partly because I’m probably unelectable for a number of reasons, but I’m at least comfortable with the idea of working within the system now.
27.

I’ve had extreme difficulty getting interviews for positions; I’ve applied for probably dozens of positions over the course of my life and only gotten interviews for two. Job interviews themselves also seem to pose severe problems to autistic people, but for some reason, that seems to have eluded me so far; I’ve only been to two, and despite being nervous at both, I was hired for both positions.

This employer is a fantastic place for autistic people to work. Since I started here, one other person from our support group has also started; he also seems to be doing well. This work’s tasks play to our common natural strengths; its lack of interaction with the general public and our ability to wear headphones at work alleviate many issues we often face in the workplace. If more employers were this good for us and accommodating of our issues, we might not be so frequently unemployed. To be clear, things aren’t perfect here, but they don’t seem to be systematically flawed as they were at my previous employer.

I’ve identified enough details about my current employer for anyone familiar with the industry to identify them; they are, in fact, all but synonymous with it. I’ve avoided using their name explicitly for several reasons, not the least of which being I don’t want to provide the remotest implication that I speak for them. Obviously, the views expressed in this work are entirely my own.

28. If it were year-round, it might at least be enough for me to rent a small apartment, which would probably be all I’d need. As long as I have space for my LP records, some books, my computer, a TV, and a bed, I’ll be happy with a living space, and apart from LPs and restaurants, I spend almost nothing on extravagances. I doubt the income would also cover my health insurance, though, which is quite costly right now. I briefly qualified for Social Security between jobs, but it apparently wasn’t long enough to qualify me for Medicare, which was unfortunate, as it would’ve put me vastly further along in my quest for self-sufficiency. We’re still appealing the claim, but navigating Social Security’s labyrinthine bureaucracy is a perplexing, intricate process that often takes years, and severance pay at my current employment after the TV diary program wraps up may eventually lose me the ability to qualify entirely. It’s also quite unfortunate that many people who need Social Security most may be completely unable to navigate its complex bureaucracy by themselves; I certainly couldn’t have gotten even the roughly seven months I qualified for without my parents’ assistance.
29.

I suspect only about 5-10% of the population could even do this job. It requires strong analytical skills, competent computer skills, the ability to locate obscure data (hence the title of ‘research’), memory for an astounding number of processes and rules, strong attention to detail, the ability to disregard “common sense” when procedures call for it (they actually tell new editors that using common sense will get them in trouble), and probably detailed knowledge of TV stations, TV programming, and cable/satellite companies. There’s so much information to keep track of, in fact, that we actually have a reference notebook for facts we’re not expected to remember offhand, but many other things aren’t written down, and we’re simply supposed to remember those. I’ve actually taken notes for some of those as well, because otherwise I might forget it from cycle to cycle. Also, if you don’t already have strong geographic knowledge of the U.S., you will after a few cycles in research (though you won’t learn as much about the more densely populated areas, which are already automated and thus not in our purview). At this point, I’ve probably forgotten more about upstate New York’s geography than most Florida natives have ever known.

Over-the-air station identification requires checking several factors, including but not limited to satellite stations (a number of stations are linked), low-power transmitters, broadcast channel numbers (which differ from the digital channel numbers most viewers see on their receivers), and stations’ signal strength. Receptivity to a given area isn’t as easy to measure as one might think, because locating a TV station is surprisingly difficult. There are several databases that provide information about TV station locations, and the one in our main database (which we refer to internally in research as CADC; I don’t know what it stands for) is frequently inaccurate. We have another database, Distributor Directory (DD), which is considered more reliable, but it shuts off at midnight. Our shift extends to 12:45 am, so we essentially can’t work on broadcast stations for the last forty-five minutes. Other databases, such as the FCC’s (which I suspect is more reliable than DD, since it includes station transmitters’ exact longitudes and latitudes, which often differ by up to thirty miles from DD’s locations, but this is evidently considered too complicated for some of the less technology-savvy researchers to input), also aren’t consistently available, and the problem has worsened since 2017. I suspect the federal hiring freeze instituted by the current presidency* has resulted in the FCC website being understaffed.

This description might make the job sound like hell to many readers, but I’ve never enjoyed a job more. I’ve certainly never felt more intellectually stimulated at work, and overall it’s one of the most intellectually stimulating experiences I’ve had since probably calculus in high school (which astounded me with its sheer beauty; Euler’s identity in particular was one of the most beautiful things I’d ever seen). Doing this job requires a resourcefulness and creativity that none of my previous work involved, and the fact that I’m consistently improving at it and consistently learning new things from it makes it the most rewarding job I’ve ever had.

30. We’re not romantically or sexually involved; I get the impression she’s not interested in romance or sex at all, at least for now, but she hasn’t explicitly confirmed this, and I’m not inclined to pry. I did ask her if we were a couple earlier this year (after about half a dozen other people had asked me the same question, incidentally), but she confirmed that we’re not. If she wants to talk about it further, she will; if not, as far as I’m concerned, it’s none of my business. (I also get asked if we’re related a lot. Obviously, we’re not close family; we’re both of Ashkenazi Jewish descent, though, so there’s a remote possibility we’re tenth cousins or something.)
31. As far as troubles with my everyday life, I still have some other problems as well; I still have difficulty with some everyday household tasks, though they’re far less excruciating to perform now than they were sixteen years ago, and I still could stand to learn a lot more about cooking (I can operate microwaves, stoves, and ovens, but complicated recipes generally remain beyond my ken). I also still procrastinate far more than is ideal.
32. I might even be comfortable enough to start giving dating a serious try again, but I have enough other priorities – such as school, class, work, writing, and my friends – that I’m not terribly bothered by the lack of romance in my life right now… which might actually mean I’d be less nervous on dates, too, come to think of it, so maybe that would make it the best possible time to start dating again.
33. To be clear, I have issues with aspects of the neurodiversity movement. Those who argue that we shouldn’t be considered to have disorders are simply wrong, as I think my life story makes clear. However, society should certainly be more accommodating to those with different thought patterns and modes of perception.
34. One study has suggested that dedicated workspaces and noise-free environments improved computer programmers’ productivity more than years of experience, salary, or even knowledge of programming languages.
35. To be perfectly clear, this is not the statistic for our entire population of mentally ill; estimating the latter’s unemployment rate is problematic, but it’s probably around 40-60%. Unemployment rates vary with illnesses’ severity; for example, depression patients report 40-60% rates, while schizophrenia patients report 80-90%.
36. I’m not saying this out of dissatisfaction with my IQ, to be clear: Pine View required ≥130 IQ for admission, and my Stanford-Binet score is much closer to 200. I’m also not saying this to brag; I consider IQ only moderately more consequential than blood type. I don’t actually know my blood type, and I wish I didn’t know my IQ. My parents didn’t want me to know it, either, but a counselor once casually let it slip. I suspect that knowing my IQ led me to underestimate the obstacles I’d face in college and elsewhere.
37.

The modern Greek descendant of this term is αμάθεια (amátheia).

[This short piece I wrote earlier in 2025 elaborates on this in further detail. –Future Aaron]

38. The president* instead used ISIS here, a term whose usage I avoid, since it legitimizes the terrorist group. The term is an English acronym for Islamic State of Iraq and Syria, which is not a state and bases itself on a perversion of Islamic teachings. The Arabic-speaking world refers to them by the pejorative Daesh (داعش), an acronym for their full Arabic name (الدولة الإسلامية في العراق والشام, or ad-Dawlah al-Islāmiyah fīl-ʿIrāq wash-Shām), which resembles the Arabic words دائس (Daes) and داهس (Dahes), both of which describe someone who ‘tramples’ upon others. (Arabic Romanization is based on pronunciation rather than spelling, which is why it isn’t Daiish.) An Egyptian goddess, a ’70s rock band, a ’00s metal band, a Bob Dylan song, a fictional spy agency, several comic book characters, and numerous other aspects of society don’t deserve to be tarred by association with Daesh.
39. Glib as it is to say the president* is lying whenever he opens his mouth, he’s almost certain to be lying if he uses the sentence “Believe me.”
40. I have an admittedly entirely crackpot hypothesis that Bill and Hillary Clinton secretly have an open marriage. I have absolutely no direct evidence for this but, given that they work in politics, it’s certainly the sort of thing they’d want to keep secret from the public. On the other hand, I’m not sure it’s possible for people as prominent as they are to keep a secret that big, which is one of several reasons it’s a crackpot hypothesis.
41. Indeed, maybe this principle should apply for all marginalized populations: women should get more say in women’s rights issues than men; minorities should get more say in minority rights issues than majorities; and so on. Power disparities often cause problems in society, and if an issue affects a small minority of the population far more than the rest, but everyone gets equal input into it, then justice for the minority may be elusive.
42. Rush’s “Freewill” seems relevant: “If you choose not to decide, you still have made a choice.” In the same way, art that doesn’t take a conscious political stance is conservative in the old-fashioned sense (in contrast with the modern American sense): it implicitly suggests that the status quo is the natural order of things. There’s nothing intrinsically wrong with this in some cases (though there is in others), but it’s not apolitical.
43. Thanks are also due to the many performers who’ve challenged traditional perceptions of sexuality and gender throughout the years, like David Bowie, Prince, Annie Lennox, Patti Smith, and Little Richard; I doubt anyone can know for sure exactly how many people have been saved from suicide by knowing that ‘freaks’ who fell outside traditional gender boundaries could still find success and fame, but it has to be a lot.
44. To be perfectly clear, I’m not advocating universal reënfranchisement. People who haven’t been rehabilitated shouldn’t be reënfranchised, and I’m not convinced everyone is rehabilitatable. I’m particularly suspicious of domestic abusers and violent sex criminals, as well as serial killers and similar repeat offenders. However, people considered well enough to send back out into society should also be considered well enough to vote, and people not considered well enough to vote shouldn’t be sent back out into society. If people are still dangerous enough to others to be denied the franchise, then why are they even being released? (A cynical retort is that they’re not too dangerous to vote, and a certain party just doesn’t like minorities voting.)
45. Many people may also find an omnipotent, all-loving deity to be irreconcilable with the modern world’s injustice and suffering, and may even have trouble imagining an all-loving deity without omnipotence. Meanwhile, the idea of an omnipotent deity who isn’t benevolent is unlikely to have much appeal.
46. Silphium, the plant widely believed to have been used for this purpose for much of humanity’s history, is generally believed to have gone extinct in the second or third century BCE.
47. I’m also fine with “Romeo and Juliet” laws excluding adolescents of sufficient age from culpability for consensual acts with each other, but there are also limits to how young is acceptable and how large an age gap is acceptable. Many minors may not even be prepared for sex; I certainly wasn’t in my adolescence. Many things that are acceptable for most adults may not be acceptable for many minors; ultimately, I suspect too many psychological factors are at play here for a single universal standard to provide an adequate summation of the ethics of teen sexuality. I will leave it at that and, from here on, focus on the conduct of adults.
48. They’re still not perfect and may never be, but unplanned pregnancy rates have mostly plummeted for decades and usually only rise when abstinence-only sex education causes reduced or inept contraceptive use.
49.

To be clear, some non-pornographic films have still featured sexual assaults during filming. There is little doubt that Last Tango in Paris (1972) and Crash (2004) featured unsimulated sexual assaults, and El topo (The Mole, 1970) may also have, depending upon how credible one considers actor/director Alejandro Jodorowsky’s initial account of filming. (Jodorowsky seems to be the only primary source suggesting he assaulted costar Mara Lorenzio; I have been unable to locate any record of Lorenzio ever commenting or pressing charges, and indeed, scant biographical information about her in any form. Jodorowsky later retracted his initial claim, stating that he and Lorenzio engaged in unsimulated but consensual sex; as he is a non-native English speaker with a history of using English imprecisely and making dubious, outlandish claims about his life and work, all of his commentary on the film is open to interpretation. It does appear almost certain that several dozen rabbits were killed during filming – this was before the days of “no animals were harmed,” which first appeared in 1972 – and even if the film doesn’t actually depict an on-camera assault, its attitude about the subject is still awful.)

For the record, I haven’t seen any of these films, and probably won’t ever. This endnote is more to note that the distinction between pornography and other forms of cinema is thinner than people usually think.

50. Before Miller, the standard was Roth v. United States (1957): as Tom Lehrer put it, “As the judge remarked the day that he acquitted my Aunt Hortense/‘To be smut it must be utterly without redeeming social importance.’” The last five of these words are a verbatim quote from Roth. Before Roth, any material that tended to “deprave and corrupt those whose minds are open to such immoral influences” could be considered obscene and banned, including works by authors like D. H. Lawrence, James Joyce, Honoré de Balzac, and Gustave Flaubert.
51. I’m using the term here in its colloquial rather than its clinical sense. I’m including this clarification because at other points in this writing I will refer to clinical narcissism.
52. It also resulted in a horrifying but entirely predictable epidemic of people using his ‘election’ as an ‘excuse’ to commit their own, similar sexual assaults; reported offenders have been as young as ten. The irresponsibility of 2016’s media coverage very likely can’t be overstated.
53. PolitiFact isn’t perfect either, but its rankings overall echoed reality much more closely than news stories did.
54. The Podesta ‘hacks’ (actually a case of social engineering) were one of the most ridiculous cases of miscommunication in recent memory: Podesta suspected a phishing attack and emailed his IT team, and they sent back an ambiguously worded reply that he interpreted as an instruction to click one link to change his password, when they meant to tell him to click another. It’s well known that poor communication causes problems, but this still goes beyond the extent of what most people think is possible. Given how close the election was in several decisive states (if you switch around 100,000 votes from the president* to Clinton in Pennsylvania, Wisconsin, and Michigan alone, Clinton wins the election), it’s entirely possible that a better worded response by Podesta’s IT team would’ve resulted in a Clinton victory.
55. Emails marked ‘classified’ were either sent erroneously to the server and without proper marking, in three cases, or marked as such after the fact, in literally thousands of others. While the media’s “Clinton Rules” (everything is worse if a Clinton did it) were a known quantity going into the election, the server’s incongruous impact on the election was nonetheless a black swan event I don’t believe anyone could have predicted. The email ‘scandal’ was based on effectively no substantive wrongdoing by a Clinton, just as Whitewater was.
56. A tech of the hosting company Platte River Networks appears to have deleted them in March 2015, in direct contravention of an order from Clinton’s legal team that the archive be preserved due to a pending subpoena; Clinton had already supplied her work-related email to State in December 2014.
57. Another side effect of this election is that I’m now extremely paranoid about email. I’m sure I’m not alone.
58. Similarly, “the Affordable Care Act” polled much better than ‘Obamacare’ for years, even though they’re the same thing; this likely can also be blamed partly on public agnoia and partly on poor media coverage.
59. The most famous quote attributed to Burke, “The only thing necessary for the triumph of evil is that good men should do nothing,” does not appear in his writings, though he did express a similar idea:

“When bad men combine, the good must associate; else they will fall one by one, an unpitied sacrifice in a contemptible struggle.”
Edmund Burke, Thoughts on the Cause of the Present Discontents (1770)

The first attestation of the modern form was in 1920 by the temperance crusader Sir R. Murray Hyslop, who attributed it to Burke. Quote Investigator has more.

60. In point of fact, a friend once enrolled me in the Communist Party as a joke, so that I couldn’t legitimately claim to have never been a member of it. However, I can at least claim to have never enrolled myself.
61. This only applied to the highest bracket, to be clear: $200,000 for single people; $400,000 for married couples. Adjusted for inflation, that’s $1.7 million in today’s money for individuals and $3.4 million for couples. Income below this was taxed at lower rates, even if one qualified for the highest rate.
62. Krugman used, and I’m repeating, the term gaslighting rather than merely lying because the media specifically blamed others for their own conduct in direct contravention of established facts, a tactic abusers deliberately use to make their victims question reality itself. I think it’s unquestionable that the concept of truth is currently under attack; the president*’s administration is a worse offender, but at this point I consider the media accomplices. Vavreck’s editorial also implicitly admits that the media consistently covered Clinton’s server even when there were no new details, which would be a stunning admission of guilt to someone more self-aware.
63.

While we’re on this topic, I should also note that the producers of La La Land were also unfathomably gracious at what must’ve been a truly horrible moment for them, though it probably helped soften the blow that they’d by all accounts genuinely become friends with Moonlight’s cast and crew.

Theirs was a very good film, but the right film definitely won. Somehow, it does seem historically appropriate that there was an announcement mix-up, as well. Given that Moonlight was responsible for so many other firsts in the history of the awards, it might as well also have been their first major screw-up.

64.

Though to be fair, the list of greatest presidents starts with Lincoln at #1, Franklin D. Roosevelt at #2 (despite his flaws both politically and as a human being), and then whoever is #3 is far, far below in stature.

[Lyndon B. Johnson and George Washington are #3 and #4 and, despite major stains on both their records – slavery in Washington’s case, Vietnam in Johnson’s – far ahead of everyone else. Anyone else who realistically could’ve been president in 1964 likely would’ve been no meaningfully different on Vietnam; no one else who realistically could’ve been president in 1964 would’ve done as much for civil rights as effectively as Johnson did. Washington’s farewell address and decision not to run for a third term established numerous precedents without which our country’s history would likely have been much darker. –Future Aaron]

65. Comedians like John Oliver, Samantha Bee, Seth Meyers, Trevor Noah, Stephen Colbert, and Jim Jefferies also frequently do surprisingly good reporting (and their humor often makes horribly depressing stories more palatable). Oliver claims he isn’t a journalist, arguing that he relies on the investigation of reporters like Harry Esteve, but his show contextualizes stories far better than many papers do, exposes others’ superb reporting to much wider audiences than it would otherwise receive, and even unearths new facts. Oliver’s piece on Ivanka Trump and Jared Kushner is a particularly good example of this. Bee has also done a commendable amount of original journalism; her interview with paid Russian trolls is particularly invaluable.
66. I realize this may sound like a joke if you haven’t visited their websites lately, but I’m being completely sincere. Teen Vogue in particular has published an astonishing quantity of fantastic political coverage lately.
67. The same blog’s David Anderson (formerly known as Richard Mayhew, after the lead character of Neil Gaiman’s Neverwhere) appears to do an equally superb job with insurance analysis, but I often find his posts beyond my level of comprehension because I lack direct experience with many aspects of insurance coverage. Many others who understand the subject better than I do have also recommended his writing, though.
68. I also feel compelled to mention the late Scott Erik Kaufman, who contributed to this blog and many others. Thanks to him, I will forever look at the visual compositions of films and TV shows in terms of eye lasers.
69. Blogs can also provide some much-needed humor when it comes to the news, as an addendum to comedians like Oliver et al. Humor’s naturally pretty subjective, but my “funny bloggers” list includes Pierce; Lawyers, Guns & Money’s Shakezula; Alicublog’s Roy Edroso; the Rude Pundit; and several Wonkette writers. (New readers may initially find Wonkette’s many in-jokes esoteric, though, which makes it a rather polarizing site.)
70. Naturally, the political left immediately reclaimed this phrase; it still occasionally appears online even today. Amusingly, the same strategist (Karl Rove) was later derided for using “math you do as a Republican to make yourself feel better”, which I feel was one of Megyn Kelly’s finest moments.
71. While in theory our corporate tax rate is high, in practice it frequently isn’t. Several major companies, including General Electric, have infamously paid zero taxes in several separate fiscal years. There are a number of reasons for this, most of which are related to tax shelters and loopholes in our tax code.
72.

Oliver Wendell Holmes, Jr., who wrote Schenck v. United States, the Supreme Court decision that popularized the phrase “falsely shouting fire in a crowded theatre”, later regretted doing so, and his later decisions undermined it. His change of heart may have been influenced by legal scholar Zechariah Chafee, who wrote that the speech Schenck was condemning was more comparable to a warning that a theatre doesn’t have enough fire exits. Schenck was ultimately mostly overturned by Brandenburg v. Ohio, which limits the scope of banned speech to that intended and likely to incite “imminent lawless action” – i.e., riots.

People have indeed falsely shouted fire in crowded locales, often with horrifying results. The 1913 Italian Hall disaster, for example, left seventy-three dead. The worst case wasn’t even intentional; the 1902 Shiloh Baptist Church stampede caused 115 deaths after the words fight and quiet were both misheard as fire.

73. The economy’s electoral impact has been inflated. It affected the election, but racial resentment, misogyny, and/or prejudice against Muslims were most strongly correlated with support for the president*. This does not mean all his supporters were bigots, but many were. Analysis also suggests Clinton won both voters who listed the economy as a top concern and voters of all races making under $50,000 per year, meaning the media’s extensive focus on the “white working class” is at least partly misplaced (to be fair, “working class” is sometimes used to refer to culture rather than income level, but since ‘class’ usually refers to economic status and people are often trained to think of it in terms of income, I find these uses misleading and mostly useless).
74.

To be clear: I’m not worried about my professional prospects in the approaching future, even though I’ve already mentioned that my own job will be automated eventually. I’ve also mentioned being ten classes away from my second bachelor’s degree; it’s in information technology. I’ll be well equipped to find employment designing, programming, and/or maintaining the robots and nanotechnology that increasingly will run humanity’s future, and hopefully skillful enough to ensure Skynet doesn’t kill us all. Automation concerned me long before I learned my job would not be permanent (it was announced late last year). I’m indescribably apprehensive about the populace as a whole, both because I value social stability and because I have a natural affinity for the marginalized: after all, I’m still marginalized on multiple levels, and won’t forget my experiences as a member of these marginalized groups even if they cease to be marginalized. Not everyone has the ability to work in fields like technology, and society should provide for those who can’t find employment. If we don’t, the poor will become much poorer than they already are, and the outcome could be widespread war or worse.

[I feel the succeeding eight years have completely vindicated me on this point. I also feel my stance was still hopelessly naïve. Furthermore, it frustrates me to no end that society in general has yet to connect the dots between language learning models (or “artificial intelligence”, as it’s colloquially known) and broader trends of automation across industries. It’s all the same thing under different names. –Future Aaron]

75.

For example, many of the Mac’s most notable features, like the mouse and the what-you-see-is-what-you-get approach to computing, were invented by Xerox – yes, that Xerox. (I’m a notorious lifelong Mac fanboy, so it should say something that I’m openly admitting Apple stole Xerox’s ideas.) Xerox’s involvement in computer development is a now largely forgotten historical footnote; in the long run, they were unable to break into the computer market for several reasons, including manufacturing costs, competitors’ existing market shares, and their leaders completely failing to foresee the future of technology.

To be fair, the last of these factors was common in that era, even among otherwise capable professionals; at around the same time, my father was part of a group that visited Washington Post editors Benjamin Bradlee and Katherine Graham with a proposal for digital content distribution, and they looked at my father’s group like they were Martians. To be clear, this was before the Internet was even a thing, so the idea was clearly well ahead of its time, but newspapers struggle with digital content distribution even today.

Moreover, while Xerox invented many of the Mac’s most important design features, many others, like the drop-down menu, the trash can, and double-clicking to open a document or program, were Apple originals. Jobs’ emphasis of the end-user experience was unprecedented in the computer industry at that time, and most existing companies could stand to learn a lot from it, whether they’re directly in the IT sector or not (even most companies that aren’t “computer companies” or “software companies” still use computers).

[I would be remiss not to mention that, despite Apple’s problematic labor issues and stances on issues like right to repair over the years, I remain a Mac fanboy eight years later.

My first computer, which I began using at the tender age of two, was an Apple IIgs. I’ve since owned dozens of Macs over the years, with my all-time favorite being the clamshell iBook, objectively the most beautiful computer that anyone will ever make. Anyone who objects that aesthetics can’t be objectively evaluated is telling us they’ve never seen a clamshell iBook without telling us they’ve never seen a clamshell iBook.

For about fifteen years, I took a detour into doing most of my work on Windows. For about fifteen years, I swore at computers vastly more often. I mostly use MacOS and Debian now, and I’m vastly happier for it.

That said, a major factor in my allegiance to Apple is that they’re effectively the only large tech company with even slightly trustworthy privacy or security records. They won a lengthy court battle after the FBI demanded that they program a backdoor into iOS. Fighting this demand was objectively the right decision. There is no such thing as a safe backdoor. Its very existence would have jeopardized every iOS user’s privacy and security, which would’ve been no more secure than the backdoor itself – and, as we’ve seen, the FBI has hardly been a bastion of good infosec practice. I have no faith that any other large tech company would’ve fought such court orders as vigorously. –Future Aaron]

76.

Among other flaws, WTA is unusually prone to the spoiler effect: third-party votes aid one’s political opponents. This means that it results in only two viable parties almost everywhere it is used (parliamentary WTA systems can sometimes have multiple regional parties, with different parties viable in different parts of the system – the Scottish National Party is a good example; Labour is their main opposition and the Tories are completely unviable in Scotland, so Scotland still falls into a two-party system – but our system is presidential and our national party organizations are strongly federalist in structure, so even that doesn’t happen here).

[It is not accurate to say that in a presidential WTA system, a vote for a non-viable third party is mathematically equivalent to a vote for the opposition; it is, however, mathematically equivalent to half a vote for the opposition. Why? For the same reason that a team falls an entire point in the standings if they lose a game against their division leader, but only half a point if they lose a game against anyone else. In the former case, they lost and the division leader won; in the latter, they lost, but the division leader didn’t win. This is why voting for a third party in presidential WTA systems doesn’t equate to giving the opposition an entire vote; it just equates to not voting against them. Which is why we can think of it as giving them half a vote. (It’s also mathematically equivalent to not voting.) –Future Aaron]

The only well-known system that can sometimes deliver worse results is the Borda count, which entirely falls apart if people vote strategically but works almost perfectly if they don’t; Borda himself said his system “is only intended for honest men.” Given the many untruths I’ve examined in this book, I suspect he didn’t mean us.

No perfect voting system exists; all known systems fail at least one criterion used to evaluate them, such as later-no-harm (adding a preference to a ballot can’t lower the odds that any candidate ranked above that candidate will win) or electing Condorcet winners if possible (the candidate favored by majorities in head-to-head matchups with every other candidate. Voting paradoxes [i.e., when majorities favor A to B, B to C, and C to A] have no Condorcet winners). In simulations, 3-2-1 voting seems to satisfy the largest number of voters the most often, closely followed by approval voting. Both are simple to explain, although approval is simpler (indeed, alongside WTA, it’s the simplest overall). 3-2-1 voting has voters rank candidates as good, OK, or bad, and counts votes as follows:

  1. Select the three with the most good votes;
  2. Select the two with the fewest bad votes;
  3. Select the one most frequently ranked above the other.

In approval voting, voters simply vote for every candidate they like, and whoever gets the most votes wins.

77.

As long as we’re reforming our elections, let’s go all out:

  • Consign the Electoral College and Citizens United v. FEC to the historical rubbish bin they deserve.
  • Uniformly replace first-past-the-post with approval voting.
  • Add new proportionally elected national and state representatives.
  • Strictly prohibit gerrymandering.
  • Universally register all legal adults to vote.
  • Universally reënfranchise all rehabilitated convicts.
  • Publicly finance elections.
  • Give all eligible citizens free national voter IDs.
  • Puppies for all. (This one is a joke… or is it?)

Only a few of these would require constitutional amendments; for instance, the National Popular Vote initiative could fix the Electoral College.

[The shortest-split-line method is an excellent way to prevent gerrymandering:

  1. To divide a map into n districts:
    1. If n is even, start with a = b = n / 2
    2. If n is odd, start with a + b = n and a = b + 1.
  2. Find all possible population distributions of the ratio a to b.
  3. Draw the shortest possible line that divides the map into those ratios.
  4. Now repeat the process for both a and b until all districts are drawn.

This process is mathematically unbiased, and the only way to manipulate it is by manipulating the census. Which, to be fair, the current administration is trying to do.

Also, I have a hot take: If you pay income tax to the United States government and are not currently serving a prison sentence for a violent crime, you deserve a say in our elections whether you’re a citizen or not. I seem to recall a tale of colonists throwing tea into a harbor once upon a time to protest what they termed “taxation without representation.”

(I wanted to close this with a sarcastic “Sounds apocryphal.” Unfortunately, I ran into a corollary of Poe’s Law: it’s impossible to write sarcasm that someone won’t mistake for sincerity.) –Future Aaron]

78. Oddly, one of the few nations that accepted Jewish refugees during WWII was Rafael Trujillo’s Dominican Republic, apparently because Trujillo considered us white. He was still monstrous overall.
79. In reality, acts like the economic stimulus and the Detroit bailout likely alleviated the crisis substantially (compare to Europe, whose misguided ‘austerity’ policies led to a comparatively sluggish recovery), but voters blamed Obama anyway. The Great Depression lasted from 1929 to 1941; the New Deal alleviated it (a cutback in its policies caused a 1937-1938 recession), and World War II ultimately finished it off.
80. There are present-day societies that could be considered to exist along largely anarchist lines, such as southern Mexico’s Zapatista Societies of Good Government (autonomous regions that have been seized by indigenous rebels in rural and urban areas) and some communal assemblies in the Rojava region of North Syria and western Kurdistan (inspired in part by Murray Bookchin’s political theories).
81. For example, one of Makhno’s accusers was a former disciple who’d had an extremely acrimonious falling out with him and had made no such accusations beforehand.
82. It’s also why I’m not addressing common questions like “how would anarchism handle violent crime?” in depth. There are detailed answers to these questions already; TV Tropes, of all websites, has a thorough, readable primer on anarchy in its Useful Notes section with links to other resources (I particularly recommend An Anarchist FAQ). However, we’re unlikely to see the day when such questions need addressing.
83. Other organizations that do fantastic work have also received donation spikes recently, like the SPLC, the Trevor Project, RAINN, and Planned Parenthood. John Oliver’s first post-election episode recommended donating to several worthy nonprofits, which all received influxes of new cash after it aired.
84.

Lovecraft wrote in a letter to Catherine L. Moore dated 1937-02-07:

Holy Hades—was I that much of a dub at 33 ... only 13 years ago? There was no getting out of it—I really had thrown all that haughty, complacent, snobbish, self-centered, intolerant bull, & at a mature age when anybody but a perfect damned fool would have known better! That earlier illness had kept me in seclusion, limited my knowledge of the world, & given me something of the fatuous effusiveness of a belated adolescent when I finally was able to get out more around 1920, is hardly much of an excuse.

Lovecraft expresses revulsion at his past actions (“anybody but a perfect damned fool would have known better”, “fatuous effusiveness of a belated adolescent”), states several reasons they were wrong (“haughty, complacent, snobbish, self-centered, intolerant bull”), and explicitly calls his own explanation of his behavior “hardly much of an excuse”. A lot of people today could stand to learn from this apology.

To my knowledge, this letter was only recently published, and it is not yet especially well-known, so public perception of Lovecraft hasn’t yet caught up. His political shift from arch-reactionary to socialist is quite a marvel to behold. The Great Depression and the New Deal seem to have been the most important factors. It’s fascinating how his themes shift as his politics evolve; works from this period (e.g., At the Mountains of Madness, written 1931, published 1936) are accessible to several interpretations. I think this is one reason he’s one of my favorite horror writers.

Back to top · Table of contents · My portfolio · Contact me · Website index