Categories
Diagnoses

The Lucas Effect

by Jemma Johnson-Shoucair | Diagnoses | Spring 2021

Image by Katie Frevert

Hubris, tragic boredom, and the groundbreaking digital effects technology behind one of the biggest letdowns in 21st-century cinema.


I frequently lack the confidence to fail. Twenty-one and a recovering perfectionist, anxiety stays close as I begin any new project. Am I good enough to be doing this? jabs at the back of my skull. If I might be terrible why would I even try? flutters around with thoughts of the pandemic and remembering to feed my cat. But for some people it’s easy. Standing in the public eye, they swagger up to the metaphorical plate, put their dreams on the line, swing, and miss. Strike after strike. Unfazed. I want to learn this technique. To grin in the face of ridicule and trust my gut, no matter how misguided. Almost a year into quarantine, I found an unlikely teacher lurking deep in the Disney+ streaming options.

Normally, I like bad movies, but George Lucas’s writing could drive anyone to drink. It was November 2020, deep in the bowels of isolation, and my friends and I hoped to maintain our sanity by watching all nine Star Wars movies in order. Our last coping mechanism quickly turned into a painful drinking game when we reached the dreaded Episode II: Attack of the Clones. We looked up rules and made some of our own. Drink when a lightsaber is drawn; drink when someone is beheaded; and the one that dragged us through the movie: drink whenever there is an awkward moment between Anakin and Padmé. 

Anyone who has seen the prequels can guess how our night ended. And anyone who has seen the prequels might also have this image charred into their brain: Anakin and Padmé flirting in a luscious, green hillside meadow. Even while inebriated, I knew something was off about this movie. Beyond the terrible dialogue, the computer animation appeared jarring, but I couldn’t place why. The meadow scene stands out as one of the only moments in the movie that seems unaltered—no green screens, no animation—but that assumption would be wrong. To its detriment, Attack of the Clones was one of the first movies made to have every frame of every shot carefully digitally enhanced.

Released in 2002, Attack of the Clones reflects the ideas of its time. New technology was everywhere—in music, phones, film—and people were just learning how to harness it. The general consensus in pop culture was excitement: more is more. More computer-generated images (CGI), more over-synthesized pop music, more drama. The year 2000 marked a new era of entertainment with the creation of digital cameras designed to replace their film counterparts in the movie industry. Attack of the Clones hit the box office charts as one of the first movies ever filmed 100% digitally, pairing digital cameras with digital effects. The result? A revolutionary film exemplifying the importance of failure. It is a product of its time. A product of two years worth of computer animation, bewildered actors in plain blue rooms, optimistic fanaticism, and a couple of very confident white men. 

***

Entranced by the original Star Wars, fans around the world stood in line to watch their favorite characters come alive again in the prequels. Many people left the theaters disappointed and confused, but still trekked with little optimism to sit through the next movie. Soni Gupta saw the original Star Wars as a child.“We had never seen anything like it,” she tells me from my computer screen. She describes the wonder she felt from watching the X-wings race through the Death Star trench, tense and exciting. And how she watches each new Star Wars movie, including Attack of the Clones 25 years later, “wanting to recreate that feeling… and it never does.” But even now, any time another sequel gets released on the big screen, Soni and her group of faithful Star Wars friends journey to see it, still hopeful.

Another Star Wars fan, Hal Sundt, was 12 years old when he saw Attack of the Clones in the theater. Thrilled at the prospect of the prequels, he entered the loud, dark room with high hopes. He returned devastated. Years later he tells me over Zoom, “I do distinctly remember walking out of Attack of the Clones being like, ‘what the hell was that?’” 

Every Star Wars fan I know chases the same feeling. With each new movie there is a moment of stillness when the lights dim, and words slide out across the screen. Your heart lifts as you give in to the reality of aliens and humans fighting for a peaceful future led by a Republic protected by magical monk cops. But the prequels disappoint, with Attack of the Clones the leading offender. Soni and Hal were at very different ages and stages of life when they watched the second Star Wars movie, and they both use the same word to describe it: “unmemorable.” 

***

Haunted by the startling computer animation, I needed to confirm the tugging feeling in my gut that Attack of the Clones held more odd production secrets than your standard bad action movie. Mark McGuiness speaks in a charming northern Irish accent that’s subtle enough for some Americans to think he’s Canadian. He lives in Belfast and is in his eighth year working as a special effects technician in the film industry. It’s about 1:30 P.M. (6:30 P.M. in Ireland) on St. Patrick’s day, and in between statements about Star Wars, he sips his Guinness to celebrate. Some projects Mark’s worked on include Game of Thrones and The Northman, and as a nerd and “film buff,” he excitedly agreed to talk to me about one of the worst Star Wars movies. “Somebody had to kind of do it first for everyone to realize that it’s just not how you really want to make a film,” Mark says, regarding the unique and unfortunate use of CGI in Attack of the Clones. I reached out to him about a month after I began my research, already deep into a rabbit hole, with everything I learned increasing my suspicions that there was something off about this movie. It only took about 10 minutes of geeking out with Mark to confirm my intuition.

With each new movie there is a moment of stillness when the lights dim, and words slide out across the screen. Your heart lifts as you give into the reality of aliens and humans fighting for a peaceful future led by a Republic protected by magical monk cops. But the prequels disappoint, with Attack of the Clones the leading offender.

Attack of the Clones was revolutionary in its use of computer animation, the process of digitally rendering moving images. In 2002, Hollywood animators were just beginning to refine the computer animation technology developed in the late ’60s. Looking back on Episode II, the digital effects are laughable. But the first computer animation was released just 35 years prior. Compared to the slow, shaky lines of the 1967 computer animation Hummingbird, a fully animated Yoda jumping around with a lightsaber rivaled witchcraft.  

Unlike computer animation, green screen is an ancient film technique. Green screen, or matting, consists of using a single-color background to extract the foreground image and then change the background image. Patented in 1918 by Frank Williams, a black-backing matte was used in the ’30s horror movie The Invisible Man, and a white matte background was used in ’20s Disney cartoons. In the ’50s and ’60s, engineer and inventor Petro Vlahos invented the basis for all blue and green screen technology we have today. In the Star Wars prequels, Lucas used matting in a completely new way. He filmed actors inside plain blue rooms and relied on visual effects artists to fill in the rest, a process made easier by the creation of digital cameras.

Sony and Panavision built the HDW-F900 digital cinema camera for the filming of Attack of the Clones. Writer and director George Lucas thought that digital film, with cameras relying on sensors instead of celluloid, looked better, and he wanted to embark on creating a film 100% digitally. Besides the debatably more attractive image of the still-developing cameras, this tech also offered a new speed. Film cameras required scanning the film into a computer in order to render digital effects. The HDW-F900 camera spat out a cassette tape and within 50 minutes the images could be edited on a computer. To a director pining for digital effects in every shot, these cameras were the future.

When George Lucas birthed the original Star Wars trilogy in 1977, he felt constrained by traditional film practices. Shooting on film requires a certain order for capturing and editing shots. Digital cameras paired with CGI allowed Lucas to put together each shot element by element, the way he preferred. In regards to the assembly line-like process of traditional filmmaking, Lucas said, “I don’t work that way. I’d much rather kinda go around, put things together, look at them, then move them around again, then look at them until I get them the way I like them.” Lucas likened this process to painting and cooking. He worked to construct a whole image from various elements, sometimes taken years apart, rather than capture a scene in a single take.

When I tell Mark that the frantic “Droid Factory” scene with Anakin and Padmé, an example of quintessential Lucas filmmaking, took four and a half hours to shoot, his mouth hangs open for a second. “Wow. That’s inSANE. I remember spending 12 hours filming a bush in Game of Thrones… So the fact that they did that in four hours is…” and trails off. In contrast, Lucas directed the Droid Factory scene as a fast-paced action sequence where our heroes, along with C-3PO and R2-D2, fall onto a droid-making assembly line and have to frantically escape without being maimed by any of the deadly looking construction machines. The scene wasn’t even in the original cut of the movie, but Lucas added it in reshoots because he wanted a fast-paced section to cut up all the dialogue-heavy scenes. It was filmed so quickly in part because the set consisted of a single blue conveyor belt on a blue stage. The actors then ran around fighting, jumping, and interacting with their invisible environment. At one point, Natalie Portman, who plays Padmé, pauses from running on the conveyor belt, looks down at Lucas, and says, “This is ridiculous. This is just a mean joke. This isn’t part of the movie at all.” 

Lucas responds with the confidence of someone financing his own movie franchise: “It will look good.”

***

Layering images in film is nothing new. Walt Disney composited animations over people in his first cartoons, and George Lucas used miniature models as set pieces in the original Star Wars. But the editors working on Attack of the Clones added a whole new, well… layer. When I started researching this film, I was under the impression that the movie was created with an insane amount of blue screen. This is true. But many of the indoor sets were also built as scale models. The Kamino set is an example of this. Lucas filmed most of the scenes set inside the buildings on the water planet Kamino in an entirely blue room, giving the actors general guidance as to where to walk and look. Then, the art department built a to-scale miniature of the set. 

For anyone who hasn’t seen the movie, imagine a long, bright glass hallway with sterile white flooring and crossbeams. Once built, the camera shot the interior of the model as if it were a regular set. Then, George Lucas-style, the shots were compiled in a computer where the actors were virtually placed into the model, and the blue screen outside the model was digitally altered with a background that suited the set. Finally, everything in the scene was digitally enhanced to create the look Lucas wanted. To Lucas’s credit, this process does save time and money. But to his discredit, most of the characters in these scenes appear as if they were placed in a strangely lit sci-fi drawing. 

Mark points out that when you watch these scenes, “It’s almost like you’re seeing them pass through [the set]… There’s no texture, there’s no weight.” Weightlessness has an incredible effect on our perception of reality, especially in an art form dedicated to illusion. Humans possess a talent for distinguishing virtual reality from our own, even with the most incredible CGI. So forming scenes with multiple mediums layered onto each other gives us the impression that something is off, even if we don’t know the exact issue. I ask Mark if, in all his eight years of working in cinema and 29 years of film buff-ery, he’s ever heard of another movie produced in this way. He offers up a validating and unsurprising response: “That seems to be very unique to that film.”

***

Much of post-production was dedicated to creating digital characters in a “real world.” Against the keen human eye, visual effects artists must work diligently for a digital character to blend seamlessly into a scene with real people. Included in the film were multiple fully animated characters, including the slender gray-blue Kaminoans, the infamous fully digitally rendered Yoda, and Dexter, an alien with a mustache who is present for one scene. Dexter is a large tan creature with four arms, a saggy chin pouch, scaled head ridges, and a mustache with human hair. He only exists to tell Obi-Wan Kenobi what planet a poisonous dart came from. In a scene populated mainly by extras wearing alien costumes, Dex feels out of place. While the animators were excited to create a completely computer-animated creature, there isn’t much payoff in the film. Not only does Dex move in an unsettling way, but the animators faced a problem when Obi-Wan and Dex hug. During filming, Ewan McGregor, who played Obi-Wan, was instructed to hug the air. When the animators formed Dexter in post-production they discovered an issue: McGregor’s arms didn’t line up with Dex’s body line. Their solution? Animate McGregor’s arms to fit the shape of a digital character. 

People were paid to animate McGregor’s body an unnerving amount of times throughout the film. Teams of animators worked to recreate his entire body for certain action shots from the slippery fight against Jango and Boba Fett outside the cloning facility. There’s even a rumor that his beard was computer-animated in some scenes because the consistency and texture change noticeably from shot to shot. While this could have been due to bad makeup, Lucas deciding to entirely animate Obi-Wan multiple times was a very unusual decision. The use of digital stunt doubles was only popular when a stuntman couldn’t safely perform a task, like the people falling from the deck of the Titanic as it sank. Even now, the incredibly CGI-heavy Avengers franchise opts for real-life actors and stunt doubles rather than digital ones. Lucas’s attitude towards this bizarre choice to repeatedly animate a main character can be summed up by his response for why he only used digital cameras for the movie: “People say why am I doing this? You know, the real question is why not?” 

***

One of the first scenes I mention to Mark is Dex’s diner. He immediately agrees that it’s one of many unnerving moments in the film and points out the fake harsh sunlight dominating the characters’ faces throughout the scene. “The technology really wasn’t there… [CGI] should be used to help further a story as opposed to just building everything around it.” Mark laments that we loved the original Star Wars movies for the characters. Han Solo, Luke Skywalker, and Darth Vader uplifted our childhoods more than the environments of Endor, Hoth, and Tatooine. So in revolutionizing digital filmmaking, Lucas sacrificed our beloved characters for not-that-impressive backgrounds. We watch characters walking, talking, and sitting for most of the movie to allow for fantastical, unrealistic landscapes. The world grew to accommodate digital effects, instead of digital effects enhancing the world. 

Two years of animation development culminated in one of the most forgettable scenes in the movie. Rooted in Star Wars lore, Lucas wrote the film excited to finally depict the hallowed “Clone Wars” referenced in the original Star Wars film, A New Hope. In the third act, for about eight minutes that feel like 20, we get to see the anticipated Battle of Geonosis. The beginning of the Clone Wars. Rob Coleman, the animation director, describes the scene, saying, “It had everything that we as teenagers of the ’70s and early ’80s saw in those original movies, and that’s what you do it for.” All the directors––of animation, animatics, etc––were ecstatic to finally watch the epic battle play out. Except they were so excited to create the battle that they missed an incredibly important part of Star Wars: the characters. Instead of observing our heroes navigating a treacherous battlefield. We sit idly as CGI clones and droids destroy each other, only occasionally cutting to the reactions of people we care about. Once again, we witness the creators of the film sacrifice our connections to characters so they can use visual effects to play out their childhood space-war fantasy.  

The fact that the men creating Attack of the Clones believed they were doing something incredible is more tragic than Anakin and Padmé’s forbidden love story. My heart goes out to the excited Star Wars fans who went to the premiere of this movie where all they got were dragging CGI battles with confusing context. I must’ve been around 10 years old when I saw Attack of the Clones for the first time, but that didn’t stop me from feeling disappointed. Now, 11 years later, I am incredibly frustrated that Lucas and company passed over an incredible story in favor of unconvincing visual effects. During our interview, Mark sums it up best: “If I grew up in the ’60s and was a child in the ’70s and watched Star Wars, and then was an adult watching [the prequels] I would feel so betrayed.” Unfortunately, the directors were so enthralled with manipulating their current technology that they lost sight of what made Star Wars more than just another action series.

We watch characters walking, talking, and sitting for most of the movie to allow for fantastical, unrealistic landscapes. The world grew to accommodate digital effects, instead of digital effects enhancing the world. 

To be fair, no one should be surprised. George Lucas got lucky with the original trilogy. We could look past the clunky writing in the ’70s and ’80s to enjoy, as Mark puts it, an “operatic space western” filled with tough, relatable, and entertaining characters. But Lucas had help making these films hits. He only directed the first film, and his now-ex-wife was responsible for editing the originals into the series we know today, including the Death Star trench battle and the decision to kill Obi-Wan Kenobi. Now we have the Lucas-run prequels. With the success of the originals on his hands, Lucas himself acknowledged, “Very rarely do I not get what I want.” No one wanted to say no to Lucas––a white man who strides through life with the reality that failure usually has fewer repercussions for him. Ben Snow, the Visual Effects Supervisor, even admitted to feeling like he was part of a weird science experiment in how the animators were pushed to further digital technology. From the start, digital effects were more important than any connection with the story. And so we lose our love for the ancestors of Luke, Princess Leia, and Han. We lose the excitement and tension of X-wings veering across the Death Star, and we lose the feeling of Star Wars. All for one man pushing to revolutionize a field years too soon. 

***

As with most tragically boring films, Attack of the Clones hides heroes in the most unexpected places. The protagonists of this story sat behind computers, drawing tables, and workbenches. Under the misguided direction of George Lucas, a team of 60 animators and 340 artists and technicians labored tirelessly to draw this fantastic failure of a movie into existence. Ironically, thanks to the visual effects staff, some shots had just the right amount of CGI. For example, in the Clone Factory scene where our characters dodge invisible metal stamps and escape from caldrons, you need to look closely to see that our favorite droids C-3PO and R2D2 are computer-animated. In a movie saturated with unnecessary effects, there are about eight shots of subtle reprise: it feels like our Star Wars again. Amid the smoldering pile of  ashes that is Attack of the Clones, the visual effects staff laid a framework for future animators to adopt. It only took failure in approximately 1,992 other shots to get there.  

I no longer want to strive for the George Lucas swagger, swing, and miss. White men have always dominated writing, directing, and starring in action movies. Their presence in this genre is the norm, and it’s an exception for anyone else to be allowed the same visions and mistakes. Lucas and his team of directors pushed the digital frontier bolstered by the prospect of fewer financial and social consequences if they failed. We have Attack of the Clones as a result. A mess of a movie, flaunting the hubris of its directors in our faces as we suffer their consequences. Sure, failure means less when this is your ballgame, your plate. Your fans will cheer you on no matter what. But you lose an important skill along the way. True growth comes from failure, but you can’t learn when the score is rigged for you. 

Instead of a white man kind of confidence, I hope to cultivate the animators’ quiet and passionate determination. It takes a lot of love for what you do to sit behind a desk for hours staring at your hand, wondering how Yoda would move his. Sitting with a problem and trying, trying, trying until something clicks. Advancing technology through something you love instead of acting out of ego and desire for fame. This method of failing is a whole new ballpark. I hold such appreciation for the people who made the harsh lighting in Dex’s restaurant, Yoda’s unnerving wobbly ears, and Ewan McGregor’s arms. They were people manipulating an art form that originated from slow, squiggly lines. We can look back now, laughing and ridiculing their work, only because others repeated their successes and created their own failures. Now, we have faster-moving and better-lit squiggly lines, thanks to the animators, spending hours tirelessly trying to create something new, innovative, and revolutionary.

Categories
Diagnoses

(Re)Creating the Past

by Sam Schuman | Diagnoses | Spring 2021

Stella Mulroney, Undoing

The distinction between “correct” and “true,” or what we talk about when we talk about “history.”


In elementary school, I loved few events more than the Scholastic Book Fair. The Halloween costume parade and Field Day were a treat, but they paled in comparison to giving up a whole class period to venture down to the library (or sometimes a requisitioned art classroom) where I could revel in the glossy covers advertising the latest and greatest in kids’ lit. It strikes me now that this is a relatively wholesome way to transform children into consumers, but I digress.

I was a bookworm, always finishing my classwork early so I could head over to the library nook and bury my nose in How to Eat Fried Worms, or an installment of the Boxcar Children. The Book Fair was the logical next step: a whole room lined wall-to-wall with shelves and tables advertising all manner of material for the up-and-coming reader, from the Magic Tree House to Frindle

Every year, I looked out for the 10 True Tales series, written by Alan Zullo. The conceit was self-explanatory: each book contained 10 nonfiction stories organized around a theme. Some topics were intense, but unobjectionable: Young Survivors of the Holocaust, Surviving Sharks and Other Dangerous Creatures. But zoom out a bit, and a recurring focus emerges: Teens at War, Battle Heroes: Voices from Afghanistan (and a similar book for Iraq), D-Day Heroes. Many of these books are essentially nationalist military histories, recounting deeds of heroism committed by intrepid GIs as they fought for the American way at home and abroad. Reading the series as a kid, I hung onto every word, picturing the battles that Zullo narrated at the pitch of fiction. Children don’t read books with a critical eye to ideological framing; Zullo called these men heroes, and I believed him.

Teens at War is typical of the series. The description from Zullo’s website starts out alright: “Ever since the American Revolution, teenagers have risked their lives to serve in every war this country has fought.” A paragraph later, though, some out-of-pocket framing emerges: “In warfare, most underage soldiers showed their zealous spirit and raw courage, but few were properly prepared for the horrors they would experience.” We’ve now entered what seems to be a pro/con list for letting children serve in the military, although we’re never quite told whose judgment is being applied. The next sentence describes these minors, as young as 12, as “warriors.” Scholastic’s publisher’s description labels their military service “patriotic” and their stories “inspiring.”

With the benefit of hindsight, I see 10 True Tales as pretty gross. But, ideological window-dressing aside, these books are, in an objective sense, accurate. The series boasts that it is “based on true events ripped from the headlines or taken from little-known moments in history.” And that’s the problem: these books are sold to kids as “history” because the events are “true,” which tacitly implies that their rhetorical framing as heroic, inspiring narratives is also somehow “true.” Zullo admits to dramatizing events and recreating dialogue (which sometimes includes racial slurs, “for realism”). But there is still a false consonance between factual veracity and narrative validity in how these “true” tales are presented. And while a kids’ author like Zullo might seem an unlikely point of entry for a screed on the blurry line between historical fact and truth, this is exactly where much of the trouble lies: to make the past accessible, works of popular history conceal the process by which masses of historical documents are converted into ideologically active stories. To understand this process, it’s important to ask: apart from telling a “true” story, what does history do, and what is it for?

***

The American historian Hayden White spent 10 years researching and writing in order to offer a possible answer in his 1973 book Metahistory: The Historical Imagination in Nineteenth-Century Europe. The book studies a number of influential historians and philosophers, tracing the development of the discipline and the idea of “history” across the 1800s, but its analytical framework is fundamentally atemporal. It is a study of history as a rhetorical practice of writing and storytelling, or a “verbal structure in the form of a narrative prose discourse,” in White’s own heady phrasing.

White is concerned with how history, as a linguistic construction of the past, is created through a specific mode of thinking which he terms the “historical consciousness,” or how past events are strung together to create a recognizable historical narrative. Historical consciousness manifests in the formal aspects of a historian’s narrative, evident in their choice of “emplotment” (what sort of dramatic plot arc does a historical narrative take? Comedy? Tragedy?) as well as their historical “grammar” and “syntax”—in other words, the process by which historians fit the past into a coherent story. This formal question of how histories are structured is fundamental to what White calls the “problem of historical knowledge:” what does it mean to think about something “historically,” and what is the point of doing so?

It’s here that we arrive at the distinction between a “fact” and a “truth”—or, to be snarky about it, the difference between something being “wrong” and “dumb.” Metahistory argues that a historical narrative always implies an ideological perspective by virtue of the way it is told. Any history will have characters, and some of those characters tend to emerge as heroes or villains, at least relatively. Certain historical entities are identified as problems or obstacles, and consequently more or less ideal. To take Zullo as an example: American child soldiers are heroes, and anyone trying to kill them is a villain. Other countries are the problem, and the American military is the solution. A historical story is told through facts, but its “truth” occurs at what White terms the “precritical” level. The historian must decide what kind of story to tell before telling it. 

The historical discipline differentiates between “history” and “historiography.” Catch-all definitions are unwieldy, but broadly, “history” is the study of the past, and “historiography” is the study of the historical discipline and its methodology. A “history” is one story, built on specific historical evidence and most often presented as a linear narrative. It attempts to explain why a particular thing happened in the way that it did. Historiography encompasses many histories, and often explains why many things happened the way that they did. It is, as White held, a fundamentally existential pursuit: a particular historiographical viewpoint amounts to an argument about the way the world works. And it’s at this level that history might be “correct” but also “wrong.” Historical whos, whats, whens, and wheres are often settled, and it’s fair to judge a history as more or less accurate on those grounds. But the historical why is virtually never a provable fact. It’s a product of interpretation and argument.

***

At my parents’ house, there is an entire shelf dedicated to housing a series of nonfiction books that my dad grew up reading in the ’50s called Landmark Books. Published between 1950 and 1970, the series employed well-known contemporary authors, some of them Pulitzer Prize winners and not one of them an academic, to cover a wide range of American historical topics, from Paul Revere and the Minute Men to The F.B.I to a Shirley Jackson-penned telling of The Witchcraft of Salem Village. They’re packaged as factual histories, and their perspectives are exemplary of post-WWII American historiography, with all of its assumptions about American exceptionalism and a triumphalist notion of historical progress. 

The 54th book in the series is Robert E. Lee and the Road of Honor, written by journalist Hodding Carter. According to one biography subtitled “The Reconstruction of a Racist” (note the implied redemption plot arc), Carter had been a white supremacist until graduating college, after which he fought for an end to Jim Crow. Authorial intrigue notwithstanding, the book is essentially a hagiography of Lee, chronicling his life from birth to death as a “great American who was guided by something he believed to be the most precious quality in life… a sense of honor.” The book is researched: it quotes primary-source letters at length and offers plenty of historical tidbits about Lee’s upbringing. It gets the “facts” right. The problem is that those facts are used to turn Lee into a hero. What is emphasized is not his role as the Confederacy’s military leader, but the admirable “sense of duty and honor” to his home state of Virginia which compelled him to side with the South. “Honor”—a term never explicitly defined—is used to separate Lee’s assumed motivations from his actions, trumpeting the former and downplaying the latter. The book’s penultimate page claims that “gallantry is our common inheritance, whether our ancestors lost with Lee or won with Grant.” As with Zullo, the facts are right, but the conclusions drawn from those facts are ideologically blinkered—relative and debatable.

Hayden White offers a solution to this nebulous problem of historical objectivity (or lack thereof) in accepting that historical meaning is ultimately subjective: it is formed, rather than found. History is not and can never be value-neutral. “The historian performs an essentially poetic act,” he writes, “in which he prefigures the historical field and constitutes it as a domain upon which to bring to bear specific theories he will use to explain ‘what was really happening’ in it.” Before explaining what a history means, the historian has to construct that history. The past is only—is always—a product of the present.

This idea was (and perhaps remains) controversial, and critics of White who decry the relativism inherent in his position have raised the polemical “Nazi question:” if historical meaning is constructed, imposed rather than essential, then on what historical grounds is one to challenge fascist historiography or even outright Holocaust deniers? White offers several responses, my favorite being his dry observation that “The Nazis were anything but relativists.” But a more instructive answer is that history is ultimately a moral and aesthetic pursuit rather than a scientific one, so fascist history can and should be dismissed precisely because it’s fascist. White cautions against treating historical revisionists “as if they were engaged in the same enterprise… instead of treating them with the contempt and derision they deserve.”

Historical whos, whats, whens, and wheres are often settled, and it’s fair to judge a history as more or less accurate on those grounds. But the historical why is virtually never a provable fact. It’s a product of interpretation and argument.

There is no objective position within history. It’s a destabilizing idea, one that denies history as a neutral proving ground for ideas. It’s impossible to argue, for example, that the collapse of the Soviet Union is proof that socialism is an unviable economic system, or, for that matter, that the Russian Revolution is proof that monarchy is an unviable political system. Those conclusions are theorized, not merely discovered. It may be true that, as The F.B.I. recounts, J. Edgar Hoover was nicknamed “speed” in high school, and that he chose to put his own life at risk in New Orleans in 1936 when he was among the FBI agents who arrested prolific criminal Alvin Karpi. But those isolated facts only become meaningful or usable as historical “evidence” once assimilated into a broader narrative about the FBI that has its own subjective viewpoint. The book’s ultimate historical stance that “every American, young or old, can be proud of his F.B.I.” is a value judgment, not an objective conclusion.

***

Like 10 True Tales, the Landmark Books series is for kids, and it’s particularly easy to dunk on with the benefit of a half-century’s hindsight. But contemporary histories, even didactic ones, still position themselves as purely expository, containers for information sans angle or bias. My high school history textbook, the American Pageant, certainly did. Like many a history textbook, the AP purportedly offers an accurate history that walks a neutral line through historical debates—as if it were possible to find a stance that is not itself an implicit position. Its 16th edition starts with the “Founding of the New Nation,” and asks, “How did the colonists overcome the conflicts that divided them (assumption one), unite against Britain (assumption two), and declare themselves at great cost to be an ‘American’ people (assumption three—does this even mean anything)?” The answers: “reverence for individual liberty, self-government, religious tolerance, and economic opportunity.” 

Along with this self-congratulatory telling is an acknowledgment of the dark side of the early American mentality (or at least AP’s telling of it): “a willingness to subjugate outsiders,” including Indigenous Americans and enslaved people from Africa. A putative commitment to exploring both the good and bad of history obscures that the American Pageant has already made a litany of presuppositions about what constitutes “good,” “bad,” and “history.” At no point is the reader pushed to ask if there is another way to tell this story.

My high school history teachers were progressive. We read some Howard Zinn, and we were taught from the first day of our Civil Rights Movement unit that race is a construct intended to mitigate class conflict. Liberal critiques of American history were common, even encouraged. But dark historical facts never contradicted the fundamental historiographical truth of American progress, of the strength and wisdom of our institutions. Besides, even if they had wanted to (and I suspect they might have), my teachers couldn’t have strayed too far. The textbook was the textbook, and we had an AP exam to take at the end of the year. To my knowledge, only two teachers in the school assigned the Communist Manifesto while I attended. They both taught English.

I took AP United States History over two years, with a different focus each semester: social movements, war and conflict, economics, and finally a history of Revolution-era philosophy. This last focus, known as intellectual history, was particularly interesting to me at the time, and has since become my primary research interest. How did people think in the past? I learned about the enlightenment philosophers: Locke’s and Hobbes’s theories about people in the state of nature, Rousseau’s social contract, Montesquieu’s separation of powers. We were taught that these were the seminal ideas that led to the American state, and, implicitly, that these ideas were superlatively good, if not flawless.

The buck stopped there. With few exceptions, our history of ideas began and ended in the 18th century. You’d think no one had had a worthwhile thought about government since the ink dried on the Constitution. Our philosophical history was strangely ahistorical, because it had been intensely “prefigured,” to use White’s term, intended to contextualize (and legitimize) American institutions more than to stimulate curiosity beyond the clear predetermined takeaways. Ideological questions were presented as done deals. I got As in history, and I believed that the study of history was important, but I graduated high school unable to articulate exactly why. What was the point of asking questions when the answer was the same as it ever was?

***

In my first semester at Oberlin, I started a history major, and things began to click. Professors could explain clearly why the study of history was important, why it was an urgent task. I learned that the “past” is often not really past, because historical memory is a building block of identity. I learned to look for historiographical slant: If this is the story, then what is its lesson? Hayden White is sometimes taught in Historical Methods, the major’s required methodology course. I finally figured out that the point of history isn’t to be “objective” or unbiased. Historical narratives imply a historical viewpoint, which implies a historical subject, which in turn implies subjectivity. 

Reading Metahistory for my own research this year, I learned that academic, source-based history dates back less than two centuries. Thucydides and Plutarch wrote “history” millennia ago, but their historical consciousnesses were drastically different from those of modern historians. For much of its existence, history has been a branch of politics or rhetoric. In the 19th century, the first recognizably modern historians gave the discipline its own autonomy by claiming that it could be purely rational and objective, scientific in the way that the natural sciences were. From there, history has alternately been defended as “science” insofar as historians deny any distortion of the facts, and as “art” insofar as it doesn’t have a unified formal method. Whatever it may be, our conception of “history” is itself historical. There’s no escape.

Which is all well and good. White wrote that the purpose of history is to educate people of “the fact that their own present world had once existed in the minds of men as an unknown and frightening future, but how, as a consequence of specific human decisions, this future had been transformed into a present.” In understanding how we created the present, we become better equipped to create our ideal (defined subjectively, of course) future. Such an understanding of history doesn’t foreclose upon the importance of getting the facts right. History is not fiction; its claims to reveal something about the real world only work if they attend to things that actually happened in that real world. But the facts are the beginning, not the end, of what makes history “true.” Historical narratives exist because someone wants you to see the past in a particular way, and by extension to feel a particular way about the present—and facts, at the end of it all, have very little to do with that.