Thursday, February 27, 2020

Will Trumps Supreme Court remake America?

How Will Trump’s Supreme Court Remake America?

On abortion, gun rights and more, the future could be determined by how fully the court’s new conservative majority embraces a rigid understanding of the Constitution.





By 

In July 2013, Aimee Stephens wrote a letter to her co-workers and her employer at a funeral home in the Detroit area, where she had worked for six years. “What I must tell you is very difficult for me and is taking all the courage I can muster,” she told them. “With the support of my loving wife, I have decided to become the person that my mind already is.” After four years of counseling, Stephens explained that she was transitioning from being a man to being a woman, and so, at the end of an upcoming vacation, she would come back to work as her “true self,” wearing women’s business attire. Stephens’s boss told her that her self-presentation would harm his clients and business, and he fired her. 

In October, the Supreme Court heard a lawsuit from Stephens challenging her termination based on Title VII of the 1964 Civil Rights Act, which prohibits employers from discriminating on the basis of “sex.” When members of Congress passed the initial law, and when they later amended it, they didn’t say they were protecting gay or transgender people. The question in front of the court was whether the plain meaning of the word they chose — “sex” — did so anyway. The court also heard the claims of two gay men who were fired from their jobs. 

Stephens’s lawyer, David Cole, argued that she was fired because she didn’t conform to her employer’s ideas about gender. Stephens didn’t fulfill an “expectation that applies only to people assigned male sex at birth,” he said, “namely, that they live and identify as a man for their entire lives. That is disparate treatment on the basis of sex.” 

Justice Neil Gorsuch, who was appointed to the Supreme Court by President Trump in 2017, asked Cole, who is the national legal director for the A.C.L.U., how judges should now interpret an “old” law, written in a different era. This question is of particular importance to Gorsuch, who says he uses a method called textualism for deciding cases that involve a statute like Title VII. He believes that judges should focus only on the plain meaning of the text. When the court interprets the Constitution, Gorsuch subscribes to a similar (though not identical) theory, originalism, in which judges adhere to the meaning of the Constitution as people understood it when it was ratified. Because they are simply looking at the words before them, which they believe have a single, fixed meaning, judges like Gorsuch say their method allows their decision-making to be “value neutral” — in contrast to judges who consider a law’s purpose or consequences. 

Cole responded to Gorsuch: “We are not asking you to apply any meaning of ‘sex’ other than the one that everybody agrees on as of 1964, which is sex assigned at birth, or as — as they put it, biological sex.” He added, “We’re not asking you to rewrite it.” 

“I agree with that,” Gorsuch said. “Assume for the moment I’m with you on the textual evidence.” If all that mattered to Gorsuch was the text, Stephens and the other plaintiffs might have the fifth vote they needed — along with those of the four liberal justices, Ruth Bader Ginsburg, Elena Kagan, Sonia Sotomayor and Stephen Breyer — to win a huge victory for gay and transgender rights when the justices decide the case by the end of the court’s term this June. 

In one sense, that result would be a huge surprise. For the first time in generations, the court has a majority of five staunchly conservative justices — Gorsuch, who filled Antonin Scalia’s seat; Trump’s second appointee, Brett Kavanaugh, who replaced Anthony Kennedy in 2018; Samuel Alito; Clarence Thomas; and Chief Justice John Roberts. Expanding the rights of gay and transgender people would not appear to be on the menu. But if Gorsuch meant what he said about faithfully following the text and agreed with Cole about its meaning, it was hard to see how he could vote against Stephens. 

But then Gorsuch pivoted, with a startling question for a strict textualist. “At the end of the day,” he asked, should a judge “take into consideration the massive social upheaval that would be entailed in such a decision?” 

Cole answered that no evidence suggests an upheaval. A 2016 poll shows that 80 percent of Americans think it’s already illegal to fire or refuse to hire someone for being gay, and some lower courts have treated discrimination against transgender people as a violation of Title VII for 20 years. 

It was a telling exchange for assessing Gorsuch’s commitment to his method of deciding cases, which he has said judges should follow in every instance. Critics have long argued that originalism and textualism are riddled with inconsistencies and can be used to provide a fig leaf for results-oriented judging. In that moment with Cole, Gorsuch seemed caught between the plain meaning of “sex” and a worldview he shares — in other words, between principles and politics. 

The line between law and politics has always been blurry, and judges have often professed to sharpen it. Claims of unblinking fidelity to the text have increasingly become the crowning orthodoxy on the right in recent decades. Now Gorsuch and his conservative colleagues have a chance to harness that energy to transform the law. “The Trump vision of the judiciary can be summed up in two words: ‘originalism’ and ‘textualism,.” Donald F. McGahn II, the former White House counsel, who was instrumental in Gorsuch’s and Kavanaugh’s appointments, said in 2017 at an event for the Federalist Society, a group that has been a juggernaut for propelling the courts to the right. Placing judges on the courts is “the most important thing we’ve done for the country,” Senator Mitch McConnell, the majority leader, said last spring. He earlier promised that Trump judges (192 and counting) will “interpret the plain meaning of our laws and our Constitution according to how they are written.” 

Since the 1960s, conservatives have often derided liberal judges as “activists” who bend the law to make big changes. And until his departure in 2018, Justice Kennedy held the Supreme Court’s swing vote and (like Sandra Day O’Connor before him) restrained his fellow conservatives by forging a kind of national compromise on abortion rights, marriage equality, gun laws, the regulatory powers of federal agencies and the scope of the death penalty. 

But now Gorsuch, along with Thomas and Alito, has become “the leading edge of a second generation of conservatives who are not afraid of exercising judicial authority” — in other words, making decisions that can significantly change the law — said John Yoo, a law professor at the University of California, Berkeley, and a former Thomas clerk. (Yoo helped write memos in George W. Bush’s Justice Department that provided justification for the torture of suspects after Sept. 11.) These three justices have shown a “predisposition to swing for the fences,” Donald Verrilli, who served as a solicitor general for President Barack Obama, told me. 

The more that conservatives on the court want to overturn precedents and strike down laws, the more useful it is for them to claim a coherent philosophy that seems to merely follow the dictates of the Constitution or a statute. Gorsuch is positioning himself to push his colleagues in that direction as the public voice and salesman of originalism. Thomas, a fellow strict originalist, rarely speaks from the bench. Roberts said at his 2005 confirmation hearings that judges should “call balls and strikes,” but he didn’t explain how. Alito calls himself a “practical originalist,” picking and choosing when to apply the theory. Kavanaugh, whose record on the court is too short to reveal much, suggested in a 2017 lecture that he didn’t have one methodology by saying that “history and tradition” competed for “primacy of place” with factors like liberty and deference to the Legislature. Questions and battles over originalism and textualism will run through almost every major case the justices hear this year and beyond, and they are the key to understanding the Roberts Court. 

Originalism may sound like an old concept, but it’s actually a modern creation, one born of political exigencies. It dates to the aftermath of the Supreme Court’s 1973 decision in Roe v. Wade, which recognized a constitutional right to abortion. That ruling was not partisan: Roe was decided by a vote of 7 to 2, with five justices in the majority appointed by Republican presidents and one in dissent appointed by a Democrat. By 1980, however, the Republican Party had become more uniformly conservative, and its leaders determined that opposing abortion was a crucial way to win votes from evangelicals and Roman Catholics. The party promised in its platform that year to appoint judges who would protect “traditional family values and the sanctity of innocent human life.” 

But the pledge made it sound as if Republican-appointed judges would pursue a political agenda. Some conservative legal thinkers were uncomfortable with that overt mixing of politics and the law. In 1982, law students started the Federalist Society. The students who founded it instructed other students not to “use the adjective ‘conservative.” Their purpose, they said, was the nonideological and nonpartisan promotion of limited government. 

To maintain their distance from politics, they needed another way to talk about abortion. Robert Bork, an early adviser to the Federalist Society and an appeals court judge, had an answer: a return to the framers’ original conception of the Constitution. Bork said Roe erred not because abortion was wrong but because it created a right to privacy that could not “be found in the Constitution by any standard method of interpretation.” In a 1985 speech to the American Bar Association, Attorney General Edwin Meese III gave Bork’s idea a wide audience, calling for judges to follow the “original intention” of the Constitution’s framers. 

Every judge begins with the text when interpreting a law, and in some ways, originalism seems like common sense. But used in isolation, it doesn’t reflect how the court has done its work for most of its history. At key moments since the country’s early days, the court has weighed the purpose and consequences of a ruling as much as or more than text. 

In a sense, the Constitution invited some license. The document gave specific instructions (“The House of Representatives shall be composed of members chosen every second year”) for the country to follow, but it also provided open-ended principles (“freedom of speech,” “property,” “liberty, “due process”) and left questions unanswered. The Constitution itself provided no method for interpreting it — and at the Supreme Court, constitutional cases were rare, in any event. Almost all of the justices’ work has involved creating legal doctrines, case by case. Their opinions have been full of discussion of their own previous decisions much more than the Constitution. “The precedents shape the text, rather than the other way around,” the University of Chicago law professor David Strauss wrote in The Harvard Law Review in 2015. 

In the 1803 case Marbury v. Madison, the justices filled in a gap about which branch of government should be the final arbiter of the Constitution’s meaning by declaring that it was the courts’ job “to say what the law is.” In 1819, in McCulloch v. Maryland, the court had to decide whether Congress could charter a bank of the United States even though the Constitution did not explicitly say so. The justices took into account the pressing concern of the day — the need for a national army and a bank with branches across the states and territories to pay soldiers — and voted unanimously to broadly interpret Congress’s power to make “necessary” laws to allow for the bank. “Such is the character of human language, that no word conveys to the mind, in all situations, one single definite idea,” Chief Justice John Marshall wrote. 

Over the years, the justices continued to consider the demands of the moment and their own beliefs about the policies that served the country’s interests. They didn’t follow a single approach or subscribe to a particular theory, as originalists claim to do today. Early on, the court recognized a principle called stare decisis, meaning “to stand by the things decided,” which allowed it to maintain the law’s stability. If one ruling proved a mistake, later justices occasionally reversed it, or they more commonly stepped around a decision they didn’t like and gradually rerouted. 

In the 20th century, the justices continued to weigh the impact their decisions would have, increasingly taking into account science and social science. In 1954, the court made an unusual request for a different kind of briefing: The justices wanted to know about the original intentions of the framers of the 14th Amendment. In Brown v. Board of Education, the landmark challenge to school segregation, the court asked whether Congress and the state legislatures that enacted the amendment in 1868 contemplated an end to segregating public schools when they promised that the states would guarantee “equal protection of the law,” as well as “due process” and “liberty.” The N.A.A.C.P., which was litigating the case, sent out an emergency telegram to its supporters asking for help responding to the court’s request. But in the end, the plaintiffs and their experts could not supply the original-meaning support for desegregation the court was looking for. At the time the 14th Amendment was ratified, its backers denied it would lead to desegregated schools. 

After an exhaustive discussion of the history at oral argument, Chief Justice Earl Warren did not pretend otherwise. In the end, his opinion, for a unanimous court, turned on present-day evidence about harm to black children. “We must look to the effect of segregation itself on public education,” Warren wrote. “In approaching this problem, we cannot turn the clock back to 1868.” 

Over the next decade, one justice who joined the unanimous majority in Brown, Hugo Black, spoke up for adhering to text and the Constitution’s original meaning. The way to change the Constitution, Black insisted, was to amend it, though in practice he continued to vote with the majority in many Warren Court decisions that expanded the concepts of equal protection and due process. Consistency, even for an early form of textual fidelity, proved difficult. 

In 1967, Thurgood Marshall, who led the N.A.A.C.P.’s team of lawyers in Brown, was asked during his confirmation hearings for the Supreme Court whether the court’s role should be “simply to ascertain and give effect to the intent of the framers.” Marshall said yes, “with the understanding that the Constitution was meant to be a living document.” 

The idea of an evolving Constitution, built from the language of the framers but not limited to their understanding of it, became a concept associated with liberals. Yet sometimes it has been conservatives who leave the text and the framers behind. In the 1970s, the conservatives on the court began to rule that the First Amendment protected commercial speech, like advertisements, even though it had never been understood that way before. The doctrine remains a tenet of the right. 

The repeated lesson, as Strauss argues, is that the Constitution isn’t just or even mainly its text. It’s the edifice the court has hammered together over the words, adding and renovating over the centuries. The court still spends most of its time and energy on its own precedents. In many important areas — free speech, civil rights, establishment of religion, criminal procedure and punishment — the doctrines the court has developed stray so far from an originalist reading of the text that to return to it would render American law unrecognizable. 

Antonin Scalia offered a more politically palatable version of originalism at his confirmation in 1986, admitting the difficulty of applying the theory in every case and saying it could be superseded by the court’s precedents. 

In 1987, President Ronald Reagan nominated Bork to the Supreme Court, and his confirmation hearing proved to be the first test of originalism’s public acceptability. Bork argued that the Constitution provided not only no basis for the right to privacy in Roe but also no basis for banning literacy tests or poll taxes or for the standard of “one person, one vote.” Bork’s statements helped persuade Democratic senators to oppose him, sinking his nomination. 

A year after Bork’s defeat, Justice Antonin Scalia, from his safe perch on the court, offered a more politically palatable version of originalism in a lecture at the University of Cincinnati called “Originalism: the Lesser Evil.” The shadow of Bork hung over the conservative legal movement, and Scalia began by admitting that originalism was “not without its warts.” Its greatest defect, he said, was that it was so difficult to apply correctly. This required “immersing oneself in the political and intellectual atmosphere of the time.” To write an originalist opinion would take significant research “sometimes better suited to the historian than the lawyer.” 

Scalia was candid about the difficulty of applying originalism in every case. “In its undiluted form, at least,” he wrote, “it is medicine that seems too strong to swallow.” Almost every originalist, Scalia said, recognized that the theory could be superseded by the court’s general rule of respect for its own past decisions, or precedents. Scalia also acknowledged that there were things the Constitution permitted at the founding that he couldn’t imagine allowing today — for example, a law allowing for whipping or branding as a criminal punishment. “In a crunch,” he admitted, “I may prove a fainthearted originalist.” How would originalism stop judges from imposing their own values if they could be selective about applying it? Scalia simply promised such cases would rarely arise. 

Scalia had already altered the definition of originalism. He said in a speech two years earlier that originalism should not focus on the intent of the framers, who disagreed among themselves. Instead, the theory should rest on the Constitution’s original public meaning: the understanding of the text by ordinary citizens at the time, revealed through research into founding-era sources. In practice, originalism has slid between these definitions ever since. 

Despite Scalia’s efforts, originalism remained politically tainted by the memory of Bork. Clarence Thomas steered clear of the subject at his Supreme Court confirmation hearings in 1991. “Our notions of what liberty means evolves with the country,” he said in a discussion of the 14th Amendment. “I don’t think that they could have determined in 1866 what the term in its totality would mean for the future.” 

Clarence Thomas downplayed his originalism to the Senate at his confirmation hearing in 1991, saying that “Our notions of what liberty means evolves with the country.” On the court, he has used originalism as a rationale for upending whole areas of American law. 

It appears in retrospect that Thomas was obscuring his views in order to win Senate votes. On the court, he became the justice most determined to use originalism to rip up whole fields of American law, especially to reduce the scope of federal regulation. “When faced with a demonstrably erroneous precedent, my rule is simple,” he wrote last June in a solo concurrence — a separate opinion agreeing with a judgment — in Gamble v. United States. “We should not follow it.” He has written solo opinions at a higher rate than any other sitting justice. When Scalia was alive, he painted Thomas as an extremist. Comparing himself with Thomas at a talk at a synagogue 15 years ago, according to the New Yorker writer Jeffrey Toobin, Scalia cracked: “I am an originalist, but I am not a nut.” 

But over time, positions Thomas once floated from the margins of conservative thought have moved into its mainstream. “Justice Thomas has been throwing out revolutionary concepts for a long time now,” Yoo, his former clerk, said. “He was interested in being proved right by history, or by the court 20 or 40 years into the future. Now you could say his influence is reaching its height.” Trump officials have expressed their appreciation for Thomas, with McGahn calling his recent opinions a “driving intellectual force.” The administration has successfully nominated more than 10 of his former clerks, the highest total for any justice, to the federal bench. 

In 2012, what was once Thomas’s radical originalist rationale for curtailing Congress’s powers to pass laws based on the Constitution’s Commerce Clause almost became the basis for striking down the Affordable Care Act. He first argued this position in the 1995 case United States v. Lopez, saying that when the Constitution was written, “commerce” referred only to “selling, buying and bartering, as well as transporting for these purposes.” This led to the extraordinary suggestion that the Supreme Court had been wrong to uphold the entire social safety net of the New Deal, because it involved “substantial effects” on commerce among the states. Seventeen years later, in the Affordable Care Act challenge, all five conservative justices embraced this thinking, finding that Congress had indeed exceeded its commerce powers. Only Roberts’s defection from the conservative majority, in concluding that the individual mandate was permitted by Congress’s power to tax, saved the health care law. 

Thomas, who declined to talk to me, moves back and forth between different forms of originalism, sometimes focusing on the intention of the framers and sometimes on the 18th-century meaning of the words, according to Ralph Rossum, a government professor at Claremont McKenna College, in his otherwise sympathetic book, “Understanding Clarence Thomas.” Sometimes, Thomas ignores originalism altogether. For example, he provided no evidence that the First Amendment’s original meaning supported his position in a 1996 concurrence in which he argued that limiting the political donations of corporations violated their free-speech rights. The conservative majority embraced that argument 14 years later in Citizens United v. F.E.C. to strike down limits on corporate campaign donations. 

In his 2019 book, “The Enigma of Clarence Thomas,” the political scientist Corey Robin traces Thomas’s version of originalism to his code of self-reliance. Thomas called his memoirs “My Grandfather’s Son,” writing with reverence about his grandfather’s achievement of lifting the family out of poverty by starting a fuel-delivery service in Georgia despite the barriers of Jim Crow. In college, Thomas was a black nationalist who followed Malcolm X, signing his letters “Power to the People.” But after law school, he became a free-market conservative. Criticizing President Franklin D. Roosevelt and other New Deal liberals in a 1987 speech for the Pacific Research Institute, a think tank, Thomas said: “These critics of ‘the rich’ really do mean to destroy people like my grandfather.” His opinions often align with his belief that the unfettered market, not government efforts to redistribute wealth or ameliorate discrimination, is “the guarantor of precisely the kind of freedom upon which the black community depended,” as Robin writes. 

Thomas’s main innovation has been to deploy originalism to loose the government’s reins over the market. One advantage of originalism is that it allows conservative judges to justify sweeping away American legal traditions, like the broad power of Congress to regulate. “You have to claim to be going back to first principles,” David Strauss says. “Otherwise, it’s just that you don’t like the legal order we have.” 

John Roberts said judges should “call balls and strikes” — without explaining how — when asked about his judicial philosophy at his 2005 confirmation hearing.Credit...Jason Reed/Reuters 

In December, the Supreme Court heard a challenge to a New York gun-control law based on the Second Amendment. It was the latest step in an originalist quest that Thomas helped start in the 1990s to use the Constitution to strike down gun laws. 

At the time, the Supreme Court’s last word on the Second Amendment dated from 1939, when the justices found unanimously that the right to bear arms applied only to weapons with a reasonable relationship to a militia. In the 1980s, originalists like Bork agreed that the Second Amendment didn’t give individuals a right to bear arms. But over the next decade, gun rights became a newly invigorating issue for Republicans, fueled by the National Rifle Association — and like abortion before it, that position, too, benefited from an originalist justification. After a few law professors — some of them liberals — began to argue that the Second Amendment had been misunderstood, Thomas referred to their work in a footnote in the unrelated 1997 case Printz v. United States. “A growing body of scholarly commentary indicates that the ‘right to keep and bear arms’ is, as the amendment’s text suggests, a personal right,” Thomas wrote, though he acknowledged that there was significant scholarship on the other side of the debate. 

Thomas’s footnote served as an invitation for lawsuits challenging local gun-control laws, and in 2007, the justices agreed to hear a Second Amendment challenge to a District of Columbia regulation that effectively barred the personal ownership of handguns. The plaintiff was Dick Heller, a police officer who couldn’t obtain a license from the District of Columbia to keep a weapon at home. 

With Heller on the court’s docket, historians saw a rare chance to influence one of the biggest cases of the decade. A group of founding-era scholars led by Jack Rakove, a Stanford historian, concluded that the law professors Thomas cited were wrong: The Second Amendment was not about self-defense. “The right to keep and bear arms became an issue in 1787-88 only because the Constitution proposed significant changes in the governance of the militia,” they wrote in an amicus brief. The Federalists wanted Congress to determine exactly what kind of militia the nation should have. A few Anti-Federalists did refer to the personal use of guns. But the debate always focused on the role of the militia, not a personal right of self-defense. The Federalists won the debate and wrote the Second Amendment. 

When the Supreme Court issued its decision in Heller in 2008, for the first time in its history, the conservative majority, including Thomas, ruled that the Second Amendment protected an “inherent right of self-defense.” Scalia wrote the opinion, relying heavily on the 18th- and early-19th-century dictionary definitions of “keep,” “bear” and “arms,” which could refer to the personal use of ordinary weapons. Scalia also picked out a few Anti-Federalist quotes that supported his position. For the most part, he bypassed the Federalist sources that Rakove and his colleagues believed held the key. 

To Rakove, Scalia’s analysis was indefensibly incomplete. The founding was “a period of great political creativity,” Rakove told me. As concepts shifted, words took on new shades of meaning. The context matters for accurately understanding the language. “Even if you have the best dictionaries from 1720 to 1790, you still want to think about what the specific nature of the revolutionary-era controversy and experience added.” 

Even some conservative scholars found Scalia’s treatment of the historical sources wanting. Steven Calabresi, a law professor at Northwestern who clerked for both Scalia and Bork and helped found the Federalist Society, looked at all the early state constitutions and found that Scalia had cited the ones that included a personal right to bear arms without acknowledging that a majority of the constitutions did not. “Scalia was better at arguing that people should do originalist history than actually doing it,” Calabresi says. 

In Heller, Scalia also dropped his originalist analysis in the crucial passage of the opinion that explained how the court’s decision would affect modern gun laws. Almost surely to win the vote of Justice Kennedy, which he needed for a majority, Scalia wrote that the court’s ruling did not “cast doubt” on prohibitions on the possession of firearms by felons and the mentally ill, laws that forbid carrying firearms in “sensitive places” and laws “imposing conditions and qualifications” on gun buyers. Many of these laws were modern-day; Scalia gave no historical support for letting them stand. 

Samuel Alito has called himself a “practical originalist,” picking and choosing when to apply the theory. 

After striking down one more handgun ban in Chicago in 2010, the court stopped taking Second Amendment cases. As long as Kennedy remained on the court, legislatures could respond to public outcry over gun violence with increasing restrictions on firearms. It was a compromise that Thomas rejected with mounting frustration, accusing his colleagues, in a solo dissent in 2015, of “relegating the Second Amendment to a second-class right.” 

The dynamic changed with Gorsuch’s arrival after Trump’s election. He moved into chambers near Thomas, who invited him over for barbecue and regularly pops into his office to talk, people who know them told me. Gorsuch soon joined Thomas in scolding the rest of the court for rejecting a challenge to California’s ban on carrying concealed weapons. Last year, with Kavanaugh installed in place of Kennedy, the court finally accepted a case about a New York City ban on transporting licensed handguns anywhere except to approved gun ranges. 

Before the justices heard the case, New York lifted the ban and asked the court to dismiss it. To the liberal justices, the grounds for a dismissal were clear. Roberts also seemed open to dismissing the case. But at the argument in December, Gorsuch took up the cause of trying to keep the case alive. The court will decide whether to rule on the merits in the next few months — and whether to make this case its vehicle for expanding gun rights. 

Whatever the court does with the New York case, it has surfaced a new challenge for Scalia’s originalist claims about the Second Amendment. Two years ago, Brigham Young University introduced a database of more than 120,000 texts from the late 18th century. Previously, originalists in search of the meaning of words during the founding era looked through newspaper archives and other old records. The B.Y.U. database made it possible to comprehensively assess how people at the time used the words “bear arms.” 

For originalists, the new tool is “a paradigm-shifting technology,” two members of the Federalist Society, the law professor Josh Blackman and the Stanford law fellow James C. Phillips, wrote in The Harvard Law Review’s blog in August 2018. It also means that cherry-picking the historical record to establish a dubious “original” meaning would be harder to conceal. “We can do empirics,” says Alison LaCroix, a historian and law professor at the University of Chicago. “There’s a data set.” 

Blackman and Phillips conducted a review of the database and found that the dominant use of “bear arms” at the time of the country’s founding related to the militia. (Even so, they didn’t conclude that Scalia got Heller wrong.) LaCroix and three linguists submitted a brief to the court last fall, in the New York case, with studies they had each done. One found that references in the database “to hunting or personal self-defense” for the phrase “bear arms” were “not just rare, they are almost nonexistent.” The phrase “keep arms,” the brief stated, was also used “almost exclusively in a military context.” 

The findings confirm what Rakove and his fellow historians showed about the era’s political history. But this time, the analysis played by the rules of the game as Scalia defined them, by looking narrowly at the original public meaning of the text. “I don’t care how big a fan of Justice Scalia you are,” Phillips told me. “At some point, you run up against the data.” 

In previous decades, it was Scalia who sold originalism to the public with brash confidence. “You would have to be an idiot,” he said, to conceive of the Constitution as a “living organism.” Scalia died in 2016, and now Gorsuch is remaking the role in his own image. In a best-selling book published in September, “A Republic, If You Can Keep It,” Gorsuch lays out his judicial philosophy. He says judges should not “interpret legal texts to produce the best outcome for society,” because that’s the job of legislatures. 

On an evening that month, a few hundred people gathered at the Richard Nixon Presidential Library in Yorba Linda, Calif., to hear him speak about his ideas. Gorsuch was on a book tour that included an hourlong special for Fox News, interviews with print reporters (though he declined my request to speak to him) and a later appearance on “Fox & Friends.” 

At the library, the crowd, dressed in pastels, filed past elderly docents and into a replica of the East Room of the White House. The audience members settled into their seats and then burst into applause when the silver-haired justice strode into the room. Sitting near a portrait of George Washington, he warmed up the crowd. “It’s really nice being west of the Mississippi,” he said with a grin, winning a roar of anti-Washington approval. He told a story about his milkman making a delivery to his home outside Boulder, Colo., where he and his wife and two daughters lived before he joined the Supreme Court. In his book, Gorsuch describes it as “our home on the prairie” and includes pictures of horses, dogs and chicks. 

Neil Gorsuch considers himself a strict originalist, saying judges should apply the theory in every case, and should not consider a law’s underlying purpose or the consequences of a ruling. 

In fact, his house in Colorado was a gated estate that was sold for $1.5 million in 2017. When he lived there, he drove a gold Mercedes convertible to work at the federal courthouse in Denver. In the early 1990s, Gorsuch met his wife, Louise, at Oxford, where she was a champion equestrian and he studied legal philosophy after graduating from Harvard Law School. 

A former colleague says that Gorsuch urged his clerks to make money in the private sector before they went on the bench, the path he took himself as a corporate lawyer. In the 2000s, Gorsuch represented Philip Anschutz, the oil-and-gas mogul, who has invested in a vast array of businesses and conservative publications, including The Washington Examiner. Anschutz played a role in elevating Gorsuch’s legal career. In 2006, after George W. Bush was re-elected president, Anschutz lobbied for Gorsuch’s appointment to the U.S. Court of Appeals for the 10th Circuit. He then gave Gorsuch a speaking spot at an annual dove-hunting retreat he ran for prominent conservatives. 

At the Nixon library, Gorsuch advertised his support for diversity. Singling out three of his law clerks, Gorsuch described them as a descendant of Mexican immigrants and Holocaust survivors, a first-generation Chinese-American and the first Native American to ever clerk on the Supreme Court. He praised his appeals court, the 10th Circuit, for being “as diverse a court on any metric you wish to consider as any court in the country.” In fact, judges on the 10th Circuit are overwhelmingly white and male. Gorsuch went on to ask his audience if they had heard people say originalism “leads to conservative results.” The crowd murmured, and Gorsuch jutted his chin. “Rubbish,” he said. 

In his book, Gorsuch asks rhetorically if there’s any reason to “only sometimes adhere” to the original meaning of the Constitution, and he answers: “For my part, I can think of none.” This is a significant shift. In contrast to Scalia’s confession of fainthearted originalism (which Scalia himself repudiated in 2013), Gorsuch professes to be absolutist on the matter. He argues that to make an exception would be to fall into a trap: “The more leeway a judge is given, the more likely the judge will engage, consciously or not, in motivated reasoning or bias in reaching a result.” 

The challenge, then, is to stick with the theory, even if it leads to a result you don’t like. But rather than facing up to archaic and politically inconvenient results that originalism can dictate, Gorsuch tends to wave them away. In his book, he addresses the charge that an originalist reading of the Constitution could prevent a woman from becoming president. Article II of the Constitution, after all, calls the chief executive “he.” But Gorsuch says it’s “nonsense” to think the plain meaning of the text restricts the presidency to men, because “‘he’ served as a standard pronoun of indefinite gender” when the Constitution was written and ratified. 

Some scholars are skeptical of Gorsuch’s reading of Article II. In the revolutionary era and after, the plain meaning of “he,” in context, was understood to refer only to men. At the time, the use of “he” would have given states, if they wanted it, a basis for blocking women from appearing on the presidential ballot, Akhil Amar, a professor at Yale Law School, told me. “The framers’ Constitution allowed states to bar women (and many men) from voting and holding office — and originalism ties its meaning now to that world,” Reva Siegel, also a Yale law professor, says. 

Last June, Gorsuch issued his most significant originalist opinion to date, in Gundy v. United States, a case dealing with Congress’s power to broadly delegate policymaking authority to federal agencies. In a dissent, Gorsuch picked up on a solo concurrence Thomas wrote in 2015 and argued that the interpretation of the Constitution that has allowed Congress to do this — in regulating everything from air and water quality to banking and food safety — is “at war with its text and history.” 

Gorsuch said the problem mostly came from a line of cases in the 1940s, following the New Deal expansion of government. He presented his view, which is known as the “nondelegation doctrine,” as the proper original understanding of the constitutional separation of powers between the legislative and executive branches. 

But a body of scholarship discussed in an amicus brief in Gundy belies Gorsuch’s interpretation. For example, a 2017 review of every relevant court challenge before 1940 showed that Congress has delegated policymaking authority to the executive branch since the founding era. One of the review’s authors, the Princeton politics professor Keith Whittington, is a member of the Federalist Society. He and Jason Iuliano, a law professor at Villanova University, concluded that “the nondelegation doctrine never actually constrained expansive delegations of power.” 

Gorsuch ignored that research, citing only a minority of scholars who agree with him. “I admire Justice Gorsuch’s writing,” Cass Sunstein, a Harvard law professor and former Obama-administration official, told me. “But his discussion in Gundy isn’t close to historical standards. There’s a ton of terrific work on the nondelegation doctrine, and he cites none of it. Then there is some not-terrific material, which he does cite.” 

In February, Nicholas Bagley and Julian Mortenson, law professors at the University of Michigan, released a new review based on thousands of pages of documents from the early Congresses. “There was no free-standing nondelegation doctrine at the founding,” they concluded, “and the question isn’t close.” Nonetheless, the issue will probably arise again. The court was short a justice in Gundy, because Kavanaugh hadn’t been confirmed, and Gorsuch didn’t win a majority. But last November, Kavanaugh praised Gorsuch’s Gundy opinion, sending a signal to lawyers to bring a new case. 

Perhaps the most significant case on the court’s docket this year is about the subject that gave rise to originalism in the first place: abortion. On March 4, the court will hear June Medical Services v. Russo, a challenge to a Louisiana law requiring abortion providers to obtain admitting privileges to local hospitals. There are only three clinics left in the state, and if the law takes effect, two of them say they will close, because no local hospital will grant them admitting privileges. That would leave only one provider in a state with nearly one million women of reproductive age. 

In 2016, Kennedy and the court’s four liberals struck down an identical Texas provision, based on a scientific consensus that the requirement isn’t medically necessary and ultimately harms women by preventing them from accessing a safe procedure. The only thing that has changed in the four years since the Texas decision is the court’s composition. The new case could be a means for the conservatives to begin dismantling the constitutional protections for abortion that the court has built, brick by contested brick, over decades of decisions that began with Roe v. Wade. 

Roe was rooted in a 1965 precedent, Griswold v. Connecticut. In Griswold, the court derived a right to privacy for marital relations from what it confusingly called “penumbras, formed by emanations” in the Bill of Rights and the 14th Amendment, striking down a law that banned the use of birth control, including for married couples. Justice Hugo Black, the stickler for the text of the era, dissented. “I get nowhere in this case by talk about a constitutional ‘right of privacy’ as an emanation from one or more constitutional provisions,” Black wrote. 

The critique sank in. When Justice Harry Blackmun wrote the majority opinion in Roe, he refashioned a right to privacy “founded in the 14th Amendment’s concept of personal liberty and restrictions upon state action” that was “broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.” 

Later, the Supreme Court established other underpinnings for the right to access abortion. In the 1992 case Planned Parenthood v. Casey, three justices appointed by Republican presidents — Kennedy, Sandra Day O’Connor and David Souter — devised a compromise that allowed the states to regulate Roe, but only if they did not impose an “undue burden” on women seeking abortions. Returning to the 14th Amendment, the justices wrote: “The controlling word in the cases before us is ‘liberty.'” The court invoked gender equality, saying that the right to decide whether and when to have a child is essential to a woman’s ability “to participate equally in the economic and social life of the nation.” As Linda Greenhouse and Reva Siegel wrote in the 2019 book “Reproductive Rights and Justice Stories,” “Respect for the equal citizenship of women appears centrally in the opinion.” It took 20 years, and perhaps a female justice, but the court saw a direct connection between reproductive freedom and equality. The current conservative majority, however, may undo it. 

Brett Kavanaugh suggested in a 2017 speech that he didn’t have one methodology and that “history and tradition” competed for “primacy of place” with factors like liberty and deference to the legislature.Credit...Melina Mara/The Washington Post, via Getty Images 

Some liberals have tried to find common ground with conservatives by blurring the boundaries between originalism and an evolving understanding of the Constitution’s open-ended principles. Justice Elena Kagan promoted this approach to constitutional interpretation at her Senate confirmation hearings in 2010. “Sometimes they laid down very specific rules,” she said of the framers. “Sometimes they laid down broad principles. Either way, we apply what they say, what they meant to do.” Kagan ended with a line that drained originalism of its standard meaning: “In that sense, we are all originalists.” 

Perhaps Kagan sought to disarm her partisan critics in the Senate. Gorsuch and Kavanaugh may have seen the same benefit when they followed her lead at their own confirmation hearings. “I am with Justice Kagan on this,” Gorsuch said at his 2017 hearing, when asked for his views on originalism. Kavanaugh repeated the refrain when it was his turn to testify: “As Justice Kagan said, we’re all originalists now.” 

But now that Gorsuch and Kavanaugh are on the court and the conservatives are firmly in control, it’s hard to see why they would go along with a liberal effort to co-opt originalism. Kennedy, hardly an originalist, floated a version of this in his 2015 majority opinion recognizing a right to marriage equality for gay couples in the case Obergefell v. Hodges. The framers “entrusted to future generations a charter protecting the right of all persons to enjoy liberty as we learn its meaning,” he wrote. The conservatives on the court at the time recoiled, arguing in dissent that the court’s decision had “nothing to do with” the Constitution. Outside the court, social conservatives warned that an ideologically neutral originalism would be useless. “One might say that originalism has become a Unitarian Church for the legal profession,” Michael Greve, a professor at the Antonin Scalia Law School at George Mason University, wrote in an essay last July. “Anybody is welcome, provided you believe there is one Constitution.” 

The left also has something to lose if it makes support for originalism, and textualism, sound like a single widely shared view, when in fact the conservatives’ conception of these theories remains very different from theirs. More than a decade ago, Justice Stephen Breyer debated Scalia in a hotel ballroom in Washington. He spoke on behalf of an approach to judging that went back to Marbury and McCulloch: reading laws and the Constitution in context to weigh their underlying purpose and the consequences of interpreting them one way or another. Breyer continues to argue urgently for this position. Last term, he wrote two solo dissents — not his usual practice — to warn against what he sees as a words-on-the-page capitulation. He is concerned that rather than challenging a method that produces constitutional law that few people would want, liberals are uncritically helping to normalize it and teach it to the next generation of the legal profession. “I don’t want textualism to take over the law schools, and I fear it is,” he told me this fall. “The purpose of the law is to work, to work for the people.”


Tuesday, February 25, 2020

‘Facebook: The Inside Story’



‘Facebook: The Inside Story’ Offers a Front-Row Seat on Voracious Ambition.

Steven Levy

During his years reporting his book, Steven Levy developed a theory to account for Facebook’s recent woes; Mark Zuckerberg was overly ambitious.

Review By Natasha Singer NY Times 

After a spate of seemingly nonstop scandals, Facebook has developed a strategy to explain the myriad problems enabled by its social network. 

Whether the issue is Russian election interference, metastasizing fraudulent news or incitements to genocide in Myanmar, the company essentially hews to the same talking points: Facebook’s mission to connect the world is well-intentioned. But when you connect billions of people, there are bound to be malefactors. 

Never mind that Facebook designed its system to promote just the kind of intriguing or provocative content that people can’t resist clicking on. Or that it constructed a marketing machine that allowed users to direct propaganda precisely toward the most receptive audiences. At Facebook, it seems, harm is a cost of doing business on a global scale. 

As Facebook’s chief, Mark Zuckerberg, put it last year, “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism and extortion.” 

Steven Levy, a longtime chronicler of Silicon Valley, relies on a similar framing in “Facebook: The Inside Story.” In his introduction, he explains that his aim is to capture “the breadth of the company’s ambitions.” So he set out to catalog its quest for power — starting with Zuckerberg’s interest as a teenager in Civilization, an empire-building video game, and Caesar Augustus, a particularly authoritarian Roman emperor. 

Along the way, Levy developed a theory to account for the company’s recent woes: Zuckerberg, an inexperienced leader who started Facebook “at such a tender age,” was overly ambitious. In his drive to connect the entire world, Levy argues, Zuckerberg made some unfortunate decisions, like delegating key policy issues to subordinates and pushing too quickly for explosive growth. And those errors came at a human cost. 

“The company pursued its naïvely utopian — and undeniably self-serving — goal with a tragic disregard for consequences,” Levy writes. Even so, he says, in a comment reminiscent of Facebook’s party line, “there is still something to the company’s insistence that the good it does outweighs what it now admits is the bad it foments.” 

In 2011, Levy, now the editor at large at Wired, wrote an extensive history of Google. To report the book, he secured liberal access to executives at Google and was allowed to soak up company culture by wandering around its corporate campus. He employed much the same strategy for “Facebook.” Zuckerberg granted Levy numerous interviews over a three-year period, and gave him “unprecedented access” to company executives. 

The result is a work that recounts the company’s narrative mainly through the lens of its central figures. It is a largely sympathetic, and occasionally fawning, portrait of Facebook that seems at odds with the company’s recent emergence as an avatar for the risks of unchecked corporate power. 

Although the book raises questions about Facebook’s serial privacy violations and handling of foreign election interference on its site, sections addressing those issues often feel pro forma or tacked on. Levy seems much more at home narrating Zuckerberg’s high-speed upward trajectory from a rule-flouting Harvard student who capitalized on other people’s ideas to the Silicon Valley mogul who muscled the founders of Instagram and WhatsApp into selling him their start-ups. 

Not for nothing is the book subtitled “The Inside Story.” Levy, who first met Zuckerberg in 2006, takes readers inside his college dorm suite; inside the late-night coding and cavorting at the company’s first home base in Palo Alto; inside meetings with the tech moguls who were the start-up’s first major investors; inside design choices that fueled the social network’s popularity; and inside Zuckerberg’s head. 

During a 2016 trip to Lagos, Nigeria, to meet young entrepreneurs, Zuckerberg tells Levy about his personal “engineering mind-set,” an approach he’s also instilled at Facebook. The idea is to view everything — computer programming, company growth — as a system that can be broken down and improved step by step. “It may even be more a value set than a mind-set,” Zuckerberg says. Alas, this is one of many passages in the book that seem to take Silicon Valley’s self-mythology as gospel. 

The heroic, rational, problem-solving engineer is a near-religious icon in the tech industry. But another writer might have pointed out that the engineering mind-set led Facebook to develop a powerful surveillance system that tracks users to target them with ads, nudge them to stay online longer, prompt them to share more personal details and prod them to keep compulsively coming back. Another writer might also have suggested that all those evildoers — the dictators, the genocidal generals, the traffickers of political propaganda, the purveyors of false news — did not hijack Facebook. They simply used the platform as it was designed: to try to influence user behavior. 

But “Facebook” does not delve deeply into the company’s data-mining practices — like the medical marketing it once offered targeting 110,000 Facebook users with a “diagnosis of H.I.V./AIDS” and 76,000 with “bulimia awareness.” (The company has said it no longer offers these ad-targeting categories.) Nor does the book examine the company’s outsize role in the surveillance economy. That is partly because Levy accepts Zuckerberg’s narrow view of privacy as the control individuals have over the personal information they choose to share. “People think that we’ve eroded [privacy] or contributed to eroding it,” Zuckerberg tells him in their last interview. “I would actually argue we have done privacy innovations, which have given people new types of private or semiprivate spaces in which they can come together and express themselves.” 

Levy doesn’t question that assertion or ask Zuckerberg about the millions of non-Facebook sites and apps from which the company harvests details about people’s behavior. Unfortunately, the book’s cursory explanations of Facebook’s data operations, one of the linchpins of its success, will make it difficult for readers to fully grasp the many antitrust and privacy investigations with which the tech giant is now grappling. 

The story of how Facebook came to capture the attention of nearly one out of three people on earth, with profound repercussions for humanity, is truly astonishing. But “Facebook” tells only half of it. It is a tour de force of access journalism. It is not a tour de force of critical thinking. 

Natasha Singer is a technology reporter at The Times, where she covers privacy and industry accountability. She also teaches a tech ethics course for high school students at The School of The New York Times, a pre-college program. 

FACEBOOK 

The Inside Story 

By Steven Levy 

583 pp. Blue Rider Press. $30.


Friday, February 21, 2020

Lawrence Tesler, Who Made Personal Computing Easier,


Lawrence Tesler, Who Made Personal Computing Easier, Dies at 74


By John Markoff  NY Times

WATCH: Lawrence Tesler talk about copy and paste

When you’re cutting and pasting, dragging the cursor over selected text and performing other common computer tasks, you can thank him.


Lawrence Tesler using an Alto personal computer sometime during his tenure at Xerox in the 1970s. As a young researcher at Xerox, he helped to develop today’s style of computer interactionCredit...Xerox PARC 




Lawrence Tesler, a pioneering computer scientist who helped make it easier for users to interact with computers, whether cutting and pasting text or selecting text by dragging a cursor through it, died on Sunday at his home in Portola Valley, Calif. He was 74. 

The cause was not known, his wife, Colleen Barton, said, but in recent years he had suffered the effects of an earlier bicycle accident. 

Mr. Tesler worked at a number of Silicon Valley’s most important companies, including Apple under Steve Jobs. But it was as a young researcher for Xerox at its Palo Alto Research Center in the 1970s that he did his most significant work: helping to develop today’s style of computer interaction based on a graphical desktop metaphor and a mouse. 

Early in his Xerox career (he began there in 1973), Mr. Tesler and another researcher, Tim Mott, developed a program known as Gypsy, which did away with the restrictive modes that had made text editing complicated. For example, until Gypsy, most text-editing software had one mode for entering text and another for editing it. 

Mr. Tesler was passionate about simplifying interaction with computers. At Apple he was responsible for the idea that a computer mouse should have only one button. For many years the license plate on his car read, “NO MODES.” 

His first breakthrough at Xerox PARC came when he took a newly hired secretary, sat her in front of a blank computer monitor and took notes while she described how she would prefer to compose documents with a computer. She proceeded to describe a very simple system, which Mr. Tesler then implemented with Mr. Mott. 

The Gypsy program offered such innovations as the “cut and paste” analogy for moving blocks of text and the ability to select text by dragging the cursor through it while holding down a mouse button. It also shared with an earlier Xerox editor, Bravo, what became known as “what you see is what you get” printing (or WYSIWYG), a phrase Mr. Tesler used to describe a computer display that mirrored printed output. 

And Gypsy brought to fruition the idea of opening a computer file by simply clicking on a screen icon while pointing at it with the mouse cursor. Before that, files had to be opened by typing the file name into a command line. 

“At Xerox he pushed a lot for things to be simpler in ways that would broaden the base of users,” said David Liddle, a veteran Silicon Valley venture capitalist who worked with Mr. Tesler at Xerox PARC. “He was always quite focused on users who weren’t also Ph.D.s in computer science.” 

Mr. Tesler later joined a small team of researchers run by Alan Kay, a visionary computer scientist who had pioneered the idea of a so-called Dynabook, which would become the inspiration for today’s laptop computers. The group was developing a software environment called Smalltalk, and Mr. Tesler developed a system for searching for software components, which he named the browser. 

“He can be hailed as one of the true pioneers of many important aspects of personal computing,” Mr. Kay said. 

After attending a demonstration of the Altair, an early hobbyist personal computer, at a Palo Alto hotel in 1975, Mr. Tesler returned to PARC to alert his colleagues to the arrival of low-cost systems. His warnings were largely ignored. 

“He can be hailed,” a fellow scientist said, “as one of the true pioneers of many important aspects of personal computing.”Credit...via Tesler family 

He continued to press for less costly computers. In 1978, with Adele Goldberg and Douglas Fairbairn, he designed a portable machine called NoteTaker, a forerunner of luggable computers like the Osborne, Kaypro and Compaq machines of the early 1980s. But Xerox declined to commercialize the NoteTaker; only a few prototypes were made. 

It was Mr. Tesler who gave Mr. Jobs the celebrated demonstration of the Xerox Alto computer and the Smalltalk software system that would come to influence the design of Apple’s Lisa personal computer and then its Macintosh. 

Mr. Tesler left Xerox to work for Mr. Jobs at Apple in 1980. 

“The questions the Apple people were asking totally blew me away,” Mr. Tesler was quoted as saying in a profile that appeared in IEEE Spectrum, the magazine of the Institute of Electrical and Electronics Engineers, in 2005. “They were the kind of questions Xerox executives should have been asking but didn’t.” 

In addition to helping develop the Lisa and Macintosh, Mr. Tesler founded and ran Apple’s Advanced Technology Group, after which he led the design of the Newton hand-held computer, although that proved unsuccessful. The group also created much of the technology that would become the Wi-Fi wireless standard, and Mr. Tesler led an Apple joint venture with two other companies that created Acorn RISC Machine, a partnership intended to provide a microprocessor for the Newton. 

Although Apple eventually sold off its holdings in that venture, it would come to dominate the market for the chips that power today’s smartphones. The chip architecture created by the partnership is today the most widely used microprocessor design in the world. 

Mr. Tesler left Apple in 1997 for a start-up and later went on to work for both Amazon and Yahoo. He left Yahoo in 2008 and spent a year as a product fellow at 23andMe, the genetics information company. He was most recently an independent consultant. 

Lawrence Gordon Tesler was born in the Bronx on April 24, 1945, to Isidore and Muriel (Krechman) Tesler. His father was an anesthesiologist. 

In 1960, while attending the Bronx High School of Science, Mr. Tesler developed a new method of generating prime numbers. He showed it to one of his teachers, who was impressed. As Mr. Tesler later recalled, he told the teacher that the method was a formula; the teacher responded, “No, it’s not really a formula, it’s an algorithm, and it can be implemented on a computer.” 

“Where do you find a computer?” Mr. Tesler asked. 

The teacher said he would first get him a programming manual and then figure out where to find a computer. 

One day Mr. Tesler was sitting in the school cafeteria reading his manual, which offered instructions on how to program an IBM 650 mainframe in the most low-level, arcane machine programming language. 

A student walked up to Mr. Tesler and asked, “What are you doing with that?” 

“I’m learning about programming,” Mr. Tesler responded. 

The other student alerted Mr. Tesler to a program at Columbia University, which gave high school students programming time. He was able to use a university computer for a half-hour each week, teaching himself to program before he got to college. 

He attended Stanford, graduating in 1965 with a degree in mathematics. While there, he became involved in a number of early projects that prefigured personal computing. 

Mr. Tesler had early access to a computer known as a LINC when he worked as a student programmer for the Nobel laureate Joshua Lederberg. The LINC, designed by the M.I.T. physicist Wesley A. Clark, is believed by many computer historians to have been the first true personal computer. 

Mr. Tesler’s first start-up venture was a programming consulting company located in a shopping mall next to the Stanford campus. He also used a mainframe computer to build a system to permit the student rooting section at Stanford football games to program elaborate card stunts. It was, Mr. Kay said, a forerunner to the ways in which modern graphical displays would be programmed. 

In 1969, with two other scientists at the Stanford Artificial Intelligence Laboratory, Mr. Tesler created a design for a small computer and proposed it to the calculator company Frieden. Although intrigued, the company declined to pursue the idea. 

Mr. Tesler left computing for a short while after that and moved to an Oregon commune with his daughter from a short-lived marriage. Lack of work led him back to the Bay Area, where he would join Xerox PARC. 

In addition to Ms. Barton, a geophysicist, and his daughter, Lisa Tesler, he is survived by two brothers, Charles and Alan. 

At Stanford and afterward, Mr. Tesler was active in both the antiwar movement and the 1960s counterculture. He participated in an alternative school, the Mid-Peninsula Free University, where he taught classes, including one exclusively for people born under the sign of Taurus. In 1968 he taught a class titled “How to End the IBM Monopoly.” 

Years later, as a computer scientist at Xerox, he remembered his activist roots, his former colleague Ms. Goldberg said. The Central Intelligence Agency was a Xerox customer, and when agency employees arrived for a meeting, Mr. Tesler attended wearing a trench coat and a fedora.

Tuesday, February 18, 2020

The iPhone at the Deathbed




The iPhone at the Deathbed 


Families are photographing death at home. These photos may feel jarring on Facebook, but the practice itself has a long history. 

A kiss after dying: the late Robert Alexander and his youngest sister, Kary Manzanares.Credit...Tawnya Musse 



By Penelope Green NY TIMES

After Robert Alexander died at 51 during heart surgery in June 2018, after he was taken from the hospital to the facility that would recover the tissue and bone he had donated, he was brought to his uncle’s farm in Hinton, Okla., where his six siblings, his mother and other family members and friends had gathered to give him a home funeral. 

They laid him out on a sturdy folding banquet table and dressed him in well-worn bluejeans, a Harley Davidson bandanna, a long-sleeved Affliction T-shirt and his black leather vest painted with the American flag. On the wall behind him, they hung a blanket emblazoned with a flaming skull. 

A mechanic, Mr. Alexander had loved motorcycles, though his health and finances had kept him from being a regular rider. After he was properly adorned, and “looking pretty badass,” as his sister Tawnya Musser said, his siblings and their mother gathered around him, and a brother-in-law took a family photo using his smartphone. 

“We couldn’t think of a time when all of us had been together with Mom,” Ms. Musser, 34, said. “So we had the conversation. Did Mom want a photo with all seven of her children and was it morbid that one of them was dead?” 

There ended up being several photographs. They are startling and beautiful. Mr. Alexander looks peaceful and regal. The siblings have shared them among themselves, but the images don’t live on social media, as many contemporary death photos do. 

In a collision of technology and culture, of new habits and very old ones, we are beginning to photograph our dead again. 

For families like Mr. Alexander’s who are choosing home funerals and following natural death practices — D.I.Y. affairs that eschew the services of conventional funeral parlors — photography is an extension and celebration of that choice. 

Family members are sitting with kin in hospice, or taking them home from hospitals, and continuing to care for them after they die, often washing their bodies and then adorning them, as Mr. Alexander’s family did, with favorite clothes, flowers, cards, books and other totems. They are sending their dead off as their grandparents used to, and recording the event and its aftermath with their smartphones. 

“You can die in a way that has beauty attached,” said Amy Cunningham, 64, a funeral director in Brooklyn who specializes in “green” burials, without embalming or metal coffins, and assists families who are caring for their dead at home. 

“The photograph seals the emotion,” Ms. Cunningham said. “And with cellular phones ever-present, we’re going to be recording all kinds of things we never did previously. Death is just one of them. Though when you’re Facebook posting and the images are wedged between the latest Trump atrocity and cats who look like Hitler it can be jarring.” 

So, too, is the now common experience of seeing emoji applied to tragic events. Do you choose the weeping smiley face or just hit “like”? 

The End of the Timeline 

When Louise Rafkin posted a photo of her mother, Rhoda Rafkin, on Facebook the night of her death at 98 in September with her golden retriever at her side, it rattled some family members and friends. 

Ms. Rafkin, 61, an author and martial arts teacher in Oakland, Calif., who is also a contributor to The New York Times, described how she and others had carried Rhoda outside to the garden she had loved. They transported her on an improvised stretcher, a surfboard borrowed from neighbors, and with help from their college-age sons. 

Rhoda was dressed in a blue caftan and strewn with sunflowers, roses and gladioli. They tucked her into a sheet, lit candles and sat with her until it was dark. It is a lovely image, shot at the magic hour, as filmmakers like to say of the time just before dusk, but it shocks nonetheless. 

“I was crazy about my mom and I wasn’t fazed by her being dead,” Ms. Rafkin said, noting that Rhoda, an educator, had been in hospice for more than six months. “I’ve been through the AIDS epidemic. I’m used to death. There are ways you can make this meaningful. Although I’m not religious, I am a deep believer in ritual and how that can heal and provide context.” 

The Facebook post was a way to announce Rhoda’s death, Ms. Rafkin said, adding, “I’m pretty sure my mother would have disapproved, and that’s a tad unsettling. ‘No folderol,’ she said about the whole process.” 

Some family members had mixed reactions. “I think what she did in the garden was beautiful,” said Ashley Peterson, 31, of Ms. Rafkin, who is her aunt. “But I felt like posting the photos could make people uncomfortable and leave an image in their minds they did not want to see. ” 

Susan Sontag wrote that photography has its own ethics: It tells us what we are allowed to see and what’s taboo. (In the age of TikTok, these rules have evolved beyond all imagining.) If we are more familiar with the deaths of strangers, their violent ends captured by photojournalists, maybe that’s because the deaths of our intimates have been at a remove for so long. 

There have been exceptions, of course, like the harrowing images that emerged during the AIDS epidemic from photographers like Therese Frare and artists like David Wojnarowicz, whose tender portraits of his friend and mentor Peter Hujar are holy-seeming and sacramental. 

“In one sense it’s surprising because we’ve been so disconnected from death in the last century or so,” said Bess Lovejoy, the author of “Rest in Pieces: The Curious Fates of Famous Corpses,” published in 2013, of the resurgence of home death photography. Ms. Lovejoy is also a member of the Order of the Good Death, an organization of funeral professionals, artists and scholars that prepare a culture generally in denial about death. 

“But we are returning to the older ways,” she went on, “a movement backward that some say began in the ’70s, with the back-to-nature movement and midwifery and natural births. The natural death movement is part of that. And these photos are unsurprising, too, because we carry our smartphones all the time, and it’s almost like if there isn’t a photo it didn’t happen. Now everyone is a photographer.” 

Modern photography was born in 1839, when Louis Daguerre refined a process for capturing an image on silver-plated copper. 

For decades, one of the most common uses of this new technology was the post-mortem photo: an artfully composed image, taken by a professional photographer, of dead family members in all manner of poses. Dead children in the laps of their parents, often with their eyes painted open; dead adults dressed in their finest clothes; even dead parents holding their living children; or entire families, wiped out by diseases like cholera, typhoid or diphtheria, nestled together in bed. 

These were prized mementos, most often the only photograph that was ever taken of the subject, said Stanley B. Burns, 81, the quirky ophthalmologist behind the Burns Archive, a collection of post-mortem and medical photos, among other intriguing photographic genres, stored in a chockablock townhouse in Midtown Manhattan. 

The photos in Dr. Burns’s “Sleeping Beauty” books (there are three) are both ghoulish and gorgeous. Dr. Burns pointed out that the subjects tended to look pretty good, because the plagues that felled them did so quickly. 

The images have been inspiration and provided material for collectors and Victoriana enthusiasts like Joanna Ebenstein, 48, a writer and curator who was a founder of the idiosyncratic Morbid Anatomy Museum, now closed, in Brooklyn. “Post-mortem photographs can be seen as a Western form of ancestor veneration,” said Ms. Ebenstein, a practice that began to decline when death was outsourced to the clinical environments of hospitals and funeral homes, “and it became taboo to talk about.” 

(In 1910, Ladies' Home Journal rebranded the parlor, where Americans had been laying out their dead for nearly a century, as “the living room” and the nascent funeral industry took the word “parlor” for its activities.) 

But what really curtailed post-mortem photography and the elaborate mourning rituals behind it, according to Dr. Burns, was World War I. “There was so much death,” he said. “If everyone is mourning, you lose your fighting spirit. It’s not patriotic.” 

“What’s happening now is that people are taking back that process,” Dr. Burns continued. “But the impulse to photograph is the same as it was for the Victorians. They want to show they have seen their person through to the end. ‘I’ve done this work, I’ve loved her to the end.’ It’s your last bond, and you want to document that.” 

Finis-tagram 

As the funeral industry slowly evolves from Big Casket to include a cadre of overwhelmingly female and digitally native professionals with all manner of titles (end of life teachers, death doulas and others), they are displaying their work, with humor and photographs, on social media. 

Their message: Get comfortable with death, it doesn’t have to be so scary, and here are photos to prove it. 

They share images of the dead attended by family members in their beds, or shrouded in natural fabrics cinched with rope at a grave site. They perform death themselves, as Melissa Unfred, 41, a natural mortician based in Austin, Texas, sometimes does, lying in shallow graves strewn with flowers and turf. Ms. Unfred, who sells “Cremate the Patriarchy” T-shirts on Etsy, is the Mod Mortician of Twitter and Instagram, one of many evangelists for the so-called Death Positive movement. 

Caitlin Doughty, 35, a funeral director who describes herself as a mortician activist and funeral industry rabble rouser, recently re-enacted a Victorian-style post-mortem photo shoot with a tintype photographer at the Merchant House Museum in Manhattan, and shared it on YouTube. 

Ms. Doughty is the founder of the Order of the Good Death and the author of “Will My Cat Eat My Eyeballs?” published last September, “Smoke Gets in Your Eyes and Other Lessons from the Crematory” and other jauntily titled books designed to demystify death. With her Bettie Page crop, she is an avatar of the goth-inflected sub-tribe of death professionals. 

“It’s not like no one never took a photo of Mom in the coffin,” Ms. Doughty said. “I have pictures of my grandparents in their caskets fully embalmed. But the sense of ownership has changed. It’s not, ‘Mom is handed to the funeral parlor and they do something behind the scenes and sell the body back to you.’ Sure, you could take photos but it’s like a statue in a museum. The product of someone else’s art. My sense of why we are seeing more and more photos of these natural bodies is because the families have prepared them themselves, they’ve done a job together and they are proud of their work.” 

Ms. Doughty advises families on home death rituals and best practices, like how to keep the dead cool with packs of dry ice. “One family texted me photos as they worked, though not to say, ‘How are we doing?’ but, ‘Look how beautiful.’ I think people have this fear that Mom is going to be this otherworldly creepy thing, and then when that doesn’t happen, they want to capture it.” 

Ms. Cunningham, the funeral director in Brooklyn, recalled addressing a group of Unitarians in Albany a few years ago, and saying that she wasn’t sure she would want to be viewed, post-mortem, by her friends and family. That she would prefer to be looking her best. A nonagenarian yelled out, quite sharply, as she remembered, “‘You’ll get over that!’” 

“And that got me thinking,” Ms. Cunningham said. “Wouldn’t it be wonderful to die unfettered and free from worrying about how I look?” 

Remembrance Portraits 

Cancer patients and others with terminal illnesses have long used photos and videos to bear witness to their suffering and make visible that which is considered off limits — on blogs, Twitter and now TikTok — and have encouraged family members and friends to do so on their behalf when they are no longer able to, pushing visual and emotional boundaries well beyond what may be considered comfortable. 

As in the Victorian era, post-mortem photographs of children have a terrible urgency and mission. Now I Lay Me Down to Sleep is an organization of volunteer photographers who make “remembrance portraits” of babies, often of the child in their parent’s arms, to assist in the grieving process. 

Oliver Wasow, a photographer, recalled the agonizing images a friend shared last summer of her son’s death to cancer at age 8 on Instagram and Facebook, documenting her child’s devastating decline, and then her own grief. 

It was shattering to see — “You couldn’t ‘like” the photos,” Mr. Wasow said — but he recognized the value it had for his friend. Some people, he noted, say the difference between analog photography and digital photography is that digital photography is a kind of activity, versus analog photographs, which are documents. 

“When you throw in social media, it becomes a record of a process rather than a record of a person. Yet the purpose remains the same whether it’s the 19th century or the 21st,” Mr. Wasow said. “It’s about documenting the transition from a physical body to a memory.” 

Ms. Williams’s hands, in her garden: showing the kind of detail photography she sometimes does with the dead.Credit...Audrey Kelly for The New York Times 

There are gentler ways to memorialize the process of dying than a portrait of a face with the life drained from it. Lashanna Williams, 40, a massage therapist and death doula in Seattle, has been making portraits of her dying clients, with their permission, to share with family members if they ask for them. 

She captures the area between a forefinger and thumb, or the calluses of someone’s hands. Wrinkles, she likes to say, are containers for memories and lived experience. She may take a photo of the crepey skin on an arm, or a scar, and sometimes she layers those images with photo collages made from leaves or flowers. The images are both abstract and intimate. 

The aesthetic and language of modern post-mortem photography is not all fabric shrouds and flower petals, however. Monica Torres, 42, is a desairologist (the term for hair and makeup stylists who work on the dead) and embalmer in Phoenix with a sassy Twitter handle, @Coldhandshosts. Her specialty is trauma, and she relies on conventional methods to make decedents look like themselves again. 

“I cannot create a positive, lasting memory for families without the chemicals and tools that I use,” Ms. Torres said. The families of her clients often ask her to take photos, or gather at a coffin for a selfie, she added. 

An educator, she also shares her work in vivid photos on her website. “Now that the death-positive movement is in full effect,” she said, “families are beginning to show interest, and documenting their journey through grief is a powerful tool to use toward acceptance. We want to empower families with education about what it is we actually do and how our dark art is valuable.” 

Bam Truesdale, 37, a hair and makeup stylist in Charlotte, N.C., has been preparing decedents for funeral parlors for 10 years. When his mother, Cynthia Cummings, died at 61 in 2016, he worked on her, too. As is his habit with all the people he prepares, he put earphones in his mother’s ears, and played her gospel music, though he worked in silence. 

After Mr. Truesdale had made his mother up and done her hair, pinning a white feather and rhinestone fascinator to her curls, he smoothed her dress, adjusted her stockings and picked her up, placing her gently in her coffin. 

He captured the entire process with his Android phone, though when he paused to kiss her face all over, as he used to do when she was alive, the colleague he’d brought from work to help him if he faltered took the phone from him and snapped those photos herself. Afterward, he uploaded the images to a Google drive and did not look at them again until the last week of January. 

“I started feeling emotional that day,” he said, “and something in my head told me, I think it was her, that I had never shared her like she asked me to.” When Ms. Cummings was dying, she made Mr. Truesdale promise that he would make sure no one would forget her. Mr. Truesdale said, “I was going back and forth, ‘Maybe I should? Maybe I shouldn’t? People are going to think I’m weird.’” 

It was evening when Mr. Truesdale posted his dead mother’s photos on Facebook. He awoke the next morning to find his phone lit up with thousands of comments and notifications. Many people asked if he could make the post public, so he did. By the end of the day, 25,000 people had “liked” the post, and it had been shared more than 15,000 times. 

Among the more than 4,000 comments, the most common were that Ms. Cummings looked beautiful, and that Mr. Truesdale had done a wonderful job caring for her. Strangers wrote that they wished they could have had a similar experience with their own family members. 

His three siblings thanked him, too. “They didn’t know they wanted to see the pictures,” he said. “But they did.”