읽은 책에 대한 감상을 나누는 공간
Articles
Podcasts
Essay Resources
Film + TV
Academic Journals
All
Search
No End Date
12
Pete Ryan
It is the job of government to prevent a tragedy of the commons. That includes the commons of shared values and norms on which democracy depends. The dominant digital platform companies, including Facebook and Google, make their profits using business models that erode this commons. They have created a haven for dangerous misinformation and hate speech that has undermined trust in democratic institutions. And it is troubling when so much information is controlled by so few companies.
What is the best way to protect and restore this public commons? Most of the proposals to change platform companies rely on either antitrust law or regulatory action. I propose a different solution. Instead of banning the current business model — in which platform companies harvest user information to sell targeted digital ads — new legislation could establish a tax that would encourage platform companies to shift toward a healthier, more traditional model.
The tax that I propose would be applied to revenue from sales of targeted digital ads, which are the key to the operation of Facebook, Google and the like. At the federal level, Congress could add it as a surcharge to the corporate income tax. At the state level, a legislature could adopt it as a type of sales tax on the revenue a company collects for displaying ads to residents of the state.
There are several advantages to using tax legislation, rather than antitrust law or regulation, as a strategy. Senator Elizabeth Warren, for example, has called for breaking up big tech companies. But the antitrust remedies that Ms. Warren and other policy experts are suggesting ask prosecutors and judges to make policy decisions best left to legislatures. Existing antitrust law in the United States addresses mainly the harm from price gouging, not the other kinds of harm caused by these platforms, such as stifling innovation and undermining the institutions of democracy.
Our digital platforms may not be too big to fail. But they are too big to trust and — despite the call by Mark Zuckerberg, Facebook’s chief executive, for new legislation and regulation — may already be too big to regulate. Powerful companies can capture or undermine a regulator. The Interstate Commerce Commission, for example, established in the 19th century, ended up serving the interests of the rail and trucking industry instead of the public. And the recent crashes of two Boeing airplanes have raised serious concern that the Federal Aviation Administration, which has a long history as an effective regulator, has been neutered by the aviation industry.
Of course, companies are incredibly clever about avoiding taxes. But in this case, that’s a good thing for all of us. This tax would spur their creativity. Ad-driven platform companies could avoid the tax entirely by switching to the business model that many digital companies already offer: an ad-free subscription. Under this model, consumers know what they give up, and the success of the business would not hinge on tracking customers with ever more sophisticated surveillance techniques. A company could succeed the old-fashioned way: by delivering a service that is worth more than it costs.
Some corporations will persist with the targeted ad model if it yields more profit, even after paying the tax. To limit the size of those businesses, the tax could be progressive, with higher rates for larger companies. This would have the added benefit of creating a corporate version of a marriage penalty. When two companies combine, their total tax bill would go up.
A progressive digital ad revenue tax would also make sure that dominant social media platforms bear the brunt of the tax. That’s important: It makes it easier for new companies to enter the market, so consumers will have more choices. A new entrant would also be less likely to be acquired if there’s a tax penalty. A large company might reduce its tax bill by breaking itself into several smaller companies. It would be up to Congress or state legislatures to decide where to place the thresholds at which higher tax rates kick in.
If these measures aren’t enough, Congress has the power to create new laws that address specific problems. It could follow the Wall Street reforms of Dodd-Frank and define “systemically important social media platforms” that would be required to meet stringent transparency standards or be subject to a “fairness doctrine” for balanced reporting, similar to what broadcasters once faced.
From the very beginning, Americans have refused to tolerate unchecked power. We must now press our legislators to protect us from the unchecked power of dominant digital platforms. The bigger they get and the more they know, the greater the threat to our social and political way of life.
Paul Romer, a recipient of the 2018 Nobel Memorial Prize in Economic Science, advised the Department of Justice in its antitrust case against Microsoft.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
A Tax That Could Fix Big Tech
Paul Romer
Illustration by Cristina Spanò
Do you think you are an above-average driver, as most people do? How do you compare with others as a parent? Are you better than most at dancing? Where do you rank in your capability to save humanity?
Many of you will answer these questions incorrectly. For some of these skills, you will think you are better than you actually are. For others, you will think you are worse.
We have long known that, for particular skills, people tend to rate themselves imperfectly. In a famous study from 1981, researchers asked people to rate their driving ability. More than 90 percent considered themselves above average.
Of course, some people who think they are above-average drivers really are. But the 90 percent statistic shows that many people inflate how they compare with others. By definition, only 50 percent of people can rate above the median.
Studies like these led social scientists to conclude that people systematically exaggerate their own capabilities, that they have what researchers call “illusory superiority.”
But that’s not the whole story.
More recent studies have found examples in which people tend to underestimate their capabilities. One found that most people thought they would be worse than average at recovering from the death of a loved one. Another study reported that people thought they were worse than most at riding a unicycle. Here, they exhibit illusory inferiority.
So when are people likely to be overconfident in how they rank? And when are they underconfident?
One of us has conducted research on this topic. Spencer and his collaborators used Positly, a new platform he started that allows researchers to conduct a large number of studies rapidly. Instead of asking people where they rank on just a few skills, they asked where they ranked on 100 skills.
For each skill, participants were asked how they thought they compared with others on the platform who shared their age and gender, and lived in their area. If, on average, people thought they could outperform more than 50 percent of others at the task, that suggests systematic overconfidence. If, however, people thought they would outperform less than 50 percent, that’s evidence of underconfidence.
You Are Not as Good at Kissing as You Think. But You Are Better at Dancing.
Spencer Greenberg
⭐️⭐️⭐️⭐️
Ryan Peltier
Our modern existence depends on things we can take for granted. Cars run on gas from any gas station, the plugs for electrical devices fit into any socket, and smartphones connect to anything equipped with Bluetooth. All of these conveniences depend on technical standards, the silent and often forgotten foundations of technological societies.
The objects that surround us were designed to comply with standards. Consider the humble 8-by-16-inch concrete block, the specifications of which are defined in the Masonry Society’s “Building Code Requirements and Specification for Masonry Structures.”
This book distills centuries of knowledge about the size and thickness of blocks, seismic design requirements and the use of materials like concrete, glass and mortar. Professionals worked through committees organized by the American Concrete Institute, American Society of Civil Engineers and the Masonry Society from 1977 to 1989 to foster consensus around this single national standard.
The number of technical standards that go into some products is astonishing, and the complexity of the methods used to create these standards is perhaps even more remarkable. A 2010 study found that a laptop computer incorporates 251 standards. Companies such as I.B.M. and Microsoft created some of these standards — but only 20 percent of them. The other 80 percent of the laptop’s standards were developed by private or nongovernmental organizations that facilitate collaboration and cooperation among technical experts.
These facts should prompt some reflection about the exercise of power in a technological society: Amid concerns about the excesses of market power and government regulation, nobody seems to worry much about the private groups of experts who created 80 percent of the laptop’s standards. Standards created this way, known as the “voluntary consensus” process, are ubiquitous. They range from technologies like electrical plugs, lumber and concrete, to rules and certifications for food safety and environmental sustainability, to more personal matters such as definitions of health and disease.
The basic irony of standards is the simple fact that there is no standard way to create a standard, nor is there even a standard definition of “standard.” There are, however, longstanding ways that industries and nations coordinate standardization efforts. In the United States, the system of voluntary consensus standards is coordinated by ANSI, the American National Standards Institute.
The standards-development organizations accredited by ANSI follow a bottom-up process. It begins when someone proposes a draft standard, which then goes through a period of public comment. A panel of stakeholders and interested parties then seeks to resolve points of friction. Eventually this process, which often takes years, results in a final published standard.
ANSI was first known as the American Engineering Standards Committee, which was created to address rampant incompatibility throughout American industry. (It was eventually reconstituted in 1966 and took on the name ANSI in 1969.) Its founders came from engineering organizations and departments of the federal government that all published their own standards, which were of limited value because they varied from group to group. The consequences could be catastrophic, as with the 1904 fire that destroyed much of downtown Baltimore: Buildings could have been saved if fire departments from neighboring cities had hoses that fit Baltimore’s fire hydrants.
The Engineering Standards Committee’s solution to technical incompatibility was to get organized. At its first meeting, in 1918, it created a process where people could work out the details of technical specifications and agree to carry them out. The structure of the standardization panels balanced producers and consumers — that is, makers and users of technologies — so that no single company could dictate the outcome. This method incorporated advice from British engineers, who had created a similar organization a decade earlier, and reflected lessons from World War I, where cooperation among engineers led to technological and humanitarian accomplishments. During the war, Herbert Hoover, who was then head of the United States Food Administration, coordinated farming and business interests, as well as the automobile, railroad and shipping industries, to provide food for America and its allies.
The consensus-driven approach had two clear advantages over existing alternatives. First, a forum to bring “all interested parties” into alignment reduced duplication and wasted effort. Second, the absence of state coercion to enforce standards meant that engineers and executives had strong incentives to resolve conflicts before they published a standard, lest they face government intervention down the road. The success of their efforts would depend on the voluntary adoption of standards. The consensus process was well suited for a society where technological and economic progress was within reach — all it needed to do was to find ways to cooperate.
This was no easy task. The standards committee’s longtime secretary, Paul Gough Agnew, looked back wearily at the “endless discussions” that set the stage for the first meeting in 1918. One of the group’s founders, the electrical engineer Comfort Adams, observed, “Fear and jealousy, as well as ignorance, were the chief obstacles which had to be overcome.”
Over the past century, standardization has expanded immensely. Today, private transnational organizations create and revise thousands of consensus standards every year.
Although the voluntary consensus method has been effective, it has never been perfect. For example, “consensus” is often a euphemism. Nasty disagreements can derail the process. Companies that agree on standards one day turn around and sue one another the next. In some industries, companies can make fortunes by defying established standards — think of innovative products from Apple, or bold designs from the leaders of the fashion industry. Standardization also creates losers, and it can be very costly to invest in the losing side of “standards wars” like VHS versus Betamax, or Blu-ray versus HD DVD.
The Joy of Standards
Andrew Russell
Lee Vinsel
⭐️⭐️⭐️⭐️⭐️
The Netflix anthology’s copious use of slo-mo gets a lot of grief, but it is remarkably effective in telegraphing the experiences of taste and touch.
In his 1980 film, “Sauve Qui Peut (La Vie),” a masterful and melancholic study of sex and romance, Jean-Luc Godard fell in love with slow motion: reducing his film speed frame by frame, he extended moments of flickering, rapid movement—a character flying down the street on a bicycle, or leaping over a table to tackle a lover—to an almost abstracted stillness, turning actions into objects, verbs into nouns. The technique was so essential to “Sauve Qui Peut (La Vie)” that, when it was released in the United Kingdom, it was given the title “Slow Motion.” (The U.S. title, “Every Man for Himself,” is translated from the French. ) “You use slow motion in a way I find unusual,” Dick Cavett told Godard when the auteur appeared on Cavett’s talk show to promote the film. “How would it have been any different if you had just decided to have that at the normal speed?” “This is precisely the point,” Godard replied, chain-smoking in a tweed sport coat and tinted glasses. He relied on slow motion, he added, “when, at the normal speed, it was not possible to see things—or at least to indicate possibility . . . that there is something different to be seen.”
I thought of this exchange recently while settling in to watch the recently released sixth season of “Chef’s Table,” the Netflix anthology series that, since its début, in 2015, has devoted thirty episodes to limning the lives and skills of the world’s most celebrated restaurant cooks. Each installment of the show is a self-contained portrait, tracing the arc of an individual’s life from hungry mortal to culinary icon: childhood influences, adolescent traumas, the triumphs and pitfalls of single-minded ambition. It would be wrong to call each episode a profile, or even a biography, as “Chef’s Table” is willfully uncritical of its subjects, treating each chef with a reverence that verges on the religious. The show’s creator-producer, David Gelb, achieves this hallowed mood by leaning heavily on a handful of cinematic techniques: symmetrically framed shots, soaring classical music, and so much slow motion that one suspects the hour-long episodes, cranked up to normal speed, might be over and done in fifteen minutes.
“Chef’s Table” ’s copious slow motion gets a lot of grief—“lazy emotional steroids,” the writer Joshua David Stein complained, recapping the second season for Eater. The show’s style lends itself, with almost obscene efficiency, to parody. In a comic interlude of the series “Ugly Delicious,” for instance, the chef David Chang is slicing eggplant in slow motion to a soundtrack of Vivaldi’s “Four Seasons” (the same piece used for “Chef’s Table” ’s opening sequence) when the knife slips, and the camera shifts its embrace from the artful placement of garnishes to balletic arcs of spraying blood. The comedy series “Documentary Now!” features a glorious episode called “Juan Likes Rice and Chicken,” a spoof on Gelb’s 2011 film, “Jiro Dreams of Sushi,” that features chickens being air-cannoned at walls and bananas being sliced lengthwise with agonizing eroticism. What’s mockable, in the end, is less the slow motion or the string-heavy adagios than the tendency to veer from eloquent to vainglorious and back again.
Even when one knows all of its tricks, “Chef’s Table” enraptures. Like “Jiro Dreams of Sushi,” where Gelb established this visual language, the show is an exercise in sensory translation, telegraphing the experiences of taste and touch within the limited visual transmission of film. The staccato actions of chopping or slicing, slowed down, turn into studies in texture and form: the contrast of metal against wood; grains of rice like tumbling leaves; dense threads of piscine muscle catching and dropping the light; sauces that drape like cloth and rest, glossy and opaque, before surface tension collapses. (A confession: I often watch “Chef’s Table” with the sound off, sans captions—no story, please, just the porn.)
I spoke with Gelb a few weeks ago, in the course of reporting a profile of Niki Nakayama, a Los Angeles chef who was featured on the first season of “Chef’s Table.” Gelb explained to me that the show makes use of a combination of shooting techniques, which involves not only increasing the number of frames shot per second (which, when played at a normal rate, creates the slow-motion effect) but also exposing each frame for an abbreviated length of time. The result is an uncommonly razor-crisp series of images, instead of—as in Godard’s film, which was slowed from a standard frame rate—a dreamlike blur. Closed-shutter shooting is often used to capture and aestheticize complex action, like live sports or extravagant violence. Gelb told me that he’d been inspired to apply the technique to food after seeing “Saving Private Ryan,” particularly the opening scene on Normandy Beach. “It had this incredible energy,” he said. “When a mortar would blow up on the beach, you’d see every grain of sand from the explosion—the violence was incredibly sharp and articulated, yet it wasn’t sped up at all.”
Like a bloody battlefield, food filmed in slow motion is both aestheticized and unnervingly visceral. Flesh looks even more like flesh. A lobe of foie gras in a hot pan sweats out slow pearls of fat. Pink chickens trussed and hung over an open flame swing in the smoke like condemned prisoners. It’s strange and grotesque and riveting to look at food like this: how a knife moving through fruit creates a weeping wound, or the way pasta dough stretches and glistens like skin. “It can be a jab or a caress,” Godard explained to Cavett in their conversation, noting the limitations of observing at the speed of reality. “It would be too sentimental or too violent, and it had to be both together.”
•
Helen Rosner is The New Yorker’s roving food correspondent. In 2016, she won the James Beard award for personal-essay writing.Read more »
Chef's Table
⭐️⭐️⭐️⭐️⭐️
Pierre Mornet
No body of writing has engendered more other bodies of writing than the Bible, but the Brontë corpus comes alarmingly close. “Since 1857, when Elizabeth Gaskell published her famous Life of Charlotte Brontë, hardly a year has gone by without some form of biographical material on the Brontës appearing—from articles in newspapers to full-length lives, from images on tea towels to plays, films, and novelizations,” wrote Lucasta Miller in The Brontë Myth, her 2001 history of Brontëmania. This year the Brontë literary-industrial complex celebrates the bicentennial of Charlotte’s birth, and British and American publishers have been especially busy. In the U.S., there is a new Charlotte Brontë biography by Claire Harman; a Brontë-themed literary detective novel; a novelistic riff on Jane Eyre whose heroine is a serial killer; a collection of short stories inspired by that novel’s famous line*, “Reader, I married him”; and a fan-fiction-style “autobiography” of Nelly Dean, the servant-narrator of Wuthering Heights. Last year’s highlights included a young-adult novelization of Emily’s adolescence and a book of insightful essays called The Brontë Cabinet: Three Lives in Nine Objects, which uses items belonging to Charlotte, Emily, and Anne as wormholes to the 19th century and the lost texture of their existence. Don’t ask me to list the monographs.
I see no reason not to consider the Brontë cult a religion. What are Peoples of the Book, after all, if not irrepressible embroiderers of fetishized texts? The Jews have a word for the feverish imaginings that run like bright threads through their Torah commentaries: midrash, the spinning of gloriously weird backstories or fairy tales prompted by gaps or contradictions in the narratives. Midrash isn’t just a Jewish hermeneutic, by the way. You could call the Gospels a midrash on the Hebrew Bible, the lives of the saints a midrash on the Christ story, the Koran a midrash on all of the above.
Some Brontë fans—reader, I’m one of them—would happily work through stacks of Brontë midrash in search of answers to the mysterium tremendum, the awesome mystery, of the Brontës’ improbable sainthood. How did a poor and socially awkward ex-governess named Charlotte and her even more awkward sister, Emily, who kept house for their father in a parsonage on a Yorkshire moor far from the literary circles of London, come to write novels and poems that outshone nearly every other 19th-century British novel and poem by dint of being more alive? In an essay on Jane Eyre and Wuthering Heights published in 1925, Virginia Woolf bears witness to this miracle:
As we open Jane Eyre once more we cannot stifle the suspicion that we shall find her world of imagination as antiquated, mid-Victorian, and out of date as the parsonage on the moor, a place only to be visited by the curious, only preserved by the pious. So we open Jane Eyre; and in two pages every doubt is swept clean from our minds.
If Charlotte’s novels keep up a stiff wind, Emily’s one novel, Wuthering Heights, is a thunderstorm. Her characters, even the ghosts, Woolf writes, have “such a gust of life that they transcend reality.” (Like most readers, Woolf ignores the youngest Brontë sister, Anne, a lesser novelist and poet, and the Brontë brother, Branwell, a failed poet and artist turned alcoholic.) And just think, Woolf went on to write in a more famous essay, A Room of One’s Own, what Charlotte might have produced had Victorian mores not corseted her potential.
Woolf seizes on a passage in Jane Eyre in which she believes she hears Charlotte breaking out of Jane’s voice to lecture the reader about women’s exclusion from the “busy world” and “practical experience,” and to lament the confinement of their talents “to making puddings and knitting stockings, to playing on the piano and embroidering bags.” According to Woolf, this shows that Charlotte’s imagination, however bold, is also constricted—that she “will never get her genius expressed whole and entire. Her books will be deformed and twisted. She will write in a rage where she should write calmly.” Charlotte’s writing would have been even better, Woolf says, had she “possessed say three hundred [pounds] a year.”
But Woolf gets it exactly wrong, thereby missing what makes the Brontë story so satisfying. The sisters’ social and economic disadvantages didn’t hold them back. Charlotte and Emily explored—and exploited—the prison-house of gender with unprecedented clear-sightedness. It so happens that the sisters had a good deal of “practical experience,” and they didn’t like it one bit. Pushed out into the world, they came home as fast as they could, and in their retreat from society found the autonomy to cultivate their altogether original voices. Those forays into the marketplace of female labor, though, gave them their best material.
The Brontë sisters were women of their class and time—educated, impoverished, likely destined to spinsterhood—although with a twist. Their childhood was sui generis. Motherless since they were very young, the Brontës enjoyed the benign neglect of their busy father and made the most of their freedom to develop elaborate fantasy worlds. They read everything they could; spent long afternoons on the moor that began at their back door; invented exotic kingdoms with voluminous histories and political intrigues; put on plays only they would see; issued magazines only they would read; and sewed novels and poems into miniature books written in script so tiny that no adult in the household could decipher them. Nonetheless, since their aging father occupied his parsonage on the sufferance of a quarrelsome congregation, they lacked security and had to find a profession. That could only mean, for the Brontës, becoming governesses or teachers of the children of the gentry.
The Brontës' Secret
Judith Shulevitz
⭐️⭐️⭐️
The actress Joan Fontaine as Jane Eyre in the 1943 film 20TH CENTURY FOX
Consider the selfie. By now, it’s a fairly mundane artistic tradition, even after aprofusion of thinkpieces have wrestled with its rise thanks to the so-called Me Generation’s “obsession” with social media. Anyone in possession of a cheap camera phone or laptop can take a picture of themselves, edit it (or not), and share it with the world in a matter or seconds.
But before the selfie came “the self,” or the fairly modern concept of the independent “individual.” The now-ubiquitous selfie expresses in miniature the seismic conceptual shift that came about centuries ago, spurred in part by advances in printing technology and new ways of thinking in philosophy. It’s not that the self didn’t exist in pre-modern cultures: Rather, the emphasis the Protestant Reformation in the 16th century placed on personal will, conscience, and understanding—rather than tradition and authority—in matters of faith spilled over the bounds of religious experience into all of life. Perhaps the first novel to best express the modern idea of the self was Jane Eyre, written in 1847 by Charlotte Brontë, born 200 years ago this year.
Those who remember Jane Eyre solely as required reading in high-school English class likely recall most vividly its over-the-top Gothic tropes: a childhood banishment to a death-haunted room, a mysterious presence in the attic, a Byronic hero, and a cold mansion going up in flames. It’s more seemingly the stuff of Lifetime television, not revolutions. But as unbelievable as many of the events of the novel are, even today, Brontë’s biggest accomplishment wasn’t in plot devices. It was the narrative voice of Jane—who so openly expressed her desire for identity, definition, meaning, and agency—that rang powerfully true to its 19th-century audience. In fact, many early readers mistakenly believed Jane Eyre was a true account (in a clever marketing scheme, the novel was subtitled, “An Autobiography”), perhaps a validation of her character’s authenticity.
The way that novels paid attention to the particularities of human experience (rather than the universals of the older epics and romances) made them the ideal vehicle to shape how readers understood the modern individual. The rise of the literary form was made possible by the technology of the printing press, the print culture that followed, and the widening literacy that was cultivated for centuries until Jane Eyre’s publication. The novel seemed perfectly designed to tell Brontë’s first-person narrative of a destitute orphan girl searching for a secure identity—first among an unloving family, then an austere charity school, and finally with the wealthy but unattainable employer she loves. Unable to find her sense of self through others, Jane makes the surprising decision to turn inward.
The broader cultural implications of the story—its insistence on the value of conscience and will—were such that one critic fretted some years after its publication that the “most alarming revolution of modern times has followed the invasion of Jane Eyre.” Before the Reformation and the Enlightenment that followed, before Rene Descartes’s cogito ergo sum (“I think, therefore I am”), when the sources of authority were external and objective, the aspects of the self so central to today’s understanding mattered little because they didn’t really affect the course of an individual’s life. The Reformation empowered believers to read and interpret the scriptures for themselves, rather than relying on the help of clergy; by extension, this seemed to give people permission to read and interpret their own interior world.
To be sure, early novelists before Brontë such as Frances Burney, Daniel Defoe, Samuel Richardson, and Mary Shelley contributed to the form’s developing art of the first-person narrator. But these authors used the contrivances of edited letters or memoirs, devices that tended toward underdeveloped characters, episodic plots, and a general sense of artificiality—even when the stories were presented not as fiction but “histories.” No earlier novelist had provided a voice so seemingly pure, so fully belonging to the character, as Brontë. She developed her art alongside her sisters, the novelists Anne and Emily (all of them publishing under gender-neutral pseudonyms), but it was Charlotte whose work best captured the sense of the modern individual. Anne Brontë’s novels Agnes Grey and The Tenant of Wildfell Hall contributed to the novel’s ability to offer social commentary and criticism, while the Romantic sensibilities of Emily Brontë’s Wuthering Heights explored how the “other,” in the form of the dark, unpredictable Heathcliff, can threaten the integrity of the self.
One of the greatest testimonies to Brontë’s accomplishment came from Virginia Woolf, a modernist pioneer who represents a world far removed from that of Bronte’s Victorianism. “As we open Jane Eyre once more,” a doubting Woolf wrote in The Common Reader, “we cannot stifle the suspicion that we shall find her world of imagination as antiquated, mid-Victorian, and out of date as the parsonage on the moor, a place only to be visited by the curious, only preserved by the pious.” Woolf continues, “So we open Jane Eyre; and in two pages every doubt is swept clean from our minds.” There is nothing of the book, Woolf declares, “except Jane Eyre.” Jane’s voice is the source of the power the book has to absorb the reader completely into her world. Woolf explains how Brontë depicts:
… an overpowering personality, so that, as we say in real life, they have only to open the door to make themselves felt. There is in them some untamed ferocity perpetually at war with the accepted order of things which makes them desire to create instantly rather than to observe patiently.
It is exactly this willingness—desire, even—to be “at war with the accepted order of things” that characterizes the modern self. While we now take such a sense for granted, it was, as Brontë’s contemporaries rightly understood, radical in her day. More disturbing to Brontë’s Victorian readers than the sheer sensuality of the story and Jane’s deep passion was “the heroine’s refusal to submit to her social destiny,” as the literary critic Sandra M. Gilbert explains. Indeed, one contemporary review complained, “It is true Jane does right, and exerts great moral strength,” but the critic continues that “it is the strength of a mere heathen mind which is a law unto itself.” In presenting such a character, the reviewer worries, Brontë has “overthrown authority” and cultivated “rebellion.” And in a way they were right: “I resisted all the way,” Jane says as she is dragged by her cruel aunt toward banishment in the bedroom where her late uncle died. This sentence, Joyce Carol Oates argues, serves as the theme of Jane’s whole story.
But Jane’s resistance is not the empty rebellion of nihilism or self-absorption (consider how current practitioners of “selfie culture” frequently weather accusations of narcissism). Rather, her quest for her true self peels back the stiff layers of conventionality in order to discover genuine morality and faith. As Brontë explains in the preface to the novel’s second edition (a preface necessitated by the moral outrage that followed the novel’s publication),
Conventionality is not morality. Self-righteousness is not religion. To attack the first is not to assail the last … These things and deeds are diametrically opposed: they are as distinct as is vice from virtue. Men too often confound them: they should not be confounded: appearance should not be mistaken for truth; narrow human doctrines, that only tend to elate and magnify a few, should not be substituted for the world.
In a letter to a friend, Bronte responded to her critics’ objections by declaring, “Unless I have the courage to use the language of Truth in preference to the jargon of Conventionality, I ought to be silent ...”
The refusal of such a woman, who lived in such a time, to be silent created a new mold for the self—one apparent not only in today’s Instagram photos, but also more importantly in the collective modern sense that a person’s inner life can allow her to effect change from the inside out.
Jane Eyre and the Invention of Self
Karen Swallow Prior
⭐️⭐️
Author(s):Abstract:Managing large amounts of information and efficiently using this information in improved decision making has become increasingly challenging as businesses collect terabytes of data. Intelligent solutions, based on neural networks (NNs) and genetic algorithms (GAs), to solve complicated practical problems in various sectors are becoming more and more widespread nowadays. The current study provides an overview for the operations researcher of the neural networks and genetic algorithms methodology, as well as their historical and current use in business. The main aim is to present and focus on the wide range of business areas of NN and GA applications, avoiding an in‐depth analysis of all the applications – with varying success – recorded in the literature. This review reveals that, although still regarded as a novel methodology, NN and GA are shown to have matured to the point of offering real practical benefits in many of their applications.Keywords: Type: Literature reviewPublisher: Emerald Group Publishing LimitedCopyright:Citation: Kostas Metaxiotis, John Psarras, (2004) "The contribution of neural networks and genetic algorithms to business decision support: Academic myth or practical solution?", Management Decision, Vol. 42 Issue: 2, pp.229-242, https://doi.org/10.1108/00251740410518534Downloads: The fulltext of this document has been downloaded 1671 times since 2012
The contribution of neural networks and genetic algorithms to business decision support
⭐️⭐️⭐️⭐️
July 2023
1
last 30 days
0
next 7 days
0