Emma Watson at He For She 2014.
Emma Watson at He For She 2014.
As Gates was working his way through the series, he stumbled upon a set of DVDs titled “Big History” — an unusual college course taught by a jovial, gesticulating professor from Australia named David Christian. Unlike the previous DVDs, “Big History” did not confine itself to any particular topic, or even to a single academic discipline. Instead, it put forward a synthesis of history, biology, chemistry, astronomy and other disparate fields, which Christian wove together into nothing less than a unifying narrative of life on earth. Standing inside a small “Mr. Rogers”-style set, flanked by an imitation ivy-covered brick wall, Christian explained to the camera that he was influenced by the Annales School, a group of early-20th-century French historians who insisted that history be explored on multiple scales of time and space. Christian had subsequently divided the history of the world into eight separate “thresholds,” beginning with the Big Bang, 13 billion years ago (Threshold 1), moving through to the origin of Homo sapiens (Threshold 6), the appearance of agriculture (Threshold 7) and, finally, the forces that gave birth to our modern world (Threshold 8).
Christian’s aim was not to offer discrete accounts of each period so much as to integrate them all into vertiginous conceptual narratives, sweeping through billions of years in the span of a single semester. A lecture on the Big Bang, for instance, offered a complete history of cosmology, starting with the ancient God-centered view of the universe and proceeding through Ptolemy’s Earth-based model, through the heliocentric versions advanced by thinkers from Copernicus to Galileo and eventually arriving at Hubble’s idea of an expanding universe. In the worldview of “Big History,” a discussion about the formation of stars cannot help including Einstein and the hydrogen bomb; a lesson on the rise of life will find its way to Jane Goodall and Dian Fossey. “I hope by the end of this course, you will also have a much better sense of the underlying unity of modern knowledge,” Christian said at the close of the first lecture. “There is a unified account.”
Christian emailed to say that he thought it was a pretty good idea. The two men began tinkering, adapting Christian’s college course into a high-school curriculum, with modules flexible enough to teach to freshmen and seniors alike. Gates, who insisted that the course include a strong digital component, hired a team of engineers and designers to develop a website that would serve as an electronic textbook, brimming with interactive graphics and videos. Gates was particularly insistent on the idea of digital timelines, which may have been vestige of an earlier passion project, Microsoft Encarta, the electronic encyclopedia that was eventually overtaken by the growth of Wikipedia. Now he wanted to offer a multifaceted historical account of any given subject through a friendly user interface. The site, which is open to the public, would also feature a password-protected forum for teachers to trade notes and update and, in some cases, rewrite lesson plans based on their experiences in the classroom.
Or take 1940s Russia, which lost some 20 million men and 7 million women to World War II. In order to replenish the population, the state instituted an aggressive pro-natalist policy to support single mothers. Mie Nakachi, a historian at Hokkaido University, in Japan, has outlined its components: mothers were given generous subsidies and often put up in special sanatoria during pregnancy and childbirth; the state day-care system expanded to cover most children from infancy; and penalties were brandished for anyone who perpetuated the stigma against conceiving out of wedlock. In 1944, a new Family Law was passed, which essentially freed men from responsibility for illegitimate children; in effect, the state took on the role of “husband.” As a result of this policy—and of the general dearth of males—men moved at will from house to house, where they were expected to do nothing and were treated like kings; a generation of children were raised without reliable fathers, and women became the “responsible” gender. This family pattern was felt for decades after the war.
Indeed, Siberia today is suffering such an acute “man shortage” (due in part to massive rates of alcoholism) that both men and women have lobbied the Russian parliament to legalize polygamy. In 2009, The Guardian cited Russian politicians’ claims that polygamy would provide husbands for “10 million lonely women.” In endorsing polygamy, these women, particularly those in remote rural areas without running water, may be less concerned with loneliness than with something more pragmatic: help with the chores. Caroline Humphrey, a Cambridge University anthropologist who has studied the region, said women supporters believed the legalization of polygamy would be a “godsend,” giving them “rights to a man’s financial and physical support, legitimacy for their children, and rights to state benefits.”
Gloria Steinem in 1970s. And here we are in 2014…
Vivian Gornick, Sontag’s fellow critic and contemporary (Gornick was born just two years later), tried to puzzle out something that seems to have confused Sontag, and is genuinely confusing: the relationship between solitude and romantic love. In Gornick’s 1978 collection, Essays on Feminism, she argues that, at least for women, solitude is necessary because marriage, its apparent opposite, usually gets in the way of thinking, growth, self-knowledge. In fact marriage, per Gornick, is the original, distorting expectation imposed on a woman’s life — distorting because it has been viewed, by both men and women, as the “pivotal experience of [a woman’s] psychic development,” her crowning achievement. Gornick outlines the consequences of the idea in one long, electrifying sentence: “It is this conviction, primarily, that reduces and ultimately destroys in women that flow of psychic energy that is fed in men from birth by the anxious knowledge given them that one is alone in this world; that one is never taken care of; that life is a naked battle between fear and desire, and that fear is kept in abeyance only through the recurrent surge of desire; that desire is whetted only if it is reinforced by the capacity to experience oneself; that the capacity to experience oneself is everything.” The promise of marriage is the promise of togetherness, support, safety, and this prevents a woman from taking responsibility for her own life — and therefore ultimately from “experiencing” herself — by removing the motivation behind all important action, which is the terror of aloneness. In Sontag’s envy of those writers who knew how to be alone runs a current of precisely this motivating terror. Her fear of being too much by herself fuels her desire to join the club.
For all of her skepticism of marriage, Gornick, who married and divorced twice, didn’t exactly give up on love. In “On the Progress of Feminism” she describes a friend — not a feminist, she is quick to point out — who wearily pronounces love dead. Maybe it is love, this friend says, that is keeping us from self-realization. The proposition appalls Gornick. “No,” she protests “hotly” —we need to learn to love anew. If we can stop being “in love with the ritual of love,” its tired conventions and seductive abstractions, maybe we can achieve a “free, full-hearted, eminently proportionate way of loving.” Women aren’t the only ones who suffer in marriage, but because marriage is so “damnably central” to us, we are always the more comprehensively wounded party. And because we have more to lose, “it is incumbent on us to understand that we participate in these marriages because we have no strong sense of self with which to demand and give substantial love, it is incumbent on us to make marriages that will not curtail the free, full functioning of that self.”
Is romantic love the enemy of a necessary aloneness? Or is it only through learning to be truly alone that we become capable of romantic love? Put differently, is independence a necessary precondition for any relationship or, instead, an end in itself? In Gornick we feel this dilemma being lived but not quite framed. It would be heartening to find, in her oeuvre, a woman who’d been able to do something like what she envisioned, a woman unscathed by her romantic past, in full possession of the answers to her questions, a mature and sober literary hero. We can find a likeness of this image in her work, but so can we find something more provisional and trapped, a person battling the same impulses and limitations her whole life, winning some, losing some, never arriving definitively. In Fierce Attachments (a memoir of her relationship with her mother that is as much a memoir of her relationship to romantic love), she describes being questioned by a friend about her stoicism in the face of long singledom — “You seem never to think about it,” he says to her, meaning men, or rather being without one — and as he speaks she has a vision: “I saw myself lying on a bed in late afternoon, a man’s face buried in my neck, his hand moving slowly up my thigh over my hip …” Poor Vivian, transfixed by her internal picture, is so “stunned by loss” she can’t even speak.
So much has she believed in the necessity of being alone that she refuses, until very late, to see the toll solitude has taken. The trouble, as she dryly remarks in Approaching Eye Level, is that she is a “born ideologue.” Repeatedly trumpeted, her convictions about the necessity of independence ultimately leave behind their foundational truth. In a way, she has also mistaken her own early argument, imagining solitude itself as the goal, the necessary regulating ambition, when it was rather solitude as precondition — “the anxious knowledge … that one is alone in the world” — that she’d initially lit upon as useful. In that earlier formulation, aloneness was neither survival technique nor highest spiritual aim, it was the fact one started with, and this awareness would lead not to literal seclusion but to the communicative achievements of art and love. So long has Gornick insisted upon living alone, believing that she must “face down loneliness,” that she has failed to see that she was in fact facing nothing down: “I saw that I had not learned to live alone at all. What I had learned to do was strategize; to lie down until the pain passed, to evade, to get by. I wasn’t drowning, but I wasn’t swimming either. I was floating on my back, far from shore, waiting to be saved.”
Rather than existing in some “real,” the media overlay on reality means we exist in statistical models that purport to measure reality but in fact are tautological, capable only of grasping what it is has already predicted and modeled. This makes me think of Facebook’s control of your Newsfeed, which attempts to shape your conception of your social reality, of what your friends are talking about and what sorts of political ideas are “important” to them, all while injecting advertisements determined through data analysis to be the least disruptive and most persuasive. Facebook promises to entertain you, but it turns out that promise is synonymous with manufacturing demand. (Being entertained becomes no different from learning how to desire; pleasure is no longer desire fulfilled, but desire itself, the condition of desiring.)
Within that model is where power is exercised, modulating behavioral outcomes at the level of populations. (Foucault writes about this as “governmentality”). For Baudrillard, those deindividuated populations ruled over through monitoring, statistical modeling, and predictive analytics are supposed to be “the social” — i.e. the “reality” of what the data measures, the population on which power can be exercised by what he tends to call the “system” — but they instead are becoming “the masses,” an amorphous blob of individuals that eludes certain management by its sheer inertia, which proves uninterpretable even as the system throws more resources at trying to understand what it wants or where it is headed. In In the Shadow of Silent Majorities, Baudrillard writes,
And it is this which today turns against it [the system, or “power”]: the inertia it has fostered becomes the sign of its own death. That is why it seeks to reverse its strategies: from passivity to participation, from silence to speech. But it is too late. The threshold of the “critical mass,” that of the involution of the social through inertia, is exceeded. Everywhere the masses are encouraged to speak, they are urged to live socially, electorally, organizationally, sexually, in participation, in festival, in free speech, etc. The spectre must be exorcised, it must pronounce its name. Nothing shows more dramatically that the only genuine problem today is the silence of the mass, the silence of the silent majority.
Hence, social media, which masquerades as communication between peers but primarily functions as individuals consuming the social as isolated atoms while compiling and generating data for the system. Social media are a huge effort to prevent the masses from being silent in Baudrillard’s subversive sense — if the masses are silent, they move beyond manipulation, beyond influence, beyond desire, beyond control, beyond comprehension by the forces attempting to exercise sovereignty over them. If the masses seem to speak, as they now do in social media (and through all the other means for surveilling their everyday activities with “smart” devices), they yield the data that appears to make them manageable. They become “social” again, in the sense of being amenable to the mechanisms of social control.
But data collection only raises more questions than it answers about the populations under surveillance; as Kate Crawford explains here, the more data you have, the more crises of interpretation you confront, leading to more data collection and deeper crises. Baudrillard puts it this way:
It is a contradictory process, for information and security, in all their forms, instead of intensifying or creating the “social relation,” are on the contrary entropic processes, modalities of the end of the social. It is thought that the masses may be structured by injecting them with information, their captive social energy is believed to be released by means of information and messages (today it is no longer the institutional grid as such, rather it is the quantity of information and the degree of media exposure which measures socialization). Quite the contrary. Instead of transforming the mass into energy, information produces even more mass.
The more information about the masses we have, the more we uncover that there is to know, which makes the masses recede even further into their massive inscrutability. It turns out that the ways the system can allow us to speak, in social-media platforms and in a stream of cell-phone metadata, amount only to so much more silence.
It’s hard to tell through his multiple layers of irony, but it seems that Baudrillard thought this “implosion” process had the potential to short-circuit the imposition of power as it made the social disappear. It manifests a refusal to participate in the flexible ways control is administered not through repression but through encouraging expression, and letting people build their own jails. The more you say and interact and connect, the better you can be modeled, and the more your reality can be seamlessly shaped around you, so that control is experienced as freedom within the circumscribed matrix. This is basically The Matrix, only now, much more plausibly, the Matrix is a simulation generated by data streams harvested from phones and social media. You get out of the matrix by disappearing into the mass, by going normcore. Baudrillard argues in “The Implosion of Meaning in Media,” that “the system’s current argument is the maximization of the word and the maximal production of meaning. Thus the strategic resistance is that of a refusal of meaning and a refusal of the word — or of the hyperconformist simulation of the very mechanisms of the system, which is a form of refusal and of nonreception.”
Maybe then, the way to resist the demand to make one’s subjectivity productive for capital is to use social media in a “hyperconformist” normcore way, emptying “self-expression” of its value for social-media companies and shifting the location of selfhood elsewhere by perpetually deferring its “genuine” expression.
But what would hyperconformist use of social media look like? And how is that any different from how we might use social media naively, without any subversive intent? At times Baudrillard makes this kind of resistance seem a deliberate strategy, requiring conscious intent, but mostly he suggests that intention doesn’t matter, in part because resistance is automatic and futile at the same time. (Power’s operation generates the “masses,” which automatically resist power by absorbing its blows and growing — all efforts to measure it extend its immeasurability.) Worrying about intentionality is like worrying about authenticity in a postauthentic age.
Instead of being “true” to yourself to evade the forces of control, one trusts to the “evil genius” generated perversely by the system itself in its efforts to function smoothly. “This is what one could call the evil genius of the object, the evil genius of the masses, the evil genius of the social itself, constantly producing failure in the truth of the social and in its analysis,” Baudrillard says in “The Masses: Implosion of the Social in Media.” He posits a “radical antimetaphysics whose secret is that the masses are deeply aware that they do not have to make a decision about themselves and the world, that they do not have to wish, that they do not have to know, that they do not have to desire.” They don’t have to do anything; they don’t have even to “be themselves,” which would be a form of production, manifesting a certain consumer demand.
Instead we have desire, subjectivity, selfhood served to us, which threatens to close a feedback loop — the self Big Data is trying to capture ends up just being the one which it has already reported to us. This, Baudrillard hopes, will eventually suffocate the system, while the masses enjoy the spectacle of themselves as a kind of consumer good. Simply liking what we are told or expected to like becomes deeply subversive to a system that depends on our innovating new desires, new demand. “The deepest desire,” he argues, “is perhaps to give the responsibility for one’s desire to someone else.” This “expulsion,” as he calls it, can now show up as a surrender to the self that social-media platforms serve us; it shows up in the ways compulsive social media use (“hyperconformity” to the expectations of our sharing things on it) can effect depersonalization. Even self-expression (as I try to argue in this post) can be a way of offloading the burden of self, dismantling identity as much as building it. Self-expression can become inertial, a form of noisy silence.
“Information overload” too, can provoke depersonalization and escape from the responsibility for identity and all the risk management that comes with having a palpable, foregrounded “personal brand.” Having a deep personality merely compounds the risks of having a self exponentially — there’s so many more things one would have to be strategic about presenting and managing. Becoming “the masses” alleviates the stress that neoliberalism’s intensifying emphasis on human capital and individual resilience and flexibility generates.
The effect of a having an automatic identity generated for us as our lives progress is that we can, in theory, be more fully present in the moment, not as “ourselves,” worried about the continuity of our identity, but as a consciousness skating on the surface of sensual experience, liberated from any meaning. Likewise, virality promises a similar liberation. Baudrillard claims that “the present argument of the system is to maximize speech, to maximize the production of meaning, of participation.” Social media testify to the continued vehemence of that argument. “And so the strategic resistance,” Baudrillard continues, “is that of the refusal of meaning and the refusal of speech—or of the hyperconformist simulation of the very mechanisms of the system, which is another form of refusal by overacceptance.” Virality may be considered in general as a kind of “refusal by overacceptance,” a hyperconformity. It is a way of speaking without saying anything. When something goes viral it can no longer signify anything but its virality; its original content is negated. It becomes silent in its ubiquity.
If communication has emptied itself of meaning through the intensification of the means by which it is circulated, the self is probably next. In “The Ecstasy of Communication” Baudrillard describes the condition of viral selfhood, of identity that consists of circulation, of a subjectivity that finds itself in the way it has been already simulated in advance in data. He notes the “forced extroversion of all interiority, the forced injection of all exteriority that the categorical imperative of communication literally signifies” — the kind of inescapable connectivity that sometimes gets described now as “the end of privacy” — and then points outs the consequences. We are no longer estranged from the real, the misplaced fear of “digital dualists” who worry that, say, the people taking pictures with their phone aren’t allowing themselves to take part in what’s “really” happening. (This Sherry Turkle op-ed is a quintessential example of this discourse, which Nathan Jurgenson critiques here.) Instead, we are characterized by “absolute proximity, the total instantaneity of things, the feeling of no defense, no retreat.” The “always on” self is not separated from the real but helplessly immersed in it, beyond the fiction of transcending it with a walk down a Cape Cod beach. The networked self “can no longer produce the limits of his own being, can no longer play nor stage himself, can no longer produce himself as a mirror. He is now a pure screen, a switching center for all the networks of influence.” At this point the self can only signify the fact of its being connected, of being able to establish network connections. The self is a modem.
Her smartphone game is extremely popular and extremely ridiculous. And totally genius.
Kim gives an age newly obsessed with self-documentation a perfectly vacant-stared mascot. Her preferred medium, the TV shows and the sex tape notwithstanding, is the photograph. And photos are extremely good at making their subjects seen and not heard.
But Kim! Kim steadfastly refuses to sell anything but herself. Or, to be more specific, she refuses to sell anything but the image of herself. She will feign no interest in baking or knitting or meditating. She seems to have no thoughts whatsoever about kale. She seems to have no thoughts, really, about much of anything. And therein lies her particular gift. She is a human, we can safely assume—she puts her Spanx on one leg at a time—but being human is, by definition, ordinary. And Kim is, by her own definition, extraordinary. She is a person who is also A Way of Life. She is her own ecosystem. She is her own value system.
That is the premise, at least, of Kim Kardashian: Hollywood, an app that is also a game that is also, now, a phenomenon.
The game is free to download and play; but it allows—and encourages—in-app purchases. You use real-world money to win at Kim World. Which has meant, among other things, that Kim Kardashian is becoming even more explicitly what a reality star always will be, underneath it all: an entrepreneur. While she has long ranked among the highest-paid of the reality (“reality”) stars—her estimated net worth, as of this June, was $45 million—the game is on track to earn $200 million, with Kim’s 45-percent cut coming in at $90 million. So you can accuse Kim Kardashian of vanity or vapidity or any manner of metaphor, but, if you do, she will laugh all the way to the bank. And then use the ATM mirror to reapply her mascara.
In very short order, your boutique—So Chic, it is called—is visited by … Kim Kardashian. Which is not just a cool thing, but a useful thing. It is An Opportunity, and you are meant to take advantage of it. (Taking advantage is a big theme of Kim Kardashian: Hollywood.) You are given two options: stay open late and help Kim get an outfit for a party she’s attending … or refuse her. (The game, however, will not actually let you refuse. Striving, in this moral universe, means always saying yes.)
So things continue apace. Your relationship with Kim grows. (Like everything else in the game, however, Kim is merely a means to end, and that end is Fame.) Kim, being magnanimous, invites you to a photo shoot; you go to said photo shoot; you go, afterward, to a party. And to more parties. And to more parties, each of escalating awesomeness.
There is a brute logic to all of this. There is also an economic essentialism to all of this. Every possible move in the game that is Kim Kardashian: Hollywood represents a strategic violation of the categorical imperative: The whole point is to use anything available to you—goods, money, other people—as means to your own self-furtherment. This is capitalism, essentially, stripped of its remaining niceties and applied to the unique vapidity of Hollywood social life.
Kim Kardashian: Hollywood is the game that Ayn Rand might have written, had Ayn Rand lived in the age of the smartphone and been a fan of bodycon skirts. It is what happens when objectification gives way to objectivism. "This game is so freakin stupid," iTunes customer Dmon555 complained, before giving it a 5-star rating.
In the Android store, Kim Kardashian: Hollywoodis sold under the category of “Adventure.” And this is where Kim really gets the last laugh. Because the adventure being undertaken is, essentially, to become Kim Kardashian. It is to mimic her life—the striving, the posing, the stubborn conviction that fame is its own reward. And it is also, outside of the game’s ecosystem, to help Kim Kardashian to realize her own quest to become Kim Kardashian. The game equates Kardashian with Hollywood itself; it might as well have thought more grandly. Kim Kardashian: Hollywood has made its way to Stephen Colbert. And to federal agencies, with the EPA mistakenly tweeting about the game. And to the House of Representatives, where John Dingell, D-MI, was recently compelled to declare, “I have no idea who/what a Kardashian is.”
The thing is: He does now.
I really didn’t want to read anything about Kim Kardashian…but then I did.
Mark Ronson: How sampling transformed music, simply amazing.
A cache of over 60 exceptionally high quality, high resolution scans.
Amazing collection. Love.
Hon Hai Precision Industry Co., Ltd., trading as Foxconn Technology Group, is a Taiwanese multinational electronics contract manufacturing company head quartered in Tucheng, New Taipei, Taiwan. It is the world’s largest electronics contracter manufacturer, and the third-largest information technology company by revenue.
Foxconn is primarily an original design manufacturer and its clients include major American, European, and Japanese electronics and information technology companies. Notable products that the company manufactures include theBlackBerry, iPad, iPhone, Kindle, Playstation 4, Xbox One, and Wii U.
Foxconn has been involved in several controversies relating to how it manages employees in China, where it is the largest private employer.
Foxconn has 13 factories in nine Chinese cities—more than in any other country.
Foxconn’s largest factory worldwide is in Longhua, Shenzhen, where hundreds of thousands of workers (varying counts include 230,000, 300,000, and 450,000) are employed at the Longhua Science & Technology Park, a walled campus sometimes referred to as “Foxconn City”. Covering about 1.16 square miles (3 square km), it includes 15 factories, worker dormitories, a swimming pool, a fire brigade, its own television network (Foxconn TV), and a city centre with a grocery store, bank, restaurants, bookstore, and hospital. While some workers live in surrounding towns and villages, others live and work inside the complex; a quarter of the employees live in the dormitories, and many of them work up to 12 hours a day for 6 days each week. Another of Foxconn’s factory “cities” is Zhengzhou Technology Park in Zhengzhou, Henan province, where it is reported 120,000 employees work.
Foxconn has been involved in several controversies all relating to employee grievances or treatment. Foxconn has more than a million employees. In China, it employs more people than any other private company as of 2011.
Allegations of poor working conditions have been made on several occasions. News reports highlight the long working hours, discrimination against mainland Chinese workers by their Taiwanese co-workers, and lack of working relationships at the company. Although Foxconn was found to be compliant in the majority of areas when Apple Inc. audited the maker of its iPods and iPhones in 2007 the audit did substantiate a few of the allegations.
Concerns increased in early 2012 due to a US theatrical monologue purportedly based on factual accounts of working conditions at Foxconn, but a portion of the source material was later found to be fictional. However, a 2012 audit performed by the Fair Labor Association at the request of Apple Inc. found that workers routinely received insufficient overtime pay and suggested that workplace accidents may be common.
A Hong Kong non-profit organisation, Students and Scholars Against Corporate Misbehavior, has written numerous negative reports on Foxconn’s treatment of its employees. These typically find far worse conditions than the 2012 Fair Labor Association audit did but rely on a far smaller number of employee informants–100 to 170. The Fair Labor Association audit in 2012 used interviews with 35,000 Foxconn employees.
In October 2012, the company admitted that 14-year-old children had worked for a short time at a facility in Yantai, Shandong Province. Foxconn said that the workers involved were part of an internship program. Individuals as young as 16 can legally work in China.
Also in October 2012 a young worker, Zhang Tingzhen, was threatened to have Hon-Hal medical support canceled, when doctors remonstrated against moving his injured body for treatment in Huizhou from the hospital in Shenzhen. He suffered an electrical shock and was injured to the extent that doctors needed to amputate half of his brain. This left him in no condition to travel to Huizhou, the city he was initially hired at. The company stated that it was acting within labor laws.
SuicidesMain article: Foxconn suicides
Suicides among Foxconn workers have attracted media attention. One was the high-profile death of a worker after the loss of a prototype and the other, a series of suicides linked to low pay in 2010. Suicides of Foxconn workers continued into 2012, with one in June 2012. The rate has substantially fallen since 2010.
In reaction to a spate of worker suicides in which 14 people died in 2010, a report from 20 Chinese universities described Foxconn factories as labor camps and detailed widespread worker abuse and illegal overtime. In response to the suicides, Foxconn installed suicide-prevention netting at some facilities, and it promised to offer substantially higher wages at its Shenzhen production bases. Workers were also forced to sign a legally binding document guaranteeing they and their descendants would not sue the company as a result of unexpected death, self-injury or suicide.