The Human Fabric of the Web

Sketching out the the social structure of a large company such as Facebook is a task which is important not only in order to understand the impact of such a global internet phenomenon as the social network on the society, local and global economy, and civil freedoms, but also to better understand how the development of high-end technology and communication infrastructures intertwine with the accumulation of capital and political power. Even though the world is at the point of postglobal development (a point where global is already reached and the new local is what the market needs), the deep embeddedness of the company in the economic, political and social elite/establishment of one society/country is what makes the company strong enough to act globally – and not, as is often thought, through the cooperation of the elites around the world... As our investigation shows, the real fabric of the web consists of the personalsocial networks of specific people in the higher strata of the company. If anything other than its profit, this is what keeps the whole structure together and safe from any change in the political establishment.

Neither intelligence nor education can stop you from forming prejudiced opinions – but an inquisitive attitude may help you make wiser judgements.Tom Stafford

The political lens
There is now a mountain of evidence to show that politics doesn’t just help predict people’s views on some scientific issues; it also affects how they interpret new information. This is why it is a mistake to think that you can somehow ‘correct’ people’s views on an issue by giving them more facts, since study after study has shown that people have a tendency to selectively reject facts that don’t fit with their existing views.

Other research shows that people with the most education, highest mathematical abilities, and the strongest tendencies to be reflective about their beliefs are the most likely to resist information which should contradict their prejudices. This undermines the simplistic assumption that prejudices are the result of too much gut instinct and not enough deep thought. Rather, people who have the facility for deeper thought about an issue can use those cognitive powers to justify what they already believe and find reasons to dismiss apparently contrary evidence.

It’s a messy picture, and at first looks like a depressing one for those who care about science and reason. A glimmer of hope can be found in new research from a collaborative team of philosophers, film-makers and psychologists led by Dan Kahan of Yale University. Kahan and his team were interested in politically biased information processing, but also in studying the audience for scientific documentaries and using this research to help film-makers. They developed two scales. The first measured a person’s scientific background, a fairly standard set of questions asking about knowledge of basic scientific facts and methods, as well as quantitative judgement and reasoning. The second scale was more innovative. The idea of this scale was to measure something related but independent – a person’s curiosity about scientific issues, not how much they already knew. This second scale was also innovative in how they measured scientific curiosity. As well as asking some questions, they also gave people choices about what material to read as part of a survey about reactions to news. If an individual chooses to read about science stories rather than sports or politics, their corresponding science curiosity score was marked up.

Armed with their scales, the team then set out to see how they predicted people’s opinions on public issues which should be informed by science. With the scientific knowledge scale the results were depressingly predictable. The higher levels of scientific education results in a greater polarisation between the groups, not less.

So much for scientific background, but scientific curiosity showed a different pattern. The team confirmed this using an experiment which gave participants a choice of science stories, either in line with their existing beliefs, or surprising to them. Those participants who were high in scientific curiosity defied the predictions and selected stories which contradicted their existing beliefs. So, curiosity shows that to promote a greater understanding of public issues, it is as important for educators to try and convey their excitement about science and the pleasures of finding out stuff, as it is to teach people some basic curriculum of facts.

A recent study published in Science Advances, suggests that “envious” is the most common personality type. A computer algorithm classified people based on their behavior in hundreds of social dilemma scenarios and found the majority could be categorized into four basic personality types: optimistic, pessimistic, trustful and envious. Thirty percent of the people were rated as envious.

“These subjects seem to behave as driven by envy, status-seeking consideration, or lack of trust,” the researchers wrote in their paper. “These players prevent their counterparts from receiving more payoff than themselves even when, by doing so, they diminish their own potential payoff.”

Each of the other three personality types ― pessimistic, optimistic and trustful ― described about 20 percent of people. The last 10 percent behaved so erratically that the computer program failed to categorize them. The four new categories, in contrast, describe the types of behaviors that people show in a social context, where they have to interact with others.

The new study analyzed the responses of more than 500 volunteers to hundreds of hypothetical dilemmas in which people could either cooperate with their teammate or act in their own self-interest. But as these new findings show, it’s not just rationality or cooperative spirit that determines what humans end up doing. Their own personalities, too, play a part.

“The results go against certain theories; the one which states that humans act purely rationally for example,” study co-author Yamir Moreno of the University of Zaragoza in Spain said in a press release.

The envious 30 percent failed to cooperate just because they couldn’t stand the thought of potentially being left with a lower payoff than their teammate received. “This points to the difficulty of making people understand when they face a nondilemmatic, win-win situation,” the researchers wrote.

Although the games in the study offer hypothetical scenarios, they resemble many real-life interactions. Imagine that you are partnered with a co-worker on a special project. To achieve spectacular results, you both need to work hard. But it’s not guaranteed that the two of you will be given equal credit for the success of the project. So you might choose to do just the minimum, which certainly prevents your partner from getting unearned praise, but also deprives you of the credit that could come with the best results.

At the heart of this seminal work is the revolutionary idea that human
consciousness did not begin far back in animal evolution but was a learned
process that emerged, through cataclysm and catastrophe, from a hallucinatory
mentality only three thousand years ago and that is still developing.

The implications of this scientific paradigm extend into virtually every
aspect of our psychology, our history, our culture, our religion — indeed
our future. In the words of one reviewer, it is “a humbling text, the kind
that reminds most of us who make our livings through thinking, how
much thinking there is left to do.”

“When Julian Jaynes . . . speculates that until late in the second millennium
B.C. men had no consciousness but were automatically obeying the voices
of gods, we are astounded but compelled to follow this remarkable thesis
through all the corroborative evidence.” — John Updike, The New Yorker

“Thi s books and this mans ideas may be the most influential, not to say
controversial, of the second half of the twentieth century. It renders whole
shelves of books obsolete.” — William Harrington, Columbus Dispatch

“Having just finished The Origin of Consciousness , I myself feel something
like Keats’ Cortez staring at the Pacific, or at least like the early reviewers of Darwin or Freud. I’m not quite sure what to make of this new territory; but its expanse lies before me and I am startled by its power.” — Edward Profitt, Commonweal

“ He is as startling as Freud was in The Interpretation of Dreams , and Jaynes
is equally adept at forcing a new view of known human behavior. ” — Raymond Headlee, American Journal of Psychiatry

“The weight of original thought in [this book] is so great that it makes
me uneasy for the author’s well-being: the human mind is not built to
support such a burden.” — D. C. Stove, Encounter

Meet the scientific prophets who claim we are on the verge of creating a new type of human - ahuman v2.0.

It's predicted that by 2029 computer intelligence will equal the power of the human brain. Some believe this will revolutionise humanity - we will be able to download our minds to computers extending our lives indefinitely. Others fear this will lead to oblivion by giving rise to destructive ultra intelligent machines.

One thing they all agree on is that the coming of this moment - and whatever it brings - is inevitable.

This is a short film that was directed by the French animation collective H5, François Alaux, Hervé de Crécy + Ludovic Houplain. It was presented at the Cannes Film Festival 2009. It opened the 2010 Sundance Film Festival and won a 2010 academy award under the category of animated short.
In this film there are two pieces of licensed music, in the beginning and in the end. All the other music and sound design are original. The opening track (Dean Martin "Good Morning Life") and closing track (The Ink Spots "I don't want to send the world on fire") songs are licensed pre-existing tracks. All original music and sound design is by, human (humanworldwide.com)

Background music in shops - disparagingly referred to as "muzak" - has been shown to have an effect on our buying habits.

Shops and restaurants can use music "to target those effects that are most likely to increase sales in a given business", says Adrian North, professor of music psychology at Australia's Curtin University in Perth.

His own research, carried out at Softley's restaurant in Market Bosworth, Leicestershire, suggested diners spent an average of £2 a head more when listening to classical rather than pop music.

A similar experiment suggested that perceptions of taste altered according to output, so "mellow and soft music"- made the wine taste "mellow and soft". And "powerful and heavy" sounds, made wine more likely to have flavours that were... "powerful and heavy".

What is muzak?
Muzak started in 1920s when General George Squires patented the process of transmitting music over electrical lines. The name is a combination of "music" and "Kodak", Squires' favourite hi-tech firm. It is known as "elevator music" because of its early use in skyscrapers to calm people's nerves (when elevators were still new and unfamiliar). In the 1940s, it was used as a musical way of relaxing workers with the aim of improving productivity.

Muzak has a reputation as being generic, light music, barely noticeable to customers - setting a calm ambience in which to buy things. But the Muzak company that gave rise to the term, taken over by Canadian firm Mood Media in 2004, tailors its playlists to target specific groups of customers, as do several large rivals.

In his BBC Reith Lectures 10 years ago, conductor and pianist Daniel Barenboim complained that background music was undermining "active" listening, creating a generation of people who could no longer concentrate properly on music.
Research at Rutgers University at about the same time suggested music played in shops had no discernible impact on customers' stated mood. But, while it did encourage higher levels of spending among impulse buyers, "contemplative" shoppers actually spent less.

Defenders of in-store sound say it is no more manipulative than other aspects of store design and management, such as layout, decoration and product presentation.

North says there's evidence that playing Edith Piaf's songs encourages people to buy French wine, rather than South African. "And we know that classical music can drive sales of more expensive products," he says, "whereas country [music] drives sales of utilitarian products."

A few years ago, music writer Paul Stokes attempted to log the songs he heard while shopping in London, but gave up because "it was all either faceless audio wallpaper, or the same pop hits of the day produced by Pharrell Williams".

"Experts say you should only notice the music one in three tracks," he says. "The rest of the time it was there to provide a sense of comfort and calm."

Tell someone they have to answer the following questions as quickly as possible:

What's one plus four?
What's five plus two?
What's seven take away three?
Name a vegetable?

Nine times out of 10 people answer the last question with “Carrot”.

What is happening is that, for most people, most of the time, in all sorts of circumstances, carrot is simply the first vegetable that comes to mind.

This seemingly banal fact reveals something about how our minds organise information. There are dozens of vegetables, and depending on your love of fresh food you might recognise a good proportion. If you had to list them you’d probably forget a few you know, easily reaching a dozen and then slowing down. And when you’re pressured to name just one as quickly as possible, you forget even more and just reach for the most obvious vegetable you can think of – and often that’s a carrot.

In cognitive science, we say the carrot is “prototypical” – for our idea of a vegetable, it occupies the centre of the web of associations which defines the concept. You can test prototypicality directly by timing how long it takes someone to answer whether the object in question belongs to a particular category. We take longer to answer “yes” if asked “is a penguin a bird?” than if asked “is a robin a bird?”, for instance. Even when we know penguins are birds, the idea of penguins takes longer to connect to the category “bird” than more typical species.
So, something about our experience of school dinners, being told they’ll help us see in the dark, the 37 million tons of carrots the world consumes each year, and cartoon characters from Bugs Bunny to Olaf the Snowman, has helped carrots work their way into our minds as the prime example of a vegetable.
The benefit to this system of mental organisation is that the ideas which are most likely to be associated are also the ones which spring to mind when you need them. Life would be impossible without them.

Having a mind which supplies ready answers based on association is better than a mind which never supplies ready answers, but it can also produce blunders that are much more damaging than claiming cows drink milk. Every time we assume the doctor is a man and the nurse is woman, we’re falling victim to the ready answers of our mental prototypes of those professions. Such prototypes, however mistaken, may also underlie our readiness to assume a man will be a better CEO, or a philosophy professor won’t be a woman. If you let them guide how the world should be, rather than what it might be, you get into trouble pretty quickly.

Advertisers know the power of prototypes too, of course, which is why so much advertising appears to be style over substance. Their job isn’t to deliver a persuasive message, as such. They don’t want you to actively believe anything about their product being provably fun, tasty or healthy. Instead, they just want fun, taste or health to spring to mind when you think of their product (and the reverse). Worming their way into our mental associations is worth billions of dollars to the advertising industry, and it is based on a principle no more complicated than a childhood game which tries to trick you into saying “carrots”.

The experiment, say the scholars, indicates that the more distracted we become, the less able we are to experience the subtlest, most distinctively human forms of empathy, compassion, and other emotions. “For some kinds of thoughts, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection,” cautions Mary Helen Immordino-Yang, a member of the research team. “If things are happening too fast, you may not ever fully experience emotions about other people’s psychological states.” It would be rash to jump to the conclusion that the internet is undermining our moral sense. It would not be rash to suggest that as the net reroutes our vital paths and diminishes our capacity for contemplation, it is altering the depth of our emotions as well as our thoughts.

There are those who are heartened by the ease with which our minds are adapting to the web’s intellectual ethic. “Technological progress does not reverse,” writes a Wall Street Journal columnist, “so the trend toward multitasking and consuming many different types of information will only continue.” We need not worry, though, because our “human software” will in time “catch up to the machine technology that made the information abundance possible.” We’ll “evolve” to become more agile consumers of data. The writer of a cover story in New York magazine says that as we become used to “the 21st-century task” of “fitting” among bits of online information, “the wiring of the brain will inevitably change to deal more efficiently with more information.” We may lose our capacity “to concentrate on a complex task from beginning to end,” but in recompense we’ll gain new skills, such as the ability to “conduct 34 conversations simultaneously across six different media.” A prominent economist writes, cheerily, that “the web allows us to borrow cognitive strengths from autism and to be better infovores.” An Atlantic author suggests that our “technology-induced ADD” may be “a short-term problem,” stemming from our reliance on “cognitive habits evolved and perfected in an era of limited information flow.” Developing new cognitive habits is “the only viable approach to navigating the age of constant connectivity.”

These writers are certainly correct in arguing that we’re being molded by our new information environment. Our mental adaptability, built into the deepest workings of our brains, is a keynote of intellectual history. But if there’s comfort in their reassurances, it’s of a very cold sort. Adaptation leaves us better suited to our circumstances, but qualitatively it’s a neutral process. What matters in the end is not our becoming but what we become. In the 1950s, Martin Heidegger observed that the looming “tide of technological revolution” could “so captivate, bewitch, dazzle, and beguile man that calculative thinking may someday come to be accepted and practiced as the only way of thinking.” Our ability to engage in “meditative thinking,” which he saw as the very essence of our humanity, might become a victim of headlong progress. The tumultuous advance of technology could, like the arrival of the locomotive at the Concord station, drown out the refined perceptions, thoughts, and emotions that arise only through contemplation and reflection. The “frenziedness of technology,” Heidegger wrote, threatens to “entrench itself everywhere.”

It may be that we are now entering the final stage of that entrenchment. We are welcoming the frenziedness into our souls.

There is no Sleepy Hollow on the internet, no peaceful spot where contemplativeness can work its restorative magic. There is only the endless, mesmerizing buzz of the urban street. The stimulations of the web, like those of the city, can be invigorating and inspiring. We wouldn’t want to give them up. But they are, as well, exhausting and distracting. They can easily, as Hawthorne understood, overwhelm all quieter modes of thought. One of the greatest dangers we face as we automate the work of our minds, as we cede control over the flow of our thoughts and memories to a powerful electronic system, is the one that informs the fears of both the scientist Joseph Weizenbaum and the artist Richard Foreman: a slow erosion of our humanness and our humanity.

It’s not only deep thinking that requires a calm, attentive mind. It’s also empathy and compassion. Psychologists have long studied how people experience fear and react to physical threats, but it’s only recently that they’ve begun researching the sources of our nobler instincts. What they’re finding is that, as Antonio Damasio, the director of USC’s Brain and Creativity Institute, explains, the higher emotions emerge from neural processes that “are inherently slow.” In one recent experiment, Damasio and his colleagues had subjects listen to stories describing people experiencing physical or psychological pain. The subjects were then put into a magnetic resonance imaging machine and their brains were scanned as they were asked to remember the stories. The experiment revealed that while the human brain reacts very quickly to demonstrations of physical pain – when you see someone injured, the primitive pain centers in your own brain activate almost instantaneously – the more sophisticated mental process of empathizing with psychological suffering unfolds much more slowly. It takes time, the researchers discovered, for the brain “to transcend immediate involvement of the body” and begin to understand and to feel “the psychological and moral dimensions of a situation.”

A series of psychological studies over the past 20 years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper. The reason, according to attention restoration theory, or ART, is that when people aren’t being bombarded by external stimuli, their brains can, in effect, relax. They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind.

The results of the most recent such study were published in Psychological Science at the end of 2008. A team of University of Michigan researchers, led by psychologist Marc Berman, recruited some three dozen people and subjected them to a rigorous and mentally fatiguing series of tests designed to measure the capacity of their working memory and their ability to exert top-down control over their attention. The subjects were divided into two groups. Half of them spent about an hour walking through a secluded woodland park, and the other half spent an equal amount of time walking along busy downtown streets. Both groups then took the tests a second time. Spending time in the park, the researchers found, “significantly improved” people’s performance on the cognitive tests, indicating a substantial increase in attentiveness. Walking in the city, by contrast, led to no improvement in test results.

The researchers then conducted a similar experiment with another set of people. Rather than taking walks between the rounds of testing, these subjects simply looked at photographs of either calm rural scenes or busy urban ones. The results were the same. The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness. “In sum,” concluded the researchers, “simple and brief interactions with nature can produce marked increases in cognitive control.” Spending time in the natural world seems to be of “vital importance” to “effective cognitive functioning.”

EARTHLINGS is a 2005 American documentary film about humankind's total dependence on animals for economic purposes.

Presented in five chapters (pets, food, clothing, entertainment and scientific research) the film is narrated by Joaquin Phoenix, featuring music by Moby, and was written, produced and directed by Shaun Monson.

"Companies whose global profits total $125bn (£86.7bn) cannot credibly claim that they are unable to check where key minerals in their productions come from"

In a report into cobalt mining in the Democratic Republic of the Congo, it found children as young as seven working in dangerous conditions. Cobalt is a a vital component of lithium-ion batteries.

The firms said that they had a zero tolerance policy towards child labour.

The DRC produces at least 50% of the world's cobalt. Miners working in the area face long-term health problems and the risk of fatal accidents, according to Amnesty.

It claimed that at least 80 miners had died underground in southern DRC between September 2014 and December 2015.

It also collected the testimonies of children who allegedly work in the mines.

Paul, a 14-year-old orphan, started mining when he was 12 and told researchers: "I would spend 24 hours down in the tunnels. I arrived in the morning and would leave the following morning ... I had to relieve myself down in the tunnels … My foster mother planned to send me to school, but my foster father was against it, he exploited me by making me work in the mine."

UNICEF estimates that there are approximately 40,000 children working in mines across southern DRC.

In response to the report, Apple said: "Underage labour is never tolerated in our supply chain and we are proud to have led the industry in pioneering new safeguards."

It said that it conducts rigorous audits on its supply chain and any supplier found hiring underage workers is forced to:fund the worker's safe return home, finance the worker's education at a school chosen by the worker or his/her family, continue to pay the worker's wages, offer him or her a job when he or she reaches legal age to work.

On cobalt specifically it added: "We are currently evaluating dozens of different materials, including cobalt, in order to identify labour and environmental risks as well as opportunities for Apple to bring about effective, scalable and sustainable change."

Samsung said that it had a "zero tolerance policy" towards child labour and that, it too, conducted regular and rigorous audits of its supply chain.

"If a violation of child labour is found, contracts with suppliers who use child labour will be immediately terminated," it said.

Sony commented: "We are working with the suppliers to address issues related to human rights and labour conditions at the production sites, as well as in the procurement of minerals and other raw materials."

This mask has become a worldwide symbol of anti-capitalist protest. Its copyright is held by the Time Warner Corporation. It is made in CHINA.

This is V’s mask. V is the fictional revolutionary anarchist in V for Vendetta, a 2006 film based on comic books by British writer Alan Moore. In 2008, hackers belonging to the group Anonymous wore the mask in public to conceal their identities offline, and in 2011, it suddenly became the predominant symbol of the global Occupy protests – a unifying face for a movement without leaders or heroes. More than 100,000 of the US$10 masks were sold in 2011, and for each one sold, royalties were sent to the makers of the movie, Time Warner. This is V’s mask, but it is owned by the world’s largest media conglomerate.

The demands of the 2011 Occupy protests varied: in New York, protestors demanded that bankers take responsibility for the financial crisis, while demonstrations in Moscow focused on allegations of electoral fraud. But each confronted social and economic inequality, and the Occupy slogan resonates everywhere – “We are the 99 percent” was originally directed at the richest one-percent of US citizens, who own almost 40 percent of the country’s wealth. Globally, the wealthiest 20 percent of mankind controls 83 percent of the world’s wealth, compared to the poorest 20 percent, who share one percent of it between them.