The iPhone is one of the world’s most iconic devices and, in the grand scheme of things, it isn’t very old. But when did it begin? Where did the idea come from?

The very first iPhone was unveiled in January 2007 at the MacWorld convention. Steve Jobs revealed what Apple had been developing for nearly 3 years and, for its time, it represented the cutting edge of technology.

The device was introduced as an iPod with a wider screen, controlled by touch instead of physical buttons. In short, it was a mobile phone and a device to communicate with the internet. At the time, Jobs told the audience that this device would “reinvent the phone”.

While revealing the design of this new device, Jobs took time out to make fun of the current smartphones on the market, ones that relied on a physical keyboard and were unwieldy to use. He showed off how simple it was to control a phone using simple touch gestures on a screen and the audience were hooked.

In the beginning, there was Dear Abby – an American institution since the 1950s. Write in, and get nice, sensible advice on your dating dilemma. But the catch was that it was PG-rated, written by a woman and generally for them. Young men continued to get a lot of their dating advice from their peers in the locker room, and if they were lucky, an older male mentor who knew the ropes.

Away from the safe, clean-cut mainstream arena, there were always the racy magazines like Playboy which gave a totally different view. After finishing the articles, the attentive reader could go to the back and find books advertised with titles like “How to Pick Up Girls”. This classic by Eric Weber appeared in 1970, and included advice such as wearing bell-bottoms and marching in peace marches to pick up the hot hippies. The 1950s were gone, and men’s dating advice had moved on from how long you should wait to kiss a girl on the cheek at her doorstep. This was red-blooded and unashamed pick-up artistry! The term itself became part of the language, and the 1970s was a time when pick-up artistry flourished, albeit still underground.

Things changed as the 1980s brought in a different atmosphere. Reagan was in the White House, concerned parents were clamping down on rock and roll lyrics, and most importantly the specter of AIDS changed the whole dating landscape. The free-wheeling 1970s over, in dating advice magazines and on TV, there was understandably a focus on staying safe.

Moving into the 1990s, Oprah gave voice to women’s viewpoints on relations between the sexes, and ushered in the sensitive metrosexual. It was also a time of such bestselling books as “Men Are From Mars, Women Are From Venus” and “The Rules”. While the former was written by a man, it seemed to cater more to women and required men to “get with the program”. And the latter defined the 1990s dating scene with what was effectively game for women. It was the decade of women making the rules. It was a tough and confusing time to be a man in the 1990s, as men no longer knew whether to be a traditional macho male, or a Sensitive New Age Guy.

Towards the end of the decade, new things started to stir. AIDS was no longer front-and-center in people’s minds, Oprah’s “you go girl!” brand of feminism was mainstream, and the time was right for men to start something of their own. Most importantly, the Web was starting to provide a new platform for men to give advice on the dating arena. The internet would well and truly shake things up.

One of the first gurus of the new era was Ross Jeffries. Selling books and CDs from his website, he told the would-be Casanova that no matter how nebbish you might be, you could learn to charm any woman into bed. Jeffries was big on NLP, effectively a rebranded form of hypnosis. In many ways, he was like a holdover from the 1970s with his unrepentant focus on getting women into bed. Love him or hate him, he showed that there was a huge market on the internet for male-focused dating tips.

Next on the scene was David De Angelo. More family-friendly than Ross Jeffries, who could come across as misogynistic, De Angelo promised you could “Double Your Dating” through attending his seminars or buying his DVD sets. De Angelo offered some key concepts which at the time were breakthroughs to many young men. His audience were the “nice guys” created by the feminism of the 80s and 90s who now found that the jerks seemed to be getting all the girls. He told them to be “cocky and funny”, skating just on the socially acceptable side of jerkdom. De Angelo also introduced the concept of the “neg”, one of the most notorious (and misunderstood) concepts of new-school “Game”.

Basically, a “neg” is giving a back-handed or ambiguous compliment to a girl, such as “I love your hair – is it real?” or telling an obviously glamorous and beautiful woman “ you’re cute – like my bratty little sister”. Designed to get the attention of sought-after women used to getting fawning compliments, it was easily abused by novices, who would make these kind of remarks to less attractive and more insecure women, or turn them into outright insults. This was the kind of dating advice that had its place, but could easily go wrong, and tended to get bad press. De Angelo was everywhere on the internet in the first few years of the 2000s, as he was essentially a marketer and businessman using the dating advice for men arena as his stepping stone to bigger things. Now under his real name, he has gone on to become a hugely successful and wealthy entrepreneur.

But even more important than De Angelo and the marketers was the ragtag group of men who started to congregate on various internet message boards in the late 1990s and early 2000s. This was something new – men learning and sharing their dating experiences pseudonymously in real time. This new era of sharing allowed them to develop complex theories on everything from how to approach women to how to successful first dates to even, what women want in bed.

One of the more famous of these men was a young Canadian by the name of Eric von Markovic, better known as Mystery. Striding around the streets of Toronto in platform boots and a top hat, wearing a feather boa (“peacocking”), Mystery would perform magic tricks to the delight of young women, in the most flamboyant pick-up artist tradition. While perhaps not the greatest or most original of PUAs, he now had an online audience of eager acolytes who took his approach and went out on the streets to try to replicate it.

A movement was forming, and online forum posts were studded with enough jargon and acronyms to require a glossary. The online seduction community, or simply “The Community” as it was known, grew enormously as it bubbled up from the grassroots. Various gurus espoused their own approaches, such as “direct” or “indirect”. The most notorious would simply advise men to “get into sexual state” as they talked to women, and let their raging pheromones do the seducing for them. Others, building from Mystery’s more cerebral approach, built elaborate theoretical models which could be drawn up on a whiteboard like a physics problem.

The time was right for a breakout into the mainstream, and it occurred in 2005 with the publication of “The Game: Penetrating the Secret Society of Pickup Artists” by Neil Strauss. This book provided a window on “the Community” and revealed the techniques of Mystery and his followers, as well as their real-life stories. Suddenly, a whole new generation of young men were getting advice on women which wasn’t given by women or filtered through family values.

Within a couple of years, most young men were at least passingly familiar with this kind of approach to dating and women, especially after more media exposure like the VH1 show “The Pick-Up Artist” which starred none other than Mystery. Some people laughed it off, others drew out what they saw as valuable lessons. In the wake of this mainstream exposure, a number of companies and individuals sprang up, creating a small industry of dating and pickup advice through DVDs and coaching and “boot camps” where – for a fee – you could go out on the town with a seduction guru and learn the “crimson arts”. Many major cities have become accustomed to these groups of young men going out and practicing their techniques, with lesser or greater effectiveness.

This vast array of dating products and services flooding the market started to cause concern amongst large sections of the ‘community’, due to the complete lack of regulation. Any person could walk in off the street, proclaim themselves a ‘guru’, and start selling products and services, often for fees in excess of $4,000, to anyone who would purchase them. To combat the rise of the fake guru who was just looking to make a quick dollar, PUA accountability groups started appears. PUAWatchDog started analysing the claims of different Gurus to test their validity and dating product review sites such as Seduction Review started giving customers a voice to share their opinions and experiences with other potential customers. These accountability groups helped ensure that customers weren’t getting ripped off by unscrupulous marketers.

With the level of information on pickup techniques reaching new saturation point, with every possible scenario for attracting a woman having it’s own book, DVD, and live coaching program, seduction community focussed turned to the effect the inner-psychology of the individual had on the outcome of the flirting and seduction attempts. This was known as ‘Inner Game’. Companies such as Authentic Man Program and The Attraction Institute developed structures and tactics for eliminating ones inner-roadblocks that were getting in the way of a successful seduction.

At this point, the PUA gurus and online communities are no longer new and novel, but they haven’t become exactly mainstream either. More recently, there has been less of a focus on “Game” and specific techniques to seduce women, and more emphasis on general self-improvement – fitness, finances, etc. – in order to increase not only success with women but overall life satisfaction. This definitely seems to be a more balanced approach than going out to hit on dozens of women every night.

There are other trends for men seeking to improve their dating success, such as an increased realization that American and Western women in general are not the only women in the world, and many men are reporting having better love lives from branching out to foreign countries. Whole online communities are devoted to the life of the budding international playboy, who enjoys not only the sights and food of other countries but also reports back on their success with the local women.

Meanwhile, there are totally different online communities, such as popular dating advice sub–Reddits, where a group of peers give advice either to the opposite sex or to their own gender. It could be considered a kind of crowd-sourced dating advice and unlike the “PUA” communities, gurus are shunned in favor of a more democratic and gender-equal approach. Men who identify as feminist or are in favor of a chivalrous approach to women will find a more friendly place for dating advice on mainstream sites like Reddit. One thing is for sure, there are many options and approaches to dating in 2014, and the spread of information is ensuring that men have plenty of information and advice to help them navigate this aspect of their lives.

The slogan “Daite khleb – Give us bread!” echoed throughout Petrograd as 90,000 people gathered to strike against the tsar, Nicholas Romanov (“February Revolution”). The demonstration began on March 8th, 1917 when working class women marched through the capital’s streets angry over food scarcity, overgrown breadlines, and the seemingly indifferent tsar.

They ardently demanded for change – anything to at least put more food on the table. Evolving into a large scale revolution, the insurgency lasted less than a week, but their influence forced Nicholas to abdicate the throne.

The events leading to the February Revolution had left the nation simmering, and Petrograd was the outlet. Nicholas’s rationing of bread infuriated his subjects.

On top of food scarcity, Russia was poorly equipped to fight in the Great War. The tsar’s command over the army was less than stellar, and while he was commanding troops, he left his German-born wife Alexandra in charge of the country. In addition to these problems, Nicholas’s repeated dissolving of the Dumas, a “workers government” with the final say in the tsar’s laws, fueled the Russian peoples’ anger (“Why”).

The populace was suffering, and his subjects were ready to revolt.

The revolution began small, but within a few days it amassed underground activists with men and women from all around the city.

The day the February Revolution began, Nicholas was on a train to Stavka, blissfully unaware of the upheaval taking place. The next day, March 10th, the mass of people in Petrograd had grown larger, and they were yelling “Down with the war!” and “Down with the tsar!” Incensed mobs of workers destroyed police stations; however, in Stavka, Nicholas paid little attention to the frantic reports streaming in about the riots in Petrograd (Siegalbaum).

He merely observed the quality of the refreshing air, and he wrote to Alexandra saying, “My brain is resting here. No ministers, no troubling questions, no demanding thought” (Fleming 161).

Rioters in Petrograd during the February Revolution (Russian Revolution) The foremost banner says, “Long Live the Council of Workmen’s and Soldiers’ Deputies” (Fleming 245).

By then, most city workers were on strike, bringing the entire city to a halt; there was no electricity and no water. They waved banners, chanted, and threw rocks and chunks of ice at the police.

Nicholas’s desperate ministers offered their resignations if only the tsar were to return, yet Nicholas could not grasp the seriousness of the situation, refusing to return. Instead, he called armed soldiers to quell the revolt.

On March 11th, demonstrators taking to the streets early were greeted with posters declaring it was forbidden to assemble, and if they did so, the strikers would be immediately and forcibly disbanded. Nevertheless, they surged through the streets.

In reaction, the soldiers fired.

Two hundred strikers lay dead and forty were wounded. The soldiers, many who were country boys recently deported from their villages, were sickened at the sight and sympathized with the demonstrators. They had had enough. Many troops emptied their rifles into the air and joined the revolution (Fleming).

One furious officer commanding a company refusing to fire ordered they “aimed for the heart.” The soldiers shot him instead. The Duma president, Mikhail Rodzianko, pleaded with the tsar in a telegram:

“Your majesty, save Russia; she is threatened with humiliation and disgrace… Urgently summon a person in whom the whole country can have faith and entrust him with the formation of the government that all people trust… In this terrible hour… there is no other way out and to delay is impossible.” (Fleming 163)

Nicholas ignored the telegram and continued his evening playing dominos declaring, “That fat Rodzianko has written all sorts of nonsense to me, to which I shall not even reply” (Fleming 163).

Monday, March 12, the uprising was still growing in numbers and strength.

The tsar’s own army joined the revolutionaries, and the whole city was in chaos. They raided the arsenal, set prisoners free, looted shops, and burned police stations and other government buildings.

Instead of putting out the fires, firemen cheered and watched the buildings burn. With Nicholas oblivious and absent, the people needed order and leadership, so the Duma stepped up and temporarily took charge to calm the revolt. Nevertheless, the Duma did little to abate the people’s anger (Fleming).

Nicholas Romanov on the imperial train, the location of where he writes the Abdication Manifesto (Emperor Nicholas II).

Days later, March 15th, Nicholas’s train arrived in Pskov, delayed by revolutionaries who had seized control of the tracks. Suddenly, telegrams began to flood in from Nicholas’s most valued generals.

In order to save the war effort, the country, and his dynasty, Nicholas would have to resign his autocratic power. Hours later, after heavy chain smoking and pondering the telegrams, he wrote his Abdication Manifesto, giving up the throne in favor of his brother Grand Duke Michael (Fleming).

In Petrograd, when the mob learned of the news, they exploded with anger. They wanted a republic, not a new tsar. They inundated the streets screaming “Down with the dynasty!”, toppling all tsarist symbols. In the Winter Palace, Nicholas’s picture was slashed with bayonets.

A new tsar would only incite more violence and possibly a civil war, so after carefully listening to reports of Petrograd, Michael declined the throne and 304 years of Romanov rule came to an end (Fleming).

After Nicholas’s abdication, revolutionaries dismantled any tsarist symbol including the bronze statue of Alexander the III, Nicholas’s father (Alexander III).

The nation reveled at the demise of the Romanov dynasty. Red flags were hung from roofs and balconies. Everyone was singing, dancing and marching in parades.

Cannons went off, and passionate orators rallied the crowds. Overnight, red ribbons appeared everywhere, and Russian soldiers fighting in the trenches shouted with delight. The public rejoiced over their newfound freedom of expression, resulting in trade unions, newspapers, and political organizations (Siegalbaum).

One villager remembers, “People kissed each other from joy and said that life from now on would be good” (Fleming 177). This one moment in history changed the course of Russia’s entire history. Thus, the Russian Revolution had officially begun.

Men’s sports have been around since the ancient times, but what about women’s sports like women’s soccer? Although there have been rumors of women playing soccer much earlier, the major rise of women’s soccer started after 1863 when the English Football Association standardized the rules of the game.

This now safer game became very popular for women all over the United Kingdom, and soon after the rule change, it was almost as popular as men’s soccer (“History of”).

In 1920, two women’s soccer teams played each other in front of a massive crowd of 53,000 people in Liverpool, England.

Although that was a major achievement for women’s soccer, it had terrible consequences for the women’s league in the United Kingdom; the English Football Association was threatened by the size of women’s soccer, so they banned women from playing soccer on the same fields as men.

Due to this, women’s soccer declined in the U.K., which caused a decline in nearby places as well. It wasn’t until 1930, when Italy and France created women’s leagues, that women’s soccer started to rise again. Then, after World War II, countries all over Europe started women’s soccer leagues (“Women in”).

Even though most countries had women’s teams, it wasn’t until 1971 that the ban was lifted in England and women could play on the same fields as the men (“History of”).

A year after the ban was lifted, women’s soccer in America became more popular due to Title IX. Title IX required that equal funding was given to men’s and women’s sports in colleges.

The new law meant that more women could go to college with a sports scholarship, and as a result, it meant that women’s soccer was becoming a more common sport at colleges all over the United States (“Women’s Soccer in”).

Surprisingly, it wasn’t until the 1996 Olympics in Atlanta that women’s soccer was an Olympic event. At that Olympic Games there were only 40 events for women and double the amount of men participants as there were women (“American Women”).

One massive step forward for women’s soccer was the first Women’s World Cup, which is a soccer tournament that has teams from all over the world play each other. This first tournament was held in China on November 16-30, 1991.

Dr. Hao Joao Havelange, the president of the Fédération Internationale de Football Association (FIFA) during that time, was the person that initiated the first Women’s World Cup, and because of that first World Cup, the United States created a name for itself in women’s soccer.

At that tournament, the U.S won, beating Norway 2-1 in the finals (above). The U.S. later won the third Women’s World Cup in 1999, beating China in a shootout; that tournament was held in the United States. In later World Cups, the United States didn’t win, but they always placed in at least second or third place. (“FIFA”).

As women’s soccer grew more popular, magazines and newspapers started to publish pictures of women playing soccer. One of the first articles was from 1869 (right); it shows a group of women playing ball in their dresses.

Another article from 1895 shows the North Team after they had won a game against the South Team (below on left).The article, it stateswomen are unfit to play soccer and that women’s soccer is a type of entertainment that is frowned upon by society (“Antique Women’s”).

Works CitedOver time, the articles and publicity of women’s soccer became more positive. Along with these positive articles, there were also some players that became legends. Some of the most legendary players are: Mia Hamm, Marta, and Abby Wambach.

Mia Hamm, who played for the Women’s National Team in the U.S., has been titled FIFA’s World Player of the Year twice, and she led the U.S to victory in two World Cups and the 1996 and 2004 Olympics. Many female soccer players consider her an inspiration due to her many skills and achievements.

Marta plays for Brazil, and she has been tilted FIFA’s World Player of the Year five times. Although she has never won a World Cup, she is still very popular because of her wide array of tricks and skills. Abby Wambach plays for the United States.

She has been titled the U.S. Soccer Athlete of the Year five times, and she has scored a total of 134 goals in her professional career. She has yet to win a World Cup, but the U.S Women’s National Team is in the 2015 World Cup in Canada (“10 Greatest”).With every year, more and more girls start to play soccer, so it will not be long before there are even more female players that everybody knows about.

So how did Mardi Gras reach this iconic status? The ironic thing is that one of the key reasons for its success was the huge opposition it faced when it began. The first march took place on Saturday 24th June 1978 at 10pm and it was met with unexpected police violence. What begun as a political demonstration for gay rights, and a desire for greater visibility in the community, has become through a huge effort from the Sydney Homosexual community, a tourist festival and queer culture celebration.

The Stonewall Riots of 1969

While many people are aware of the first march on the 24th of June in 1978, few people realise that this date was the 9th anniversary of the Stonewall riots in America. The Stonewall Riot took place in New York, on the 28th June 1969 when New York City police officers launched a ‘routine’ raid on the Stonewall Inn, a Greenwich Village Gay Bar. The bar’s customers took a stand and fought back, clashing with police and causing a riot. The Stonewall riot came to represent a new movement of open defiance of the heteronormative society.

At this time, in Sydney, homosexuals experienced cultural invisibility and legal discrimination. It is hard to believe at the time that, in Australia, consenting sexual relations were criminalised and policed. This first march in Sydney, 9 years later, was not only a means of the homosexual community remembering and commemorative the Stone wall riots, it was a time to stand up for against harassment in Sydney and Australia.

Asylums. Electro-Shock Therapy. Skull Drills. Pills. Exorcisms. Isolation. Lobotomies. Many of the drastic procedures that have been put in place to relieve a person of mental illness are only successful in creating ‘vegetables’ out of patients, not curing their illness but making them ghosts of their previous selves.

Throughout history, there have been radical changes in how the mentally ill are treated and cared for; most of these occurred because of changing societal views and knowledge of mental illness. These changes have brought psychiatrics out of a negative light and have given psychiatric studies a brighter, more positive outlook.

The history of treating mental illnesses dates as far back as 5000 B.C.E. with the evidence of “trephined skulls.”

In the ancient world cultures, a well-known belief was that mental illness was “the result of supernatural phenomena”; this included phenomena from “demonic possession” to “sorcery” and “the evil eye”. The most commonly believed cause, demonic possession, was treated by chipping a hole, or “trephine”, into the skull of the patient by which “the evil spirits would be released,” therefore healing the patient.

Although ancient Persians also believed that the illnesses were caused by demons, they practiced precautionary measures such as personal hygiene and “purity of the mind and body” in order to “prevent and protect one from diseases”.

Similarly, the Egyptians recommended that those stricken with mental illness should participate in “recreational activities” in order to relieve symptoms which displayed that, as a civilization, the Egyptians were very advanced in their treatment of mental handicaps. (Foerschner)

During the 5th and 3rd centuries B.C.E., the Greeks changed the way that psychological disorders were viewed. The philosopher and physician, Hippocrates, discovered that illnesses come from “natural occurrences in the body” (Foerschner).

As Hippocrates was studying mental illness, he stepped away from the superstitious beliefs and towards the medical aspect of it. He studied the pathology of the brain and suggested that mental illness stemmed from imbalances in the body.

These imbalances were in the “four essential fluids”; blood, phlegm, bile, and black bile which produce “unique personalities of individuals.” In order to restore the body’s balance, the Greeks used techniques such as phlebotomies, bloodletting, purging, and imposing diets on the afflicted (Foerschner).One treatment that Hippocrates advocated was changing the occupation and/or environment of the patient.

Although these treatments had gained popularity amongst most cultures, there were still vast majorities of people who believed in the supernatural causes of mental illness and used treatments such as amulets, talismans, and sedatives to “ease the torment” of the afflicted (Foerschner).

Historically, those with mental illnesses had a “social stigma” attached to them. It was believed that “a mentally ill member implies a hereditary, disabling condition in the bloodline” threatening the family’s “identity as an honorable unit”.

In countries, or cultures, that had strong ties to family honor, such as China, the ill were hidden by their families so that the community or society that they were a part of wouldn’t believe the illness was “a result of immoral behavior by the individual and/or their relatives”.

As a result of this social stigma, many of the mentally ill were forced to either “live a life of confinement” or were abandoned and forced to live on the streets. Any of those that were abandoned to live on the streets and were rumored “dangerous and unmanageable” were either put in jail or dungeons, out of the public eye (Foerschner, 1).

According to Dr. Eve Leeman of the New York- Presbyterian Hospital, the social views on the sexes also affected the treatment of patients, particularly women. In the early 20th century, women were “preferentially sterilized and lobotomized” and were sometimes even subjected to unnecessary procedures such as the five women in the Stockton State Hospital who were given a clitoridectomy. The justification for these procedures was that having a mental illness was “unladylike” and required “surgical intervention” (Leeman).

These negative perspectives of the mentally ill were maintained throughout history and into modern societies as shown by Nurse Ratched’s treatment of the patients in One Who Flew Over the Cuckoo’s Nest (Kesey). Throughout the novel, Nurse Ratched abuses her position and uses her power to submit her patients to cruel treatment as punishment for misbehavior.

This is due to the fact that she doesn’t see her patients as human beings but as animals who need to be trained.

In the early 15th century many of those afflicted with psychological disorders were placed in workhouses, madhouses, or asylums because it was too burdensome for the families to care for them. The state of these institutions was abhorable.

Those that were admitted to madhouses were abused and often abandoned by their caregivers who were not trained in the treatment of mental disorders. Private madhouses, however, were often run by clergy men and were significantly more humane.

The treatments instituted by the clergymen included regular church attendance, pilgrimages, as well as priests solacing individuals to confess their sins and repent. Asylums, on the other hand, were incredibly inhumane in the treatment of their patients.

Many of those admitted were abused, abandoned, treated like animals, restrained with shackles and iron collars, cared for by untrained staff, and even put on display. An infamous example of the horrors of early asylums would be La Bicetre.

In this French asylum, patients were shackled to walls with very little room to move, were not adequately fed, only visited when brought food, their rooms were not cleaned, and they were therefore forced to sit in their own wastes. Another example would be Saint Mary of Bethlehem, an asylum nicknamed “Bedlam” due to its horrific treatment of the mentally ill.

Their “violent” patients were on display like “sideshow freaks” and their “gentler” patients were forced to beg on the streets. Patients who were allowed to be visited by family often begged their families to be released, however, since the current stigma of mental handicaps was so negative, their pleas would be ignored.

Treatments in these asylums, as well as others, included purging, bloodletting, blistering, dousing patients in either boiling or ice-cold water to “shock” them, sedatives, and using physical restraints such as straitjackets (Foerschner).

Due to the obviously horrific treatment of patients in asylums, many reforms began to take place starting in the mid-to-late 1800s.

Two reformists greatly influenced the spread of what is known as the “Humanitarian Movement,” the first being Phillipe Pinel, in Paris. Pinel believed that “mentally ill patients would improve if they were treated with kindness and consideration” instead of filthy, noisy, and abusive environments; he implemented his hypothesis when he took over La Bicetre.

Another major reformist, William Tuke, founded the York Retreat where patients were treated with “respect and compassion” (Foerschner). After Tuke and Pinel, came Dorothea Dix who advocated the hospital movement and in 40 years, got the U.S. government to fund the building of 32 state psychiatric hospitals as well as organizing reforms in asylums across the world (Module 2).

The Hospital movement started in the 18th century and was justified by reasons such as: “to protect society and the insane from harm, to cure those amenable to treatment, to improve the lives of the incurable, and to fulfill the humanitarian duty of caring for the insane” (Dain).

Along with the creation of state psychiatric hospitals, various organizations and acts, such as Mental Health America (MHA) and the U.S. Community Mental Health Centers Act of 1963, were created to “improve the lives of the mentally ill in the United States” (Module 2). With the reforms came the increase in psychoanalysis.

Sigmund Freud, who is referred to as the father of psychology, was, basically, the creator of psychoanalysis. Freud wrote the Psychoanalytic Theory in which he explains “the id, the ego, and the superego” as well as therapeutic techniques such as hypnosis, “free thinking”, and dream analysis (Foerschner). Freud believed that allowing a patient to focus on repressed thoughts and feelings, he could cure the patient of his/her disorder.

One form of psychoanalysis had goals to help and individual “identify and achieve their own goals” and would keep patients occupied and “thus cure them from delusions and irrationalities” (Dain). Lastly, Somatic treatment was introduced in asylums which included psycho-pharmacology, psychosurgery, electroconvulsive therapy, and electric shock therapy, among others.

The first non-sedative drug used in the treatment of patients was chlorpromazine which “cured” many mental ailments and patients “became free of symptoms entirely and returned to functional lives” (Drake).

The introduction of pharmacology led to the deinstitutionalization reform which changed the view from institutionalized care to “community-oriented care” to improve the “quality of life” (Module 2). According to Foerschner, this backfired and led to 1/3 of the homeless population being the mentally ill.

Many of the treatments enacted on mentally ill patients throughout history have been “pathological sciences” or “sensational scientific discoveries that later turned out to be nothing more than wishful thinking or subjective effects” and haven’t actually benefited those being treated.

As the social perspectives and knowledge have changed, so has the treatment of those afflicted with mental pathologies. These treatments will continue to change as the world expands on its knowledge of brain pathology.

As Leeman says, “mental illness is not accurately described as a disease of the mind or brain and… treatment must attend to the whole patient” so as we continue forward in our knowledge of psychology we must learn from “the foibles of earlier generations” (Leeman).

Module 2. “Module 2: A Brief History of Mental Illness and the U.S. Mental Health Care System.” A Brief History of Mental Illness and the U.S. Mental Health Care System. Unite For Site. Web. 15 Oct. 2014.

In the 19 games of the series, the Guitar HeroFranchise was very successful even though it only lasted six years. Guitar Hero is a video game where one plays an instrument shaped controller along to pre-made track lists as if part of a rock band. From its startup in the United States in 2005, it has been loved by all.

The major reason Guitar Hero was unable to continue was because they had trouble keeping developers. They got a new developer almost every game. After Harmonix, their first developer, was bought by MTV to help make the Rock Band series, it was difficult to keep the same developers (“The History”).

Before the start of the Guitar HeroFranchise, there was a video game called Guitar Freaks. It was a Japanese arcade game that was made in 1978. One plays by strumming the guitar shaped controller and pushing the corresponding colored buttons, on the fret of guitar, on the screen. This inspired the development of Guitar Hero, for many wanted to play it on a home console (“Guitar Freaks”).

Guitar Hero was born in 2005 with the release of their first game simply called: Guitar Hero. It became an instant hit. In fact, it made one billion dollars within a week of its premier. The game was only available on PlayStation 2. The game was developed by Harmonix, which is knows for games such as Amplitude and Frequency, and published by RedOctane (Gies).

The next year they released the next game, GuitarHero 2. It became even more successful with it reaching the fifth best-selling game of 2006 (“The History”). This game featured better graphics than its prior one and a different track list. Also, this game was co-published by RedOctane and Activision. They improved the controller and also made it available on Xbox 360 (Gies).

In 2007, they released Guitar Hero: Encore: Rock the 80s. This game was different from the previous because its track list consisted only of top rock songs from the 1980s.

The next game was called Guitar Hero: Legends of Rock, and was released in 2008. Different from previous games, this game was developed by the company Neversoft; they are known for the TonyHawk game series (“Guitar Hero”). This game improved, for it was not only available on PlayStation 2, but also on PlayStation 3, Xbox 360, Wii, as well as a PC.

Later that same year, the next game, Guitar Hero: Aerosmith, was released. With its track list of only Aerosmith’s music, this game allows one to play as though a member of Aerosmith.

Also released in 2008, Guitar Hero: On Tour was their first portable game. This game is only available on Nintendo DS. This has the same concept as their other games, but without the guitar shaped controller.

The next game involved many changes in game play than the previous ones. Guitar Hero: World Tour was released in 2008. This game introduced a drum-set controller and a microphone to allow players to play as an entire band. This was the company’s response to Rock Band, which was created by their ex-developer, Harmonix (“The History”). Also, they improved the pre-existing guitar controllers. They installed “Neck sliders” on them, which was a touch screen panel on the neck of the guitar which allowed one to change the pitch of sustained notes.

In 2009, they released the sequel to their portable game called Guitar Hero: On Tour: Decades. Also on that year they released Guitar Hero: Metallica. This game had the same idea as Guitar Hero: Aerosmith. One plays as if a member of the rock band Metallica (Gies).

Their next game was made by another new developer. The game was called Guitar Hero: On Tour: Modern Hits. This was another portable game available for the Nintendo DS. It was developed by Vicarious Visions. This game was also released in 2009.

Also in 2009, they released Guitar Hero: Smash Hits. This game’s track list consists of the top guitar hero songs of all the previous games. This was available on PlayStation 2, PlayStation 3, Xbox 360, and Wii. This was also made by a new developer: Beenox. That same year, Guitar Hero 5 was released, developed by Neversoft.

The next game was called Band Hero. Neversoft tried a new idea with this game. They tried to make it appeal to all audiences instead of just rockers (Gies). Therefore, the track list for this game consisted of mainly top 40s songs that can be played on Guitar, Bass, Drum set, or sang in a microphone. They did not focus on songs that would be good to play on guitar. This game was also released in 2009.

Another new idea came out for guitar hero in 2009. They released a game called DJHero. This game’s controller was only an electronic turntable. This allowed one to mash two songs together and remix them.

In late 2009, prior to the release of Guitar Hero: Van Halen, Guitar Hero’s co-producer, RedOctane, shut down (Gies). Guitar Hero: Van Halen was developed by Underground Development and produced by Activision alone.

In 2010, Guitar Hero released a game available on the iPhone. That year was also the premier of the games Guitar Hero: Warriors of Rock, developed by Neversoft, and DJ Hero 2, developed by Freestyle Games (Gies).

With its lack of stable developers and producers, the Guitar HeroFranchise shut down in 2011. They did it without making an official announcement, just ceasing to continue the series. “Rock Band is rumored to be making a comeback, and if it does, Guitar Hero might not be far behind” (Vincent).

It comes from the fruit of a Cacao (aka cocoa) tree, known as Theobroma Cacao.The tree grows on small farms in West Africa, Southeast Asia, and North and Central America. You can give credit to the Olmec, Mayan and Aztec civilizations that discovered the cacao bean.

In 2000BC, cocoa is said to have originated from the Amazon. While different groups were originally trying to make beer, chocolate was the result.

Cacao seeds were used to make ceremonial beverages consumed by elites of the Aztecs and other civilizations.

Native to the tropical regions of Mesoamerica, the evergreen cacao tree grows small, white flowers throughout the year, which are pollinated by tiny flies.

The fruit, called a cacao pod, then grows. It is an oval shape, and changes from a yellowish color to more of an orange when ripe, and weighs about a pound. Each pod contains a white fruit with around 40-50 seeds, usually called beans. The white flesh is eaten or made into juice in some countries. The beans are the main ingredient for chocolate and cocoa powder.

The Mayans, in the 6th century, started to use the cocoa bean to make chocolate. The word “chocolate” comes from the Maya word xocoatl, which means “bitter water.” To the Mayas, cocoa pods symbolized life and fertility. Stones from their palaces and temples revealed many craved pictures of cocoa pods.

The Maya also used chocolate in religious rituals; it sometimes took the place of blood. Chocolate was used in marriage ceremonies, where it was exchanged by the bride and groom, and in baptisms. They even had a cacao god.

The Maya prepared chocolate strictly for drinking. Chocolate history doesn’t include solid chocolate until the 1850s. Except for that, the way the Maya prepared chocolate wasn’t too much different from the way it’s prepared today.

First, the beans were harvested, fermented, and dried. The beans were then roasted and the shells removed, and the rest was ground into a paste.

The paste was mixed with hot water and spices, such as chili, vanilla, annatto, allspice, honey, and flowers. Then the mixture was frothed by pouring it back and forth between two containers. They thought the froth was one of the best parts. Chocolate was also mixed with corn and water to make a sort of gruel. It was similar to the chocolate and corn drink Pinole, still enjoyed in Latin America today.

In 1657, a Frenchman opened the first chocolate house in London. It was called The Coffee Mill and The Tobacco Roll, and because of the cost of the drink, only upper class could buy it. Due to the popularity of chocolate, by 1674, it became an ingredient in cakes and rolls.

In 1847, Joseph Fry of Bristol, England, made the next major leap, with the invention of a steam engine for grinding the beans. This allowed chocolate to be manufactured on a larger scale.

The steam engine was crucial in mechanizing the process of grinding cacao seeds to produce chocolate. Before the steam engine, cacao seeds were ground in mills driven by animal, wind, or water power, and before that they were ground by hand with stones. The power supplied by the steam engine enabled chocolate makers to streamline chocolate production in larger quantities.

Before Fry & Sons could create the chocolate bar, however, Dutchman Conrad J. van Houten invented a hydraulic press in 1929 that was used to create cocoa powder. Today this process is known as “Dutching”. From there, chocolate took off. Richard Cadbury is said to have created the first known heart-shaped box for Valentine’s Day in 1861, and Daniel Peters of Switzerland produced the first milk chocolate bar in 1875, using powdered milk that had been invented by Henri Nestle a few years earlier.

Like chocolate itself, the chocolate chip cookie was also an accident. Ruth Wakefield, a dietician-turned innkeeper, was baking cookies for guests at her Toll House lodge in Massachusetts when she discovered she didn’t have baker’s chocolate. So she substituted a semi-sweet chocolate candy bar cut into little pieces. But, unlike the baker’s chocolate, the candy bar didn’t melt completely. She had inadvertently created the world’s first chocolate chip cookie.

The resulting creation became popular at the inn, and soon Ruth’s recipe was printed in several New England newspapers. The cookie was a hit.

As the new chocolate chip cookie’s popularity soared, so did sales of the Nestle semi-sweet chocolate bar used in the cookies, and eventually Ruth Wakefield and Nestle reached an agreement that allowed the company to print the Toll House cookie recipe on the label of Nestle’s semi-sweet chocolate bar.

As part of the agreement, Ruth received a lifetime supply of chocolate for baking her famous cookies.

The raw beans of cacao are very good for you, full of vitamin C and magnesium, but they are bitter. The beans also have a fair amount of caffeine in them, like coffee or tea, and can help you work harder than you could without cocoa.

Although chocolate was not meant to be created, it was, and is now one of the most popular treats to enjoy at any time.

A Electronic Book or E Book as they are universally known, is a text based publication in digital form. While they may contain images and graphs of some kind, mostly their formats lead them to be text based.

E books are designed to be read off an electronically compatible device either an IReader, a Kindle EReader, tablet or personal computer. While E Books are the actual text and document being read, an E Reader is the device that makes this possible. E books are stored as electronic files, they are small and easy to share and purchase.

They are convenient, light and have a huge storage capacity, that allows for incredible travel reading, electronic notes, and character summaries. However, they were not always like this.

The first Automated Reader is invented

The world’s first automated reader, the precursor to today’s e-readers, was invented by a woman named Angela Ruiz Robles. Angela had her innovative idea in Spain in 1949. Angela Ruis Robles was a school teacher, who watched her students lug text books back and forth from school every day. The idea was that her reader would be far easier to carry for school children, than a number of different text books.

In Angela’s first design, smaller amount of text where printed onto spools and were operated by compressed air. She made her first prototype in 1949. While this book was not electronic it is still hailed as the first automated reader. Her project was never picked up for mass production and she was never able to get a viable patent on the design, but there is a photograph of her holding it in 1949 so she can still claim it.

The Internet and the First E Book is Downloaded

The invention of the internet was the next huge step forward in E books. Information sharing, and file sharing was the birth place of Electronic books.

In 1971, Michael Hart, a student at the University of Illinois, was given unlimited computer time on a huge Xerox mainframe computer in the Materials Research lab (probably because his brother’s best friend was one of its operators). What might seem like an incredibly boring time in Internet history, as there were not many people on the internet in 1971, Michael Hart turned into an incredible opportunity.

The machine was used primarily for data processing, but it was also connected to ARPAnet, a part of what would later become the internet. The value of this gift, given the huge expense of buying and running such machines, he later calculated to be around $100,000,000.

When Hart was given a copy of the Declaration of Independence at a grocery store in the lead up to the local fireworks on July 4th, he found his inspiration. Hart came up with a good use of the computer time he had been given. He typed the text into a computer, all in capitals as there was no lower-case option at the time, and sent out a message on ARPAnet saying that it was now available to download. Six people took him up on the offer and downloaded the text. The world’s first e-book was born.

Hart then set about typing up more texts to make them electronically available. His entries included The Bill of Rights, the American Constitution and the Christian Bible. What he created was far more than an electronic text document, what he created was an idea. The idea of not just using computers to crunch numbers and deal with data, but to get computers sharing text and literature.

It was a long time before the next development came along in 1987 from the computer games creators East Gate Systems. It was around this time that the company published the first hyper text fiction work. The first hyper text book was titled Afternoon by Michael Joyce and was available for purchase on floppy disk. This books was created as the first demonstration of a new online program called Story Space. Story Space was a software program available for Personal computers for creating, editing and reading hypertext fiction.

1993

BiblioBytes launched a website to sell ebooks over the internet, the first company to create a financial exchange system for the net.

1999

American publisher Simon & Schuster created a new imprint, ibooks, and became the first trade publisher simultaneously to publish titles in ebook and print format. Featured authors included Arthur C Clarke, Irving Wallace, and Raymond Chandler. Oxford University Press offered a selection of its books over the internet through netLibrary.

The National Institute of Standards and Technology in America held its first ebook conference. Dick Brass of Microsoft declared that ebooks were the future of reading. “We are embarking on a revolution that will change the world at least as much as Gutenberg did,” he declared, and predicted that by 2018, 90% of all books sold would be Ebooks.

This number, 90% fails to take into consideration the very stable and profitable Gift Book Market. 40% of the paper book market is what is called a ‘gift purchase’. People buy each other books – and they don’t buy each other EBooks. Christmas is still a huge time for selling books, recipe books, picture books, design books, coffee table books and picture books for new born babies. This market of book selling has not been effected by Ebooks and ebooks and still have not tapped into this market.

E books and how they change the way we talk about reading

Pages do not exists in E Books, and the orientation of the reader within the text can be altered depending on adjustments made to the font size and layout. Therefore, the location of the reader throughout the text is displayed as a percentage of the whole text.

The rise of e-readers has prompted speculation about the ways the mind processes words on a screen compared to words in paper books–the concern that holding a physical book promotes understanding in a way that staring at a screen does not. The physicality of the book, sparks the reader to see the text not only for it’s content but as an object as well.

A recent study by Sara Margolin suggests that e-readers do not hinder reading comprehension, at least in short passages of text. As research like this gains ground, the use of e-readers will only increase, and with it, new ways of conceiving of and talking about reading will surface in the language, and in turn, enter dictionaries.

Yet we still use the term book mark to hold the place where we are up to.

A doll with a skinny frame, long blonde hair, blue eyes, perfectly done up makeup, swathes of clothing, and a companion named Ken. Sound familiar?

Most Americans will recognize this as the familiar Barbie doll, invented by Ruth Handler and marketed by the well-known toy company Mattel. Yet, this description that we nowadays so often associate with Barbie was a bit different 50 years ago.

When Barbie was first created, she had a much different appearance, and since then, Barbie has evolved in her looks with the changing times.

There was a crucial set of events in Ruth Handler’s life prior to the creation of the doll that allowed this doll to become as popular as it is today.

It was in 1945 when Ruth Handler, her husband Elliot Handler, and their friend Harold Matson had started up a small workshop in their garage (In the picture is Elliot and Ruth Handler). This business, from which they originally made and sold frames, was called Mattel (a combination of Matson’s and Elliot’s names).

At first their business was small, but after a little while Elliot began making dollhouse furniture out of frame scraps and selling them in addition to the frames. Surprisingly these dollhouse furniture items turned out to be a success, so the group of three decided to shift their businesses focus to toys. Out of this came their first product, the Uke-A-Doodle, which created a solid foundation for the business. Through the early and mid-50s, they created several toys like this that generated a fair amount of company revenue.

Using this revenue, the Handlers bought 52 weeks worth of advertising on the new show the Mickey Mouse Club. It was this action that paved the way for the company Mattel to continue up in the ranks as a big brand toy company and create the successful doll that thrived in the toy industry (“Inventor”).

It was in the year 1959 when Ruth Handler came up with the idea for the Barbie doll. For some time, she had observed that her daughter Barbara loved to play with paper cut out dolls.

She found that her daughter most commonly acted out scenarios where the doll was a teenager or a working woman, and came to the conclusion that girls most often used dolls to act out future, rather than current, roles. “When I conceived Barbie, I believed it was important for little girls’ self-esteem to play with a doll that has breasts” Ruth said (“Ruth”).

She also concluded that this acting out of future scenarios helped a girl’s self-confidence.

This gave her the idea to make a three dimensional doll in which girls could act out their dreams. Hence the Barbie doll, named after Ruth’s own daughter, was born. Loosely based off the racy German comic-book character named Lilliin, in December of 1959 the Barbie was officially released and showcased at the 1959 toy fair in New York City (“Inventor”). At the right is the first Barbie doll that Ruth showcased (Bellis).

In the period of only a few years, Barbie became a hit with the young girls of America.

By 1965 the Mattel Company was already making over $100 million in sales with various toys. At that point, the Barbie came in three original varieties: Blonde hair, Brunette hair, or Red hair. Also during that time, in 1961 Barbie’s infamous companion Ken was released and showcased at American International Toy Fair (shown to right).

Ruth Handler had named this doll after her son Ken, and the doll was a great success. He was marketed as Barbie’s boyfriend and came to have nine different choices of hair color. Just like Barbie, Ken evolved slightly in his looks to meet the changing styles and trends of the times. (Bellis).

The Barbie toy line didn’t just stop with this Barbie and Ken duo. In 1968, Mattel created a new doll: The Christie doll. One of the first dolls added into the ethnically diverse Barbie doll family, Christie served as an African American friend of Barbie.

In 1980, Mattel released another African American Barbie, Francie. Some people consider Francie to be the first true black Barbie, arguing that Christie was just a “white doll, painted brown” (“African-American”).

In 1980, Mattel also released a Hispanic Barbie by the name of Teresa who was another ethnically diverse friend of Barbie (Bellis).

Many variants in the original blonde Barbie were coming out in between the releases of these different ethnic Barbie Dolls. Most of the new releases only consisted of slight changes in facial structure or body length, say the eyebrow or lip shape. But, it was these small differences that eventually led the Barbie to look as it does today.

Along the path of these small changes, one of the most popular Barbie editions were the Totally Hair Barbie and Ken. Released in the 1990s, in this version of the doll Barbie had hair that extended all the way down to her feet and Ken had actual hair that came with a gel that could be used to style his hair (Bellis).

The popularity of the Barbie Doll inspired hundreds of other similar creations. Mattel created variants of almost every one of their dolls, as well as creating a whole Barbie lifestyle complete with a Barbie car and Barbie Dream house.

Outside of Mattel, other companies tried to get their share of fame by creating dolls similar to Barbie such as such as the UK’s Sindy Doll or Ideal Toy Compnay’s Crissy and Velvet dolls. Barbie even affected outside the toy industry and inspired people to create movies and TV shows about the doll, as well as the song I’m a Barbie Girl released by the group Aqua in 1997 (“10”).

Because of all this culture centralized around the Barbie Doll, one could say that the Barbie is an iconic American toy. Barbie thrives in the minds of the young, kindling the spark of imagination and creativity in the young girls of America (Bellis).

With 1.2 million printings for the final book, a spot on the New York Times Bestsellers list, and grossing more than $1,156,413,241 at the box office over the course of three films (domestically), Suzanne Collins’ Hunger Games Trilogy has shown itself as a force to be reckoned with. Young Adult Literature like Collins’ work along with J.K. Rowling, Cassandra Clare, and John Green have flooded the market, but it’s not just teenagers who absorb the words; adults constitute a percentage of the consumers as well. Even with all of the current focus and hype, the exact date and place where this category came to fruition is unclear.

A comeback kid, Young Adult (YA) novels have stolen the limelight from other categories as some of the most popular books today. While there is some dissent on exactly how big the YA market has gotten, publishers have made at least $1.3 billion from sales in 2009 and are adding new authors continually. But what does the niche include, how did it begin, and why is it so popular now?

Between Middle School and Adulthood: What is Young Adult Literature?

Recognized for their awards in the YA world, The Young Adult Library Services Association (YALSA), a part of the American Library Association (ALA), categorizes a young adult as someone between the ages of twelve and eighteen; however, YA fans and authors generally categorize YA literature for someone ages sixteen to twenty-five.

Other similar categories include Teen Fiction, which is supposed to be for ages ranging ten to fifteen. But, in general, the terms young-adult novel, juvenile novel, and young-adult book all refer to texts in the YA literature category.

In general, these fictional novels share one thing in common: a young protagonist that loses one’s innocence. Most, if not all, YA novels contain a character’s understanding of the world and first experiences through some of life’s emotional upheavals. As David Brown points out in “How Young Adult Fiction Came of Age,” many of the struggles the characters go through are poignant because it is the first time

The Beginning of Young Adult Literature

In 2004, Reading at Risk: A Survey of Literary Reading in America showed that young adult readers was dropping at alarming rates, up to 20% between 1982 and 2002. This decline showed over 20 million possible readers, making it a “national crisis” according to Dana Gioia, the National Endowment for the Arts (NEA) chairman, the group that had published the study.

Reading programs for young adults sprouted up all over the nation as flowers in spring, showering their incentives on all who would accept them. Coincidentally, as Hannah Wither and Lauren Ross point out in “Young People Are Reading More Than You,” J.K. Rowling’s hit, Harry Potter and the Sorcerer’s Stone, was released in the U.S. in 2000, possible correlating with a 21% increase in young readership in 2002, one that has continued through 2008. While Wither and Ross cite Rowling’s work as a boon towards Young Adult Literature’s recognition in the wider community, Rowling was not the first to publish books geared towards young adults.

Some argue that Young Adult literature as a category began very long ago with what are now regarded as the classics, books written for the common young adult, books ranging from Chaucer’s Canterbury Tales to The Fables of Aesop and Le Morte d’Arthur; a comprehensive list of books created for younger audiences can be found here. Many of these stories were created with young adults in mind, but they had a very different style and purpose than young adult novels today have.

One year that stands out is 1967, the year that some say YA Literature unofficially began with the publication of Books and the Teenage Reader: A Guide for Teachers, Librarians, and Parents by G. Robert Carlsen. However, Michael Cart, author and former editor for the Young Adult Library Services Association, has stated that Maureen Daly’s Seventeenth Summer, printed first in 1942, was the first novel written and published specifically for teenagers.

Later in the 1970s, America specifically had its first “Golden Age” of YA novels, starting with Go Ask Alice by Anonymous, a short novel supposedly created from the journal of a teen struggling with drugs and finding herself, and continuing its run with Robert Cormier’s The Chocolate War, an iconic text about high school and peer pressure that began some very real debates about YA literature in general.

The 1980s signaled a return to romances and series books, which The Hardy Boys and Nancy Drew had begun in 1905. From the 1990s to today, many YA books focus on realism, but recently fantasy and dystopian series like The Maze Runner, Twilight, and TheHunger Games have made a return.

What is YA Literature?

But what is more, from literary critics to the common reader, almost everyone has a different definition of what exactly makes a book “young adult.” Some argue that every young adult text is a coming of age story, a bildungsroman, that focuses on a loss of innocence, while others posit that young adult simply means that the emotions, ideas, and themes presented are geared towards that 14-early 20s crowd, given in a more simplistic manner.

Whatever the case, many would agree that this is a category- not a definitive literary genre- and as such, many books fit within it. Some, like the famous Catcher in the Rye by Salinger, did not ask to be placed in this category but instead were adopted into it because the audience demanded it to be so, having a majority of teen readers who connected with Holden Caulfield’s, the protagonist’s, dilemmas and conflicts.

Brewing Controversy Surrounding YA Lit.

The audience for YA literature is steadily increasing with more and more options on bookshelves everywhere, but some literary critics are issuing an outcry that some of the audience for these books are adults in their 30s and 40s who have followed the hype from their children and begun reading these seemingly adolescent texts. With more and more YA books being published, some believe that quality adult novels get pushed to the side in favor of their younger, more naive counterparts.

Although, people like Tracy van Straaten, vice president at Scholastic, have a very different take on the situation: “The ‘cross-under’ trend is great. Some of the best books being published are in the Y.A. section, but many people didn’t know about them. The exciting thing with the cross-over phenomenon is that it’s not so much about where in the bookstore the books get shelved. It used to frustrate me that my friends would watch a teen movie, but a Y.A. book seemed remedial to them. Happily, that’s not the case now.”

Asbestos is a highly versatile, strong, cheap, non flammable malleable substance that has been used in building, textiles and construction for the last 2000 years. Asbestos is also a highly toxic airborne fibrous substance that causes a number of different incurable cancers in the humans that are exposed to it. Asbestos is in many homes around the world and is still being used.

Asbestos became popular in the building industry for its natural properties and affordability – desirable physical properties:[1] sound absorption, average tensile strength, its resistance to fire, heat, electrical and chemical damage. When asbestos is used for its resistance to fire or heat, the fibers are often mixed with cement or woven into fabric or mats. These desirable properties made asbestos a very widely used material, and its use continued to grow throughout most of the 20th century until the carcinogenic (cancer-causing) effects of asbestos dust caused its effective demise as a mainstream construction and fireproofing material in most countries

So how did Asbestos become so wide spread? Where did it come from and how to we rid ourselves of the asbestos that is in more than a 3rd of the homes around the world?

Asbestos is a Naturally Accuring Mineral

Asbestos is mined straight from the ground. It is a naturally occurring mineral that can be dug out of the earths surface, with Russia as the greatest supplier or Asbestos. There are six different types of Asbestos, defined mostly by their colour.

Asbestos is minded from an open pit and looks a lot like wood in it’s raw form. After it is separated from the earth and other matter, the asbestos is processed and refined into fluffy fibres. These fibres are then mixed with a binding agent a lot like cement. Sheets and pipes made from Asbestos are not 100 percent asbestos but simply a product that contains asbestos.

Asbestos in Ancient Times

Asbestos has been mined and used for over 4 000 years, however it was not mined on a large scale until the 19th century when it started to be used in housing. Health issues related to asbestos exposure can be found in records dating back to Roman times.

The word asbestos comes from the ancient Greek, meaning “unquenchable” or “inextinguishable”. Pliny (the younger) make reference to clothes being made of asbestinon in his earliest journals. He states, ‘it is rare and impressive and sold for the same price and the finest pearls.’ He makes note of people cleaning their napkins by setting them on fire. He also makes note of a sickness in the asbestos miners, but there are few details relating to this.

Toxicity

Pliny the Younger wrote in AD 61-114 that slaves who worked with the mineral asbestos became ill, there seems to be no exact reference that can be found. Word of mouth only.

For a long time the damaging effects of Asbestos fibres to people, it was not until 1924 that the very first case of asbestosis was diagnosed. Asbestosis would later be called Mesothelioma as the cancer that asbestos causes effects the mesothelial cells.

Asbestos and the Industrial Revolution

Asbestos regained significant popularity as the world, specifically Great Britain, entered the Industrial Revolution. As powered machinery and steam power became more and more prevalent, so did the need for an efficient and effect way to control the heat needed to create and power the machines at the centre of the paradigm shift. Asbestos served as a perfect insulator for high-temperature products like steam pipes, turbines, ovens, and kilns; all things that helped facilitate the Industrial Revolution.

The increase in demand for asbestos sparked the first commercial asbestos mines to open in 1879 in Quebec providence of Canada. Mines opened shortly thereafter in Russia, Australia, and South Africa. By 1900, doctors started reporting lung sickness and pulmonary fibrosis in patients who had worked in asbestos textile factories and asbestos mines.

Despite the resurgence of health concerns, asbestos became very important in the United States as the railroad infrastructure was put into place. Asbestos become an important solution to prevent heat build up and temperature fluctuation in steam powered trains, and again when the steam powered trains shifted to diesel power. By WWII, asbestos was being used in the shipping industry (as insulation to components subjected to high heat), the automobile industry (as brake and clutch lining), and in the construction industry (in a wide variety of products included insulation, siding, and cement).

Asbestos and the Industrial Revolution

During the industrial revolution asbestos rose in popularity because of it’s amazing ability to control heat. Asbestos served as a perfect insulator for high-temperature products like steam pipes, turbines, ovens, and kilns; all things that helped facilitate the Industrial Revolution and the industrialisation of production and manufacture.

The increase in demand for asbestos sparked the first commercial asbestos mines to open in 1879 in Quebec providence of Canada. It was not long after this mine opened that others were established in Russia, Australia, and South Africa. By 1900, doctors started reporting lung sickness and pulmonary fibrosis in patients who had worked in asbestos textile factories and asbestos mines.

Despite the resurgence of health concerns, asbestos became very important in the United States as the railroad infrastructure was put into place. wether the toxic risk of Asbestos was underestimated, ignored or hidden, asbestos played a huge part in the production and building of railway lines all over the world.

By WWII, asbestos was being used in the shipping industry, the automobile industry (as brake and clutch lining), and in the construction industry (in a wide variety of products included insulation, siding, and cement).

Mesothelioma and Asbestos

Is the cancer that effects the mesothelial cells. The mesothelial cells cover almost every organ inside your body. These cells form a lubricating and protective coating over the organs called a mesothelium. Mesothelioma is the cancer of the mesothelial cells.

Almost everyone who is diagnosed with mesothelioma was exposed to Asbestos, be it from the workplace, home or air bone fibres.

James Hardie and Asbestos

James Hardie was one of the largest manufacturers and distributers of Asbestos in Australia. While many companies over the last 50 years have been paying compensation to employees who were victim to Asbestos related diseases and cancers. The history of Asbestos is closely linked to it’s victims however it is too enormous to cover in this article, read more on James Hardie Here and Here.

Asbestos Removal in Homes

The removal of Asbestos from building and homes will be a long and expensive process. Asbestos can only be disposed of at a registered disposal facility. These sites are registered with the Australian government and are the only people that can perform the disposal. It is illegal to leave asbestos anywhere else in Australia.

While it is legal for you to remove Asbestos from your home yourself, it is advised that you do not undertake this process alone. Safety equipment, breathing apparatus and the proper means to clean up afterwards should be factored into your asbestos removal.

While the toxic and carsengenic qualities of asbestos are widely know, there are still a number of countries in the world that are mining huge amounts of asbestos for commercial use. We can be sure that no more of it is used in Australia, but there is no such uniform ban on the substance throughout the world.

Hollywood: Perhaps no other place on earth evokes the same air of show-business magic and glamour. The legend of Hollywood began in the early 20th century and is an earmark of modern American society rich in history and innovation.

The origin of movies and motion pictures began in the late 1800’s, with the invention of “motion toys” designed to trick the eye into seeing an illusion of motion from a display of still frames in quick succession, such as the thaumatrope and the zoetrope. In 1872, Edward Muybridge created the first true “motion picture” by placing twelve cameras on a racetrack and rigging the cameras to capture shots in quick sequence as a horse crossed in front of their lenses.

The first film for motion photography was invented in 1885 by George Eastman and William H. Walker, which contributed to the advance of motion photography. Shortly thereafter, the brothers Auguste and Louis Lumiere created a hand-cranked machine called the cinematographe, which could both capture pictures and project still frames in quick succession.

The 1900’s were a time of great advancement for film and motion picture technology. Exploration into editing, backdrops, and visual flow motivated aspiring filmmakers to push into new creative territory. One of the earliest and most famous movies created during this time was The Great Train Robbery, created in 1903 by Edwin S. Porter.

Around 1905, “Nickelodeons”, or 5-cent movie theaters, began to offer an easy and inexpensive way for the public to watch movies. Nickelodeons helped the movie industry move into the 1920’s by increasing the public appeal of film and generate more money for filmmakers, alongside the widespread use of theaters to screen World War I propaganda. After World War I ended and ushered the United States into a cultural boom, a new industry center was on the rise: Hollywood, the home of motion pictures in America.

According to industry myth, the first movie made in Hollywood was Cecil B. DeMille’s The Squaw Man in 1914 when its director decided last-minute to shoot in Los Angeles, but In Old California, an earlier film by DW Griffith, had been filmed entirely in the village of Hollywood in 1910. By 1919, “Hollywood” had transformed into the face of American cinema and all the glamour it would come to embody.

The 1920’s were when the movie industry began to truly flourish, along with the birth of the “movie star”. With hundreds of movies being made each year, Hollywood was the rise of an American force. Hollywood alone was considered a cultural icon set apart from the rest of Los Angeles, emphasizing leisure, luxury, and a growing “party scene”.

Hollywood was the birthplace of movie studios, which were of great importance to America’s public image in the movie industry. The earliest and most affluent film companies were Warner Brothers Pictures, Paramount, RKO, Metro Goldwin Meyer, and 20th Century Fox, each of whom owned their own film production sets and studios. Universal, United, and Columbia Pictures were also considered noteworthy, despite not owning their own theaters, while Disney, Monogram, and Republic were considered third-tier.

This age also saw the rise of two coveted roles in the movie industry: the director and the star. Directors began to receive greater recognition for using and trademarking personal styles in the creation of their films, which previously in history had not been possible due to limitations in filmmaking technology. Additionally, movie stars began to receive greater fame and notoriety due to increases in publicity and shifts in American trends to value faces from the big screen.

The 1930’s was considered the Golden Age of Hollywood. A new era in film history began in this decade with the introduction of sound into film, creating new genres such as action, musicals, documentaries, social statement films, comedies, westerns, and horror movies. The use of audio tracks in motion pictures created a new viewer dynamic and also initiated Hollywood’s leverage in the upcoming World War II.

The early 1940’s were a tough time for the American film industry, especially after the attack on Pearl Harbor by the Japanese. However, production saw a rebound due to advances in technology such as special effects, better sound recording quality, and the beginning of color film use, all of which made movies more modern and appealing.

Like all other American industries, the film industry responded to World War II with increased productivity, creating a new wave of wartime pictures. During the war, Hollywood was a major source of American patriotism by generating propaganda, documentaries, educational pictures, and general awareness of wartime need. The year 1946 saw an all-time high in theater attendance and total profits.

The 1950’s were a time of immense change in American culture and around the world. In the post-war United States, the average family grew in affluence, which created new societal trends, advances in music, and the rise of pop culture – particularly the introduction of television sets. By 1950, an estimated 10 million homes owned a television set.

A shift in demographics created a change in the film industry’s target market, which began creating material aimed at American youth. Instead of traditional, idealized portrayals of characters, filmmakers started creating tales of rebellion and rock n’ roll. This era saw the rise of films featuring darker plot lines and characters played by “edgier” stars like James Dean, Ava Gardner, and Marilyn Monroe.

The appeal and convenience of television caused a major decline in movie theater attendance, which resulted in many Hollywood studios losing money. To adapt to the times, Hollywood began producing film for TV in order to make the money it was losing in movie theaters. This marked the entrance of Hollywood into the television industry.

The 1960’s saw a great push for social change. Movies during this time focused on fun, fashion, rock n’ roll, societal shifts like the civil rights movements, and transitions in cultural values. It was also a time of change in the world’s perception of America and its culture, largely influenced by the Vietnam War and continuous shifts in governmental power.

1963 was the slowest year in film production; approximately 120 movies were released, which was fewer than any year to date since the 1920’s. This decline in production was caused by lower profits due to the pull of television. Film companies instead began to make money in other areas: music records, movies made for TV, and the invention of the TV series.

Additionally, the average film ticket price was lowered to only a dollar, hoping to create greater appeal to former moviegoers. By 1970, this caused a depression in the film industry that had been developing over the past 25 years. A few studios still struggled to survive and made money in new ways, such as theme parks like Florida’s Disney World. Because of financial struggles, national companies bought out many studios. The Golden Age of Hollywood was over.

With the Vietnam War in full swing, the 1970’s began with an essence of disenchantment and frustration within American culture. Although Hollywood had seen its lowest times, during the late 1960’s, the 1970’s saw a rush of creativity due to changes in restrictions on language, sex, violence, and other strong thematic content. American counterculture inspired Hollywood to take greater risks with new alternative filmmakers.

The rebirth of Hollywood during the 1970’s was based on making high-action and youth-oriented pictures, usually featuring new and dazzling special effects technology. Hollywood’s financial trouble was somewhat alleviated with the then-shocking success of movies like Jaws and Star Wars, which became the highest-grossing movies in film history (at that time).

This era also saw the advent of VHS video players, laser disc players, and films on videocassette tapes and discs, which greatly increased profits and revenue for studios. However, this new option to view movies at home once again caused a decrease in theater attendance.

In the 1980’s, the past creativity of the film industry became homogenized and overly marketable. Designed only for audience appeal, most 1980’s feature films were considered generic and few became classics. This decade is recognized as the introduction of high concept films that could be easily described in 25 words or less, which made the movies of this time more marketable, understandable, and culturally accessible.

By the end of the 1980’s, it was generally recognized that films of that time were intended for audiences who sought simple entertainment, as most pictures were unoriginal and formulaic. Many studios sought to capitalize on advancements in special effects technology, instead of taking risks on experimental or thought-provoking concepts. The future of film looked precarious as production costs increased and ticket prices continued to drop. But although the outlook was bleak, films such as Return of the Jedi, Terminator, and Batmanwere met with unexpected success.

Due to the use of special effects, the budget of film production increased and consequently launched the names of many actors into overblown stardom. International big business eventually took financial control over many movies, which allowed foreign interests to own properties in Hollywood. To save money, more and more films started to launch production in overseas locations. Multi-national industry conglomerates bought out many studios, including Columbia and 20th Century Fox.

The economic decline of the early 1990’s caused a major decrease in box office revenue. Overall theater attendance was up due to new multiscreen Cineplex complexes throughout the United States. Use of special effects for violent scenes such as car chases and gunfights in high-budget films was a primary appeal for many moviegoers.

Meanwhile, pressure on studio executives to make ends meet while creating hit movies was on the rise. In Hollywood, movies were becoming exorbitantly expensive to make due to higher costs for movie stars, agency fees, rising production costs, advertising campaigns, and crew threats to strike.

VCR’s were still popular at this time, and profits from video rentals were higher than the sales of movie tickets. In 1992, CD-ROM’s were created. These paved the way for movies on DVD, which hit stores by 1997. DVD’s featured a much better image quality as well as the capacity for interactive content, and videotapes became obsolete a few years later.

The turn of the millennium brought a new age in film history with rapid and remarkable advances in technology. The movie industry has already seen achievements and inventions in the 2000’s, such as the Blu-ray disc and IMAX theaters. Additionally, movies and TV shows can now be watched on smartphones, tablets, computers, and other personal devices with the advent of streaming services such as Netflix, which you can watch anywhere in the world.

The 2000’s have been an era of immense change in the movie and technology industries, and more change is sure to come quickly. What new innovations will the future bring us? Only time will tell.

What is Crohn’s disease?

Crohn’s disease is a type of inflammatory bowel disease but it may affect any part of the whole digestion tract from the mouth, through the stomach to the colon and anus. Crohn’s disease may affect it’s patients in many different ways; with symptoms including pain, cysts, fever, diarrhoea, bleeding from sores in the gut track, infection and weight loss. Bowel obstructions and severe constipation are also complications from Crohn’s disease that may result in the patient needing surgery and / or a colostomy bag. Patients with Crohn’s disease are at greater risk of developing bowel cancer.

The exact cause of Crohn’s disease is unknown however it has been linked to a combination of environmental factors, immune function and bacterial factors, as well as a patient’s genetic susceptibility to developing the disease.

From these symptoms patients incur a whole range of issues such as tiredness, life style disruptions, anemia or nutritional deficiencies. Crohn’s disease may effect a patients ability to work, support themselves or even go about their normal lives. A recent survey done by Crohn’s and Colitis UK acknowledged that patients may even be giving up sport and exercise due to their illness. Due to the wide variance of Crohn’s symptoms, there is no definitive cure or treatment for the disease. Everyone is different and must be treated according to the individual’s needs however, as Crohn’s directly effects the digestion tract, there is a huge effort to treat and manage symptoms through moderating and altering diet.

Awareness of Crohn’s disease has increased a great deal in the last 40 years, as patients feel more comfortable to discuss the disorder and share their experiences with others. What was once a very taboo topic is now common knowledge. This new awareness for Crohn’s disease may help to explain why the number of patients with diagnosed Crohn’s disease is increasing.

Feature Articles

Every American person should know the old rhyme “You scream! I scream! We all scream for ice cream!” In any country in the world, the word Ice Cream brings to mind memories of summer. Hot hazy days lounging...

Sited on the banks of the Tiber River, on a hill sits the Vatican City. It is a place that has one of the richest histories in the world and is one of the most influential. The religious history that surrounds...