YouTube star Austin Jones was sentenced to 10 years in federal prison Friday for persuading underage girls to send him sexually explicit videos of themselves. Jones, 26, of Bloomingdale, Illinois, pleaded guilty in February to one count of receipt of child pornography. He admitted in a plea agreement that in 2016 and 2017 he enticed six girls, as young as 14 years old, to produce and send the videos to “prove” they were his “biggest fan.” “Production and receipt of child pornography are (...)

A straw is a simple thing. It’s a tube, a conveyance mechanism for liquid. The defining characteristic of the straw is the emptiness inside it. This is the stuff of tragedy, and America.

Over the last several months, plastic straws have come under fire from environmental activists who rightly point out that disposable plastics have created a swirling, centuries-long ecological disaster that is brutally difficult to clean up. Bags were first against the wall, but municipalities from Oakland, California, (yup) to Surfside, Florida, (huh!) have started to restrict the use of plastic straws. Of course, now there is a movement afoot among conservatives to keep those plastics flowing for freedom. Meanwhile, disability advocates have pointed out that plastic straws, in particular, are important for people with physical limitations. “To me, it’s just lame liberal activism that in the end is nothing,” one activist told The Toronto Star. “We’re really kind of vilifying people who need straws.” Other environmentalists aren’t sure that banning straws is gonna do much, and point out that banning straws is not an entirely rigorous approach to global systems change, considering that a widely cited estimate for the magnitude of the problem was, umm, created by a smart 9-year-old.

All this to say: The straw is officially part of the culture wars, and you might be thinking, “Gah, these contentious times we live in!” But the straw has always been dragged along by the currents of history, soaking up the era, shaping not its direction, but its texture.

The invention of American industrialism, the creation of urban life, changing gender relations, public-health reform, suburbia and its hamburger-loving teens, better living through plastics, and the financialization of the economy: The straw was there for all these things—rolled out of extrusion machines, dispensed, pushed through lids, bent, dropped into the abyss.

You can learn a lot about this country, and the dilemmas of contemporary capitalism, by taking a straw-eyed view.

People have probably been drinking things through cylindrical tubes for as long as Homo sapiens has been around, and maybe before. Scientists observed orangutans demonstrating a preference for a straw-like tool over similar, less functional things. Ancient versions existed, too.

But in 19th-century America, straws were straw, rye stalks, cut and dried. An alternative did not present itself widely until 1888. That year, Marvin Stone, a Washington, D.C., gentleman, was awarded a patent for an “artificial straw”—“a cheap, durable, and unobjectionable” substitute for natural straws, Stone wrote, “commonly used for the administration of medicines, beverages, etc.”

Workmen created these early artificial straws by winding paper around a thin cylindrical form, then covering them in paraffin. Often, they were “colored in imitation of the natural straw.” Within a decade, these straws appeared often in newspaper items and advertisements across the country.A typical Stone straw ad from a newspaper in 1899 (Google Books)

Advertising for the Stone straw describes its virtues and emphasizes the faults of the natural straw. Stone’s straws were free from TASTE and ODOR (natural straws were not). Stone’s straws were SWEET, CLEAN, and PERFECT (natural straws could be cracked or musty). You only had to use one Stone straw per drink (not always the case with natural straws).

They worked. They were cheap. They were very popular and spawned many imitators because once an artificial straw had been conceived, it just wasn’t that hard to make them, tinkering with the process just enough to route around Stone’s patent. This could be read as a story of individual genius. America likes this kind of story.

But in 1850, long before Stone, Abijah Fessenden patented a drinking tube with a filter attached to a vessel shaped like a spyglass. Disabled people were using drinking tubes in the mid-19th century, as attested to by a patent from 1870. These were artificial, high-value straws; rye was natural and disposable. But it wasn’t until the late 1880s that someone thought to create the disposable, artificial straw.

Why?

Americans were primarily a rural people in the early 19th century. Cities had few restaurants until the 1830s and 1840s. Most that did exist were for very rich people. It took the emergence of a new urban life to spark the creation of the kind of eating and drinking establishment that would enshrine the straw in American culture: the soda fountain.

Carbon dioxide had been isolated decades before, and soda water created with predictably palate-pleasing results, but the equipment to make it was expensive and unwieldy. It wasn’t until the the gas was readily available and cheap that the soda fountain became prevalent. In the 1870s, their technical refinement met a growing market of people who wanted a cold, sweet treat in the city.

At the same time, the Civil War had intensified American industrialization. More and more people lived in cities and worked outside the home. Cities had saloons, but they were gendered spaces. As urban women fought for greater independence, they, too, wanted places to go. Soda fountains provided a key alternative. Given the female leadership of the late-19th-century temperance movement, soda fountains were drafted onto the side. Sodas were safe and clean. They were soft drinks.

By 1911, an industry book proclaimed the soda fountain the very height of democratic propriety. “Today everybody, men, women and children, natives and foreigners, patronize the fountain” said The Practical Soda Fountain Guide.

Temperance and public health grew up together in the disease-ridden cities of America, where despite the modern conveniences and excitements, mortality rates were higher than in the countryside. Straws became a key part of maintaining good hygiene and public health. They became, specifically, part of the answer to the scourge of unclean drinking glasses. Cities begin requiring the use of straws in the late 1890s. A Wisconsin paper noted in 1896 that already in many cities “ordinances have been issued making the use of wrapped drinking straws essential in public eating places.”

But the laws that regulated health went further. A Kansas doctor campaigned against the widespread use of the “common cup,” which was ... a cup, that many people drank from. Bans began in Kansas and spread.The Cup Campaigner

In many cases, this cup was eventually replaced by the water fountain (or paper cups). Some factories kept the common cup, but purchased straw dispensers that allowed all to partake individually. “The spectacle of groups of able-bodied men standing around drinking water through straws and out of a common, ordinary drinking cup, prompted no end of facetious comment,” read an item in the Shelbina Democrat of October 11, 1911.

Cup and straw both had to be clean to assure no germs would assail the children (or the able-bodied men). So even the method by which straws were dispensed became an important hygienic indicator. “In some stores, customers are permitted to choose their own straws, and this system would work very well if customers would not finger the straws,” The Practical Soda Fountain Guide lamented.

That led to the development of the straw dispenser, which has a deep lineage. Already, in 1911, the thing existed where you individually pop a straw into reach. That’s it, right below, with the rationale written in: “Protects straws from flies, dust, and microbes.”The Practical Soda Fountain Guide

To people living through the early 20th century, the straw was a creation of the new public-health regime. “Due to the ‘Yankee mania for sanitation,’ the [American] output of artificial straws has increased from 165 million in 1901 to 4 billion a year at present,” the Battle Creek Enquirer wrote in May 1924. “A manufacturer pointed out yesterday that, laid end to end, these straws would build an ant’s subway 16 times around the world at the equator.”

Four billion straws! There were only 114 million Americans at the time, so that’s 35 straws per capita (though some were exported).

Of course, straw making was improving through all these decades—mechanizing, scaling up—but the straw itself basically stayed the same. According to Sidney Graham—who founded the National Soda Straw Company in 1931, and who competed against Stone and other early straw manufacturers—in a 1988 history of the straw:

Straws were uniform up until the 1930s ... They were tan in color, thin, and exactly 8.5 inches long. Then someone in the soda-bottling business started marketing eight-ounce bottles, and straws grew to 10.5 inches. Various soda fountains began mixing malted milks, and the old straws were too thin. So we started making them thicker. Still, they were all tan in color, like the original straws.

In the interwar years, however, major changes came to straws. In 1937, for example, Joseph Friedman invented the bendy straw at his brother’s soda shop in San Francisco, leading to the design that’s prevalent today.

But what happened to the straw industry is far more interesting than its (limited) technical advances. Three of the biggest names in the industry—Friedman’s Flexi-Straw Company; the Lily-Tulip Cup Corporation, which made popular white straws; and Maryland Cup Corporation—have bumped around the last 80 years like corporate Forrest Gumps.

As it turns out, all three companies’ histories intersect with each other, as well as with structural changes to the American economy. But first, we have to talk about McDonald’s.

Let’s start with Ray Kroc, who built the McDonald’s empire. For about 16 years, beginning in 1922, he sold cups for the Lily-Tulip Cup Corporation, rising to lead sales across the Midwest. “I don’t know what appealed to me so much about paper cups. Perhaps it was mostly because they were so innovative and upbeat,” Kroc recalled in his memoir, Grinding It Out. “But I sensed from the outset that paper cups were part of the way America was headed.”

At first, selling cups was a tough job. Straws were cheap—you could get 100 for nine cents in the 1930s—but cups were many times more expensive. And besides, people could just wash glasses. Why would they need a paper cup? But America was tilting toward speed and disposability. And throwaway products were the future (“innovative and upbeat”). Soda fountains and their fast-food descendants were continuing to grow, spurring more sales of cups and straws. In the end, Kroc called the years between 1927 and 1937 “a decade of destiny for the paper-cup industry.”

Selling all those cups brought Kroc into contact with soda fountains, and eventually he went into business selling milkshake mixers. This led him to Southern California, where he saw the first McDonald’s in operation. He bought his way into the small company and deposed the original owners. With Kroc growing the brand, McDonald’s added 90 franchises between 1955 and 1959. By 1961, Kroc was fully in control of the company, and by 1968, there were 1,000 McDonald’s restaurants.The first McDonald’s that Ray Kroc opened in Des Plaines, Illinois, is now a museum dedicated to the burger chain. (Reuters/Frank Polich)

The restaurant chain became a key customer for Maryland Cup, which began as an ice-cream-cone bakery in Boston. Its first nonfood product launched under a brand that became nationally famous, Sweetheart. That product? The straw. The name derived from the original packaging, which showed “two children sharing a milkshake, each drinking from a straw and their heads forming the two curved arcs of a heart.”

After the war, the company went into cups, and later other kinds of packaging for the growing fast-food industry. It developed new products for McDonald’s, like those old foam clamshell packages that hamburgers used to come in. It also snatched up the Flexi-Straw Company—along with all its patents and rights—in 1969. Things were going great. The founder’s son-in-law was president of the company in Baltimore; one nephew of the founder ran the McDonald’s relationship; the other ran the plastics division.

Because the future, at that point, had become plastics! In 1950, the world produced 1.5 million tons of plastic. By the late 1960s, that production had grown more than tenfold. Every product was being tried as a plastic thing, and so naturally, the straw became a plastic thing, too. It didn’t happen overnight. It took years for paper straws to lose their cultural salience.

While functionally, paper and plastic straws might have seemed the same, to the keen observer who is the narrator of Nicholson Baker’s dazzling 1988 novel, The Mezzanine, the plastic and paper straw were not interchangeable. Paper did not float. Plastic did: “How could the straw engineers have made so elementary a mistake, designing a straw that weighed less than the sugar-water in which it was intended to stand? Madness!”

But there was a problem: lids, which had come into vogue. Plastic straws could push through the little + slits in the cap. Paper ones could not. The restaurant chains committed fully to plastic straws.

Baker goes on to imagine the ramifications, painting a miniature portrait of the process of path-dependent technological choice, which has helped shape everything from the width of railroad tracks to the layout of your keyboard. The power players went plastic, so everyone had to go plastic. “Suddenly the paper-goods distributor was offering the small restaurants floating plastic straws and only floating plastic straws, and was saying that this was the way all the big chains were going,” Baker writes. Sometimes it all works. Other times, a small pleasure is lost, or a tiny headache is created: “In this way the quality of life, through nobody’s fault, went down an eighth of a notch.”

I can’t prove that this was the precise series of events that took hold among straw engineers, cup distributors, and McDonald’s. Most corporate decision-making of this kind simply doesn’t stick in the nets of history. Yet these differences influence the texture of life every single day, and ever more so, as the owners of corporations become ever further removed from the products they sell. Let’s just say that the logic Baker describes, the way he imagines the development and consequences of these forgettable technologies, squares with the histories that we do know. The very straw engineers that Baker describes might well have been working in the plastics division of the Maryland Cup Corporation, owners of the Sweetheart brand.

Baker was writing in the 1980s, when straws of all kinds had begun to proliferate, and the American economic system entered a period of intense consolidation and financialization. A key component of this new form of capitalism was the “leveraged buyout,” in which private-equity firms descended on old companies, sliced them up, took out huge amounts of debt, and sold off the various components, “unlocking value” for their investors. You might remember this was how Mitt Romney made his fortune. Matt Taibbi described the model in acerbic but not inaccurate terms: “A man makes a $250 million fortune loading up companies with debt and then extracting million-dollar fees from those same companies, in exchange for the generous service of telling them who needs to be fired in order to finance the debt payments he saddled them with in the first place.”

Global competition and offshoring enabled by containerized trade was responsible for some of the trouble American manufacturing encountered in the 1970s and 1980s. But the wholesale restructuring of the economy by private-equity firms to narrow the beneficiaries of business operations contributed mightily to the resentments still resounding through the country today. The straw, like everything else, was swept along for the ride.

In the early 1980s, Maryland Cup’s family-linked executives were on the glide path to retirement. Eighty family members held about half the company’s stock. In 1983, the company had $656 million in revenue, $32 million in profits, and 10,000 employees. It was the biggest disposable-food-product manufacturer in the nation, an empire built on cups, straws, and plastic silverware. The family was ready to cash out.

The big paper and food companies circled Maryland Cup, but it was eventually sold for $534 million to Fort Howard, a paper company that had gone public in the early ’70s, and began to aggressively expand beyond its Wisconsin base.

The sale was a boon for Maryland Cup’s shareholders, but the company did not fare well under the new management. Following the transaction, the Baltimore Sun relates, Maryland Cup executives flew to dinner with Fort Howard’s hard-charging CEO, Paul Schierl. He brought out a flip chart, on which he’d written the company’s “old” values—“service, quality, responding to customers.” He turned the page to show the company’s “new” values—“profits, profits, profits.” It’s like a scene out of Tommy Boy, or a socialist’s fever dream.

Fort Howard forced deep cuts on the company. Some longtime managers quit. The trappings of the family company went out the window. No more executives dressing up as Santa Claus or local charitable contributions. And while Fort Howard was cutting people, it invested in expanding the company’s factories. This was just business. Schierl literally appeared at a sales meeting in a devil’s mask.

Maryland Cup’s struggles intensified after the wave of departures that followed the acquisition. It needed customer volume to keep its new, bigger plants running, so Fort Howard snatched up the Lily-Tulip Cup Corporation in 1986 for another $332 million. Surely there would be synergies. More layoffs came.

Two years later, the private-equity guys struck. Morgan Stanley, which had helped broker Fort Howard’s deals, swept in and snatched the company for $3.9 billion in one of those famed leveraged buyouts. The whole enterprise was swept off the public markets and into their hands.

One of their moves was to spin out the cup business as Sweetheart Holdings—along with a boatload of debt jettisoned out of Fort Howard. Just eight years inside Fort Howard and a turn through the private-equity wringer had turned a profitable company into one that still made money on operations in 1991, but was $95 million in the red because it was so loaded up with debt.

The company made layoffs across the country. Retirement health-care benefits were cut, leaving older employees so livid they filed a class-action lawsuit. A huge Wilmington factory closed after McDonald’s got rid of its plastic clamshell packaging for hamburgers, citing environmental concerns over plastic.

In 1993, the company was sold again to a different investment group, American Industrial Partners. Eventually, it was sold yet again to the Solo Cup Company, makers of one-third of the materials necessary for beer pong. And finally, in 2012, Solo was itself sold to Dart Container, a family-owned packaging company that sells a vast array of straws under the Solo brand.

Fort Howard continued on, going back public in 1995, then merging with another paper company, James River, in 1997, to become Fort James. Just three years later, an even bigger paper company, Georgia Pacific, snatched up the combined entity. In 2005, Koch Industries bought the shares of all the companies, taking the company back private. They still make straws.

While bulk capitalism pushes hundreds of millions of plain plastic straws through the American food system, there are also thousands of variations on the straw now, from the “krazy” whirling neon kind to a new natural straw made from rye stalks advertised on Kickstarter (the entrepreneur calls them “Straw Straws”). There are old-school paper straws and newfangled compostable plastic straws. Stone Straw, founded by the inventor of the artificial straw, even survives in some form as the straw-distributing subsidiary of a Canadian manufacturing concern. Basically, there’s never been a better time to be a straw consumer.

Meanwhile, the country has shed manufacturing jobs for decades, straws contribute their share to a dire global environmental disaster, the economy continues to concentrate wealth among the very richest, and the sodas that pass through the nation’s straws are contributing to an obesity epidemic that threatens to erase many of the public health gains that were won in the 20th century. Local governments may legislate the use of the plastic straw, but they can’t do a thing about the vast system that’s attached to the straw, which created first disposable products, then companies, and finally people.

The straw is the opposite of special. History has flowed around and through it, like thousands of other bits of material culture. What’s happened to the straw might not even be worth comment, and certainly not essay. But if it’s not clear by now, straws, in this story, are us, inevitable vessels of the times in which we live.

why DuPont first located the factory in St. John. “They looked at this community and did like they normally do,” said Taylor. “If we find a place where it’s just going to be Negroes, we can set up business there, we can set up shop there, because nobody cares about them.”

You hear ex-professors say it all the time and I’ll add to the chorus: despite nagging precariousness, there’s something profoundly liberating about leaving academe, whereupon you are no longer obliged to give a shit about fashionable thinkers, network at the planet’s most boring parties, or quantify self-worth for scurrilous committees (and whereupon you are free to ignore the latest same-old controversy), for even when you know at the time that the place is toxic, only after you exit (spiritually, not physically) and write an essay or read a novel or complete some other task without considering its relevance to the fascist gods of assessment, or its irrelevance to a gang of cynical senior colleagues, do you realize exactly how insidious and pervasive is the industry’s culture of social control.

There are tragic elements to this adventure, sure. A political symbolism informs my academic career. After months without work, my family suffered financial hardship. And I didn’t matriculate through 22nd grade in order to land a job that requires no college. Then again, neither did I attend so many years of college in order to be disabused of the notion that education is noble.

I pitched honest living to my parents when I told them about the new job. Despite being aware of academe’s ruthless memory, they hoped that I’d one day be a professor again. They probably still do. In a better world, my redemption would happen in the United States. I wanted to quell that expectation. “Even if Harvard offered me a job I’d say no,” I proclaimed with earnest hyperbole.

They feigned support but didn’t believe me. I understand why. It’s hard to imagine coming of age in reverse. Hollywood doesn’t make inspirational movies about struggling to overcome material comfort. We don’t aspire to the working class. Personal fulfillment occurs through economic uplift. We go from the outdoors to the office, from the ghetto to the high-rise, from the bar rail to the capital. That’s the dream, to become a celebrity or a tycoon or, in humbler fantasies, a bureaucrat. But forward progress as material comfort is cultivated through the ubiquitous lie that upward mobility equals righteousness. Honest living is a nice story we tell ourselves to rationalize privation, but in the real world money procures all the honesty we need.

For immigrants, these myths can be acute. I could see my parents struggling between a filial instinct to nurture and an abrupt recognition of their failure. My mom, a retired high school teacher, seemed interested in the logistics of transporting students, but my dad, the original professor, clenched his hands and stared across the table. It’s the only time I’ve seen him avoid eye contact.

If time is a river, the Histomap, created by John B. Sparks and first published by Rand McNally back in 1931, is a raging Mississippi. In that massive river of time, each of humanity’s great civilizations becomes a confluence that ebbs, wanes, and sometimes ebbs again, each a separate current in a river that inexorably rages down to the mouth of the present day.

Author Rand McNally and CompanyAuthor Sparks, John B.Date 1931Short Title The Histomap.Publisher Rand McNally and CompanyPublisher Location ChicagoType TimelineObj Height cm 158Obj Width cm 31Note Histomap is accompanied by a Foreword explaining the purpose and layout of the history. Map and Foreword slide into a green folder with title and relevant information, such as price, on the outside.World Area WorldSubject Pictorial mapSubject HistoricalSubject Data VisualizationFull Title The Histomap. Four Thousand Years Of World History. Relative Power Of Contemporary States, Nations And Empires. Copyright by John B. Sparks. Published by Histomap, Inc. Chicago, Ill. Printed and distributed in the U.S.A. by Rand McNally & Co., Chicago, Ill.List No 1810.001Series No 2Publication Author Sparks, John B.Publication Author Rand McNally and CompanyPub Date 1931Pub Title The Histomap. Four Thousand Years Of World History. Relative Power Of Contemporary States, Nations And Empires. Copyright by John B. Sparks. Published by Histomap, Inc. Chicago, Ill. Printed and distributed in the U.S.A. by Rand McNally & Co., Chicago, Ill.Pub Note See note field above.Pub List No 1810.000Pub Type TimelinePub Height cm 158Pub Width cm 31Image No 1810001Download 1 ►http://www.davidrumsey.com/rumsey/download.pl?image=/D5005/1810001.sid Full Image Download in MrSID FormatDownload 2 ▻https://www.extensis.com/support/geoviewer-9 GeoViewer for JP2 and SID filesAuthors Rand McNally and Company; Sparks, John B.

September 6, 2018 - by Daniel Hautzinger - Last year, a pair of Chicago aldermen proposed renaming a Chicago street to honor the journalist and anti-lynching activist Ida B. Wells, and in July of this year the proposal was approved for a stretch of Congress Parkway. But Congress wasn’t the street originally considered for renaming; rather, it was Balbo Drive.

7th Street became Balbo Drive in 1934, in recognition of Italo Balbo, a leading Italian Fascist under Benito Mussolini. There’s also Balbo Monument east of Soldier Field, a 2,000-year-old column donated by Mussolini to the city the same year. Why does Chicago have a street and monument honoring a Fascist?

In 1933, Balbo led twenty-four seaplanes on a pioneering sixteen-day transatlantic journey from Rome to Chicago, flying over the Century of Progress World’s Fair before landing in Lake Michigan near Navy Pier. Balbo and the pilots were celebrated by Chicago’s high society over the next three days. Chief Blackhorn of the Sioux, who was participating in the World’s Fair, granted Balbo a headdress and christened him “Chief Flying Eagle;” Balbo gave the Chief a Fascist medallion in return. He and his pilots then continued on to New York City. Balbo was featured on the cover of Time magazine and had lunch with President Franklin D. Roosevelt.

The following year, Mussolini sent the column to Chicago to commemorate Balbo’s flight, and it was installed in front of the Fair’s Italian Pavilion. 40,000 people attended its unveiling, and a speech by Balbo was broadcast by radio from Italy. After the defeat of the Fascists in World War II and the revelation of their crimes, Italy’s ambassador to the United States suggested that marks of respect on the column to Balbo and the Fascist government be removed. Despite those changes, the monument still stands, and Balbo Drive retains its name despite the proposal to change it, being a point of pride for many Italian Americans in Chicago.

The World’s Fair was also the site of a subtle protest against fascism in Europe, when a pageant dramatizing Jewish religious history took place in Soldier Field in July of 1933. According to the Chicago Daily News, the event drew 150,000 people of various faiths, and the “spiritual kinship” and “fine fellowship” between Christians and Jews there would “carry rebuke to those who oppress the Jew” in “Hitler’s Germany.”

Two years later, Soldier Field saw a different kind of demonstration that does not seem to have been explicitly anti-Semitic but did feature the Nazi swastika. In 1936, a “German Day” rally included a march with both the American flag and a flag bearing the swastika. But the German American community in Chicago mostly laid low during World War II, careful to conceal their ethnicity and avoid experiencing some of the anti-German sentiment they had already experienced during World War I. However, in 1939 a rally in Merrimac Park supporting the German-American Bund, an organization sympathetic to Nazism and Hitler, attracted several thousand people.

Decades later, a tiny flare-up of support for fascism in Chicagoland attracted outsized national attention. In 1977, a small neo-Nazi group called the National Socialist Party of America sought to hold a demonstration in the northern suburb of Skokie, which had a large population of Jewish people, including some 7,000 survivors of the Holocaust. The suburb originally planned on letting the demonstration happen and moving on, but was convinced by members of its Jewish community to prevent it. (In 1966, the head of the American Nazi Party came to Chicago to march against Martin Luther King, Jr. as Dr. King protested unfair housing practices in the city.)

After passing ordinances that would prevent the demonstration, Skokie was challenged in court by the neo-Nazis, who were supported by the legal backing of the American Civil Liberties Union. The ACLU did not support the views of the group, but rather sought to protect the First Amendment rights of freedom of speech and freedom of assembly. David Goldberger, the ACLU lawyer who led the case, was Jewish.

30,000 members of the ACLU resigned in protest, and financial support for the organization dropped precipitately. Yet the lawyers persevered, fearing that any denial of free speech was a slippery slope. Through various courts, injunctions, and proposed legislation, the neo-Nazis eventually won the case, which even made it to the Supreme Court.

But the neo-Nazis never demonstrated in Skokie. Instead, they staged two marches in Chicago, one downtown and one in Marquette Park. Counter-protesters vastly outnumbered the ten or twenty neo-Nazis in both cases. The leader who spearheaded the marches and garnered the media’s attention during the Skokie case was later convicted for child molestation. (The hapless National Socialist Party of America is famously satirized in the 1980 film Blues Brothers.)

In the wake of the Skokie case, Illinois became the first state to mandate Holocaust education in schools. And in 2009, Skokie became the site of the Illinois Holocaust Museum and Education Center, an implicit rebuke to the attempted Nazi demonstrations of three decades prior.

Interview with Kaggle GrandMaster, Data Scientist: Dr. Bojan TunguzPart 14 of The series where I interview my heroes.Index to “Interviews with ML Heroes”In this very interview, I’m super excited talking to another great kaggler: The Discussions grandmaster: (kaggle: @tunguz, ranked #3), Kernels (Ranked #10) and Competitions Master (Ranked #23): Dr. Bojan TunguzDr. Bojan Tunguz holds a Ph.D. in Applied Physics from the University of Illinois and a masters in Physics from Stanford University.He is currently working as a Data Scientist at H2o.ai, before H2o.ai he had worked at Figure as a Data Scientist and at ZestFinance as a Machine Learning Modeler.About the Series:I have very recently started making some progress with my Self-Taught Machine Learning Journey. But to be honest, it wouldn’t be (...)

They say a picture is worth a thousand words, unless it’s a shredded Banksy, obviously, which is worth around £1m. But how to put a value on the majestic artwork Donald Trump was revealed to have gracing the wall outside the Oval Office, as eagle-eyed viewers of 60 Minutes spotted?

So far, we know of two other “artworks” that Trump has: that Photoshopped picture of his inauguration crowd (dude, let it go), and the electoral college map. It is no wonder Trump wanted to spruce the place up in his own way, given that he referred to the White House as “a dump”. I still cackle at this, given its sheer, disparaging rudeness – like how when Location, Location, Location’s Phil shows a couple around a three-bedroom semi with a north-facing garden, Kirstie mugs to the camera and draws an imaginary knife across her throat.

As Americans try to come to terms with the astonishing prospect that Donald Trump is the new U.S. president, and before he moves into 1600 Pennsylvania Avenue, we were excited to know what properties will he leave behind? Donald Trump is one of the most well-known real estate moguls in the world, he is well-known in the real estate world and has sold some of the most expensive properties in the U.S.The Trump Organization real estate portfolio includes properties in Virginia, Illinois, Florida, New Jersey, Nevada, California, New York, Connecticut and Hawaii.▻https://www.arch2o.com/quick-tour-inside-donald-trump-house

ESCWA Launches Report on Israeli Practices Towards the Palestinian People and the Question of Apartheid

United Nations Under-Secretary-General and Executive Secretary of the UN Economic and Social Commission for Western Asia (ESCWA) Rima Khalaf pointed out today that it is not an easy matter for a United Nations entity to conclude that a State has established an apartheid regime. In recent years, some have labelled Israeli practices as racist, while others have warned that Israel risks becoming an apartheid State. A few have raised the question as to whether in fact it already has.

Khalaf’s remarks were given during a press conference held this afternoon at the UN House, in Beirut, when she launched a report by ESCWA on “Israeli Practices towards the Palestinian People and the Question of Apartheid.”

Khalaf noted that Israel, encouraged by the international community’s disregard for its continual violations of international law, has succeeded over the past decades in imposing and maintaining an apartheid regime that works on two levels. First, the political and geographic fragmentation of the Palestinian people which enfeebles their capacity for resistance and makes it almost impossible for them to change the reality on the ground. Secondly, the oppression of all Palestinians through an array of laws, policies and practices that ensure domination of them by a racial group and serve to maintain the regime.

The Executive Secretary stressed that the importance of this report is not limited to the fact that it is the first of its kind published by a United Nations body, clearly concluding that Israel is a racial State that has established an apartheid regime. It also provides fresh insight into the cause of the Palestinian people and into how to achieve peace.

Khalaf maintained that the report shows that there can be no solution, be it in the form of two States, or following any other regional or international approach, as long as the apartheid regime imposed by Israel on the Palestinian people as a whole has not been dismantled. Apartheid is a crime against humanity. Not only does international law prohibit that crime, it obliges States and international bodies, and even individuals and private institutions, to take measures to combat it wherever it is committed and to punish its perpetrators. The solution therefore lies in implementing international law, applying the principles of non-discrimination, upholding the right of peoples to self-determination and achieving justice.

Khalaf concluded that the report recognizes that only a ruling by an international tribunal would lend its conclusion that Israel is an apartheid State greater authority. It recommends the revival of the United Nations Centre against Apartheid and the Special Committee against Apartheid, the work of both of which came to an end by 1994, when the world believed that it had rid itself of apartheid with its demise in South Africa. It also calls on States, Governments and institutions to support boycott, divestment and sanctions initiatives and other activities aimed at ending the Israeli regime of apartheid.

The report was prepared, at the request of ESCWA, by two specialists renowned for their expertise in the field: Richard Falk, a former United Nations special rapporteur on the situation of human rights in the Palestinian territories occupied since 1967 and professor emeritus of international law at Princeton University; and Virginia Tilley, a researcher and professor of political science at Southern Illinois University with a wealth of experience in Israeli policy analysis.

Two former special rapporteurs on the situation of human rights in the occupied Palestinian territory, Falk and his predecessor, John Dugard, raised in their reports the issue of whether Israel has actually become an apartheid State and recommended that it be examined more closely. About two years ago, member States requested that the ESCWA secretariat prepare a study on the matter. At the Commission’s twenty-ninth session, held in Doha, Qatar in December 2016, member States adopted a resolution stressing the need to complete the study and disseminate it widely.

The report concludes, on the basis of scholarly enquiry and overwhelming evidence, that Israel has imposed a regime of apartheid on the Palestinian people as a whole, wherever they may be. A regime that affects Palestinians in Israel itself, in the territory occupied in 1967 and in the diaspora.

During the press conference, Khalaf gave the floor to Falk and Tilley who participated by video conference. Falk said that this study concludes with clarity and conviction that Israel is guilty of the international crime of apartheid as a result of the manner in which exerts control over the Palestinian people in their varying circumstances. It reached this important conclusion by treating contentions of Israeli responsibility for the crime of apartheid by rigorously applying the definition of apartheid under international law.

Falk added that the study calls, above all, on the various bodies of the United Nations to consider the analysis and conclusions of this study, and on that basis endorse the central finding of apartheid, and further explore what practical measures might be taken to uphold the purpose of the Convention on the Suppression and Punishment of the Crime of Apartheid. It should also be appreciated that apartheid is a crime of the greatest magnitude, treated by customary international law as peremptory norm, that is a legal standard that is unconditionally valid, applies universally, and cannot be disavowed by governments or international institutions.

For her part, Dr Tilley noted that it has become entirely clear that “we are no longer talking about risk of apartheid but practice of apartheid. There is an urgency for a response as Palestinians are currently suffering from this regime. There are many references to apartheid in polemics on the Israel-Palestine conflict.” She added that reference for a finding of apartheid in Israel-Palestine is not South Africa but International Law. She concluded that the key finding is that Israel has designed its apartheid regime around a strategic fragmentation of the Palestinian people geographically and legally.

James Morris Lawson, Jr. (born September 22, 1928) is an American activist and university professor. He was a leading theoretician and tactician of nonviolence within the Civil Rights Movement.[1] During the 1960s, he served as a mentor to the Nashville Student Movement and the Student Nonviolent Coordinating Committee.[2][3] He was expelled from Vanderbilt University for his Civil Rights activism in 1960, and later served as the pastor in Los Angeles, California for 25 years.

The Fellowship of Reconciliation (FoR or #FOR) is the name used by a number of religious nonviolent organizations, particularly in English-speaking countries. They are linked by affiliation to the International Fellowship of Reconciliation (IFOR).

The Nashville Student Movement was an organization that challenged racial segregation in Nashville, Tennessee during the Civil Rights Movement. It was created during workshops in nonviolence taught by James Lawson. The students from this organization initiated the Nashville sit-ins in 1960. They were regarded as the most disciplined and effective of the student movement participants during 1960.[1]

[...] #Vox wrote: “Five experts discuss what a war on the Korean peninsula would look like, how close we are to conflict, and the terrifying consequences.”

Who are those five experts opining on the prospects of a new war?

Andrew C. Weber, a former US assistant secretary of defenseJung Pak, a former CIA analyst on North KoreaDave Maxwell, a retired US Army colonelTammy Duckworth, a US senator representing IllinoisBruce Bennett, a senior researcher at the RAND Corporation, which is bankrolled by the US government

That is to say, four of the five experts cited by Vox directly worked for the government. The fifth expert works at a think tank that is substantially financed by the Pentagon and does research contract work for it.

In one way, the art selfie app might be seen as a fulfillment of Berger’s effort to demystify the art of the past. As an alternative to museums and other institutions that reinforce old hierarchies, Berger offered the pinboard hanging on the wall of an office or living room, where people stick images that appeal to them: paintings, postcards, newspaper clippings, and other visual detritus. “On each board all the images belong to the same language and all are more or less equal within it, because they have been chosen in a highly personal way to match and express the experience of the room’s inhabitant,” Berger writes. “Logically, these boards should replace museums.” (As the critic Ben Davis has noted, today’s equivalent of the pinboard collage might be Tumblr or Instagram.)

Yet in Berger’s story this flattening represents the people prying away power from “a cultural hierarchy of relic specialists.” Google Arts & Culture is overseen by a new cadre of specialists: the programmers and technology executives responsible for the coded gaze. Today the Google Cultural Institute, which released the Arts & Culture app, boasts more than forty-five thousand art works scanned in partnership with over sixty museums. What does it mean that our cultural history, like everything else, is increasingly under the watchful eye of a giant corporation whose business model rests on data mining? One dystopian possibility offered by critics in the wake of the Google selfie app was that Google was using all of the millions of unflattering photos to train its algorithms. Google has denied this. But the training goes both ways. As Google scans and processes more of the world’s cultural artifacts, it will be easier than ever to find ourselves in history, so long as we offer ourselves up to the computer’s gaze.

I have come to believe over and over again that what is most important to me must be spoken, made verbal and shared, even at the risk of having it bruised or misunderstood. That the speaking profits me, beyond any other effect.

I was forced to look upon myself and my living with a harsh and urgent clarity that has left me still shaken but much stronger. Some of what I experienced during that time has helped elucidate for me much of what I feel concerning the transformation of silence into language and action.

In becoming forcibly and essentially aware of my mortality, and of what I wished and wanted for my life, however short it might be, priorities and omissions became strongly etched in a merciless light, and what I most regretted were my silences. Of what had I ever been afraid? To question or to speak as I believed could have meant pain, or death. But we all hurt in so many different ways, all the time, and pain will either change or end. Death, on the other hand, is the final silence. And that might be coming quickly now, without regard for whether I had ever spoken what needed to be said, or had only betrayed myself into small silences, while I planned someday to speak, or waited for someone else’s words.

I was going to die, if not sooner then later, whether or not I had ever spoken myself. My silences had not protected me. Your silence will not protect you.

What are the words you do not yet have? What do you need to say? What are the tyrannies you swallow day by day and attempt to make your own, until you will sicken and die of them, still in silence? Perhaps for some of you here today, I am the face of one of your fears. Because I am a woman, because I am Black, because I am lesbian, because I am myself — a Black woman warrior poet doing my work — come to ask you, are you doing yours?

And of course I am afraid, because the transformation of silence into language and action is an act of self-revelation, and that always seems fraught with danger. But my daughter, when I told her of our topic and my difficulty with it, said, “Tell them about how you’re never really a whole person if you remain silent, because there’s always that one little piece inside you that wants to be spoken out, and if you keep ignoring it, it gets madder and madder and hotter and hotter, and if you don’t speak it out one day it will just up and punch you in the mouth from the inside.”

In the cause of silence, each of us draws the face of her own fear — fear of contempt, of censure, of some judgment, or recognition, of challenge, of annihilation. But most of all, I think, we fear the visibility without which we cannot truly live.

And that visibility which makes us most vulnerable is that which also is the source of our greatest strength. Because the machine will try to grind you into dust anyway, whether or not we speak. We can sit in our corners mute forever while our sisters and our selves are wasted, while our children are distorted and destroyed, while our earth is poisoned; we can sit in our safe corners mute as bottles, and we will still be no less afraid.

Each of us is here now because in one way or another we share a commitment to language and to the power of language, and to the reclaiming of that language which has been made to work against us. In the transformation of silence into language and action, it is vitally necessary for each one of us to establish or examine her function in that transformation and to recognize her role as vital within that transformation.

For those of us who write, it is necessary to scrutinize not only the truth of what we speak, but the truth of that language by which we speak it. For others, it is to share and spread also those words that are meaningful to us. But primarily for us all, it is necessary to teach by living and speaking those truths which we believe and know beyond understanding. Because in this way alone can we survive, by taking part in a process of life that is creative and continuing, that is growth.

And it is never without fear — of visibility, of the harsh light of scrutiny and perhaps judgment, of pain, of death. But we have lived through all of those already, in silence, except death. And I remind myself all the time now that if I were to have been born mute, or had maintained an oath of silence my whole life long for safety, I would still have suffered, and I would still die. It is very good for establishing perspective.

We can learn to work and speak when we are afraid in the same way we have learned to work and speak when we are tired. For we have been socialized to respect fear more than our own needs for language and definition, and while we wait in silence for that final luxury of fearlessness, the weight of that silence will choke us.

The fact that we are here and that I speak these words is an attempt to break that silence and bridge some of those differences between us, for it is not difference which immobilizes us, but silence. And there are so many silences to be broken.

(Originally delivered at the Modern Language Association’s “Lesbian and Literature Panel,” Chicago, Illinois, December 28, 1977. First published in Sinister Wisdom 6 (1978) and The Cancer Journals (Spinsters, Ink, San Francisco, 1980)

In 1907, Indiana became the first state to pass a law permitting the sterilization of “confirmed criminals, idiots, imbeciles and rapists.” The Ishmaels were invoked in the drafting of the legislation, under which over 2,300 people were sterilized.

The 1933 Chicago World’s Fair was christened “A Century of Progress.” A lot had changed since the first world’s fair in 1893: zeppelins soared, Ford’s assembly line had brought automobiles to the masses, and pre-fabricated homes were the wave of the future.

And eugenics was now considered by many to be a legitimate science. As such, it received its own exhibit at the fair. One panel in the eugenics exhibit showed the genealogy of the best family in America: the Roosevelts. Juxtaposed to that was another panel, showing the genealogy of the worst family in America: the Ishmaels.

“Among certain charity workers and eugenicists at the time,” says Nathaniel Deutsch, a history professor at UC Santa Cruz and author of Inventing America’s “Worst” Family, “any poor white Upland Southerner living in or around Indianapolis could just be called an Ishmaelite or a member of the Tribe of Ishmael as a way of stigmatizing them.”

By the 1930s, the term “Tribe of Ishmael” had come to designate thousands of people who were not all part of the biological same family — though eugenicists sought to prove hereditary connections between them. They were “a group of degenerates found in Indiana, Kentucky, Ohio, Illinois, Missouri and Iowa,” claimed a leaflet from roughly 1921, which tallied 10,000 so-called Ishmaelites. They were “paupers, beggars and thieves, criminals, prostitutes, wanderers.” The leaflet conceded that “some have become good citizens,” but “the great majority are still mating like to like and producing unsocial offspring.”

Can you guess which city is depicted above? If you look closely, there are plenty of clues: the big arterial river, the blocky buildings crowded onto the pier, the carefully plotted streets overlaid by the occasional loop of highway.

Do you have it? This is Chicago, Illinois, as viewed in Morphology, an exploratory cartographic tool on the Mapzen platform that eschews common communicative elements like color and symbols. Instead, it seeks to show the world, and all its constituent parts, as a series of carefully chosen lines.

As the holidays approach there has been a heart wrenching increase in fire deaths of children, highlighting the deplorable housing conditions and systemic poverty within the US. The US Fire Administration (USFA) collects information on civilian casualties due to fire and reports that as of this writing, 2152 people have lost their lives in fires. The prior year’s total was 2,290.

The three states most impacted in November were Texas, with 21 lives lost, Illinois with 16, and California losing 14. Texas had the most fatalities for all of 2016 - 132. The state’s toll stands at 126 thus far in 2017.

The house fire crisis disproportionately impacts the working class, which faces substandard housing conditions, as well as declining living standards.

On Independence Day, 1910, race riots ignited across America. Jack Johnson, a black boxer, had defeated the white Jim Jeffries in a heavyweight fight in the middle of the Reno desert. Cities around the nation, including Houston, New York, St. Louis, Omaha, New Orleans, Little Rock, and Los Angeles, erupted with the anger and vindication of a racially divided country.

The day after, newspapers set on the difficult task of tallying the aftermath. “One man was shot in Arkansas, two negroes were killed at Lake Providence, La.; one negro was killed at Mounds, Ill., and a negro fatally wounded in Roundeye, Va.,” reported one local newspaper, explaining that “the tension that existed everywhere vented itself out chiefly in street shuffles.”

A report from Houston read, “Charles Williams, a negro fight enthusiast, had his throat slashed from ear to ear on a streetcar by a white man, having announced too vociferously his appreciation of Jack Johnson’s victory in Reno.”

In Manhattan’s San Juan Hill neighborhood, a mob set fire to a black tenement, while blocking the doorway to prevent the occupants’ escape. In St. Louis, a black crowd marched the streets, pushing No one knows how many died in the wake of Johnson-Jeffries fight, but records show between 11 and 26 were killed. Likely hundreds were assaulted or beaten. To quell the disturbance, cities barred the fight video from being shown in theaters, and Congress tried to pass a bill to ban the screening of all boxing films.

William Pickens, president of the all-black Talladega College, was heartened by the symbolic victory, acknowledging it came at a great cost. “It was a good deal better for Johnson to win and a few negroes to be killed in body for it,” he said, “than for Johnson to have lost and negroes to have been killed in spirit by the preachments of inferiority from the combined white press.”

As Johnson biographer Geoffrey C. Ward pointed out, “No event yielded such widespread racial violence until the assassination of Dr. Martin Luther King, Jr., fifty-eight years later.”whites off the sidewalk and harassing them, before being clubbed and dispersed by police.

In Washington, two white men were fatally stabbed by black men, with 236 people arrested in that city alone. And in Omaha, a black man was smothered to death in a barber’s chair, while in Wheeling, West Virginia, a black man driving an expensive car — just as the playboyish Jack Johnson was famous for — was beset by a mob and hanged.

Considered a founder of the Photorealist movement, Richard Estes is best known for his paintings of city scenes in New York. Compiling his compositions from multiple source photographs, Estes reconstructs reality in highly convincing renderings. He often incorporates reflective surfaces, such as shop windows and shiny cars, yielding mirrored imagery that serves to enhance what the naked eye is capable of perceiving. In Double Self-Portrait (1976), for example, the artist and an entire street scene behind him are reflected in meticulous detail against the glass façade of a diner.

The political practice of gerrymandering isn’t easy to explain. Attempts often include confusing maps and eye-roll-inducing descriptions of efforts to draw district lines that guarantee one party’s ability to maintain—or win—seats in Congress. But it’s important to how Americans vote. So to show gerrymandering’s effects, one runner decided to illustrate the issue in the run-loving town of Asheville, North Carolina—with a racecourse.

A panel of federal judges ruled on Friday that three of Texas’ congressional districts are illegal, violating the Constitution and the Voting Rights Act. The panel found that Republicans had used race as a motivating factor in redistricting.

Judges Xavier Rodriguez and Orlando Garcia wrote the court’s decision, which comes after a protracted and complex legal battle that began when the new districts were drawn in 2011, following the last census.

So what is one of the worst examples of gerrymandering in the country? What is the example that the national media uses when talking about gerrymandering? Illinois’ congressional district 4, including parts of Chicago, represented by congressman Luis Gutierrez. Check out the map below and tell me if you don’t think this is the most ridiculous political game ever. Apparently they had to run the district through the middle of Interstate 294 so that they could maintain a contiguous area of homogenous constituents.

The company spent just $80,000 on #lobbying in 2003. Today, its parent company, Alphabet, spends more on lobbying than any other corporation – $9.5m in the first half of 2017 alone and $15.4m the previous year. In 2013, the company signed a lease on a 55,000-square-foot office, roughly the same size as the White House, less than a mile away from the Capitol Building.

And it’s not just #Google. #Facebook, #Amazon, #Apple and #Microsoft – which was hamstrung by its lacklustre early efforts to court policymakers – have been pouring money into Washington.

“They are overwhelming Washington with money and lobbyists on both sides of the aisle,” said Robert McChesney, communications professor at the University of Illinois. “The Silicon Valley billionaires and CEOs are libertarian, low-tax deregulation buddies of the Koch brothers when it comes to talking to Republicans, and dope-smoking, gay rights activist hipsters when they mix with the Democrats.”