Storm Shelters in OKC

Tuesday June 5, 2001 marked the start of an extremely fascinating time in the annals of my cherished Houston. Tropical storm Allison, that early summer daytime came to see. The thunderstorm went rapidly, although there was Tuesday. Friday, afterward arrived, and Allison returned. This time going slowly, this time in the north. The thunderstorm became still. Thousands of people driven from their houses. Only when they might be desired most, several leading hospitals shut. Dozens of important surface roads, and every important highway covered in water that was high.

Yet even prior to the rain stopped, service to others, and narratives of Christian compassion started to be composed. For a couples class, about 75 people had assembled at Lakewood Church among the greatest nondenominational churches in The United States. From time they got ready to depart the waters had climbed so high they were stranded. The facility of Lakewood stayed dry and high at the center of among the hardest hit parts of town. Refugees in the powerful thunderstorm started arriving at their doorstep. Without no advance preparation, and demand of official sanction, those 75 classmates started a calamity shelter that grew to hold over 3,000 customers. The greatest of over 30 refuges that could be established in the height of the thunderstorm.

Where help was doled out to those who’d suffered losses after Lakewood functioned as a Red Cross Service Center. When it became clear that FEMA aid, and Red Cross wouldn’t bring aid enough, Lakewood and Second Baptist joined -Houston to produce an adopt a family plan to greatly help get folks on their feet quicker. In the occasions that followed militaries of Christians arrived in both churches. From all over town, people of economical standing, race, and each and every denomination collected. Wet rotted carpeting were pulled up, sheet stone removed. Piles of clothes donated food and bed clothes were doled out. Elbow grease and cleaning equipment were used to start eliminating traces of the damage.

It would have been an excellent example of practical ministry in a period of disaster, in the event the story stopped here, but it continues. A great many other churches functioned as shelters as well as in the occasions that followed Red Cross Service Centers. Tons of new volunteers, a lot of them Christians put to work, and were put through accelerated training. That Saturday, I used to be trapped in my own, personal subdivision. Particular that my family was safe because I worked in Storm Shelters OKC that was near where I used to live. What they wouldn’t permit the storm to do, is take their demand to give their religion, or their self respect. I saw so a lot of people as they brought gifts of food, clothes and bedclothes, praising the Lord. I saw young kids coming making use of their parents to not give new, rarely used toys to kids who had none.

Leaning On God Through Hard Times

Unity Church of Christianity from a location across town impacted by the storm sent a sizable way to obtain bedding as well as other supplies. A tiny troupe of musicians and Christian clowns requested to be permitted to amuse the kids in the shelter where I served and arrived. We of course promptly taken their offer. The kids were collected by them in a sizable empty space of flooring. They sang, they told stories, balloon animals were made by them. The kids, frightened, at least briefly displaced laughed.

When not occupied elsewhere I did lots of listening. I listened to survivors that were disappointed, and frustrated relief workers. I listened to kids make an effort to take advantage of a scenario they could not comprehend. All these are only the stories I have heard or seen. I am aware that spiritual groups, Churches, and lots of other individual Christians functioned admirably. I do need to thank them for the attempts in disaster. I thank The Lord for supplying them to serve.

I didn’t write its individuals, or this which means you’d feel sorry for Houston. As this disaster unfolded yet what I saw encouraged my beliefs the Lord will provide through our brothers and sisters in religion for us. Regardless how awful your community hits, you the individual Christian can be a part of the remedy. Those blankets you can probably never use, and have stored away mean much to people who have none. You are able to help in the event that you can drive. You are able to help if you’re able to create a cot. It is possible to help in the event that you can scrub a wall. It is possible to help if all you are able to do is sit and listen. Large catastrophes like Allison get lots of focus. However a disaster can come in virtually any size. That is a serious disaster to your family that called it home in case a single household burns. It is going to be generations prior to the folks here forget Allison.

Firms investing in this sector can research, develop and create, as well as appreciate the edges of a global gas and oil portfolio with no political and economical disadvantages. Allowing regime and the US financial conditions is rated amongst the world and the petroleum made in US is sold at costs that were international. The firms will likely gain as US also has a national market that is booming. Where 500 exploration wells are drilled most of the petroleum exploration in US continues to be concentrated around the Taranaki Basin. On the other hand, the US sedimentary basins still remain unexplored and many show existence of petroleum seeps and arrangements were also unveiled by the investigation data with high hydrocarbon potential. There have already been onshore gas discoveries before including Great south river basins, East Coast Basin and offshore Canterbury.

As interest in petroleum is expected to grow strongly during this interval but this doesn’t automatically dim the bright future expectations in this sector. The interest in petroleum is anticipated to reach 338 PJ per annum. The US government is eager to augment the gas and oil supply. As new discoveries in this sector are required to carry through the national demand at the same time as raise the amount of self reliance and minimize the cost on imports of petroleum the Gas and Oil exploration sector is thought to be among the dawn sectors. The US government has invented a distinctive approach to reach its petroleum and gas exploration targets. It’s developed a “Benefit For Attempt” model for Petroleum and Gas exploration tasks in US.

The “Benefit For Attempt” in today’s analytic thinking is defined as oil reserves found per kilometer drilled. It will help in deriving the estimate of reservations drilled for dollar and each kilometer spent for each investigation. The authorities of US has revealed considerable signs that it’ll bring positive effects of change which will favor investigation of new oil reserves since the price of investigation has adverse effects on investigation task. The Authorities of US has made the information accessible about the oil potential in its study report. Foil of advice in royalty and allocation regimes, and simplicity of processes have enhanced the attractiveness of Petroleum and Natural Gas Sector in the United States.

Petroleum was the third biggest export earner in 2008 for US and the chance to to keep up the growth of the sector is broadly accessible by manners of investigation endeavors that are new. The government is poised to keep the impetus in this sector. Now many firms are active with new exploration jobs in the Challenger Plateau of the United States, Northland East Slope Basin region, outer Taranaki Basin, and Bellona Trough region. The 89 Energy oil and gas sector guarantees foreign investors as government to high increase has declared a five year continuance of an exemption for offshore petroleum and gas exploration in its 2009 budget. The authorities provide nonresident rig operators with tax breaks.

AC systems, and heat, venting collect pollutants and contaminants like mold, debris, dust and bacteria that can have an adverse impact on indoor air quality. Most folks are at present aware that indoor air pollution could be a health concern and increased visibility has been thus gained by the area. Studies have also suggested cleaning their efficacy enhances and is contributory to a longer operating life, along with maintenance and energy cost savings. The cleaning of the parts of forced air systems of heat, venting and cooling system is what’s called duct cleaning. Robots are an advantageous tool raising the price and efficacy facets of the procedure. Therefore, using modern robot duct isn’t any longer a new practice.

A cleaner, healthier indoor environment is created by a clean air duct system which lowers energy prices and increases efficiency. As we spend more hours inside air duct cleaning has become an important variable in the cleaning sector. Indoor pollutant levels can increase. Health effects can show years or up immediately after repeated or long exposure. These effects range from some respiratory diseases, cardiovascular disease, and cancer that can be deadly or debilitating. Therefore, it’s wise to ensure indoor air quality isn’t endangered inside buildings. Dangerous pollutants that can found in inside can transcend outdoor air pollutants in accordance with the Environmental Protection Agency.

Duct cleaning from Air Duct Cleaning Edmond professionals removes microbial contaminants, that might not be visible to the naked eye together with both observable contaminants. Indoor air quality cans impact and present a health hazard. Air ducts can be host to a number of health hazard microbial agents. Legionnaires Disease is one malaise that’s got public notice as our modern surroundings supports the development of the bacteria that has the potential to cause outbreaks and causes the affliction. Typical disorder-causing surroundings contain wetness producing gear such as those in air conditioned buildings with cooling towers that are badly maintained. In summary, in building and designing systems to control our surroundings, we’ve created conditions that were perfect . Those systems must be correctly tracked and preserved. That’s the secret to controlling this disorder.

Robots allow for the occupation while saving workers from exposure to be done faster. Signs of the technological progress in the duct cleaning business is apparent in the variety of gear now available for example, array of robotic gear, to be used in air duct cleaning. Robots are priceless in hard to reach places. Robots used to see states inside the duct, now may be used for spraying, cleaning and sampling procedures. The remote controlled robotic gear can be fitted with practical and fastener characteristics to reach many different use functions.

Video recorders and a closed circuit television camera system can be attached to the robotic gear to view states and operations and for documentation purposes. Inside ducts are inspected by review apparatus in the robot. Robots traveling to particular sections of the system and can move around barriers. Some join functions that empower cleaning operation and instruction manual and fit into little ducts. An useful view range can be delivered by them with models delivering disinfection, cleaning, review, coating and sealing abilities economically.

The remote controlled robotic gear comes in various sizes and shapes for different uses. Of robotic video cameras the first use was in the 80s to record states inside the duct. Robotic cleaning systems have a lot more uses. These devices provide improved accessibility for better cleaning and reduce labor costs. Lately, functions have been expanded by areas for the use of small mobile robots in the service industries, including uses for review and duct cleaning.

More improvements are being considered to make a tool that was productive even more effective. If you determine to have your ventilation, heat and cooling system cleaned, it’s important to make sure all parts of the system clean and is qualified to achieve this. Failure to clean one part of a contaminated system can lead to re-contamination of the entire system.

Charges or fees against a DWI offender need a legal Sugar Land criminal defense attorney that is qualified dismiss or so that you can reduce charges or the fees. So, undoubtedly a DWI attorney is needed by everyone. Even if it’s a first-time violation the penalties can be severe being represented by a DWI attorney that is qualified is vitally significant. If you’re facing following charges for DWI subsequently the punishments can contain felony charges and be severe. Locating an excellent attorney is thus a job you should approach when possible.

So you must bear in mind that you just should hire a DWI attorney who practices within the state where the violation occurred every state within America will make its laws and legislation regarding DWI violations. It is because they are going to have the knowledge and expertise of state law that is relevant to sufficiently defend you and will be knowledgeable about the processes and evaluations performed to establish your guilt.

As your attorney they are going to look to the evaluations that have been completed at the time of your arrest and the authorities evidence that is accompanying to assess whether or not these evaluations were accurately performed, carried out by competent staff and if the right processes where followed. It isn’t often that a police testimony is asserted against, although authorities testimony also can be challenged in court.

You should attempt to locate someone who specializes in these kind of cases when you start trying to find a DWI attorney. Whilst many attorneys may be willing to consider on your case, a lawyer who specializes in these cases is required by the skilled knowledge needed to interpret the scientific and medical evaluations ran when you had been detained. The first consultation is free and provides you with the chance to to inquire further about their experience in fees and these cases.

Many attorneys will work according into a fee that is hourly or on a set fee basis determined by the kind of case. You may find how they have been paid to satisfy your financial situation and you will have the capacity to negotiate the conditions of their fee. If you are unable to afford to hire an attorney that is private you then can request a court-appointed attorney paid for by the state. Before you hire a DWI attorney you should make sure when you might be expected to appear in court and you understand the precise charges imposed against you.

The credit card is making your life more easy, supplying an amazing set of options. The credit card is a retail trade settlement; a credit system worked through the little plastic card which bears its name. Regulated by ISO 7810 defines credit cards the actual card itself consistently chooses the same structure, size and contour. A strip of a special stuff on the card (the substance resembles the floppy disk or a magnetic group) is saving all the necessary data. This magnetic strip enables the credit card’s validation. The layout has become an important variable; an enticing credit card layout is essential in ensuring advice and its dependability keeping properties.

A credit card is supplied to the user just after a bank approves an account, estimating a varied variety of variables to ascertain fiscal dependability. This bank is the credit supplier. When a purchase is being made by an individual, he must sign a receipt to verify the trade. There are the card details, and the amount of cash to be paid. You can find many shops that take electronic authority for the credit cards and use cloud tokenization for authorization. Nearly all verification are made using a digital verification system; it enables assessing the card is not invalid. If the customer has enough cash to insure the purchase he could be attempting to make staying on his credit limit any retailer may also check.

As the credit supplier, it is as much as the banks to keep the user informed of his statement. They typically send monthly statements detailing each trade procedures through the outstanding fees, the card and the sums owed. This enables the cardholder to ensure all the payments are right, and to discover mistakes or fraudulent action to dispute. Interest is typically charging and establishes a minimal repayment amount by the end of the following billing cycle.

The precise way the interest is charged is normally set within an initial understanding. On the rear of the credit card statement these elements are specified by the supplier. Generally, the credit card is an easy type of revolving credit from one month to another. It can also be a classy financial instrument, having many balance sections to afford a greater extent for credit management. Interest rates may also be not the same as one card to another. The credit card promotion services are using some appealing incentives find some new ones along the way and to keep their customers.

One solution while removing much of the anxiety, to have the revenue of your rental home would be to engage and contact property management in Oklahoma City, Oklahoma. If you wish to know more and are considering the product please browse the remainder of the post. Leasing out your bit of real property may be real cash-cow as many landlords understand, but that cash flow usually includes a tremendous concern. Night phones from tenants that have the trouble of marketing the house if you own an emptiness just take out lots of the pleasure of earning money off of leases, overdue lease payments which you must chase down, as well as over-flowing lavatories. One solution while removing much of the anxiety, to have the earnings would be to engage a property management organization.

These businesses perform as the go between for the tenant as well as you. The tenant will not actually need to understand who you’re when you hire a property management company. The company manages the day to day while you still possess the ability to help make the final judgements in regards to the home relationships using the tenant. The company may manage the marketing for you personally, for those who are in possession of a unit that is vacant. Since the company is going to have more connections in a bigger market than you’ve got along with the industry than you are doing, you’ll discover your device gets stuffed a whole lot more quickly making use of their aid. In addition, the property management company may care for testing prospective tenants. With regards to the arrangement you’ve got, you might nevertheless not be unable to get the last say regarding if a tenant is qualified for the the system, but of locating a suitable tenant, the day-to-day difficulty is not any longer your problem. They’ll also manage the before-move-in the reviews as well as reviews required following a tenant moves away.

It is possible to step back watching the profits, after the the system is stuffed. Communicating will be handled by the company with all the tenant if you have an issue. You won’t be telephoned if this pipe explosions at the center of the night time. Your consultant is called by the tenant in the company, who then makes the preparations that are required to get the issue repaired with a care supplier. You get a phone call a day later or may not know there was an issue before you register using the business. The property management organization may also make your leasing obligations to to get. The company will do what’s required to accumulate if your tenant is making a payment. In certain arrangements, the organization is going to also take-over paying taxation, insurance, and the mortgage on the portion of property. You actually need to do-nothing but appreciate after after all the the invoices are paid, the revenue which is sent your way.

With all the advantages, you’re probably questioning exactly what to employing a property management organization, the downside should be. From hiring one the primary variable that stops some landlords is the price. All these providers will be paid for by you. The price must be weighed by you from the time frame you’ll save time that you may subsequently use to follow additional revenue-producing efforts or just take pleasure in the fruits of your expense work.

Orthodontics is the specialty of dentistry centered on the identification and treatment of dental and related facial problems. The outcomes of Norman Orthodontist OKC treatment could be dramatic — an advanced quality of life for a lot of individuals of ages and lovely grins, improved oral health health, aesthetics and increased cosmetic tranquility. Whether into a look dentistry attention is needed or not is an individual’s own choice. Situations are tolerated by most folks like totally various kinds of bite issues or over bites and don’t get treated. Nevertheless, a number people sense guaranteed with teeth that are correctly aligned, appealing and simpler. Dentistry attention may enhance construct and appearance power. It jointly might work with you consult with clearness or to gnaw on greater.

Orthodontic attention isn’t only decorative in character. It might also gain long term oral health health. Right, correctly aligned teeth is not more difficult to floss and clean. This may ease and decrease the risk of rot. It may also quit periodontists irritation that problems gums. Periodontists might finish in disease, that occurs once micro-organism bunch round your house where the teeth and the gums meet. Periodontists can be ended in by untreated periodontists. Such an unhealthiness result in enamel reduction and may ruin bone that surrounds the teeth. Less may be chewed by people who have stings that are harmful with efficacy. A few of us using a serious bite down side might have difficulties obtaining enough nutrients. Once the teeth aren’t aimed correctly, this somewhat might happen. Morsel issues that are repairing may allow it to be more easy to chew and digest meals.

One may also have language problems, when the top and lower front teeth do not arrange right. All these are fixed through therapy, occasionally combined with medical help. Eventually, remedy may ease to avoid early use of rear areas. Your teeth grow to an unlikely quantity of pressure, as you chew down. In case your top teeth do not match it’ll trigger your teeth that are back to degrade. The most frequently encountered type of therapy is the braces (or retainer) and head-gear. But, a lot people complain about suffering with this technique that, unfortunately, is also unavoidable. Sport braces damages, as well as additional individuals have problem in talking. Dental practitioners, though, state several days can be normally disappeared throughout by the hurting. Occasionally annoyance is caused by them. In the event that you’d like to to quit more unpleasant senses, fresh, soft and tedious food must be avoided by you. In addition, tend not to take your braces away unless the medical professional claims so.

It is advised which you just observe your medical professional often for medical examinations to prevent choice possible problems that may appear while getting therapy. You are going to be approved using a specific dental hygiene, if necessary. Dental specialist may look-out of managing and id malocclusion now. Orthodontia – the main specialization of medication – mainly targets repairing chin problems and teeth, your grin as well as thus your sting. Dentist, however, won’t only do chin remedies and crisis teeth. They also handle tender to severe dental circumstances which may grow to states that are risky. You actually have not got to quantify throughout a predicament your life all. See dental specialist San – Direction Posts, and you’ll notice only but of stunning your smile plenty will soon be.

To most of us, a 3-D-printed turtle just looks like a turtle; four legs, patterned skin, and a shell. But if you show it to a particular computer in a certain way, that object’s not a turtle — it’s a gun.

Objects or images that can fool artificial intelligence like this are called adversarial examples. Jessy Lin, a senior double-majoring in computer science and electrical engineering and in philosophy, believes that they’re a serious problem, with the potential to trip up AI systems involved in driverless cars, facial recognition, or other applications. She and several other MIT students have formed a research group called LabSix, which creates examples of these AI adversaries in real-world settings — such as the turtle identified as a rifle — to show that they are legitimate concerns.

Lin is also working on a project called Sajal, which is a system that could allow refugees to give their medical records to doctors via a QR code. This “mobile health passport” for refugees was born out of VHacks, a hackathon organized by the Vatican, where Lin worked with a team of people she’d met only a week before. The theme was to build something for social good — a guiding principle for Lin since her days as a hackathon-frequenting high school student.

“It’s kind of a value I’ve always had,” Lin says. “Trying to be thoughtful about, one, the impact that the technology that we put out into the world has, and, two, how to make the best use of our skills as computer scientists and engineers to do something good.”

Clearer thinking through philosophy

AI is one of Lin’s key interests in computer science, and she’s currently working in the Computational Cognitive Science group of Professor Josh Tenenbaum, which develops computational models of how humans and machines learn. The knowledge she’s gained through her other major, philosophy, relates more closely this work than it might seem, she says.

“There are a lot of ideas in [AI and language-learning] that tie into ideas from philosophy,” she says. “How the mind works, how we reason about things in the world, what concepts are. There are all these really interesting abstract ideas that I feel like … studying philosophy surprisingly has helped me think about better.”

Lin says she didn’t know a lot about philosophy coming into college. She liked the first class she took, during her first year, so she took another one, and another — before she knew it, she was hooked. It started out as a minor; this past spring, she declared it as a major.

“It helped me structure my thoughts about the world in general, and think more clearly about all kinds of things,” she says.

Through an interdisciplinary class on ethics and AI ethics, Lin realized the importance of incorporating perspectives from people who don’t work in computer science. Rather than writing those perspectives off, she wants to be someone inside the tech field who considers issues from a humanities perspective and listens to what people in other disciplines have to say.

Teaching computers to talk

Computers don’t learn languages the way that humans do — at least, not yet. Through her work in the Tenenbaum lab, Lin is trying to change that.

According to one hypothesis, when humans hear words, we figure out what they are by first saying them to ourselves in our heads. Some computer models aim to recreate this process, including recapitulating the individual sounds in a word. These “generative” models do capture some aspects of human language learning, but they have other drawbacks that make them impractical for use with real-world speech.

On the other hand, AI systems known as neural networks, which are trained on huge sets of data, have shown great success with speech recognition. Through several projects, Lin has been working on combining the strengths of both types of models, to better understand, for example, how children learn language even at a very young age.

Ultimately, Lin says, this line of research could contribute to the development of machines that can speak in a more flexible, human way.

Hackathons and other pastimes

Lin first discovered her passion for computer science at Great Neck High School in Long Island, New York, where she loved staying up all night to create computer programs during hackathons. (More recently, Lin has played a key role in HackMIT, one of the Institute’s flagship hackathons. Among other activities, she helped organize the event from 2015 to 2017, and in 2016 was the director of corporate relations and sponsorship.) It was also during high school that she began to attend MIT Splash, a program hosted on campus offering a variety of classes for K-12 students.

“I was one of those people that always had this dream to come to MIT,” she says.

Lin says her parents and her two sisters have played a big role in supporting those dreams. However, her knack for artificial intelligence doesn’t seem to be genetic.

“My mom has her own business, and my dad is a lawyer, so … who knows where computer science came out of that?” she says, laughing.

In recent years, Lin has put her computer science skills to use in a variety of ways. While in high school, she interned at both New York University and Columbia University. During Independent Activities Period in 2018, she worked on security for Fidex, a friend’s cryptocurrency exchange startup. The following summer she interned at Google Research NYC on the natural language understanding team, where she worked on developing memory mechanisms that allow a machine to have a longer-term memory. For instance, a system would remember not only the last few phrases it read in a book, but a character from several chapters back. Lin now serves as a campus ambassador for Sequoia Capital, supporting entrepreneurship on campus.

She currently lives in East Campus, where she enjoys the “very vibrant dorm culture.” Students there organize building projects for each first-year orientation — when Lin arrived, they built a roller coaster. She’s helped with the building in the years since, including a geodesic dome that was taller than she is. Outside of class and building projects, she also enjoys photography.

Ultimately, Lin’s goal is to use her computer science skills to benefit the world. About her future after MIT, she says, “I think it could look something like trying to figure out how we can design AI that is increasingly intelligent but interacts with humans better.”

This week MIT hosted its second annual summit on “AI and the Future of Work,” bringing together representatives from industry, government and academia to discuss the opportunities and challenges of artificial intelligence (AI) and automation.

Co-hosted by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Initiative on the Digital Economy (IDE), the event featured former Alphabet chairman Eric Schmidt and Massachusetts Secretary of Labor Rosalin Acosta, who delivered the keynote address.

A common theme throughout the event was the importance of doing more than just thinking about technological disruption and actually working to create public policy that encourages the thoughtful deployment of AI systems.

“The technologies themselves are neutral, so the question is how to organize ourselves in society in a way that addresses their potential to change the job market,” said Diana Farrell, CEO of the JPMorgan Chase Institute. “We’re kidding ourselves if we think that the market is going to, on its own, allow these technologies to infiltrate and yield the kind of outcomes from society that we want.”

The focus on public policy also extended to education. Many panelists spoke of the importance of lifelong learning, in the form of a burgeoning industry of free and low-cost online classes to pick up skills in fields like machine learning and data science that have seen major job growth.

Some speakers believed that future focus needs to happen much earlier the educational pipeline. Fred Goff, who serves as CEO of the blue-collar job-search network Jobcase, did a survey of the platform’s 90 million members about their education. Half said that their K-12 background prepared them for their job today, but less than a quarter said that they think their education will prepare them for the jobs of tomorrow.

Beyond the U.S., industry analysts spoke about the importance of considering AI in the context of the developing world, where there is often low digital literacy.

“How do we support people in remote and isolated areas so that they don’t fall further behind?” asked Tina George, an expert in global technologies for the World Bank. “We can’t build Star Wars with Flintstone technology.”

There was also a growing recognition that in industry, AI could actually become something of an equalizer, especially in areas like mergers and acquisitions that rely heavily on data analysis.

“It no longer requires a multi-million dollar budget to get AI going in your company,” said Nichole Jordan, a managing partner at Grant Thornton LLP. “It represents an opportunity to level the playing field for smaller companies.”

On the academic side, CSAIL Director Professor Daniela Rus discussed the many ways that scientists are using AI for everything from diagnosing disease to predicting food shortages. At the same time, she talked about how important it is for researchers to be thoughtful and intentional as they work on these new breakthroughs.

“AI should be able to help us all get lifted to better lives, and I think there is a lot of potential still untapped,” Rus said in video remarks. “But we can’t just push technology forward and hope for the best. We have to work to ensure that the best happens.”

Today the Boston Lyric Opera presents the world premiere of “Schoenberg in Hollywood,” a new opera by Tod Machover, the Muriel R. Cooper Professor of Music and Media and director of the MIT Media Lab’s Opera of the Future group. Performances will run through Nov. 18.

“Schoenberg in Hollywood” is inspired by the life of Austrian composer Arnold Schoenberg after he fled Hitler’s Europe in the 1930s. After moving first to Boston and then to Los Angeles, Schoenberg sought connection with his new culture through music. He forged a friendship with famous comedian Harpo Marx, who introduced him to MGM’s Irving Thalberg, who in turn offered him the opportunity to compose a score for the film “The Good Earth.”

Schoenberg ultimately turned down the commission, rejecting the lure of more money and greater fame in favor of his artistic integrity (and after proposing highly unrealistic artistic and financial terms). In doing so, Schoenberg chose a path of fidelity to his heritage and his musical identity — a decision that pitted change against tradition, art against entertainment, and personal struggle against public action.

Machover’s opera is bookended by the Thalberg meeting, after which the fictional Schoenberg goes off to make a film about his own life. This imagined creation follows the narrative of Schoenberg’s historical journey up to a point, then diverges in a wild fantasy to imagine a different path had Schoenberg been able to reconcile opposing forces. Drawing on inspirations ranging from Jewish liturgical music to Bach and a World War I soundscape to contemporary 20th century music, Machover illustrates Schoenberg’s personal evolution through a synthesis of shifting influences.

“I immersed myself in Schoenberg’s world through his extensive — and incredible — writings, his music, his paintings, through visiting his amazing archives in Vienna, and by speaking with many people who knew him,” Machover explains. “But I grew up with Schoenberg’s music, so have been thinking about this for a very long time. It is part of me.”

Machover also drew on his own experience as a composer in a rapidly changing world to inform his interpretation of Schoenberg’s musical and personal journey.

“The work explores one man’s journey to move millions to social and political action while remaining deeply thoughtful and thoroughly ethical,” Machover says. “The underlying artistic, activist, and ethical questions raised in this opera are ones that we ask every day at the Media Lab.”

The opera is also uniquely informed by Machover’s dual roles as artist and technologist. The opera blends reality and fantasy, combining live singers and actors with diverse media, and acoustic sound, with complex electronics spread throughout the theater, while incorporating physical stage effects that modify perspective and perception in unusual ways.

“The Media Lab is the only environment I know where the forms and technologies of this opera could have been imagined and developed,” Machover says.

A polymath and inventor, Schoenberg never earned a degree from any academic or musical institution, but became the top composition professor at the renowned Berlin Conservatory of Music (before being expelled immediately upon Hitler’s rise to power). His depth of knowledge informed but never limited his own musical explorations. His invention of 12-tone technique, which Schoenberg described as “a method of composing with 12 tones which are related only with one another,” changed the face of Western music in the 20th century and beyond.

“He invented not only music but all kinds of unusual things, like a new notation system for tennis games (designed to annotate his son’s expert playing), contraptions to draw his own customized music manuscript paper, a curriculum to train movie composers in a purely sonic art, a painting technique to allow him to depict his inner mental state rather than outside physical features in a series of self-portraits,” says Machover. “As an intellect and creator, Schoenberg would have fit right into the Media Lab.”

In celebration not only of the opera’s premiere but also of the Media Lab’s informal adoption of Schoenberg, the lab is now hosting an exhibition on “Schoenberg in Hollywood” in the lobby gallery of Building E14. Videos and archival materials trace Schoenberg’s journey, including materials on loan from the Schoenberg Center in Vienna, most of which have never before been shown in the Boston area. The exhibition also serves as a companion to the opera, offering a listening station, a video trailer of one of the opera’s climactic moments, some of Machover’s own musical sketches, and an illustrated timeline juxtaposing events in Schoenberg’s life with scenes and sounds from Machover’s opera.

“The exhibition is a resonant companion to the opera, useful whether experienced before or after a performance,” explains Machover. “But is also meant to stand alone to introduce the art and life of this remarkable creator to the MIT community and beyond, and to tell at least a bit of the story about why this unusual new opera grew out of inspiration from Arnold Schoenberg … and the MIT Media Lab itself.”

“Schoenberg in Hollywood”runs Nov. 14-18 at the Emerson Paramount Theater in Boston. The Media Lab’s exhibition is currently open to the public and will run through April 30.

In October, the Institute announced the creation of the MIT Stephen A. Schwarzman College of Computing, an ambitious new enterprise that will allow students to better tailor their educational interests to their goals. But the ideas driving this exciting new effort may carry a distant echo — especially among alumni were at MIT during the 1980s — from the time leadership launched another computing enterprise that dramatically changed how undergraduates and graduate students learned.

Project Athena was a campus-wide effort to make the tools of computing available to every discipline at the Institute and provide students with systematic access to computers. A new project that featured computer workstations and educational programming, Athena was a milestone in the history of distributed systems and inspired programs like Kerberos. It also revolutionized educational computing for the Institute and beyond, and created the computing environment that many students and faculty still work in today.

“Before we had [Athena], our students complained about the lack of computing in such a technology-centered institution,” says Joel Moses, an Institute Professor at MIT and one of the initial leaders of Project Athena. “Athena turned MIT into one of the most computer-rich institutions in the country.”

“The founders of of project Athena believed that computation should be used broadly by a lot of people for a lot of reasons,” says Daniela Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science and director of MIT’s Computer Science and Artificial Intelligence Laboratory.

“They set out to create an education environment to empower MIT students to do just that. Since then, the MIT faculty and students have left their fingerprints all over the biggest accomplishments in the field of computing from systems to theory to artificial intelligence,” she says.

In 1983, the year Project Athena began, it was still possible for students to receive a science or engineering degree from MIT without ever having touched a computer. That was despite digital computers having been on campus since 1947, when the Navy commissioned Whirlwind I, one of the world’s first real-time computers. (It was powered by vacuum tubes.) But at the time, computers were nearly all provided by research funds which restricted their use.

Pre-Athena, MIT students who needed to use computers could work on computing systems such as CTSS. These systems did have some drawbacks, though. For one thing, students often had to wait in line at all hours of the day to do their work. In 1969, the Institute moved from CTSS to MULTICS, which was supported primarily by research funds with limited access for educational purposes. It included a timeshare aspect which meant that if students went over their allotted time, they weren’t allowed to run any more programs until the timeshare refreshed.

“(Before Athena), there was no internet access or email, no way to share files, and no standard anything. There was no @mit.edu address,” says Earll Murman, the director of the latter half of the eight-year project. “Athena changed all of that.”

Even when personal computers first started to appear on campus in the mid to late 1980s, they were still too expensive for many students. For about $1,000 a consumer could buy a computer with a 5 MB hard drive — which today is about enough space to store an MP3 of Bonnie Tyler’s “Total Eclipse of the Heart,” the song that topped of the charts in Athena’s first year.

Getting going

The leadership at MIT knew that as a technology-centered school, MIT needed to incorporate more computing into their education. So, in May 1983, under the care of a committee of faculty from the Department of Electrical Engineering and Computer Science — including then-EECS head Moses; Michael Dertouzos, the director of the Laboratory for Computer Science (now CSAIL); and Dean of the School of Engineering Gerald Wilson — the largest educational project ever undertaken at MIT was launched at the eventual cost of around $100 million. The project was largely paid for with funding from the Digital Equipment Corporation (DEC) and IBM.

The leaders of the project named it “Project Athena” after the ancient Greek goddess of wisdom, war, and the crafts. Unlike her namesake, however, Athena did not spring fully formed and outfitted with programs from the head of her creators. When the project started, it was ambitious and a little vague. Goals spanned from creating a cohesive network of compatible computers, to establishing a knowledge base for future decisions in educational computing, to helping students share information and learn more fully across all disciplines.

To supply the system with some clarity and direction, the committee went to the faculty and asked them to develop their own software for use in their classes and for students to work on. Projects — there were 125 in total — ranged from aerospace engineering simulations to language-learning applications to biology modeling tools.

Athena took off.

“I felt that we would know Athena was successful if we were surprised by some of the applications,” Moses says. “It turned out that our surprises were largely in the humanities.”

One such surprise was the Athena Writing Project, spearheaded by MIT professors James Paradis and Ed Barrett, which aimed to create an online classroom system for teaching scientific writing. The system allowed students to edit and annotate papers, present classwork, and turn in assignments.

The hardware

Of course, in order for students to be able to use all the educational programming, there had to be enough terminals for them to access the system. That’s where Professor Jerome Saltzer came in. While much of the leadership of the project was focused on overseeing the faculty proposals and research, Saltzer stepped in as the technical director of the project in 1983 and led the effort to bring the physical workstations, made by IBM, to all students.

Luckily for Saltzer and MIT, from its inception and beyond, Project Athena was on the cutting edge of distributed systems computing at the time. The Institute found a range of willing partners in industry, such as IBM and DEC, that were willing to provide MIT with funding, technology and hardware.

Project Athena formally ended in 1991. By then the project (and computing in general) had become much more pervasive and commonplace in MIT students’ lives. There were hundreds of Athena workstations located in clusters around the campus, and students were using them to measure bloodflow, design airplane wings, practice political science debates, digital revise their humanities papers, and hundreds of other things.

Athena’s wisdom today

It has now been 27 years since Project Athena ended, but the Athena computing environment is still a part of everyday life at MIT. There are Athena clusters located around campus, with many workstations hooked up to printers and available to students 24 hours a day, seven days a week (although there are fewer workstations than there once were, and they are typically used for more specialized applications).

Though Project Athena’s main goals were educational, it had long-lasting effects on a range of technologies, products, and services that the MIT community touches every day, often without even knowing it. Athena’s impact can be seen in the integration of third-party software like Matlab into education. Its use of overlapping windows — students could be watching videos in one window, chatting with friends and classmates in another, and working on homework in a third — was the start of the X Window system, which is now common on Unix displays. Athena also led to the development of the Kerberos authentication system (named, in keeping with the Greek mythology motif, after the three-headed dog which guards the Underworld) which is now used at institutions around the world.

For Drew Houston ’05, Athena was a source of inspiration.

“With Athena, you could sit down at any of the (hundreds) of workstations on campus and your whole environment followed you around — not only your files,” he says. “When I graduated, not only did I not have that anymore, but it felt like for most people they didn’t have anything like that, so I certainly saw a big opportunity to deliver that kind of experience to a much larger audience.”

The result was Dropbox, which Houston and his co-founder launched in 2008, allowing users to access their files from any synced device. “When we recruited engineers, part of our pitch was we were trying to build Athena for the rest of the world,” Houston says.

As MIT moves forward with the new college, Vice Chancellor Ian Waitz sees a parallel between the college and Project Athena. Like the new college, Athena was a way to change the structure of MIT’s education and provide a platform for students to create and problem-solve.

“One of the things that we do here is try to provide resources for people to use, and they may even use them in ways that we don’t imagine,” Waitz says. “That’s a pretty broad analogy to a lot of the stuff that we do here at MIT — we bring bright people together and give them the tools and problems to solve, and they’ll go off and do it.”

“Computers have made our daily lives easier in a million ways that people don’t even notice, from online shopping to digital cameras, from antilock brakes to electronic health records, and everything in between,” adds Rus.

“Computing helps us with all the little things and it is also vital to the really big things like traveling to the stars, sequencing the human genome, making our food, medicines, and lives safer,” she says. “Through the MIT Schwarzman College of Computing we will create the education and research environment to make computing a stronger tool and find new ways to apply it.”

Artist Ekene Ijeoma will join the MIT Media Lab, founding and directing the Poetic Justice research group, in January 2019. Ijeoma, who will be an assistant professor, works at the intersections of design, architecture, music, performance, and technology, creating multisensory artworks from personal experiences, social issues, and data studies.

Ijeoma’s work explores topics and issues ranging from refugee migration to mass incarceration. At its most basic level, the work aspires to embody human conditions, expand people’s thoughts, and engage them in imagining change and acting on it. At the lab, Ijeoma will continue this work in developing new forms of justice through artistic representation and intervention.

“New forms of justice can emerge through art that engages with social, cultural and political issues — ones that aren’t tied to codified laws and biased systems,” he says.

When asked to define “poetic justice,” Ijeoma explained that, for him, the phrase is about using code-switching content, form, context, and function to create artwork with rhythm and harmony that extends our perceptions and exposes the social-political systems affecting us as individuals. An example of this is his “Deconstructed Anthems” project, an ongoing series of music performances and light installations that explores the inequalities of the American Dream and realities of mass incarceration through “The Star-Spangled Banner,” and “​Pan-African AIDS,” a sculpture examining the hypervisibility of the HIV/AIDS epidemic in Africa and the hidden one in Black America. “​Pan-African AIDS,” is on display through April 2019 at the Museum of the City of New York as part of the exhibit Germ City: Microbes and the Metropolis.

Ijeoma’s art practice has been primarily project-based and commission-driven. His recent large works, both deeply conceptual yet highly technical projects, required research and development to happen concurrently with the production of the work. At the Media Lab, with more space for trial and error and failure, he will have the resources and facilities to stay reflective and proactive, to create work outside of commissions, and to expand more artworks into series. In addition, he will have opportunities for more listening to and meditating on issues.

“Like many artists,” said Ijeoma, “A lot of my work comes from vibing and forward thinking — channeling my environment and signaling out the noise.” This aspect of his practice is reflected in work such as “The Refugee Project ” (2014), released a few months before the European refugee crisis, and “Look Up” (2016), released a few days before Pokemon Go; and more recently “Pan-African AIDS” which was presented as news was breaking on the underreported AIDS epidemic in the black populations in areas including the American South.

Ijeoma’s work has been commissioned and presented by venues and events including the Museum of Modern Art, The Kennedy Center, the Design Museum, the Istanbul Design Biennial, Fondation EDF, the Annenberg Space for Photography, the Neuberger Museum of Art at the State University of New York at Purchase, and Storefront for Art and Architecture.

“We are thrilled that Ekene Ijeoma will be joining the Media Lab and MAS program,” said Tod Machover, head of the Program in Media Arts and Sciences, the Media Lab’s academic program. “Ekene’s work is brilliant, bold, and beautiful, and the way he combines expression, reflection, innovation, and activism will place him at the absolute center of Media Lab culture, hopefully for many years to come.”

Ekene Ijeoma graduated with a BS in information technology from Rochester Institute of Technology, and an MA in interaction design from Domus Academy. He has lectured and critiqued at Yale University, Harvard Law School, Columbia University, New York University, the School of Visual Arts, and The New School.

Several years ago, a couple thousand people filed into Le Grand Rex, a Paris auditorium, to watch a performance. It was not a concert, however. Instead, a group of professional computer-game players competed to see who could win at “StarCraft 2,” a science fiction game where human exiles from Earth battle aliens.

Beyond the audience watching in person, meanwhile, was another audience streaming an online broadcast of the competition — including T.L. Taylor, a professor in the Comparative Media Studies/Writing program at MIT.

For years, Taylor has been chronicling the rise of esports: competitive computer games watched by audiences like the one at Le Grand Rex. But, as Taylor chronicles in a new book, esports showcases are part of a larger cultural trend toward livestreaming as a distinctive mode of entertainment. That trend also encompasses a scrappier outsider culture of do-it-yourself gaming broadcasts and other uses of streaming, a genre as popular as it is overlooked in the mainstream media.

“We’re at a fascinating moment right now,” says Taylor, about the growth of the livestreaming movement.

And now, in her book, “Watch Me Play: Twitch and the Rise of Game Livestreaming,” Taylor examines the ascendance of livestreaming in its many forms, while analyzing the commercialization of streaming and some of the social tensions that come with the subculture.

As Taylor emphasizes in the book, the rise of livestreaming is very much tied to Twitch, the San Francisco-based streaming website where people broadcast their contests, and their lives. Twitch has about 10 million active daily users and was purchased by Amazon in 2014.

“Formalized competitive computer gaming has been around for decades,” Taylor notes. “But it also used to be a lot of work to be a fan. You had to know what specialist websites to visit. You had to download replay files or seek out recorded videos. Livestreaming changed everything.”

Originally, livestreaming was not necessarily meant to focus on gaming. Instead, it was partly conceived as a “new form of reality TV,” according to Justin Kan, who in 2007 founded Justin.tv, a site broadcasting events from his own life. After seeing how popular livestreaming of gaming was, however, Kan and some partners founded Twitch as a separate platform. It has since grown to encompass people who stream cooking and “social eating” content, music shows, and more.

Still, computer gaming remains a principal driver of livestreaming. One branch of this has become organized esports, complete with teams, sponsors, and corporate investment. Another branch consists of individuals building their own audiences and brands, one gaming session at a time, broadcasting on camera while playing and interacting with their audiences.

This can be a grueling occupation. In the book, Taylor visits the home of a suburban Florida gaming entrepreneur while he broadcasts a playing session that begins at 3:30 a.m., to draw a global audience. After several hours, the session netted this independent livestreaming player about 50 new subscribers, 800 new followers, and $500 in donations, all while his children slept.

“Eventually these livestreamers become not only content producers but also brand and community managers,” Taylor writes in the book. Some of them are also unlikely broadcast personalities, by their own admission. “I guess a part of me is that talkative person on the screen, but as soon as it goes off … I’m kind of a quiet person offstream,” says gaming star J.P. McDaniel, as recounted in Taylor’s book.

Meanwhile, livestreaming is a heavily male-dominated field. As Taylor documents in the book, women, people of color, and participants from the LGBTQ community can face serious levels of harassment, which limits their participation in the culture.

“Women also continue to face stereotypes and pushback when they focus on competitive games and have professional aspirations,” Taylor writes in the book. Indeed, a central theme of “Watch Me Play” is that all forms of livestreaming, including professional esports, have much to tell us about larger social trends, instead of existing as a kind of cultural cul-de-sac.

“Far too often we imagine what happens in play and games as being separate from ‘real life,’” says Taylor. “But our leisure is infused with not only our identities and social worlds, but broader cultural issues. This is probably most obvious when we think about how gender plays a powerful role in our leisure, shaping who is seen as legitimately allowed to play, what they can play, and in what ways.”

For this reason, Taylor adds, “those very moments when people are engaging in play remain some of the most politically infused spaces” in society. Thus, for all the novelty, Taylor hopes her study of livestreaming will appeal to those who have never watched competitive computer games, alone or at Le Grand Rex.

“My hope is that it [the book] gets picked up by not only those who are interested in livestreaming, but readers who might want to finally understand how to think about gaming” as it expands in society, and as entertainment becomes diversified across media platforms, Taylor says.

“Digital games have become a part of many people’s everyday lives,” she adds. “My hope is that the work helps make clear what is at stake in that.”

Wikipedia has enabled large-scale, open collaboration on the internet’s largest general-reference resource. But, as with many collaborative writing projects, crafting the content can be a contentious subject.

Often, multiple Wikipedia editors will disagree on certain changes to articles or policies. One of the main ways to officially resolve such disputes is the Requests for Comment (RfC) process. Quarreling editors will publicize their deliberation on a forum, where other Wikipedia editors will chime in and a neutral editor will make a final decision.

Ideally, this should solve all issues. But a novel study by MIT researchers finds debilitating factors — such as excessive bickering and poorly worded arguments — have led to about one-third of RfCs going unresolved.

For the study, the researchers compiled and analyzed the first-ever comprehensive dataset of RfC conversations, captured over an eight-year period, and conducted interviews with editors who frequently close RfCs, to understand why they don’t find a resolution. They also developed a machine-learning model that leverages that dataset to predict when RfCs may go stale. And, they recommend digital tools that could make deliberation and resolution more effective.

“It was surprising to see a full third of the discussions were not closed,” says Amy X. Zhang, a PhD candidate in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and co-author on the paper, which is being presented at this week’s ACM Conference on Computer-Supported Cooperative Work and Social Computing. “On Wikipedia, everyone’s a volunteer. People are putting in the work, and they have interest … and editors may be waiting on someone to close so they can get back to editing. We know, looking through the discussions, the job of reading through and resolving a big deliberation is hard, especially with back and forth and contentiousness. [We hope to] help that person do that work.”

The paper’s co-authors are: first author Jane Im, a graduate student at the University of Michigan’s School of Information; Christopher J. Schilling of the Wikimedia Foundation; and David Karger, a professor of computer science and CSAIL researcher.

(Not) finding closure

Wikipedia offers several channels to solve editorial disputes, which involve two editors hashing out their problems, putting ideas to a simple majority vote from the community, or bringing the debate to a panel of moderators. Some previous Wikipedia research has delved into those channels and back-and-forth “edit wars” between contributors. “But RfCs are interesting, because there’s much less of a voting mentality,” Zhang says. “With other processes, at the end of day you’ll vote and see what happens. [RfC participants] do vote sometimes, but it’s more about finding a consensus. What’s important is what’s actually happening in a discussion.”

To file an RfC, an editor drafts a template proposal, based on a content dispute that wasn’t resolved in an article’s basic “talk” page, and invites comment by the broader community. Proposals run the gamut, from minor disagreements about a celebrity’s background information to changes to Wikipedia’s policies. Any editor can initiate an RfC and any editor — usually, more experienced ones — who didn’t participate in the discussion and is considered neutral, may close a discussion. After 30 days, a bot automatically removes the RfC template, with or without resolution. RfCs can close formally with a summary statement by the closer, informally due to overwhelming agreement by participants, or be left stale, meaning removed without resolution.

For their study, the researchers compiled a database consisting of about 7,000 RfC conversations from the English-language Wikipedia from 2011 to 2017, which included closing statements, author account information, and general reply structure. They also conducted interviews with 10 of Wikipedia’s most frequent closers to better understand their motivations and considerations when resolving a dispute.

Analyzing the dataset, the researchers found that about 57 percent of RfCs were formally closed. Of the remaining 43 percent, 78 percent (or around 2,300) were left stale without informal resolution — or, about 33 percent of all the RfCs studied. Combining dataset analysis with the interviews, the researchers then fleshed out the major causes of resolution failure. Major issues include poorly articulated initial arguments, where the initiator is unclear about the issue or writes a deliberately biased proposal; excessive bickering during discussions that lead to more complicated, longer, argumentative threads that are difficult to fully examine; and simple lack of interest from third-party editors because topics may be too esoteric, among other factors.

Helpful tools

The team then developed a machine-learning model to predict whether a given RfC would close (formally or informally) or go stale, by analyzing more than 60 features of the text, Wikipedia page, and editor account information. The model achieved a 75 percent accuracy for predicting failure or success within one week after discussion started. Some more informative features for prediction, they found, include the length of the discussion, number of participants and replies, number of revisions to the article, popularity of and interest in the topic, experience of the discussion participants, and the level of vulgarity, negativity, and general aggression in the comments.

The model could one day be used by RfC initiators to monitor a discussion as it’s unfolding. “We think it could be useful for editors to know how to a target their interventions,” Zhang says. “They could post [the RfC] to more [Wikipedia forums] or invite more people, if it looks like it’s in danger of not being resolved.”

The researchers suggest Wikipedia could develop tools to help closers organize lengthy discussions, flag persuasive arguments and opinion changes within a thread, and encourage collaborative closing of RfCs.

In the future, the model and proposed tools could potentially be used for other community platforms that involve large-scale discussions and deliberations. Zhang points to online city-and community-planning forums, where citizens weigh in on proposals. “People are discussing [the proposals] and voting on them, so the tools can help communities better understand the discussions … and would [also] be useful for the implementers of the proposals.”

Zhang, Im, and other researchers have now built an external website for editors of all levels of expertise to come together to learn from one another, and more easily monitor and close discussions. “The work of closer is pretty tough,” Zhang says, “so there’s a shortage of people looking to close these discussions, especially difficult, longer, and more consequential ones. This could help reduce the barrier to entry [for editors to become closers] and help them collaborate to close RfCs.”

“While it is surprising that a third of these discussions were never resolved, [what’s more] important are the reasons why discussions fail to come to closure, and the most interesting conclusions here come from the qualitative analyses,” says Robert Kraut, a professor emeritus of human-computer interactions at Carnegie Melon University. “Some [of the study’s] findings transcend Wikipedia and can apply to many discussion in other settings.” More work, he adds, could be done to improve the accuracy of the machine-learning model in order to provide more actionable insights to Wikipedia.

The study sheds light on how some RfC processes “deviate from established norms, leading to inefficiencies and biases,” says Dario Taraborelli, director of research at the Wikimedia Foundation. “The results indicate that the experience of participants and the length of a discussion are strongly predictive of the timely closure of an RfC. This brings new empirical evidence to the question of how to make governance-related discussions more accessible to newcomers and members of underrepresented groups.”

Since announcing the MIT Stephen A. Schwarzman College of Computing, Institute leaders have reached out to the campus and alumni communities in a series of forums, seeking ideas about the transformative new entity that will radically integrate computing with disciplines throughout MIT.

MIT Provost Martin A. Schmidt and Dean of the School of Engineering Anantha P. Chandrakasan engaged with students, faculty, and staff at forums on campus, where they presented outlines of the project and received dozens of public comments and questions. Additionally, Chandrakasan and Executive Vice President and Treasurer Israel Ruiz engaged with alumni in two webcast sessions that featured Q&A about the college.

“Creating this new college requires us to think deeply and carefully about its structure,” Schmidt said at a forum for faculty on Oct. 18. That process should be firmly connected to the ideas and experiences of the MIT community as a whole, he said further at a student forum on Oct. 25, adding that the goal was to “engage you in the process of building the college together.”

Community perspectives

The discussions at the forums each had a slightly different flavor, generally reflecting the perspectives of the participants. The faculty forum, for instance, included professors from several fields concerned about maintaining a solid balance of disciplinary research at MIT.

The responsibilities of professors at the new college have yet to be fully defined. Many faculty will have joint appointments between the MIT Schwarzman College of Computing and existing MIT departments, an approach that both Schmidt and Chandrakasan acknowledged has had varying results in the past. As participants noted, some MIT faculty with joint appointments have thrived, but others have floundered, being pulled in different scholarly and administrative directions.

“We need to figure out how to make dual appointments work,” Chandrakasan said. Still, he noted that the “cross-cutting” structure of the college had enormous potential to integrate computing into the full range of disciplines at the Institute.

At a standing-room-only forum for MIT staff members on Oct. 25, with people lining the walls of Room 4-270, audience members offered comments and questions about the college’s proposed main building, MIT’s computing infrastructure, teaching, advising, the admissions process, and the need to hire motivated staff in the college’s most formative stages.

“It’s an opportunity to really do a whole-of-Institute solution to this challenge,” Schmidt said. “It’s going to test us.”

Multiple people at the student forum on Oct. 25 called for diversity among the college’s new faculty — a view Schmidt and Chandrakasan readily agreed with. The Institute leaders also emphasized the expansion of opportunities the college will provide for students, including more joint programs and degrees, and more student support.

“There will be more UROP opportunities, more resources, more faculty,” Chandrakasan said. Also, he noted, “We’re not going to change the undergraduate admissions process.” MIT Chancellor Cynthia Barnhart also spoke at the student forum.

At all three on-campus forums, audience participants commented upon the value of having Institute supporters share MIT’s goal of creating a “better world.” At the staff forum, one audience member advocated that MIT only accept funding from backers who were fully committed to democracy, and questioned the Institute’s connections with Saudi Arabia. Schmidt noted that MIT — as it has publicly announced — is currently reassessing MIT’s Institute-level engagements with entities of the Kingdom of Saudi Arabia.

At the student forum, audience members also raised queries about MIT’s mission and its relationships with donors; the issues cited included the precedent of naming the college after an individual, and the extent of MIT’s due diligence process during the creation of the college. Schmidt said the Institute had performed its due diligence well and developed the idea of the named college after extensive discussions; he also noted that faculty and students of the college would be able to develop a full range of intellectual and academic projects freely.

Audience members also stressed the generalized need to think critically about the impact of technology on society at a moment of social, political, and ecological uncertainty — and expressed a preference for the college to integrate ethics into its curriculum.

“This presents a real opportunity to get at that,” Chandrakasan responded.

On Oct. 30, the Alumni Association hosted two webcasts that featured Q&A with Chandrakasan and Ruiz. Over 1,000 alumni from around the world registered for the virtual conversations, which were moderated by Vice President for Communications Nate Nickerson. Questions centered on how the cross-disciplinary aspirations of the college would find life, and on how ethics will be made to infuse the college and shape its graduates. In both sessions, alumni asked how they can participate in the pursuit of the college’s mission. “The alumni will be critical to our efforts,” said Ruiz. “They offer us great wisdom as we form the college, and they will serve as important points of connection for our faculty and students as they seek to understand all the ways that computing is shaping our world.”

Helping every department

The college is being developed thanks to a $350 million foundational gift from Mr. Schwarzman, the chairman, CEO, and co-founder of Blackstone, a leading global asset manager. It will serve as an interdisciplinary hub for research and teaching across all aspects of computing, while strengthening links between computing and other scholarly pursuits.

“The college has two goals,” said Chandrakasan at the forum for MIT staff members. “One is to advance computing, and one is to link computing to other other [fields]. … This allows us to optimize, unbundle, and rebundle, to make computing much more integrated across all disciplines.”

It also presents new organizational challenges. For decades, MIT has been largely organized around its five schools, which focus on engineering; science; architecture and planning; humanities, arts, and social sciences; and management. But as Chandrakasan emphasized in all three campus forums, the MIT Schwarzman College of Computing is intended to develop connections with all of those schools as well as other stand-alone institutes and programs on campus.

“This is about helping advance every department,” said Chandrakasan, who frequently referred to the importance of the college’s “bridge” function, meaning it can span the width of MIT to link students, faculty, and resources together.

For his part, Schmidt emphasized at the events that the college will accelerate the current trend in disciplinary transformation. He noted that the fields of economics and urban studies at MIT have both recently created joint degrees with computer science as a natural response to the ways data and computing power have enabled new modes of academic research.

The foundational gift is part of a $1 billion commitment MIT has made to the new college, which will be centered in a new campus building, include 50 new faculty and allow the Institute to create a new series of collaborative, interdisciplinary enterprises in research and education. The college is meant to address all aspects of computing, including the policy and ethical issues surrounding new technologies.

“Across the Institute there is great enthusiasm for this,” Chandrakasan added.

“A unique opportunity to evolve”

The MIT Schwarzman College of Computing is intended to open in the fall of 2019 and will be housed partly — but not entirely — in its new building. The timeline, Chandrakasan acknowledged, is “super-aggressive.”

Schmidt and Chandrakasan noted that many important issues were yet to be resolved. As part of the process of developing the college, the Institute is creating a task force and working groups to assess some of the critical issues MIT faces.

Some audience members at the forums also questioned why MIT would announce the creation of its new college at a time when some of the entity’s institutional features are unresolved. In response, Schmidt noted that the Institute benefits by being on the leading edge of computing, and that the creation of the college will only enhance that position. Community engagement, he noted, would help the Institute finalize its vision for the college.

“We’re not going to be able to answer all of [your] questions,” Schmidt said at the staff forum. To gain traction on unresolved matters, he added, “We think the task force model is an appropriate one.”

MIT intends to hire a dean for the college and begin the search process for new faculty during the current academic year. There are a few campus sites being considered as the location for the college’s main building, but not all elements of the college will be located in that building.

Overall, Schmidt concluded, the creation of the college has presented MIT with a unique opportunity to evolve in response to the prevalence of computing and its influence in so many spheres of life.

“Every campus in the country today has been grappling with the need,” Schmidt said. “We feel that MIT has come forward with a really compelling solution.”

Finding lost hikers in forests can be a difficult and lengthy process, as helicopters and drones can’t get a glimpse through the thick tree canopy. Recently, it’s been proposed that autonomous drones, which can bob and weave through trees, could aid these searches. But the GPS signals used to guide the aircraft can be unreliable or nonexistent in forest environments.

In a paper being presented at the International Symposium on Experimental Robotics conference next week, MIT researchers describe an autonomous system for a fleet of drones to collaboratively search under dense forest canopies. The drones use only onboard computation and wireless communication — no GPS required.

Each autonomous quadrotor drone is equipped with laser-range finders for position estimation, localization, and path planning. As the drone flies around, it creates an individual 3-D map of the terrain. Algorithms help it recognize unexplored and already-searched spots, so it knows when it’s fully mapped an area. An off-board ground station fuses individual maps from multiple drones into a global 3-D map that can be monitored by human rescuers.

In a real-world implementation, though not in the current system, the drones would come equipped with object detection to identify a missing hiker. When located, the drone would tag the hiker’s location on the global map. Humans could then use this information to plan a rescue mission.

“Essentially, we’re replacing humans with a fleet of drones to make the search part of the search-and-rescue process more efficient,” says first author Yulun Tian, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro).

The researchers tested multiple drones in simulations of randomly generated forests, and tested two drones in a forested area within NASA’s Langley Research Center. In both experiments, each drone mapped a roughly 20-square-meter area in about two to five minutes and collaboratively fused their maps together in real-time. The drones also performed well across several metrics, including overall speed and time to complete the mission, detection of forest features, and accurate merging of maps.

Co-authors on the paper are: Katherine Liu, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and AeroAstro; Kyel Ok, a PhD student in CSAIL and the Department of Electrical Engineering and Computer Science; Loc Tran and Danette Allen of the NASA Langley Research Center; Nicholas Roy, an AeroAstro professor and CSAIL researcher; and Jonathan P. How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics.

Exploring and mapping

On each drone, the researchers mounted a LIDAR system, which creates a 2-D scan of the surrounding obstacles by shooting laser beams and measuring the reflected pulses. This can be used to detect trees; however, to drones, individual trees appear remarkably similar. If a drone can’t recognize a given tree, it can’t determine if it’s already explored an area.

The researchers programmed their drones to instead identify multiple trees’ orientations, which is far more distinctive. With this method, when the LIDAR signal returns a cluster of trees, an algorithm calculates the angles and distances between trees to identify that cluster. “Drones can use that as a unique signature to tell if they’ve visited this area before or if it’s a new area,” Tian says.

This feature-detection technique helps the ground station accurately merge maps. The drones generally explore an area in loops, producing scans as they go. The ground station continuously monitors the scans. When two drones loop around to the same cluster of trees, the ground station merges the maps by calculating the relative transformation between the drones, and then fusing the individual maps to maintain consistent orientations.

“Calculating that relative transformation tells you how you should align the two maps so it corresponds to exactly how the forest looks,” Tian says.

In the ground station, robotic navigation software called “simultaneous localization and mapping” (SLAM) — which both maps an unknown area and keeps track of an agent inside the area — uses the LIDAR input to localize and capture the position of the drones. This helps it fuse the maps accurately.

The end result is a map with 3-D terrain features. Trees appear as blocks of colored shades of blue to green, depending on height. Unexplored areas are dark but turn gray as they’re mapped by a drone. On-board path-planning software tells a drone to always explore these dark unexplored areas as it flies around. Producing a 3-D map is more reliable than simply attaching a camera to a drone and monitoring the video feed, Tian says. Transmitting video to a central station, for instance, requires a lot of bandwidth that may not be available in forested areas.

More efficient searching

A key innovation is a novel search strategy that let the drones more efficiently explore an area. According to a more traditional approach, a drone would always search the closest possible unknown area. However, that could be in any number of directions from the drone’s current position. The drone usually flies a short distance, and then stops to select a new direction.

“That doesn’t respect dynamics of drone [movement],” Tian says. “It has to stop and turn, so that means it’s very inefficient in terms of time and energy, and you can’t really pick up speed.”

Instead, the researchers’ drones explore the closest possible area while considering their speed and direction and maintaining a consistent velocity. This strategy — where the drone tends to travel in a spiral pattern — covers a search area much faster. “In search and rescue missions, time is very important,” Tian says.

In the paper, the researchers compared their new search strategy with a traditional method. Compared to that baseline, the researchers’ strategy helped the drones cover significantly more area, several minutes faster and with higher average speeds.

One limitation for practical use is that the drones still must communicate with an off-board ground station for map merging. In their outdoor experiment, the researchers had to set up a wireless router that connected each drone and the ground station. In the future, they hope to design the drones to communicate wirelessly when approaching one another, fuse their maps, and then cut communication when they separate. The ground station, in that case, would only be used to monitor the updated global map.

In 1968, the black student community at MIT was small and needed a way to amplify its voice. Formed during that tumultuous year in political and racial history in the U.S., the MIT Black Students’ Union (BSU) launched a journey of advocacy and community that now continues 50 years later.

In the late 1960s, about 11 percent of Americans were black, but each 1,000-member class at MIT had perhaps half a dozen black students. Galvanized by the assassination of Martin Luther King Jr., black student groups were forming at overwhelmingly white college campuses across the country, and MIT was no exception. The students who started MIT BSU had two goals in mind: to support each other and to bring more black students to the Institute. “Surely there were more than three blacks in the high school class of 1965 who could belong to the MIT tribe,” says Linda C. Sharpe ’69, one of the BSU founders, who is a past president of the MIT Alumni Association and a former MIT Corporation member.

In fall of 1968, the new group drew up and presented a list of recommendations to the MIT administration: increasing the number of black students, creating a pre-­enrollment summer program for minority students, and hiring more minority faculty members. In response, MIT established the Task Force on Educational Opportunity (TFEO), which was made up of a group of BSU representatives and MIT administrators and chaired by associate provost (and future MIT president) Paul Gray ’54, SM ’55, ScD ’60. Through a series of often intense discussions, the TFEO designed the summer program, called Project Interphase, and came up with more inclusive approaches to things like recruitment, admissions, and financial aid.

“The Institute rolled up its sleeves and attacked [the recommendations] in the MIT way — that is, being very analytical about what the challenges and problems were, and then trying to figure out solutions to those challenges,” says founding BSU co-chair Shirley Ann Jackson ’68, PhD ’73, who went on to become the first black woman to earn a PhD from MIT and is now the president of Rensselaer Polytechnic Institute and a life member of the MIT Corporation. “That doesn’t mean there wasn’t great emotion around it, because there really, really was on all sides.”

Gray, who died in 2017, would ultimately recall that being part of the Task Force was eye-opening: “I came away with an understanding I had none of two years before, as best a white person can understand what it was like to be black in the United States in the era before and during the civil rights revolution. It was a powerful experience.”

Through the efforts of the Admissions Office and BSU members who began recruiting black applicants from all over the country, the number of African-American students jumped to about 50 in the Class of 1973 and continued to rise, as did the numbers of women and other members of underrepresented minorities. Meanwhile, in an event modeled after political takeovers of buildings on other college campuses, a group of black students disrupted a meeting of the MIT Corporation in 1970 to advocate for the BSU demands and support kitchen workers involved in a labor dispute.

“The BSU has always played a major role in helping the Institute to not fall back from the goals of commitment and participation of black students, faculty, and administration. It’s been a key agent in helping MIT look at itself,” says adjunct professor emeritus of urban studies Clarence Williams, HM ’09, who joined the MIT administration in 1972 as assistant dean of the graduate school and has since served in multiple positions, including as acting director of the Office of Minority Education, special assistant to the president and chancellor, and Institute ombudsperson. Williams, who started the Black History Project in 1995, is the author of Technology and the Dream: Reflections on the Black Experience at MIT, 1941–1999 (MIT Press, 2001) and co-produced the 1996 video “It’s Intuitively Obvious,” which documented the experience of black students at MIT.

In addition to working to increase the number of black students on campus, the BSU advocated for recruitment and retention of black faculty and staff. “We also sought to broaden the dialogue on campus around issues pertinent to our community,” says Michelle Harton ’83, the outgoing chair of Black Alumni of MIT (BAMIT). Over the years, the BSU has organized discussion panels and cultural events, hosted prospective minority students, and played a central role in MIT’s annual Black History Month observance. Among the speakers brought to campus by the BSU are Benjamin L. Hooks, then executive director of the NAACP, and Ivan Van Sertima, author of They Came Before Columbus: The African Presence in Ancient America.

Five decades after the formation of the BSU, black students now make up 6.2 percent of MIT’s undergraduate population (as of fall 2017), up from 0.6 percent in 1968. And Sharpe notes that today, “the number of black women in the freshman class is nearly equal to the number of all women in my class.” She adds, “Times do change, if a lot more slowly than we would like.”

And the work continues today. In a parallel to the 1968 BSU proposals, the BSU and the Black Graduate Student Association (BGSA) met with President L. Rafael Reif in 2015, following several racially charged incidents across the country. The two groups issued a set of recommendations that included diversity orientation and training for all students, a diversity representative within each department, an MIT Medical clinician specializing in psychological issues affecting African-Americans, and a requirement that all undergraduates take an “immersion studies” elective focusing on multiculturalism or diversity. BAMIT and other groups also made recommendations. Many have already been completely or partially implemented, and conversations on how to advance other recommendations on the departmental level are ongoing.

“The work that needs to be done at MIT is similar to what needs to take place across the country — greater cultural understanding and value for the differences that people bring, plus mechanisms for civil discourse,” says Elaine Harris ’78, a BAMIT board member and cosponsor of what’s now called the Hack for Inclusion, an annual hackathon to tackle issues of bias, diversity, and inclusion. Outcomes from the hackathon include projects to create a more welcoming Boston for the black community and to address bias in machine learning. “I wish that the problem-­solving skills we apply to technical challenges and the metrics we develop to assess progress could be used in the domain of diversity, equity, and inclusion,” she says.

The BSU held an on-campus event in February to celebrate its 50-year legacy of advocating for black students and all minorities at MIT. In June, Jackson and Rudd — the first two black women to earn undergraduate degrees at MIT — became the first black women to earn their red jackets at their 50th reunion, where Jackson also served as a class speaker. In November, she is slated to speak at the BAMIT capstone event, “Road to 50: The Power of Community,” which will feature historical recollections, discussions, and a look forward.

Kelvin Green ’21, current cochair of the BSU, believes the organization is still playing an integral part to ensure equality within the MIT community, and that’s one of the reasons why he chose the Institute.

“Diversity is but a stepping-stone toward a higher goal,” he says. “Inclusion — truly valuing the people brought to this campus in all of the identities they bring — is where we must look. Let us not stop at the stepping-stone of diversity and ponder why it cannot support our weight; we must transition to the rock of inclusion, which is by definition created to support us all.”