Story Highlights

Mark Zuckerberg says lawmakers tell him Facebook has too much power over speech. "Frankly I agree."

Civil rights groups say Facebook has not cut down on hate speech against African Americans.

It was spirit week, and Carolyn Wysinger, a high school teacher in Richmond, California, was cheerfully scrolling through Facebook on a break between classes. Her classroom, with its black-and-white images of Martin Luther King Jr. and Che Guevara and a "Resist Patriarchy" sign, was piled high with colorful rolls of poster paper, the whiteboard covered with plans for pep rallies.

A post from poet Shawn William caught her eye. "On the day that Trayvon would've turned 24, Liam Neeson is going on national talk shows trying to convince the world that he is not a racist." While promoting a revenge movie, the Hollywood actor confessed that decades earlier, after a female friend told him she'd been raped by a black man she could not identify, he'd roamed the streets hunting for black men to harm.

"White men are so fragile," she fired off, sharing William's post with her friends, "and the mere presence of a black person challenges every single thing in them."

It took just 15 minutes for Facebook to delete her post for violating its community standards for hate speech. And she was warned if she posted it again, she'd be banned for 72 hours.

Wysinger glared at her phone, but wasn't surprised. She says black people can't talk about racism on Facebook without risking having their posts removed and being locked out of their accounts in a punishment commonly referred to as "Facebook jail." For Wysinger, the Neeson post was just another example of Facebook arbitrarily deciding that talking about racism is racist.

Many of these users now think twice before posting updates on Facebook or they limit how widely their posts are shared. Yet few can afford to leave the single-largest and most powerful social media platform for sharing information and creating community.

So to avoid being flagged, they use digital slang such as "wypipo," emojis or hashtags to elude Facebook's computer algorithms and content moderators. They operate under aliases and maintain back-up accounts to avoid losing content and access to their community. And they've developed a buddy system to alert friends and followers when a fellow black activist has been sent to Facebook jail, sharing the news of the suspension and the posts that put them there.

They call it getting "Zucked" and black activists say these bans have serious repercussions, not just cutting people off from their friends and family for hours, days or weeks at a time, but often from the Facebook pages they operate for their small businesses and nonprofits.

A couple of weeks ago, Black Lives Matter organizer Tanya Faison had one of her posts removed as hate speech. "Dear white people," she wrote in the post, "it is not my job to educate you or to donate my emotional labor to make sure you are informed. If you take advantage of that time and labor, you will definitely get the elbow when I see you." After being alerted by USA TODAY, Facebook apologized to Faison and reversed its decision.

Even former employees are not immune. In November, former Facebook partnerships manager Mark Luckie called out Facebook for how it treats black users and black employees. "One of the platform's most engaged demographics and an unmatched cultural trendsetter is having their community divided by the actions and inaction of the company," he wrote in a Facebook post. "This loss is a direct reflection of the staffing and treatment of many of its black employees."

Facebook deleted his post, then hours later said it “took another look” and restored it.

'Black people are punished on Facebook'

"Black people are punished on Facebook for speaking directly to the racism we have experienced," says Seattle black anti-racism consultant and conceptual artist Natasha Marin.

Marin says she's one of Facebook's biggest fans. She created a "reparations" fund that's aided a quarter million people with small donations to get elderly folks transportation to medical appointments or to pay for prescriptions, to help single moms afford groceries or the rent or to get supplies for struggling new parents. More recently, she started a social media project spreading "black joy" rather than black trauma.

She was also banned by Facebook for three days for posting a screenshot of a racist message she received.

"For me as a black woman, this platform has allowed me to say and do things I wouldn’t otherwise be able to do," she says. "Facebook is also a place that has allowed things like death threats against me and my children. And Facebook is responsible for the fact that I am completely desensitized to the N-word.”

Seven out of 10 black U.S. adults use Facebook and 43 percent use Instagram, according to the Pew Research Center. And black millennials are even more engaged on social media. More than half – 55 percent – of black millennials spend at least one hour a day on social media, 6 percent higher than all millennials, while 29 percent say they spend at least three hours a day, nine percent higher than millennials, Nielsen surveys found.

"Maybe Mark Zuckerberg needs to sit down with a bunch of black women who use Facebook and just listen," says Seattle black anti-racism consultant and conceptual artist Natasha Marin.(Photo: Mary Dee Mateo)

These hashtag movements, coming against the backdrop of an upsurge in hate crimes, have helped put the deaths of unarmed African Americans by police officers on the public agenda, along with racial disparities in employment, health and other key areas.

"If I were to sit down with Mark Zuckerberg, the message I would want to get across to him is: You may not even realize how powerful a thing you have created. Entire revolutions could take place on this platform. Global change could happen. But that can’t happen if real people can’t take part," Marin says. "The challenge for these companies is to see black women as valuable resources. This is the wealth on the platform, the people pushing the platform forward. If anything, they should be supported. "There should be policies and community standards that overtly support that kind of work," Marin says. "Maybe Mark Zuckerberg needs to sit down with a bunch of black women who use Facebook and just listen."

How Facebook judges what speech is hateful

For years, Facebook was widely celebrated as a platform that empowered people to bypass mainstream media or oppressive governments to directly tell their story. Now, in the eyes of some, it has assumed the role of censor.

With more than a quarter of the world's population on Facebook, the social media giant says it's wrestling with its unprecedented power to judge what speech is hateful.

"Lawmakers often tell me we have too much power over speech," Facebook CEO Mark Zuckerberg said recently, "and frankly I agree."(Photo: Andrew Harnik/AP)

All across the political spectrum, from the far right to the far left, Facebook gets flak for its judgment calls. To help sort what's allowed and what's not, it relies on a 40-page list of rules called "Community Standards," which were made public for the first time last year. Facebook defines hate speech as an attack against a "protected characteristic," such as race, gender, sexuality or religion. And each individual or group is treated equally. The rules are enforced by a combination of algorithms and human moderators trained to scrub hate speech from Facebook. From July to September 2018, Facebook removed 2.9 million pieces of content that it said violated its hate speech rules, more than half of which was flagged by its technology.

The tag team of algorithms and moderators frequently makes mistakes when flagging and removing content, Facebook acknowledges. And it has taken steps to try to make its system more accountable. Last year, Facebook began allowing users to file an appeal when their individual posts are removed. This year, the company plans to introduce an independent body of experts to review some of those appeals.

In late 2017 and early 2018, Facebook explored whether certain groups should be afforded more protection than others. For now, the company has decided to maintain its policy of protecting all racial and ethnic groups equally, even if they do not face oppression or marginalization, says Neil Potts, public policy director at Facebook. Applying more "nuanced" rules to the daily tidal wave of content rushing through Facebook and its other apps would be very challenging, he says.

Potts acknowledges that Facebook doesn't always read the room correctly, confusing advocacy and commentary on racism and white complicity in anti-blackness with attacks on a protected group of people. Facebook is looking into ways to identify when oppressed or marginalized users are "speaking to power," Potts says. And it's conducting ongoing research into the experiences of the black community on its platform.

"That's, on its face, the type of speech we want to encourage, but words and people aren't perfect, so it doesn't always come across as that. We are exploring additional refinements to our hate speech policy that will perhaps help remedy some of these situations," he says.

"That's the biggest thing," says Neil Potts, public policy director with Facebook, "making sure we are in tune with this community and the way they actually speak about these topics, and making sure our policies are in line and in touch."(Photo: Facebook)

Facebook wants to make sure its policies "reflect how people speak about these topics." "That's the biggest thing," Potts says, "making sure we are in tune with this community and the way they actually speak about these topics, and making sure our policies are in line and in touch."

'Just another slap in the face'

Ayo Henry, a mother of four from Providence, Rhode Island, says Facebook's policies could not be more out of touch.

Last year, Henry was cut off by a kid on a bike wearing a confederate sweatshirt when she pulled into the parking lot of a sandwich shop. She honked her horn. He responded twice with a racial slur.

She restrained herself in front of her children, but a few weeks later after leaving roller derby practice, Henry spotted the kid again, wearing the same sweatshirt. The boy tried to pedal away. She pulled out her phone. "I don't know what came over me. It was an impulsive decision," she says. "But I wanted to let him know that it wasn't OK."

He apologized, explaining he "wasn't in a good mood that day." She realized how young he was as his body trembled and hands shook. She tried to offer him some motherly advice on why he should not use racial slurs.

Henry's video of the exchange was viewed more than 2 million times on Facebook. Within 48 hours, Facebook took the footage down, saying it ran afoul of its hate speech rules. Henry appealed the decision but Facebook refused to reverse it.

In the meantime, her Messenger inbox filled with hundreds racial slurs, derogatory messages and threats that she would be raped or killed. Yet each time Henry tried to privately share the video with her friends on Messenger, Facebook blocked her.

Offers of support that poured in from around the country helped Henry develop a network of black activists. Starting last summer, she says they all began noticing that just typing the phrase "white people" into a Facebook post could get their post flagged and their accounts suspended. Henry says Facebook has suspended her several times, once for calling on white women to get on board with a broader, more intersectional, form of feminism.

"Social media is supposed to be a way that people can come together and be able to communicate relatively freely," black activist Ayo Hentry says. "For us, it has become just another slap in the face."(Photo: Ayo Henry)

It wasn't a Facebook post, but a comment on that post that triggered her longest suspension from Facebook.

Her Facebook post drew attention to an antique store in nearby Massachusetts that refused to remove racist memorabilia hanging on its walls, including a vintage advertisement for smoking tobacco with a caricature of two black men. "Still perpetuating white supremacy through 'nostalgia," she wrote.

One person challenged whether the image was in fact racist so Henry replied with a similar image of a jigsaw puzzle box from the same era which was labeled "Chopped Up N------."

That comment, which Facebook deleted, sent her to Facebook jail for a month. Henry appealed the suspension but Facebook would not relent. She says she knows of no one in the black community who has had a post reinstated by appealing.

"Black people in this country, and people of color in general, we endure a daily battle just to exist. Now we have to be careful not to complain too much about the various types of oppression publicly because, if you do, then you are going to be suppressed," she says. "It's difficult to navigate the framework of social media as a black person, just because racism is systemic and, when you realize it's ingrained so deeply in a system that is so influential on our country, then it's almost like a whole second burden to bear."

"Social media is supposed to be a way that people can come together and be able to communicate relatively freely," she says. "For us, it has become just another slap in the face."

Civil rights groups push for audit, accountability

Early on, the Black Lives Matter social justice movement turned to Facebook as an organizing tool. Yet their organizers say they were soon set upon by bands of white supremacists who targeted them with racial slurs and violent threats. In 2015, Color Of Change, which was formed after Hurricane Katrina to organize racial justice campaigns on the internet, began pressuring Facebook to stop the harassment of black activists by hate groups.

"What we continue to see time and time again is what's framed as race-neutral decision-making ends up being overtly hostile to the communities most in need of some of those free speech protections," says Brandi Collins-Dexter, senior campaign director at Color Of Change.(Photo: Michael Meadows)

Chanelle Helm, a Black Lives Matter organizer from Louisville, Kentucky, says the threats intensified in the form of doxxing – posting organizers' addresses, phone numbers and photos on the internet. Faison, founding member of the Sacramento chapter of Black Lives Matter, was stalked. "It got a lot more serious," Helm says. "They were threatening folks with doxxing of family members."

Facebook removed a group responsible for some of the harassment, but Color Of Change and other civil rights groups say they struggled to get the company to address other complaints. Late last year, The New York Times reported that Facebook had hired a Republican opposition research firm to discredit Color Of Change and other Facebook critics.

"What we continue to see time and time again is what's framed as race-neutral decision-making ends up being overtly hostile to the communities most in need of some of those free speech protections," says Brandi Collins-Dexter, senior campaign director at Color Of Change.

The Center for Media Justice began probing why content from people of color was being removed from Facebook in August 2016 when, at the request of law enforcement, Facebook shut down the video of a Baltimore woman, Korryn Gaines, who was live-streaming her standoff with police. Gaines was later shot and killed by a police officer in front of her 5-year-old son who was also struck twice by gunfire. At the same time, Black Lives Matter activists and Standing Rock pipeline protesters in North Dakota were reporting that their content was being removed, too.

Tanya Faison, of Black Lives Matter, addresses a demonstration outside the Sacramento Police Department to protest the decision to not prosecute the two officers involved in the 2018 fatal shooting of Stephon Clark, in Sacramento, Calif., Saturday, March 2, 2019. (Photo: Rich Pedroncelli, AP)

In 2016 and again in 2017, civil rights and other groups wrote letters urging Facebook to conduct an independent civil rights audit of its content moderation system and to create a task force to institute the recommendations.

Last May, Facebook agreed to an audit as it was trying to control the damage from revelations that a shadowy Russian organization posing as Americans had targeted unsuspecting users with divisive political messages to sow discord surrounding the 2016 presidential election. One of the main targets of the Internet Research Agency on Facebook were African Americans. The same day Facebook gave in to demands from civil rights groups, it announced a second audit into allegations of anti-conservative bias led by former Senator Jon Kyl, an Arizona Republican.

There are few signs of progress in how Facebook deals with racially motivated hate speech against the African American community or the erasure of black users' speech, says Steven Renderos, senior campaign manager at the Center for Media Justice.

"We, and a lot of organizations that we work with, are frankly tired of waiting for Facebook to decide what changes it's going to make for itself," says Steven Renderos of the Center for Media Justice.(Photo: Center for Media Justice)

Last summer, after Nia Wilson, a black teenage girl, was stabbed to death by a white man wielding a knife at an Oakland, California, train station, black women gathered on Instagram to mourn.

“As we see another one of us being murdered to bleed out in the streets we can’t help by think: that could be me, that could be my daughter, a sister, my best friend,” black activist Rachel Cargle wrote. “You okay sis? I get it if you’re not. At this moment I feel heavy and distant and numb. I feel angry and deflated and heartbroken.”

Cargle asked that only women of color respond. "I needed to give us this space to check in on each other." Comments from black women poured in. “I’m scared. For my family, for my friends, for myself and for all the other black women out there,” one wrote.

Some white women objected to being left out of the conversation. Soon Instagram removed Cargle’s post, saying it violated guidelines on hate speech.

"There were hundreds of comments of black women being seen and heard by their peers, being loved and cared for by their sisters, being consoled and loved exactly as they needed it," Cargle wrote at the time. "DO YOU SEE THIS? DO YOU SEE HOW NOT ONLY ARE WE KILLED IN THE STREETS WE ARE ALSO PUNISHED FOR GRIEVING."

Instagram later reversed the decision.

Between class periods, Carolyn Wysinger checks social media and emails on her phone.(Photo: Brittany Hosea-Small for USA TODAY)

Civil rights organizations say they've largely given up on Facebook voluntarily taking steps to protect black users, calling instead on Congress and the Federal Trade Commission to regulate the company. Color Of Change has asked Zuckerberg and COO Sheryl Sandberg to take part in a civil rights summit this spring, but they have not agreed to take part. Color Of Change is also pushing a resolution at Facebook’s shareholder meeting in May to replace Zuckerberg as chairman of the board.

"At the end of the day, Facebook hasn't tackled one of the biggest issues of most interest to the civil rights community, which is how it deals with content moderation and how the platform will become a place that civil rights are protected," Renderos says. "We, and a lot of organizations that we work with, are frankly tired of waiting for Facebook to decide what changes it's going to make for itself."

'I don't think Facebook cares'

Shaun Saunders, who works in public relations, says he doesn't think Facebook cares. Saunders is someone tech companies seek out when they want to tell their story to the media. He uses Facebook to connect with journalists. He's also been suspended from Facebook three times for speaking his mind about racism.

"I am exhausted by the notion that a platform can arbitrarily cut you off. I am not only cut off from my family and friends, I am cut off from my job and my craft. That's the part that really gets me," Saunders says. "It's not OK that black people are dying. Black people should not shut their mouths for identifying these things that are always happening."

"I would love to ask Mark Zuckerberg: 'What the hell are you guys doing? You guys are a hot mess.' I would say it to him just like that."

It's not just black people who have their posts removed. Andy Marra, executive director of the Transgender Legal Defense & Education Fund, says allies of black people run into trouble, too.

Marra's Facebook post in late January calling on Asian Americans to protect "black and brown who face the brunt of white supremacy" was removed by Facebook.

Twice Marra appealed the decision to take down her Facebook post, which shared an article from a popular blog showing an Asian man throwing up "white power" signs to antagonize Black Lives Matter protesters. "This post is expressing condemnation to anti-black racism. The post also articulates critical feedback about how other people of color – specifically those in the Asian community, including myself as an Asian person – should oppose racism in all of its forms," she wrote in one appeal that Facebook denied.

It was only when friends reached out to Facebook to plead her case that Marra's Facebook post was reinstated. Critics say having those kinds of connections is the only way that Facebook corrects content moderation errors, but it's not a channel available to just anyone seeking redress.

Samreen "Sammie" Lewis and Erica Morales, two activists of color, say their Facebook page, Three Token Brown Girls, has been deleted three times, once just for the logo, and they each have been personally banned repeatedly.(Photo: Samreen "Sammie" Lewis)

"It basically says that we don't matter. The harm being caused to us perpetually, constantly is nothing to (Facebook)," Samreen "Sammie" Lewis says. "They'd much rather shut us up for their own comfort than acknowledge the harm they are causing."(Photo: Samreen "Sammie" Lewis)

Take Samreen "Sammie" Lewis and Erica Morales, two activists of color, who say their Facebook page, Three Token Brown Girls, has been deleted three times, once just for the logo.

They say they each have been personally banned repeatedly, sometimes barely serving out one suspension before being hit with another. Lewis estimates she's been suspended from Facebook for half of the past year. Their protests go unheard, and, each time they have to rebuild their Facebook page from scratch, they lose followers.

"It basically says that we don't matter. The harm being caused to us perpetually, constantly is nothing to (Facebook)," Lewis says. "They'd much rather shut us up for their own comfort than acknowledge the harm they are causing."

In 2017, DiDi Delgado, a poet and black liberation organizer and activist, captured the growing anger in the black community with a Medium post provocatively titled: "Mark Zuckerberg Hates Black People." At the time, Delgado was simultaneously serving two Facebook bans for alleged hate speech.

Asked what has changed since she published the viral post, Delgado says nothing. "Black, LGBT, non-male and women identified users are still disproportionately banned for speaking out against oppression," she says.

These days, Delgado spends less time and energy on Facebook and, at times, refrains from speaking her mind there. "Sometimes it’s more important to keep that direct line of communication open than to risk getting banned with a public post," she says.

In the end, Wysinger made that same calculation. In February, Wysinger decided not to risk being booted off Facebook by republishing her post about Neeson, the actor. Just days before her 40th birthday, she did not want to get thrown in Facebook jail and miss the chance to celebrate with family and friends. But, she says, she wants Facebook to know that, in silencing black people, the company is causing them harm.

"Facebook is not looking to protect me or any other person of color or any other marginalized citizen who are being attacked by hate speech," she says. "We get trolls all the time. People who troll your page and say hateful things. But nobody is looking to protect us from it. They are just looking to protect their bottom line."

"Anything that I share, I'm sharing because it's something personal that happened to me. That's what Facebook has always built its platform on," she says. "It used to ask: How are you feeling? Well, today, I am feeling targeted by CIS hetero white men."

Posted!

A link has been posted to your Facebook feed.

Phaedra Ellis-Lamkins is part of the new guard shaking up Silicon Valley. Her tech start-up Promise is working on decarceration, keeping people – mostly poor or of color – out of jail who don’t need to be there. Ellis-Lamkins secured $12 million and the confidence of venture capitalists, including First Round Capital and Jay-Z’s Roc Nation. Sasha Craig

Kathryn Finney gives advice to founders within her BIG Incubator program in Atlanta. Finney is the founder and Managing Director of digitalundivided (DID). Founded in 2013, DID is a social enterprise that empowers and encourages Black and Latina women to own their economic security through entrepreneurship. She’s a graduate of Yale University. Savannah Brock, for USA TODAY

Stacy Spikes is the co-founder of the subscription-based movie ticketing service MoviePass. Spikes served as CEO of the company through mid-2016 and was the chief operating officer up until the service's sale to HMNY, an IT service management company. Spikes also founded the Urbanworld Film Festival in 1997 which has served as a launching pad to filmmakers like Ava DuVernay and more. Robin Platzer, FilmMagic

Jessie Woolley-Wilson, CEO of DreamBox Learning, has worked in the education technology space for nearly 20 years to support school and district leaders to improve learning and life outcomes for K-12 students. Woolley-Wilson graduated from Harvard Business School and has a B.A. from the University of Virginia. Jessie Woolley-Wilson

Julia Collins is the first black woman to co-found a company valued at $1 billion or more by investors. She’s the co-founder of Zume Pizza, a company that uses robots for many stages of pizza production. Collins graduated from Stanford University Graduate School of Business. Cody Pickens

Tyrone Ahmad-Taylor, also known as Ty, is a trailblazer in the multimedia industry. He is the vice president of product marketing at Facebook and also serves on GoPro's board of directors. Ahmad-Taylor founded FanFeedr, a news media services for sports fans in 2008. The graduate of Columbia University has over 25 years experience in information design, according to his Linkedin profile. Hand-out, GoPro, Inc.

Kimberly Bryant is the founder of Black Girls Code, which introduces girls of color to computer science with the goal of building a new generation of coders. The company aims to train 1 million girls by 2024. Bryant, an electrical engineer from Memphis, attended the Vanderbilt University School of Engineering. Martin E. Klimek, USA TODAY

Chris Young is CEO of McAfee, LLC, and under his leadership, McAfee protects data for nine out of 10 Fortune 100 firms, according to the company. Young holds a bachelor's degree from Princeton University and a master's from the Harvard Business School. McAfee

Ime Archibong leads a business development and product partnerships team at Facebook. Prior to joining Facebook, Archibong was a software engineer at IBM. He holds a bachelor's degree in Electrical Engineering and Computer Science from Yale University and a master's from the Stanford Graduate School of Business, according to his Linkedin profile. Facebook

Jasmine Arielle Edwards founded i-Subz, a recruiting and placement marketplace for substitute teachers and schools that serve low-income students. It launched a pilot in 2018 in three schools with 20 substitute teachers and brought aboard its first paying customer. Jasmine Arielle Edwards

Tristan Walker is the co-founder of Code2040, an organization that aims to create a pipeline of talented black and Hispanic students and funnel them to some of the world's biggest tech firms. Walker is also the founder and CEO of Walker & Co., a startup making health and beauty products for people of color. Procter & Gamble Co. agreed to buy Walker & Co. in December and it will operate as a separate and wholly-owned subsidiary of P&G, continuing to be led by Walker. Martin E. Klimek

Stacy Brown-Philpot is the CEO of TaskRabbit, an online marketplace that matches people who do tasks with people willing to hire them. The Detroit native studied at Stanford University and currently serves as a member of the board of directors at HP, Inc. TaskRabbit

Dr. Dre (Andre Young) is a pioneering West Coast hip-hop producer, rapper and entrepreneur. In 2008, Dre introduced the “Beats by Dr. Dre” line of headphones. Six years later, tech giant Apple purchased the company for $2.6 billion in cash and $400 million in company stock. The transaction made Dr. Dre hip-hop's first billionaire and included an executive position for him at the corporation. Angela Weiss, Getty Images for City Of Hope

According to the Silicon Valley Engineering Council Hall of Fame, Roy L. Clay, Sr., was “a key figure” in the development of Hewlett Packard's computer divisions in the 1960s. Clay, who grew up in the Jim Crow-era South, served as vice mayor in Palo Alto, California in the 1970s. Palo Alto Weekly

Computer scientist and engineer Mark Dean helped develop a number of landmark technologies for IBM. He was chief engineer on the team that designed the original IBM PC in the early 1980s. He helped develop technologies, including the color PC monitor and the first gigahertz chip. With a colleague, he developed a system that allowed devices such as printers and monitors to be plugged directly into computers. He is now a professor at the Tickle College of Engineering at the University of Tennessee. Tennessee Alumnus

Jewel Burks Solomon was the founder and CEO of Partpic Inc, a software development company that enables users to search and purchase replacement parts using visual recognition technology. Burks raised more than $2 million in seed funding and closed deals with distributors and retailers. In 2016, Amazon bought her company. Burks Solomon is a graduate of Howard University. Holland Reid Photography

John W. Thompson became the independent chairman of Microsoft Corp. in 2014. Before that, he served as CEO of Virtual Instruments, a privately held company. Thompson also held several leadership positions during his 28 years at IBM in the areas of sales, marketing and software development. Thompson received a bachelor’s from Florida A&M University and a master’s degree in management science from MIT’s Sloan School of Management. Microsoft

Delane Parnell is the founder and CEO of PlayVS, a venture-backed startup building the infrastructure for high school esports. Prior to starting PlayVS, Delane worked at IncWell Venture capital where he became one of the youngest black venture capitalists in the U.S. In 2018, PlayVS signed an exclusive, five-year partnership with the National Federation of State High School Associations, the organization that oversees varsity sports at nearly 19,500 public and private high schools across the country. PlayVS

Frank Green is a Silicon Valley pioneer who developed high-speed semiconductor computer-memory systems at Fairchild Semiconductor R&D Labs in the 1960s. He started two technology companies and later founded NewVista Capital, a venture firm with a special focus on minority- and female-headed firms. He died at 71 years old in 2009. Palo Alto Weekly

Melissa Hanna is co-founder of Mahmee, a Los Angeles tech company that works with health care systems and insurance companies to get women access to maternity care. Hanna is a graduate of Southwestern Univeristy School of Law. Elton Anderson courtesy of Cross Culture Ventures

Reginald Fils-Aime became the president and Chief Operating Officer of Nintendo of America Inc. in 2006. He retired from the position in 2019. Fils-Aime is a marketing strategist who held positions at VH1 and bicycle manufacturer Derby Cycle. He graduated from Cornell University with a Bachelor of Science in Economics and Management. Jae C. Hong, AP