Stark, an assistant professor at Western University in Ontario, Canada, studies the social and ethical impacts of artificial intelligence. In late November, he applied for a Google Research Scholar award, a no-strings-attached research grant of up to $60,000 to support professors who are early in their careers.
He put in for the award, he said, "because of my sense at the time that Google was building a really strong, potentially industry-leading ethical AI team."
Soon after, that feeling began to dissipate. In early December, Timnit Gebru, the co-leader of Google's ethical AI team and a prominent Black woman in a mostly White, male field, abruptly left Google. On Wednesday, December 2, she tweeted that she had been "immediately fired" for an email she sent to an internal mailing list. In the email she expressed dismay over the ongoing lack of diversity at the company and frustration over an internal process related to the review of a then-unpublished research paper about the risks of building ever-larger AI language models — a buzzy kind of AI that is increasingly important to Google's enormous search business.
At the time, Gebru said Google AI leadership told her to retract the paper from consideration for presentation at a conference, or remove her name from it.
Google said it accepted Gebru's resignation over a list of demands she had sent via email that needed to be met for her to continue working at the company.
Gebru's ouster kicked off a months-long crisis for the company, including employee departures, a leadership shuffle, and an apology from Google's CEO for how the circumstances of Gebru's departure caused some employees to question their place there. Google conducted an internal investigation into the matter, results of which were announced on the same day the company fired Gebru's co-team leader, Margaret Mitchell, who had been consistently critical of the company on Twitter following Gebru's exit. (Google cited "multiple violations" of its code of conduct.) Meanwhile, researchers outside Google, particularly in AI, have become increasingly distrustful of the company's historically well-regarded scholarship and angry over its treatment of Gebru and Mitchell.
All of this came into sharp focus for Stark on Wednesday, March 10, when Google sent him a congratulatory note, offering him $60,000 for his proposal for a research project that would look at how companies are rolling out AI that is used to detect emotions. Stark said he immediately felt he needed to reject the award to show his support for Gebru and Mitchell, as well as those who yet remain on the ethical AI team at Google.
"My first thought was, 'I have to turn it down'," Stark told CNN Business.
Stark is among a growing number of people in academia who are citing the exits of Gebru and Mitchell for recent decisions to forfeit funding or opportunities provided by the company. Some AI conference organizers are rethinking having Google as a sponsor. And at least one academic who has received a big check from Google in the past has since declared he won't seek its financial support until changes are made at the company.
"In good conscience, I can no longer accept funding from a company that treats its employees in this manner," Vijay Chidambaram, an assistant professor at the University of Texas at Austin who studies storage systems, told CNN Business. Chidambaram previously received $30,000 from Google in 2018 for a research project.
The money involved is of little consequence to Google. But the widening fallout from Google's tensions with its ethical AI team now pose a risk to the company's reputation and stature in the AI community. This is crucial as Google battles for talent — both as employees at the company and names connected to it in the academic community.
"I think this is wider spread than even the company realizes," Stark said.
Despite his initial inclination, Stark didn't immediately refuse Google's award. He spoke to colleagues about what he planned to do — "People were supportive of whichever decision I made," he said — before sending Google his response the following Friday. He thanked the company for the "vote of confidence" in his research, but wrote that he was "declining this award in solidarity with Drs. Gebru and Mitchell, their teammates, and all those who've been in similar situations", according to emails viewed by CNN Business.
"I look forward to the possibility of collaborating with Google Research again, at such time as the organization and its leaders have reflected on their decision in this case, addressed the harms they've caused, and committed, in word and deed, to fostering critical research and products that support equity and justice," Stark wrote.
He tweeted about his decision to reject the award as well, to make it public, noting that many people can't afford to turn down such funding from Google or other companies. Stark is able to forgo the money because his department at Western University is sufficiently funded. The award from Google would have provided extra research money, he said.
"All we can do is what we can reasonably do — and this was something I felt I could," Stark tweeted.
Gebru said she appreciated Stark's action.
"It's a pretty huge deal for someone to decline Google sponsorship," she told CNN Business. "Especially someone who's early in their career."
A Google spokesperson said that, over the past 15 years, the company has furnished over 6,500 academic and research grants to those outside Google. Stark is the first person to turn one down, according to the spokesperson.
Yet Stark's decision is just the latest show of solidarity with Gebru and Mitchell.
The first obvious sign of anger came just after Gebru left Google. A Medium post decrying her departure and demanding transparency about Google's decision regarding the research paper quickly gained signatures of Google employees and supporters within the academic and AI fields; by late March, its number of supporters had swelled to nearly 2,700 Google employees and over 4,300 others.
In early March, the conference to which Gebru and her coauthors had submitted the paper, the ACM Conference on Fairness, Accountability, and Transparency, or FAccT, halted its sponsorship agreement with Google. Gebru is one of the conference's founders, and served as a member of FAccT's first executive committee. Google had been a sponsor each year since the annual conference began in 2018. Michael Ekstrand, co-chair of the ACM FAccT
Network, confirmed to CNN Business that the sponsorship was halted, saying the move was determined to be "in the best interests of the community" and that the group will "revisit" its sponsorship policy for 2022. Ekstrand said Gebru was not involved in the decision.
Also in March, two academics protested Google's actions by tweeting that they decided not to attend an invitation-only robotics research event that was being held online. Hadas Kress-Gazit, a Cornell robotics professor, was one of them; she said she was invited in January but grew more reticent as the event drew closer.
"It was a real fiasco the way [Gebru and Mitchell] were treated. Nobody apologized to them yet even," she told CNN Business in a recent interview. "I don't want to interact with companies that behave that way toward top researchers."
Google is aware that its reputation as a research institution has been harmed in recent months, and the company has said it's intent on fixing it. In a recent Google town hall meeting, which Reuters first reported on and CNN Business has also obtained audio from, the company outlined changes it's making to its internal research and publication practices.
"I think the way to regain trust is to continue to publish cutting-edge work in many, many areas, including pushing the boundaries on responsible-AI-related topics, publishing things that are deeply interesting to the research community, I think is one of the best ways to continue to be a leader in the research field," Jeff Dean, Google's head of AI, said. He was responding to an employee question regarding outside researchers saying they will read papers from Google "with more skepticism now."
Gebru hopes that, like FAccT, more conferences will reevaluate their relationships with tech companies' research labs. Historically, much of the work in the development and study of AI has been performed within academic settings. But as companies have found more and more commercial uses for the technology, the lines between the academic and corporate worlds have blurred. Google is just one of many tech companies that wields a large amount of influence over academic conferences that publish many of its researchers' papers; its employees sit on conference boards and it sponsors numerous conferences each year, sometimes to the tune of tens of thousands of dollars.
For instance, Google and some subsidiaries of its parent company, Alphabet, were listed as $20,000 "platinum" and $10,000 "gold" level sponsors at the International Conference on Machine Learning, or ICML and the Conference on Neural Information Processing Systems, or NeurIPS, in 2020 — both key AI conferences. And some of the company's employees sit on their organizing committees.
ICML president John Langford said the conference is "presently open for sponsorship" by Google for its 2021 conference, which is set for July.
"There is quite a bit of discussion ongoing about how ICML as a conference should encourage good machine learning culture and practices with future sponsorship policy a part of that discussion," he added.
NeurIPS executive director Mary Ellen Perry said the conference hasn't yet made its annual call for sponsorships, but that requests "will be evaluated against a set of selection guidelines put in place by this year's sponsorship chairs"; NeurIPS is scheduled for December.
For Stark and others in the academic research community, however, their criteria for accepting funds from Google have already changed.
"Extra research money would be great," Stark said. "But it was something I felt like I just couldn't take."