Google researcher Timnit Gebru received an email in early December while on vacation. “We accept your resignation immediately, from today,” wrote a vice president of the company.
The problem is that Gebru had not resigned, he had only said that he would do so in the future if conditions were not met. The decision had apparently been prompted by an internal email criticizing the censorship of an academic article.
And he immediately went to Twitter to tell about it: “They have cut me off access to the corporate account. They have fired me lightning. ” “I feel bad for my colleagues, but for me it is better to know the beast than to make it see,” he added in another tweet.
Since that day, and until last Friday, 2,351 Google employees and 3,729 academics from around the world have signed a letter in support of Gebru. The CEO of Google, Sundar Pichai, has written an email to all employees, where he said he wanted to regain the trust of his workers, but without apologizing.
The global community of engineers and researchers dedicated to artificial intelligence has since held a debate on the limits of research funded by large companies and the role of Google as a company, which has already removed all masks after abandoning its original naive slogan of “ do not be bad”.
Gebru is a pioneering researcher in the field of ethics in artificial intelligence (AI). His greatest academic achievement was a 2018 article where, with other co-authors, he discovered that facial recognition was only wrong in detecting the gender of white men 1% of the time, but 35% of black women.
After arriving from Ethiopia at the age of 16, he received a BA in Electrical Engineering from Stanford University and was involved in creating the first iPad at Apple. Then it went through Microsoft. After her doctorate in machine vision with the celebrated professor Fei Fei Li, she came to Google at the end of 2018.
The type of dismissal – by email, on vacation – is not the usual for a figure of this level. “Firing a person with an email is the second worst way after with a post-it,” wrote Spanish professor and engineer Ricardo Baeza Yates in response to Gebru. Speculation about the real causes of the dismissal skyrocketed.
These are the most talked about causes during these days. All reflect at least a certain lack of sensitivity, which leaves Google in a bad place for its generous or delicate behavior with stalkers and other high-profile cases.
1 / Email. The vice president’s email mentioned an email Gebru wrote to an internal list of Google employees. “Some aspects of the email you sent last night to non-management employees reflect behavior that is inconsistent with the expectations of a Google executive.”
What did that email say? Gebru was unable to retrieve it because he no longer had access to his account. But within a few hours he was on the journalist Casey Newton’s newsletter . “After all the micro and macro aggressions and harassment received after submitting my stories here, I had stopped writing,” wrote Gebru.
But that day he had something else to say. The entire text is an outlet, a public complaint about how her superiors had treated her regarding an academic article that she had to “retract” from. “A week before you go on vacation, they give you a meeting,” Gebru said. “Nobody tells you what’s going on.
There they tell you that ‘it has been decided’ that you must rectify the paper in a week. You are not worthy of having conversations since you are someone whose humanity (not to mention your recognized knowledge) is not accepted or valued by this company ”.
It was an email about the alleged censorship of an academic article that Gebru and other researchers, inside and outside of Google, had sent to a conference. Suddenly that item didn’t seem to be of sufficient quality. What information would that article contain that so worried Google?
2 / The academic article. No one has published the entire article. But some journalists have been able to read it. There is nothing explosive for the ethical AI sector. He talks about the huge energy expenditure involved in building AI models: they need a lot of computational power.
It also refers to the biases created by models who write after training with trillions of words found on the Internet. “The structural injustices that exist in society permeate the data. It is very difficult to find unbiased data because society is biased ”, explains Ariel Guersenzvaig, professor at the Elisava University School of Design and Engineering (Barcelona). With language it is obvious: if the models are trained with what we say they will repeat our patterns forever.
“What comes out in the academic article, as published so far, I will not say is known, but anyone who knows the academic literature will not be surprised at anything,” says Guersenzvaig.
3 / “The angry black woman.” If neither the mail nor the article seem explosive, maybe they were just the excuse? “The paper is probably an excuse to get rid of a person who was finding it difficult,” says Mara Balestrini, PhD in Computer Science from the University College of London.
The definition of “angry black woman” is Gebru’s own cartoon about the role that Google’s communications seem to want to give her: a troublemaker who has earned her dismissal for heavy lifting.
“They paint me as the angry black woman because they put you in this terrible workplace and if you talk about it you become the problem,” says Gebru in the only interview she has given so far after the conflict.
One of the verbs that Gebru has used the most is gaslight, which comes from a play from the beginning of the 20th century, later taken to the cinema. A man psychologically harassed his wife, making her believe that she was going crazy. Abuse and load the dead to the other. That’s what Google has done with it, according to Gebru.
These are the facts, but the stir it has caused implies that it has struck a chord. In principle it is just one more layoff in a company with 130,000 employees. Why have you caused such a scandal?
1 / “If they have done this to me”. The big problem that Gebru sees in his situation is that of all the black women who are in a more precarious situation than hers. Gebru founded the Negros en IA group in 2016. “The biggest story for me is that if this happens to me, what is happening to other people?” Gebru says now.
Google has 1.6% black women in its general workforce, only 0.7% in technical positions and 1.1% in managerial positions. Almost 50% of managerial positions are for white men. “You have stalkers who take millions of dollars,” says Gebru.
“You have all those people with such toxic behavior that there are others saying ‘they are valuable to the company’, ‘oh, they are socially rare’ or whatever. And then you have a young black girl who has to prove herself over and over again. I reached a point where my knowledge is valued by people, but not within Google ”, he adds.
2 / “If they have done this to her.” And a company that treats its brilliant employees like this, how will it treat its billions of users? That is the question asked by Anna Jobin, a sociologist and researcher at the University of Lausanne (Switzerland).
“If a company like Google can’t stand what Timnit Gebru has to say and listen and learn to innovate better, what about all the other ethical issues in your business? So it’s no longer that personal matters [like Gebru’s hypothetical problems at Google] don’t matter, it’s not just personal anymore. They are important socially, politically and ethically ”, he adds.
3 / Who will allow us to investigate? Big tech companies are perhaps the biggest source of funding for researchers like Gebru. Universities are essential, but they have much less money. “Intel funded my doctorate for me.
The difference with what a Spanish doctoral student could do was abysmal ”, explains Balestrini. “There are very few employers of this nature. For ethical AI if you are not in university there is no work. Getting to Google or Facebook is the possibility of accessing large volumes of data ”, he adds.
The Gebru deal could be a turning point for hiring AI engineers. If the more sensitive see that the ethics department is despised, they may look beyond Google to work. Balestrini is skeptical: “I don’t want to be pessimistic, but they will not stop having groups of researchers for this.
If these things are going to change, it is because there is more alert to technical problems and at the same time more job opportunities for researchers arise, as will now happen with the Horizon Europe program ”, he says.
The controversy generated recalls that Google is no longer the search hero forever. Problems that occur within your offices are viewed in a different light.