Tuesday, June 12, 2018

On James Damore and Biased Silicon Valley Recruiting

Last night I committed myself to reading all ten pages of James Damore’s infamous 2017 memo – the one that excitedly hung on mentions of Google’s invisible biases and yet conjured messy, sexist prejudices of its own. Misrepresentation of women in the workplace, Damore insisted, drew partly on their biological differences with men, and it was here I felt his point began to curdle.

To ascribe, with a cool indifference, biology a role in such systematic issues is an unsettling line of reasoning. Damore’s strategic distance from relevant gender studies research (here, here, and here) and blithe handling of workplace injustices demonstrates a clear disregard for empathy and social theory. Though I fully respect Damore’s prerogative to air his grievances, I found his conclusions parochial and his thesis a sobering reminder of all that is wrong in Silicon Valley.

Particularly, I find a heavy-handed imbalance in the process by which technology companies deem an employee adequate for hire. Testing on how to traverse a linked list or design a marketing campaign serves to shore up a unilateral recruiting process, to ward off “unsuitable” candidates, and to keep these enterprises upright in their pursuit of changing the world. But is that all that’s necessary when the goal is to construct safe, objective, and accessible resources for everyone?

Should we not ask a data scientist how he/she would handle using biased training data? Why shouldn’t engineers be required to submit an ethical analysis of using race as a feature in a machine learning model? Are product managers really more equipped for their jobs if they know how long it would take to empty a bathtub with a drinking straw, instead of understanding how historical context could impact their software release in different areas?

From the looks of it, I guess not. All that matters is finding Pythagorean triplets in an array or knowing when to use L1 or L2 regularization.

This larger valley ecosystem sources progress from technical theory, but demonstrates a resistible inattention to social and cultural awareness. The release and legitimization of algorithms that classify African Americans as gorillas or fail to recognize Asian Americans, then, perfectly typifies the technology bias I am suggesting is rampant. Within Damore’s frame of mind, where misrepresentation is seen as a result of biology and not entrenched prejudice, our digital society becomes ripe for misguided and premature creations.

I find no better analysis of Silicon Valley’s depreciation of social knowledge than in Safiya Umoja Noble’s 2018 piece, Algorithms of Oppression:

The implications of such marginalization are profound. The insights about sexist and racist biases... are important because information organizations, from libraries to schools and universities to governmental agencies, are increasingly reliant on being displaced by a variety of web-based "tools" as if there are no political, social, or economic consequences of doing so….

The notion that Google/Alphabet has the potential to be a democratizing force is certainly laudable, but the contradictions inherent in its projects must be contextualized in the historical conditions that both create and are created by it.

In other words, to technical skills we must conjoin social context and historical knowledge, lest dispatching biased software that dons the fraudulent guise of neutrality. I don’t mean to sound cynical – I simply believe that as purveyors of products used by billions of people, technology companies must embrace a critical sense of detachment, continuously questioning the limits of their perspective and the social insights that comprise them. That is, if they are serious about their self-respecting promise to make the world a better place, they should privilege all types of knowledge.

When employees such as Damore see initiatives like diversity recruiting as “discriminatory,” I think we are made aware of what Silicon Valley manifestly lacks. While I appreciate Google’s swift firing of Damore, it seems we have a penchant for forgetting, as if it were so hard to remember why such thinking is pernicious. It only seems like a matter of time until another offensive algorithm is hastily deployed or an indecent boardroom remark is leaked to the public.

No one is perfect and these companies make mistakes, that much I will acknowledge. They wield a titanic responsibility that is hard to manage among a workforce that is spread so thin — as such, it must be their priority to inculcate habits of decency and cultural awareness within what is currently a rigid and insular recruiting framework. Only then, I feel, can we trust such enterprises to make products that better the world.