robotics that fit male surgeons’ hands better, for example, act as barriers for women who are at a disadvantage when using the same technology. This ‘fit’
issue for women could mean they are perceived as
being less effective at their jobs, which can have a
negative impact on their careers, potentially prevent
women from being accepted or respected in the field,
and thus discourage other women from entering the
The embedded software in devices can also make
it harder for select populations to use devices. The
most glaring example comes from the latest virtual
reality headsets. As first identified by Danah Boyd
—a Principal Researcher at Microsoft Research and
founder of Data & Society—there are biological
preferences for specific types of depth perception
cues, and the solutions selected to provide depth
perception in most virtual reality headsets make
women feel more nauseous than men.
These embedded biases can affect women in a
range of personal and professional scenarios, from
job training, to learning how to fly a plane, to training
for parents on CPR for newborns.
HUMANIZATION OF TECHNOLOGY
Giving technology products human personas
in audio or physical forms also creates risk of
perpetuating stereotypes and normalizing biases.
Virtual assistants like Siri and Alexa were all
launched with a female only voice option, furthering
the stereotype that women should assist you with
tasks. After significant media attention, these
products now offer the option of a male voice
(though the default is still female).
When they first came out, these digital assistants
were also programmed to respond in a subservient
manner to sexual comments like “you’re sexy”. After
pushback that normalizing verbal sexulization of digital
assistants would normalize real-life behavior, now
digital assistants actively acknowledge inappropriate
behavior by saying “I’m not going to respond to that.”
SO, WHO IS RESPONSIBLE HERE?
As rapid innovation with emerging technology
continues, I see a real opportunity for consultants
in being able to identify and eliminate biases. Do
companies need to take responsibility for the fact
that end users think the curvy robot should get them
coffee? As consultants, what is our role in educating
our clients who are building products on bias
identification and inclusive design?
Ultimately, the responsibility for leading the
charge to incorporate inclusive and thoughtful
design into products lies with our clients. It is my
goal that with some education and insights, we can
educate them on the benefits of eliminating bias in
its early phases. By adding bias assessment into the
product development lifecycle, developers and data
scientists will be better able to catch it before it
becomes legacy and more difficult to change.
Companies are in a powerful position to increase
the value of designing products for everyone and
showcase the business value of doing so. Many have
already done so in cases where media has brought
attention to the issue, but that’s not enough; it needs to
be intertwined with corporate values.
THANK YOU, PROFESSOR
Second, there’s also an opportunity to enable
academia. Increased funding for academics to identify
more use cases with biases will draw more attention
to the risks and can lead to potential solutions for
reducing biases. We need to equip emerging creators—
data scientists, product managers, developers—with
the framework and tools to deal with issues such as
identifying biases in datasets.
Let’s not wait until biases are spread across too
many new solutions before our group of professionals addresses the issue. The time is now for emerging technology leaders to step up and enable their
teams to be thoughtful and successful in building
Megha Mathur is a technology business strategy consultant
who is passionate about using technology and data to empower
organizations and people. In her client-facing work, she brings
deep insights into an array of areas for Fortune 500 companies
developing and using emerging technologies such as AI/ML.
She also heads Keystone Strategy’s Diversity Initiative.