[ad_1]

I spent a good deal of my educational and early career as an analyst doing research at scale. In fact, the way I got into the executive resources program at IBM was through one of the largest research projects my division had ever undertaken.

A recurring issue with those who attempt to address the diversity and inclusion problem is that in the absence understanding it, they focus on the symptoms. Like taking a decongestant for a cold, it might offer temporary relief, but that approach hasn’t been successful in fixing the problem.

At Dell Technology World last week, I attended a session on diversity and inclusion hosted by Chief Diversity Officer Brian Reaves. It focused on issues with facial recognition technology, which has worked well with white men but not so well with any other group.

The speaker was Joy Buolamwini who calls herself the “Poet of Code.” Had her presentation been a TED Talk (and no surprise — she’s done a TED talk), it would have ranked as one of the best I’ve ever seen. If you’ve ever seen a TED talk, you’d realize this means the quality of the presentation and message were incredibly high.

I’ll share my response to that important session and close with my product of the week.

Guru Sessions

Every Dell Technology World includes sessions that cover topics of broad interest. You really should attend these sessions. While not all are great, they generally provide insights on critical issues developing in the market, and with a breadth you probably won’t get at any other event. Hell, you are there anyway — you might learn something. They tend to be far more interesting and relevant than most product pitches.

Past presentations seemed to suggest Dell should be going into emerging markets, like robotics and general use artificial intelligence (that was last year), but Dell doesn’t seem take those pitches seriously. What I found fascinating this year was that Brian Reaves was right there for the event, and he clearly planned to take it to heart. In fact, he appeared to be adopting Joy’s recommendations already.

Algorithmic Bias

Joy eloquently pointed out something that most of us in tech know — or should know — which is that a homogenous culture of white male engineers is clueless about any other group. Her focus was on facial recognition, and governments across the world increasingly have been using this technology to identify people. Currently, there are 130 million people in U.S. facial recognition programs, according to Joy. Many of them not only don’t know that, but also are being misidentified.

This is particularly onerous for women of color, who often are misidentified as male — or even as animals or something else. Some, for instance, have been identified as wigs or mustaches on men. These programs are used to make decisions on services, on whether individuals should be allowed access, on whether they are criminals.

The underlying problem isn’t a failure in the AI tools in terms of core technology. Although these misidentifications sometimes can result from low-quality cameras, they largely are the result of horribly biased data sets. Often the data sets are pulled from tech firms, staffed mostly by white males, or from the media, which in terms of volume seems to favor pictures of white males.

This is not just a problem for system accuracy — it is effectively abuse at scale. I mean, how offensive would it be if some brain-dead AI identified you as a gorilla? Used to determine criminal sentences in some areas, these programs often showcase people of color as repeat offenders, erroneously suggesting higher sentences for them.


These programs may be used for weeding out potentially bad candidates for employment (one product is called “Hirevue”) by analyzing facial expressions — even though it has been shown that they can’t even reliably identify gender, or whether the candidate is even human.

IBM, Microsoft, Amazon

As part of her effort, Joy initially looked at facial recognition programs from Microsoft, IBM and Face++, a China-based firm I’d not heard of previously. These programs were almost perfect for white guys, but when you got to people of color, and particularly women of color, their accuracy was often little better than flipping a coin. After the firms became aware of the problems, they moved to fix them, and currently their programs are vastly more accurate with people of color — though far from perfect.

Joy then looked at Amazon and Kairos and found they were as bad as the others were, having learned nothing from their peers. What is frightening, given how widely it is used, is that Amazon was by far the worst.

However, this showcased that with focus, a desire to fix the problem, and strong execution, you can reduce the problem by a massive degree, and that continuing to work on it eventually would make the number of misidentifications trivial.

Fixing the Problem

Joy argued that to fix this problem, it would be necessary to increase the diversity of the data sets being used to train AIs so they would be better matches for the populations they are missioned to measure. Today those datasets are only 17 percent women and only 4 percent women of color. (It is fascinating — and embarrassing — that Joy found she had to wear a white mask for some of these systems to see her as human.)

The process she suggested, which should be neverending, would be to highlight the bias first, so that the problem can be resourced. Then identify the causes of the bias, so resources can be focused on the problem. Then execute to mitigate the bias.

My own research training suggests that you can never eliminate bias because bias is inherent in people, but you can work to minimize it so that its impact is less significant over time.

Joy suggested three inclusion imperatives:

  1. Dare to ask uncomfortable, intersectional questions. If something doesn’t look right, then make the effort to investigate the issue. Don’t avoid it because it makes people uncomfortable. Change is uncomfortable, but the only way you can make progress is through change.
  2. Dare to listen to silenced voices. Often people who are disadvantaged are considered inconsequential, but if you don’t listen you won’t see or understand critical problems that need to be addressed — let alone be able to resource fixing them.
  3. Dare to dream. I think Joy clearly lives this. Dreams of a better world, when collectively shared, can result in a better world. If you just accept the status quo, then there is little chance you’ll ever achieve what otherwise might be possible.

Joy praised the Google employees who staged a bit of a revolt last week, protesting the retaliation against their peers — many of whom were demoted or fired for walking out to protest some of Google’s bad practices. I agree that those folks were heroes, and that this kind of thing often is necessary if we want to drive needed change.

Joy recommended that those who use analytical tools recognize that mitigation is a process, and that as you add elements to analytical tools, always question the assumptions, the accuracy of the data, and the models being used. Never assume.


She also praised the UK, which has implemented facial recognition massively, for being public about the fact that it sucks. It requires transparency to target resources at fixing problems; covering them up clearly does not work.

Wrapping Up

Given that I’ve done research at a national scale and spent much of my own graduate work thinking about the elimination of bias, I found Joy’s talk incredibly interesting. We know that diverse groups, if properly created and managed, can result in better products, improved performance, and a more well-rounded and inclusive company.

If we can recognize that there are problems, identify the causes, and work to mitigate them, we not only can make our companies better places to work, but also can make the world a better place to live in.

Joy’s talk gave me hope that we are making progress. If more of us get engaged, then maybe we really can create the utopia that Joy and a number of old white guys dream about.

Rob Enderle's Product of the Week

We are moving toward a future when we will be surrounded by robots. I’ve seen projections that suggest the market for them will be larger than the one for smartphones in a few years. One of my greatest concerns is that the tech companies have been ignoring the rise of robots, much like those that came before them largely ignored other major disruptive trends — like the PC and smartphones.

Well, Briggo is a counterpoint, because it is a robotic coffee store that generates US$12K per foot and makes the best damn mocha I’ve ever had.


Briggo Coffee

Briggo Coffee


Briggo Coffee Haus (basically a robotic Starbucks) was created using Dell’s original equipment manufacturing unit and Boomi, together with a ton of robotic development. (The team basically created a robotic clone of one of the world’s top baristas.)

Briggo Coffee Haus is one of the most impressive coffee vending machines I’ve ever seen. I live for my morning cup of coffee, and after I tasted my first Briggo cup I was hooked. Sadly, there aren’t a ton of these available yet, but they will be arriving in the San Francisco airport shortly and are already in the Austin Airport. Dell employees are lucky, because they have two of the things at Dell headquarters. Were I at any other large Dell facility that didn’t have one, I’d be asking WTF? Don’t I count?

They use a revenue-share model, so that those that install them get a percentage of the revenue, while Briggo largely handles the cost of the hardware and logistics of supporting the machine. The machine is a showcase of technology, with full predictive analytics and a software stack that ensures quality that would make a leading computer scientist proud.

It uses a cloud-based solution, which means that if you have the app, you can order the coffee when you land (you’ll get a single use code), and it will ready when you are sprinting by the machine, so that even if you have a tight connection you can get your caffeine fix (I just wish it were here in the Las Vegas airport where I’m writing this).

The end product is so much better than my typical Starbucks that it isn’t even funny. Given that I’m currently wishing I had a cup of the stuff now, the Briggo Coffee machine, and especially the coffee, is my product of the week. If you see one, try it out, I’ll bet you’ll be impressed with the selection and the quality of the drink.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *