Home Leadership Turn Archives Me RampUp Solutions  
 

  • Categories

  • Archives
 

AI is Not Society’s Savior

Tuesday, March 31st, 2020

https://www.flickr.com/photos/77068017@N07/6779368830/

The chatter about how AI will change the world, take your job, out-consult the consultants, displace management, perform reviews, identify potential criminals and reoffenders, diagnose illnesses, etc., especially etc., is never ending.

AI is supposed to bring true objectivity to its many applications creating longed for change.

Yet it’s been proven over and over that AI contains the same biases that created our unfair, prejudiced world; not just in the US, but around the world.

AI is good at increasing bias in the name of efficiency and objectivity.

It is even better at automating the loss of privacy and increasing surveillance in the name of safety.

Long before AI got hot Lou Gerstner knew the solution.

Computers are magnificent tools for the realization of our dreams, but no machine can replace the human spark of spirit, compassion, love, and understanding.

Something tech has forgotten in its love affair with data and its warped view of progress.

And, of course, profit.

Image credit: safwat sayed

AI As Blunt Force Trauma

Wednesday, February 12th, 2020

https://www.flickr.com/photos/mikemacmarketing/30188200627/in/photolist-MZCqiH-SjCgwQ-78gAtb-4Wrk4s-Dcx4UC-24s3ght-2dZfNaQ-8nBs97-5JpQEE-4GXcBN-RNNXQ4-2eo1VjR-29REGc9-3iAtU2-8SbD9g-2aDXanU-dYVVaB-5Pnxus-29Jabm7-2em8eRN-24DS86P-4KTiY4-87gbND-TnPTMx-UWXASW-fvrvcc-9xaKQj-2dviv8X-7Mbzwn-4WrkmQ-EPaCDj-dWTnJy-4zWGpJ-2fuyjjE-23y8cHC-4HEcBa-585oYX-jR9gc-dZ2ueo-dZ2v6o-2etej9U-dZ2A5J-4vuuEb-TrNV8b-dYVQKp-4HCFvt-6kBMSR-7JvXoF-3Ym8Sz-ShBxCm

While AI can do some things on its own, it’s a blunt force, ignorant of nuance, but embracing all the  bias, prejudices, bigotry and downright stupidity of past generations thanks to its training.

Using AI to make judgement calls that are implemented sans human involvement is like using a five pound sledgehammer on a thumbtack.

Yesterday looked at what AI can miss in hiring situations, but candidates at least have more choice than others do.

AI is being used extensively around the world by government and law enforcement where its bias is especially hard on people of color.

The algorithm is one of many making decisions about people’s lives in the United States and Europe. Local authorities use so-called predictive algorithms to set police patrols, prison sentences and probation rules. In the Netherlands, an algorithm flagged welfare fraud risks. A British city rates which teenagers are most likely to become criminals.

Human judgement may be flawed and it does has the same prejudices, but it’s not inflexible, whereas AI is.

As the practice spreads into new places and new parts of government, United Nations investigators, civil rights lawyers, labor unions and community organizers have been pushing back.

Now schools are jumping on the bandwagon claiming that facial recognition will make schools safer, but not everyone agrees.

“Subjecting 5-year-olds to this technology will not make anyone safer, and we can’t allow invasive surveillance to become the norm in our public spaces,” said Stefanie Coyle, deputy director of the Education Policy Center for the New York Civil Liberties Union. (…)

Critics of the technology, including Mr. Shultz and the New York Civil Liberties Union, point to the growing evidence of racial bias in facial recognition systems. In December, the federal government released a study, one of the largest of its kind, that found that most commercial facial recognition systems exhibited bias, falsely identifying African-American and Asian faces 10 to 100 times more than Caucasian faces. Another federal study found a higher rate of mistaken matches among children.

So what do the kids think?

Students 13 and older are invited to comment. All comments are moderated by the Learning Network staff…

Read the Q&A to find out.

Image credit: Mike MacKenzie

The Bias of AI

Tuesday, June 25th, 2019

https://www.flickr.com/photos/mikemacmarketing/30188200627/in/photolist-MZCqiH-SjCgwQ-78gAtb-4Wrk4s-Dcx4UC-24s3ght-2dZfNaQ-8nBs97-5JpQEE-4GXcBN-RNNXQ4-2eo1VjR-29REGc9-3iAtU2-8SbD9g-2aDXanU-dYVVaB-5Pnxus-29Jabm7-2em8eRN-24DS86P-4KTiY4-87gbND-TnPTMx-UWXASW-fvrvcc-9xaKQj-2dviv8X-7Mbzwn-4WrkmQ-EPaCDj-dWTnJy-4zWGpJ-2fuyjjE-23y8cHC-4HEcBa-585oYX-jR9gc-dZ2ueo-dZ2v6o-2etej9U-dZ2A5J-4vuuEb-TrNV8b-dYVQKp-4HCFvt-6kBMSR-7JvXoF-3Ym8Sz-ShBxCm

I’ve written before that AI is biased for the same reason children grow up biased — they both learn from their parents.

In AI’s case its “parents” are the datasets used to train the algorithms.

The datasets are a collection of millions of bits of historical information focused on the particular subject being taught.

In other words, the AI learns to “think”, evaluate information and make judgments based on what has been done in the past.

And what was done in the past was heavily biased.

What does that mean to us?

In healthcare, AI will downgrade complaints from women and people of color, as doctors have always done.

And AI will really trash you if you are also fat. Seriously.

“We all have cultural biases, and health care providers are people, too,” DeJoy says. Studies have indicated that doctors across all specialties are more likely to consider an overweight patient uncooperative, less compliant and even less intelligent than a thinner counterpart.

AI is contributing significantly to the racial bias common in the courts and law enforcement.

Modern-day risk assessment tools are often driven by algorithms trained on historical crime data. (…) Now populations that have historically been disproportionately targeted by law enforcement—especially low-income and minority communities—are at risk of being slapped with high recidivism scores. As a result, the algorithm could amplify and perpetuate embedded biases and generate even more bias-tainted data to feed a vicious cycle.

Facial recognition also runs on biased AI.

Nearly 35 percent of images for darker-skinned women faced errors on facial recognition software, according to a study by Massachusetts Institute of Technology. Comparatively lighter-skinned males only faced an error rate of around 1 percent.

While healthcare, law and policing are furthest along, bias is oozing out of every nook and cranny that AI penetrates.

As usual, the problem was recognized after the genie was out of the box.

There’s a lot of talk about how to correct the problem, but how much will actually be done and when is questionable.

This is especially true since the bias in AI is the same as that of the people using it it’s unlikely they will consider it a problem.

Image credit: Mike MacKenzie

RSS2 Subscribe to
MAPping Company Success

Enter your Email
Powered by FeedBlitz
About Miki View Miki Saxon's profile on LinkedIn

Clarify your exec summary, website, etc.

Have a quick question or just want to chat? Feel free to write or call me at 360.335.8054

The 12 Ingredients of a Fillable Req

CheatSheet for InterviewERS

CheatSheet for InterviewEEs

Give your mind a rest. Here are 4 quick ways to get rid of kinks, break a logjam or juice your creativity!

Creative mousing

Bubblewrap!

Animal innovation

Brain teaser

The latest disaster is here at home; donate to the East Coast recovery efforts now!

Text REDCROSS to 90999 to make a $10 donation or call 00.733.2767. $10 really really does make a difference and you'll never miss it.

And always donate what you can whenever you can

The following accept cash and in-kind donations: Doctors Without Borders, UNICEF, Red Cross, World Food Program, Save the Children

*/ ?>

About Miki

About KG

Clarify your exec summary, website, marketing collateral, etc.

Have a question or just want to chat @ no cost? Feel free to write 

Download useful assistance now.

Entrepreneurs face difficulties that are hard for most people to imagine, let alone understand. You can find anonymous help and connections that do understand at 7 cups of tea.

Crises never end.
$10 really does make a difference and you’ll never miss it,
while $10 a month has exponential power.
Always donate what you can whenever you can.

The following accept cash and in-kind donations:

Web site development: NTR Lab
Creative Commons License
This work is licensed under a Creative Commons Attribution-NoDerivs 2.5 License.