Home Leadership Turn Archives Me RampUp Solutions  
 

  • Categories

  • Archives
 
Archive for the 'Ethics' Category

Tech with a Conscience

Tuesday, July 2nd, 2019

https://twitter.com/deepnudeapp

Sounds like an oxymoron.

The world knows about tech’s love affair with, and misuse of, personal data. The continual ignoring, minimizing and excusing of hate speech, revenge porn, fake news, bullying, etc.

Then there is its totally irrational attitude/belief that people will be kind and good to each other online no matter what they are like in the real world.

Given the prevailing attitude, would a hot tech startup have a conscience?

So would a founder, a self-described “technology enthusiast,” create an AI app that went viral and then shut it down because of the way it was being used?

DeepNude was built on Pix2Pix, an open-source algorithm used for “image-to-image translation.” the app can create a naked image from any picture of a woman with just a couple of clicks. Revenge porn activists said the app was “absolutely terrifying.”

As to the above question, the answer is “yes.”

The DeepNude team was horrified, believing “the probability that people will misuse it is too high.”

“We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it,” DeepNude wrote in a tweet. “The world is not yet ready for DeepNude.”

—deepnudeapp (@deepnudeapp) June 27, 2019

Pix2Pix was developed by a team of scientists, who now believe the industry needs to do better and not just release their work to the world at large.

“We have seen some wonderful uses of our work, by doctors, artists, cartographers, musicians, and more,” the MIT professor Phillip Isola, who helped create Pix2Pix, told Business Insider in an email. “We as a scientific community should engage in serious discussion on how best to move our field forward while putting reasonable safeguards in place to better ensure that we can benefit from the positive use-cases while mitigating abuse.”

One can only hope that the scientific community does, indeed, find a way to do good while avoiding the worst of the negative fallout from discoveries.

And hats off to the DeepNude team.

It’s really inspiring to see such a concrete example of doing the right thing, with no shilly-shallying or dancing around the decision.

But I do wonder what would have happened if either the developers or the scientists were beholden  to investors.

Image credit: deepnudeapp via Twitter

Say What?

Wednesday, June 26th, 2019

https://www.flickr.com/photos/m_kajo/10071501426/

Every day seems to bring more bad news from the AI front.

Google gives away tools for DIY AI, with no consideration for who uses them or for what.

One result is the proliferation of deepfakes.

Now scientists from Stanford University, the Max Planck Institute for Informatics, Princeton University, and Adobe Research are making faking it even simpler.

In the latest example of deepfake technology, researchers have shown off new software that uses machine learning to let users edit the text transcript of a video to add, delete, or change the words coming right out of somebody’s mouth.

The result is that almost anyone can make anyone say anything.

Just type in the new script.

Adobe, of course, plans to consumerize the tech, with a focus on how to generate the best revenue stream from it.

It’s not their problem how it will be used or by whom.

Yet another genii out of the box and out of control.

You can’t believe what you read; you can’t believe what you read or hear; it’s been ages since you could believe pictures, and now you won’t be able to believe videos you see.

All thanks to totally amoral tech.

Werner Vogels, Amazon’s chief technology officer, spelled out tech’s attitude in no uncertain terms.

It’s in society’s direction to actually decide which technology is applicable under which conditions.

“It’s a societal discourse and decision – and policy-making – that needs to happen to decide where you can apply technologies.”

Decisions and policies that happen long after the tech is deployed — if at all.

Welcome to the future.

Image credit: Marion Paul Baylado

The Bias of AI

Tuesday, June 25th, 2019

https://www.flickr.com/photos/mikemacmarketing/30188200627/in/photolist-MZCqiH-SjCgwQ-78gAtb-4Wrk4s-Dcx4UC-24s3ght-2dZfNaQ-8nBs97-5JpQEE-4GXcBN-RNNXQ4-2eo1VjR-29REGc9-3iAtU2-8SbD9g-2aDXanU-dYVVaB-5Pnxus-29Jabm7-2em8eRN-24DS86P-4KTiY4-87gbND-TnPTMx-UWXASW-fvrvcc-9xaKQj-2dviv8X-7Mbzwn-4WrkmQ-EPaCDj-dWTnJy-4zWGpJ-2fuyjjE-23y8cHC-4HEcBa-585oYX-jR9gc-dZ2ueo-dZ2v6o-2etej9U-dZ2A5J-4vuuEb-TrNV8b-dYVQKp-4HCFvt-6kBMSR-7JvXoF-3Ym8Sz-ShBxCm

I’ve written before that AI is biased for the same reason children grow up biased — they both learn from their parents.

In AI’s case its “parents” are the datasets used to train the algorithms.

The datasets are a collection of millions of bits of historical information focused on the particular subject being taught.

In other words, the AI learns to “think”, evaluate information and make judgments based on what has been done in the past.

And what was done in the past was heavily biased.

What does that mean to us?

In healthcare, AI will downgrade complaints from women and people of color, as doctors have always done.

And AI will really trash you if you are also fat. Seriously.

“We all have cultural biases, and health care providers are people, too,” DeJoy says. Studies have indicated that doctors across all specialties are more likely to consider an overweight patient uncooperative, less compliant and even less intelligent than a thinner counterpart.

AI is contributing significantly to the racial bias common in the courts and law enforcement.

Modern-day risk assessment tools are often driven by algorithms trained on historical crime data. (…) Now populations that have historically been disproportionately targeted by law enforcement—especially low-income and minority communities—are at risk of being slapped with high recidivism scores. As a result, the algorithm could amplify and perpetuate embedded biases and generate even more bias-tainted data to feed a vicious cycle.

Facial recognition also runs on biased AI.

Nearly 35 percent of images for darker-skinned women faced errors on facial recognition software, according to a study by Massachusetts Institute of Technology. Comparatively lighter-skinned males only faced an error rate of around 1 percent.

While healthcare, law and policing are furthest along, bias is oozing out of every nook and cranny that AI penetrates.

As usual, the problem was recognized after the genie was out of the box.

There’s a lot of talk about how to correct the problem, but how much will actually be done and when is questionable.

This is especially true since the bias in AI is the same as that of the people using it it’s unlikely they will consider it a problem.

Image credit: Mike MacKenzie

Amazon’s Terrifying Power

Wednesday, June 19th, 2019

https://apagraph.com/quote/6672

Every day when I look through the headlines there’s always another story about Facebook, Google, or another tech company abusing their users and offering the same old platitudes about how important user privacy is to them or being investigated/fined by the Feds, European Union and some other country.

Ho-hum, business as usual.

There is still a certain amount of choice about using Facebook, Google-Android, various apps, and smart products, such as Samsung’s smart TV, all of which can be hacked. And while it takes effort, to some extent you can protect yourself and your privacy.

But even Facebook and Google’s efforts to dominate pale in comparison, as do the dreams of power of every despot, politico, religious zealot, or military organization, to the future Amazon sees for itself.

Amazon’s incredible, sophisticated systems are no longer being used just to serve up good deals, fast delivery times, or cheap web storage. Its big data capabilities are now the tool of police forces, and maybe soon the military. In the corporate world, Amazon is positioning itself to be the “brains” behind just about everything.

Add to that Amazon’s belief that they have no responsibility in how their tech is used.

Rekognition, Amazon’s facial recognition software is a good example.

Civil rights groups have called it “perhaps the most dangerous surveillance technology ever developed”, and called for Amazon to stop selling it to government agencies, particularly police forces. City supervisors in San Francisco banned its use, saying the software is not only intrusive, but biased – it’s better at recognising white people than black and Asian people. (…) Werner Vogels, Amazon’s CTO,  doesn’t feel it’s Amazon’s responsibility to make sure Rekognition is used accurately or ethically.

In one form or another, with great power comes great responsibility has been a byword starting with the Bible and down through the ages to Spiderman.

When a company wields the power to bring the modern world to its knees one can only hope it will take that to heart.

Image credit: judon / aparagraph.com

Tech is Full of Isht

Tuesday, June 18th, 2019

From Maciej Cegłowski’s (a SF white hat techie) blog:

Writing in the New York Times last month, Google CEO Sundar Pichai argued that it is “vital for companies to give people clear, individual choices around how their data is used.” Like all Times opinion pieces, his editorial included multiple Google tracking scripts served without the reader’s knowledge or consent. Had he wanted to, Mr. Pichai could have learned down to the second when a particular reader had read his assurance that Google “stayed focused on the products and features that make privacy a reality.”

Writing in a similar vein in the Washington Post this March, Facebook CEO Mark Zuckerberg called for Congress to pass privacy laws modeled on the European General Data Protection Regulation (GDPR). That editorial was served to readers with a similar bouquet of non-consensual tracking scripts that violated both the letter and spirit of the law Mr. Zuckerberg wants Congress to enact.

TOS for new apps aren’t improving (paywall). Consider this from Ovis, an app women use to track their pregnancy.

An Ovia spokeswoman said the company does not sell aggregate data for advertising purposes. But women who use Ovia must consent to its 6,000-word “terms of use,” which grant the company a “royalty-free, perpetual, and irrevocable license, throughout the universe” to “utilize and exploit” their de-identified personal information for scientific research and “external and internal marketing purposes.” Ovia may also “sell, lease or lend aggregated Personal Information to third parties,” the document adds.

Good grief. As any search will tell you “de-identified” is a joke, since it’s no big deal to put a name to so-called anonymous data.

By now you should know that tech talks privacy, but walks data collection.

That means it’s up to you to do what you can, starting with always adjusting all default privacy settings.

 

Golden Oldies: Entrepreneurs: Tech vs. Responsibility And Accountability

Monday, June 17th, 2019

Poking through 13+ years of posts I find information that’s as useful now as when it was written.

Golden Oldies is a collection of the most relevant and timeless posts during that time.

This post and the quote from the FTC dates back to 2015. Nothing on the government side has changed; the Feds are still investigating and Congress is still talking. And as we saw in last weeks posts the company executives are more arrogant and their actions are much worse. One can only hope that the US government will follow in the footsteps of European countries and rein them in.

Read other Golden Oldies here.

Entrepreneurs are notorious for ignoring security — black hat hackers are a myth — until something bad happens, which, sooner or later, always does.

They go their merry way, tying all manner of things to the internet, even contraceptives and cars, and inventing search engines like Shodan to find them, with nary a thought or worry about hacking.

Concerns are pooh-poohed by the digerati and those voicing them are considered Luddites, anti-progress or worse.

Now Edith Ramirez, chairwoman of the Federal Trade Commission, voiced those concerns at CES, the biggest Internet of Things showcase.

“Any device that is connected to the Internet is at risk of being hijacked,” said Ms. Ramirez, who added that the large number of Internet-connected devices would “increase the number of access points” for hackers.

Interesting when you think about the millions of baby monitors, fitness trackers, glucose monitors, thermostats and dozens of other common items available and the hundreds being dreamed up daily by both startups and enterprise.

She also confronted tech’s (led by Google and Facebook) self-serving attitude towards collecting and keeping huge amounts of personal data that was (supposedly) the basis of future innovation.

“I question the notion that we must put sensitive consumer data at risk on the off chance a company might someday discover a valuable use for the information.”

At least someone in a responsible position has finally voiced these concerns — but whether or not she can do anything against tech’s growing political clout/money/lobbying power remains to be seen.

Image credit: centralasian

Bad Boys Facebook and Google

Wednesday, June 12th, 2019

https://www.flickr.com/photos/mysign_ch/8527753874/in/photolist-dZyY8d-HRv9gc-XaYGXv-e5CAgW-29Kkshj-anSkn7-9DdnrK-9k7Jan-ebtNpt-ohmijQ-5oubhB-nZU9J9-nZU9rA-bj8NSR-ohd3EY-9kaGPY-5MzoeQ-gjS9QU-ofmUa7-ohd4WL-5rQcbT-6K55ZR-nZUoFB-oj9VZM-9hmC9R-99BVQZ-t7ohKh-92x5xZ-5BKnf4-V96rVQ-mZPN5U-WmWEqd-9tQRav-a63sAi-dtGJev-nW7xNg-9gti5v-dtGPTx-97bqPt-4xrBt2-65L7JN-bJtwZ8-6tXvgR-rqaoff-j3PG8F-aPYzQz-ebtLaF-raTXZQ-btpW68-WVXxceYou’d have to be living on another planet not to be aware of the isht pulled by Facebook. Where do I start?

With the fact that Facebook is getting fined for storing millions of passwords in plain text or that they “unintentionally” uploaded a million and a half new member email contacts? Or that user data, such as friends, relationships and photos, was used to reward partners and fight rivals? Or might it bother you more to know that your posts, photos, updates, etc., whether public or private, are labeled and categorized by hand by outsourced works in India? Nastier is Facebook sharing/selling your data to cell phone carriers.

Offered to select Facebook partners, the data includes not just technical information about Facebook members’ devices and use of Wi-Fi and cellular networks, but also their past locations, interests, and even their social groups. This data is sourced not just from the company’s main iOS and Android apps, but from Instagram and Messenger as well. The data has been used by Facebook partners to assess their standing against competitors, including customers lost to and won from them, but also for more controversial uses like racially targeted ads.

Facebook owns Instagram, so it should come as no surprise that the private phone numbers and email addresses of millions of celebrities and influencers were scraped by a partner company.

Then there is Google, which dumps location data from millions of devices, not just Android, into a database called Sensorvault and makes it available for search to law enforcement, among others. On May 7 Google claimed it had found privacy religion, but on CNBC reported that Gmail tracks and saves every digital receipt, not just things, but services and, of course Amazon. Enterprise G Suite customers don’t fare much better. Their user passwords were kept un-encrypted on an internal server for years. Not hacked, but still…

YouTube is in constant trouble for the way it interprets its constantly changing Terms Of Service.

The list for all go on and on.

The European Union is far ahead of the US in terms of privacy, anticompetitive actions, etc., but US consumers are finally waking up. So-called Big Tech is no longer popular politically and the Justice Department is opening an antitrust investigation of Google (Europe already fined it nearly 3 billion in 2017 for anticompetitive actions).

Can Facebook be far behind?

A bit more next week.

Image credit: MySign AG

The Doings of Amazon and Apple

Tuesday, June 11th, 2019

https://www.flickr.com/photos/mysign_ch/8527753874/in/photolist-dZyY8d-HRv9gc-XaYGXv-e5CAgW-29Kkshj-anSkn7-9DdnrK-9k7Jan-ebtNpt-ohmijQ-5oubhB-nZU9J9-nZU9rA-bj8NSR-ohd3EY-9kaGPY-5MzoeQ-gjS9QU-ofmUa7-ohd4WL-5rQcbT-6K55ZR-nZUoFB-oj9VZM-9hmC9R-99BVQZ-t7ohKh-92x5xZ-5BKnf4-V96rVQ-mZPN5U-WmWEqd-9tQRav-a63sAi-dtGJev-nW7xNg-9gti5v-dtGPTx-97bqPt-4xrBt2-65L7JN-bJtwZ8-6tXvgR-rqaoff-j3PG8F-aPYzQz-ebtLaF-raTXZQ-btpW68-WVXxceAs promised yesterday, I’m updating the “don’t trust them, they lie” list (in mostly alphabetical order) with new links to the nefarious doings of your favorite “can’t live without ‘em” companies.

First up: Amazon. Anyone who has bought from Amazon is aware of how it uses your buying data to suggest additional purchases, as do all ecommerce sites. And there have been multiple stories about Alexa listening and responding even when it’s supposedly not on. But did you know that those supposedly anonymous recordings are discussed for amusement in Amazon employee chatrooms?

On a far more serious note, Ring, the video doorbell company Amazon acquired, is teaming up with police departments to offer free or discounted smart doorbells. And although it supposedly goes against Ring’s own policy, some of those PDs are adding to the terms of service the right to look at the saved video footage sans subpoena.

Sadly, Apple is on the nefarious list, in spite of it’s famous “What happens on your iPhone stays on your iPhone” philosophy. But, as with other companies, the facts are more complicated — the thieves are in the apps.

More tomorrow.

Image credit: MySign AG

Golden Oldies: You the Product

Monday, June 10th, 2019

https://www.flickr.com/photos/8693667@N05/4617735784/

Poking through 13+ years of posts I find information that’s as useful now as when it was written.

Golden Oldies is a collection of the most relevant and timeless posts during that time.

For years I’ve written about the lie/cheat/steal attitude of social media sites, such as Facebook, Google, Amazon, the list goes on and on. This post is only a year old, but I thought it could use some updating. What I can tell you today is that nothing has improved, in fact it has gotten much worse — as you’ll see over the next two days.

Read other Golden Oldies here.

you ever been to a post-holiday potluck? As the name implies, it’s held within two days of any holiday that involves food, with a capital F, such as Thanksgiving, Christmas and, of course, Easter. Our group has only three rules, the food must be leftovers, conversation must be interesting and phones must be turned off. They are always great parties, with amazing food, and Monday’s was no exception.

The unexpected happened when a few of them came down on me for a recent post terming Mark Zukerberg a hypocrite. They said that it wasn’t Facebook’s or Google’s fault a few bad actors were abusing the sites and causing problems. They went on to say that the companies were doing their best and that I should cut them some slack.

Rather than arguing my personal opinions I said I would provide some third party info that I couldn’t quote off the top of my head and then whoever was interested could get together and argue the subject over a bottle or two of wine.

I did ask them to think about one item that stuck in my mind.

How quickly would they provide the location and routine of their kids to the world at large and the perverts who inhabit it? That’s exactly what GPS-tagged photos do.

I thought the info would be of interest to other readers, so I’m sharing it here.

Facebook actively facilitates scammers.

The Berlin conference was hosted by an online forum called Stack That Money, but a newcomer could be forgiven for wondering if it was somehow sponsored by Facebook Inc. Saleswomen from the company held court onstage, introducing speakers and moderating panel discussions. After the show, Facebook representatives flew to Ibiza on a plane rented by Stack That Money to party with some of the top affiliates.

Granted anonymity, affiliates were happy to detail their tricks. They told me that Facebook had revolutionized scamming. The company built tools with its trove of user data (…) Affiliates hijacked them. Facebook’s targeting algorithm is so powerful, they said, they don’t need to identify suckers themselves—Facebook does it automatically. And they boasted that Russia’s dezinformatsiya agents were using tactics their community had pioneered.

Scraping Android.

Android owners were displeased to discover that Facebook had been scraping their text-message and phone-call metadata, in some cases for years, an operation hidden in the fine print of a user agreement clause until Ars Technica reported. Facebook was quick to defend the practice as entirely aboveboard—small comfort to those who are beginning to realize that, because Facebook is a free service, they and their data are by necessity the products.

I’m not just picking on Facebook, Amazon and Google are right there with it.

Digital eavesdropping

Amazon and Google, the leading sellers of such devices, say the assistants record and process audio only after users trigger them by pushing a button or uttering a phrase like “Hey, Alexa” or “O.K., Google.” But each company has filed patent applications, many of them still under consideration, that outline an array of possibilities for how devices like these could monitor more of what users say and do. That information could then be used to identify a person’s desires or interests, which could be mined for ads and product recommendations. (…) Facebook, in fact, had planned to unveil its new internet-connected home products at a developer conference in May, according to Bloomberg News, which reported that the company had scuttled that idea partly in response to the recent fallout.

Zukerberg’s ego knows no bounds.

Zuckerberg, positioning himself as the benevolent ruler of a state-like entity, counters that everything is going to be fine—because ultimately he controls Facebook.

There are dozens more, but you can use search as well as I.

What can you do?

Thank Firefox for a simple containerized solution to Facebook’s tracking (stalking) you while surfing.

Facebook is (supposedly) making it easier to manage your privacy settings.

There are additional things you can do.

How to delete Facebook, but save your content.

The bad news is that even if you are willing to spend the effort, you can’t really delete yourself from social media.

All this has caused a rupture in techdom.

I could go on almost forever, but if you’re interested you’ll have no trouble finding more.

Image credit: weisunc

Progressive Walk Doesn’t Follow Talk

Tuesday, May 14th, 2019

I used the following quote in a post about ego taking over startup founders.

Star CEOs grow dangerous when they see their success as destiny, their place at the head of the pack as the only path possible, rendering all of their choices justified.— Zachary First, managing director of The Drucker Institute (A 2013 Fortune article, link dead))

Obviously, it’s not only founders, but, just five years later, would you expect it to apply to so-called progressive managers?

It does, with a vengeance.

The (unfortunately) best (worst?) example comes from the Southern Poverty Law Center.

The most egregious recent example of this troubling type appears to exist in Morris Dees, 82, co-founder and the powerful former head of the Southern Poverty Law Center (SPLC) in Montgomery, Alabama. He was removed from that post in March, following allegations of workplace misconduct. Specifically, the leader of the SPLC, known for its doggedness identifying and winning court cases against vile hate groups, was accused of racism and multiple counts of sexual harassment.

Dees’ fall shocked everyone, except the people who had worked closely with him, according to a recent New Yorker essay by journalist Bob Moser, who worked at the SPLC for a few years in the early 2000s. The organization known as a “beacon of justice” as he writes, was in fact what another one of its former writers called a “virtual buffet of injustices.” Employees worked within a two-tiered system: People of color were hired for support roles, while the higher-paid leaders, lawyers, writers, and fundraisers were “almost exclusively white.”

None of us like our heroes to have feet of clay, but it is easy to start seeing through an “I’m doing good in my world, therefore I am good and can do no wrong.”

In other words, if I’m fighting them, I’m not acting like them and shouldn’t be compared to them.

Years ago someone my crowd thought of as a good friend stole my credit card and jewelry and another guy’s car, etc. When he was caught he told the judge that, since he had done good for us, his stealing was no big deal.

Doing good is not a vaccine.

I recently wrote about the importance of objectively; using it on yourself can help you avoid the “do as I say, not as I do” trap.

Weekly, take a hard, look at your own actions and compare them objectively to someone on the philosophically opposite side.

Any similarities should serve as a warning.

Do something about them immediately.

Image credit: Anders Sandberg

RSS2 Subscribe to
MAPping Company Success

Enter your Email
Powered by FeedBlitz
About Miki View Miki Saxon's profile on LinkedIn

Clarify your exec summary, website, etc.

Have a quick question or just want to chat? Feel free to write or call me at 360.335.8054

The 12 Ingredients of a Fillable Req

CheatSheet for InterviewERS

CheatSheet for InterviewEEs

Give your mind a rest. Here are 4 quick ways to get rid of kinks, break a logjam or juice your creativity!

Creative mousing

Bubblewrap!

Animal innovation

Brain teaser

The latest disaster is here at home; donate to the East Coast recovery efforts now!

Text REDCROSS to 90999 to make a $10 donation or call 00.733.2767. $10 really really does make a difference and you'll never miss it.

And always donate what you can whenever you can

The following accept cash and in-kind donations: Doctors Without Borders, UNICEF, Red Cross, World Food Program, Save the Children

*/ ?>

About Miki

About KG

Clarify your exec summary, website, marketing collateral, etc.

Have a question or just want to chat @ no cost? Feel free to write 

Download useful assistance now.

Entrepreneurs face difficulties that are hard for most people to imagine, let alone understand. You can find anonymous help and connections that do understand at 7 cups of tea.

Crises never end.
$10 really does make a difference and you’ll never miss it,
while $10 a month has exponential power.
Always donate what you can whenever you can.

The following accept cash and in-kind donations:

Web site development: NTR Lab
Creative Commons License
This work is licensed under a Creative Commons Attribution-NoDerivs 2.5 License.