Articles

Identification is not an ethical principle but a dirty trick!

Since I started working on the project Laila, an artificial intelligence ecosystem to support a conversational agent dedicated to businesses, I learned that AI offers immense opportunities for development and innovation, but most of the companies that claim to use it are unfortunately putting into practice a mere marketing strategy .

The viral video of Google Duplex

As soon as it was introduced to the world, Google Duplex immediately attracted the attention of the web. Presented during the Google IO 2018, this technology based on Artificial Intelligence manages to impersonate a human being intent, on behalf of its user, to book a session at the hairdresser and a table in the restaurant, sustaining, with both businesses, a rather fluid telephone conversation.

The video demo looks real even if for many it’s a fake. Certainly the conversations, which appear quite complex, if they have not been artfully assembled are certainly the result of many attempts and a bit of luck: on two successful phone calls we do not know how many other Google Duplexes may have failed miserably.

The effect of the presentation reminds me of the great talk that Amazon did when in 2013, with a video on Youtube, it presented to the world Amazon Prime Air, the innovative delivery system based on drones; this revolutionary system, after many years, is still in the experimental phase and we mere mortals have gotten used to the idea that “all” these science fiction found are in the end just a new way to make viral videos.

What struck me about Google Duplex is the little theater assembled around an alleged ethical question already raised a few hours after the video was published and immediately re-launched by the most important online newspapers. In short, according to some: AIs that simulate human behavior present an ethical problem if they disguise their nature and pretend to be human.

Snapshot of Google’s response: “Duplex will be immediately identifiable to your interlocutor“.

Let’s take a step back

Google has always accustomed us to use its technologies by never revealing how they are made and how they work. Its search engine is representative of this philosophy: the underlying algorithm is an industrial secret of inestimable value, no one considers himself capable of revealing its secrets. And precisely because of its unfathomable nature, we have become accustomed to not questioning its effectiveness, whether real or presumed. We just use it.

It does not matter if passing from one page to another of a search the number of results changes; or if something emerges from our research that we never thought we would associate with our intent.

Yet these small failures appear to our eyes as mere imperfections, small flaws in a system so sophisticated that sometimes we doubt that it was Google who was right and we were wrong.

Take for example the Google Search Suggest, the auto-completion system with which Google suggests what to look for while we are writing. This system, which for me raises many more ethical problems than Google Duplex, appears useful and intelligent, yet behind its effectiveness it hides a trick for real hackers: the Google search engine works on “keywords”, groups of words that represent the search intent of its users. Each new keyword is the expression of a need and requires specific processing in order for Google to be able to respond adequately. Google has immensely great computing power, yet today it is impractical to imagine organizing the results of its engine according to “any possible combination of words”.

Related Post
Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.
Google Search Suggest raises far more ethical issues than Google Duplex.

For this reason, Google pays attention only to keywords that have been expressed enough times to be a widespread need. For everything else, improvise: similar words, similarities with other keywords, identification of random texts are the systems applied to deal with those situations that otherwise appear with no way out.

The number of results, like that of pages, changes by moving the search page forward and you never manage to inspect the results beyond a certain page: let’s be clear, the results that are on page 30 are of no use to anyone, but why claim that Does the keyword “printed stems” have 160 results if less than 300 can be viewed?

Google Search Suggest is the way in which Google tries to “anticipate us”: by suggesting search intent already known to it, Google tries to guide us towards a search to which it is able to answer without tricks: convincing the user to use a key Google not only returns a useful result but saves itself from having to add a new keyword to the list of those for which it will have to spend some of its computing power.

Operational-functional indulgence

Returning to Google Duplex, we are in 2022 and we have not talked about it for some time, but from this experience we have learned that identifying as an artificial conversational system is not the answer to an ethical problem but a hacker trick: if those who converse with Duplex are aware that he is talking to an automated system, he also knows that he must move adequately, supporting him in the conversation and helping him to understand its content.

For Google Duplex, identifying as an artificial system is not the answer to an ethical problem but a hacker trick.

When we talk to Cortana, Alexa, Siri, we always automatically use the same expressions, the same formulas because this is essential if we want to avoid not being understood by our systems.

Identification is for Google Duplex a way to obtain from people “understanding” of their limits, a form of operational-functional indulgence that all of us humans have learned to have towards that technology which, while making an effort, does not give back everything its creators they promise.

BlogofInnovation.com

Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.

Recent Posts

Michigan law requiring cardiac emergency response plans in schools is latest in nationwide trend

​​DALLAS, TX- (NewMediaWire) - April 27, 2024 — At an event today featuring the NFL’s Smart Heart Sports Coalition, Buffalo…

15 hours ago

Over 900 Exhibitors to Showcase Food & Beverage Innovation to More Than 27,000 Visitors at JFEX 2024

Japan Int’l Food & Beverage Expo (JFEX) is gearing up to host an unprecedented gathering of over 900 exhibitors and…

1 day ago

Synaptic Surgical Announces the Introduction of a Novel Design for Its Operating Room Model

The Synaptic Surgical platform empowers customers to visualize and bring to life cutting-edge OR design and technology Synaptic Surgical (ISIN:…

1 day ago

Florida’s Leading Podiatric Group Takes A Giant Leap Into Texas

SPRING, Texas, April 26, 2024 (SEND2PRESS NEWSWIRE) — Houston, we don’t have a foot care problem — now that Modern…

2 days ago

North Dakota Notaries Can Now Ditch the Desk! Secured Signing Brings Streamlined Online Notarization to the Peace Garden State

BISMARK, N.D., and MOUNTAIN VIEW, Calif., April 26, 2024 (SEND2PRESS NEWSWIRE) — Imagine notarizing documents from the comfort of your…

2 days ago

Informative Research’s Ryan Kaufman Named HousingWire’s 2024 Rising Star

IRVINE, Calif., April 26, 2024 (SEND2PRESS NEWSWIRE) — Informative Research, a leading technology platform that delivers data-driven solutions to the lending…

2 days ago

Seguici

Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.