Articles

Identification is not an ethical principle but a dirty trick!

Since I started working on the project Laila, an artificial intelligence ecosystem to support a conversational agent dedicated to businesses, I learned that AI offers immense opportunities for development and innovation, but most of the companies that claim to use it are unfortunately putting into practice a mere marketing strategy .

The viral video of Google Duplex

As soon as it was introduced to the world, Google Duplex immediately attracted the attention of the web. Presented during the Google IO 2018, this technology based on Artificial Intelligence manages to impersonate a human being intent, on behalf of its user, to book a session at the hairdresser and a table in the restaurant, sustaining, with both businesses, a rather fluid telephone conversation.

The video demo looks real even if for many it’s a fake. Certainly the conversations, which appear quite complex, if they have not been artfully assembled are certainly the result of many attempts and a bit of luck: on two successful phone calls we do not know how many other Google Duplexes may have failed miserably.

The effect of the presentation reminds me of the great talk that Amazon did when in 2013, with a video on Youtube, it presented to the world Amazon Prime Air, the innovative delivery system based on drones; this revolutionary system, after many years, is still in the experimental phase and we mere mortals have gotten used to the idea that “all” these science fiction found are in the end just a new way to make viral videos.

What struck me about Google Duplex is the little theater assembled around an alleged ethical question already raised a few hours after the video was published and immediately re-launched by the most important online newspapers. In short, according to some: AIs that simulate human behavior present an ethical problem if they disguise their nature and pretend to be human.

Snapshot of Google’s response: “Duplex will be immediately identifiable to your interlocutor“.

Let’s take a step back

Google has always accustomed us to use its technologies by never revealing how they are made and how they work. Its search engine is representative of this philosophy: the underlying algorithm is an industrial secret of inestimable value, no one considers himself capable of revealing its secrets. And precisely because of its unfathomable nature, we have become accustomed to not questioning its effectiveness, whether real or presumed. We just use it.

It does not matter if passing from one page to another of a search the number of results changes; or if something emerges from our research that we never thought we would associate with our intent.

Yet these small failures appear to our eyes as mere imperfections, small flaws in a system so sophisticated that sometimes we doubt that it was Google who was right and we were wrong.

Take for example the Google Search Suggest, the auto-completion system with which Google suggests what to look for while we are writing. This system, which for me raises many more ethical problems than Google Duplex, appears useful and intelligent, yet behind its effectiveness it hides a trick for real hackers: the Google search engine works on “keywords”, groups of words that represent the search intent of its users. Each new keyword is the expression of a need and requires specific processing in order for Google to be able to respond adequately. Google has immensely great computing power, yet today it is impractical to imagine organizing the results of its engine according to “any possible combination of words”.

Related Post
Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.
Google Search Suggest raises far more ethical issues than Google Duplex.

For this reason, Google pays attention only to keywords that have been expressed enough times to be a widespread need. For everything else, improvise: similar words, similarities with other keywords, identification of random texts are the systems applied to deal with those situations that otherwise appear with no way out.

The number of results, like that of pages, changes by moving the search page forward and you never manage to inspect the results beyond a certain page: let’s be clear, the results that are on page 30 are of no use to anyone, but why claim that Does the keyword “printed stems” have 160 results if less than 300 can be viewed?

Google Search Suggest is the way in which Google tries to “anticipate us”: by suggesting search intent already known to it, Google tries to guide us towards a search to which it is able to answer without tricks: convincing the user to use a key Google not only returns a useful result but saves itself from having to add a new keyword to the list of those for which it will have to spend some of its computing power.

Operational-functional indulgence

Returning to Google Duplex, we are in 2022 and we have not talked about it for some time, but from this experience we have learned that identifying as an artificial conversational system is not the answer to an ethical problem but a hacker trick: if those who converse with Duplex are aware that he is talking to an automated system, he also knows that he must move adequately, supporting him in the conversation and helping him to understand its content.

For Google Duplex, identifying as an artificial system is not the answer to an ethical problem but a hacker trick.

When we talk to Cortana, Alexa, Siri, we always automatically use the same expressions, the same formulas because this is essential if we want to avoid not being understood by our systems.

Identification is for Google Duplex a way to obtain from people “understanding” of their limits, a form of operational-functional indulgence that all of us humans have learned to have towards that technology which, while making an effort, does not give back everything its creators they promise.

BlogofInnovation.com

Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.

Recent Posts

Native Capital Hosts Bitcoin Impact during Singapore Fintech Festival on Nov 10

Early investment fund Native Capital is thrilled to announce the next Bitcoin Impact, an exclusive gathering set to ignite discourse…

16 hours ago

Blockchain Life 2024 United Crypto Leaders from 120 Countries in Dubai

The 13th edition of the Blockchain Life Forum, the premier gathering for cryptocurrency leaders worldwide, recently concluded with an astounding…

16 hours ago

Forian Acquires Kyber Data Science to Enhance Data Analytics Capabilities

​​NEWTOWN, PA - (NewMediaWire) - November 01, 2024 - Forian Inc. (Nasdaq: FORA), a provider of data science driven information and analytics…

1 day ago

Dappfort Explores the Present and Future of Web3

As the digital landscape rapidly evolves, Dappfort is excited to shed light on the transformative potential of Web3 technologies for…

2 days ago

HitPaw Video Enhancer Rebrands as HitPaw VikPea: Major Update for Your Ultimate Video Enhancer Solution

NEW YORK, N.Y., Oct. 31, 2024 (SEND2PRESS NEWSWIRE) — HitPaw, a leading software company, is excited to announce the rebranding…

2 days ago

Evergreen Trading Expands Leadership Team to Amplify Growth and Innovation

The Media Trade Firm Brings on Sean Moran as COO and Erin Keating as EVP, Media Operations, and Investment The Media…

2 days ago

Seguici

Innovation Newsletter
Don't miss the most important news about Innovation. Sign up to receive them by email.