Tag Archives: NICE company voice biometric

Worldcoins for Your Eyes: How Sam Altman is Saving us from the Robots

Sam Altman wants to save us from the AI-dominated world he is building. The trouble is, governments aren’t buying his plan, which involves an attempt to scan the eyeballs of every person on Earth and pay them with his own cryptocurrency-the Worldcoin. But Worldcoin has come under assault by authorities over its mission. It has been raided in Hong Kong, blocked in Spain, fined in Argentina and criminally investigated in Kenya. A ruling looms on whether it can keep operating in the European Union….Among the concerns: How does the Cayman Islands-registered Worldcoin Foundation handle user data, train its algorithms and avoid scanning children? 

Worldcoin verifies “humanness” by scanning irises using a basketball-sized chrome device called the Orb. Worldcoin says irises, which are complex and relatively unchanging in adults, can better distinguish humans than fingerprints or faces. Users receive immutable codes held in an online “World ID” passport, to use on other platforms to prove they are human, plus payouts in Worldcoin’s WLD cryptocurrency. Worldcoin launched in 2023 and says it has verified more than six million people across almost 40 countries. Based on recent trading prices, the total pool of WLD is theoretically worth some $15 billion.

Altman says his technology is completely private: Orbs delete all images after verification, and iris codes contain no personal information—unless users permit Worldcoin to train its algorithms with their scans. Encrypted servers hold the anonymized codes and images. However, several authorities have accused Worldcoin of telling Orb operators, typically independent contractors, to encourage users to hand over iris images. Privacy advocates say these could be used to build a global biometric database with little oversight.

Excerpt from Angus Berwick, Sam Altman’s Worldcoin Is Battling With Governments Over Your Eyes, WSJ, Aug. 18, 2024

The Uses and Abuses of Alexa

Excerpts from the Interview with Robert Lewis Shayon author of “The Voice Catchers: How Marketers Listen In to Exploit Your Feelings, Your Privacy, and Your Wallet” published  at the Pennsylvania Gazette July 2021

There is  emerging industry that is deploying immense resources and breakthrough technologies based on the idea that voice is biometric—a part of your body that those in the industry believe can be used to identify and evaluate you instantly and permanently. Most of the focus in voice profiling technology today is on emotion, sentiment, and personality. But experts tell me it is scientifically possible to tell the height of a person, the weight, the race, and even some diseases. There are actually companies now trying to assess, for example, whether you have Alzheimer’s based upon your voice…

The issue is that this new voice intelligence industry—run by companies you know, such as Amazon and Google, and some you don’t, such as NICE and Verint—is sweeping across society, yet there is little public discussion about the implications. The need for this conversation becomes especially urgent when we consider the long-term harms that could result if voice profiling and surveillance technologies are used not only for commercial marketing purposes, but also by political marketers and governments, to say nothing of hackers stealing data.

There are hundreds of millions of smart speakers out there, and far more phones with assistants, listening to you and capturing your voice. Voice technology already permeates virtually every important area of personal interaction—as assistants on your phone and in your car, in smart speakers at home, in hotels, schools, even stores instead of salespeople. 

Amazon and Google have several patents centering around voice profiling that describe a rich future for the practice…But consider the downside: we could be denied loans, have to pay much more for insurance, or be turned away from jobs, all on the basis of physiological characteristics and linguistic patterns that may not reflect what marketers believe they reflect.

The first thing to realize is that voice assistants are not our friends no matter how friendly they sound. I argue, in fact, that voice profiling marks a red line for society that shouldn’t be crossed.