Tag Archives: internet fingerprint

Who Owns Your Voice? Grabbing Biometric Data

Increasingly sophisticated technology that detects nuances in sound inaudible to humans is capturing clues about people’s likely locations, medical conditions and even physical features.Law-enforcement agencies are turning to those clues from the human voice to help sketch the faces of suspects. Banks are using them to catch scammers trying to imitate their customers on the phone, and doctors are using such data to detect the onset of dementia or depression.  That has… raised fresh privacy concerns, as consumers’ biometric data is harnessed in novel ways.

“People have known that voice carries information for centuries,” said Rita Singh, a voice and machine-learning researcher at Carnegie Mellon University who receives funding from the Department of Homeland Security…Ms. Singh measures dozens of voice-quality features—such as raspiness or tremor—that relate to the inside of a person’s vocal tract and how an individual voice is produced. She detects so-called microvolumes of air that help create the sound waves that make up the human voice. The way they resonate in the vocal tract, along with other voice characteristics, provides clues on a person’s skull structure, height, weight and physical surroundings, she said.

Nuance’s voice-biometric and recognition software is designed to detect the gender, age and linguistic background of callers and whether a voice is synthetic or recorded. It helped one bank determine that a single person was responsible for tens of millions of dollars of theft, or 18% of the fraud the firm encountered in a year, said Brett Beranek, general manager of Nuance’s security and biometrics business.

Audio data from customer-service calls is also combined with information on how consumers typically interact with mobile apps and devices, said Howard Edelstein, chairman of behavioral biometric company Biocatch. The company can detect the cadence and pressure of swipes and taps on a smartphone.  How a person holds a smartphone gives clues about their age, for example, allowing a financial firm to compare the age of the normal account user to the age of the caller…

If such data collected by a company were improperly sold or hacked, some fear recovering from identity theft could be even harder because physical features are innate and irreplaceable.

Sarah Krouse, What Your Voice Reveals About You, WSJ, Aug. 13, 2019

Why a Dumb Internet is Best

Functional splintering [of the internet] is already happening. When tech companies build “walled gardens”, they decide the rules for what happens inside the walls, and users outside the network are excluded…

Governments are playing catch-up but they will eventually reclaim the regulatory power that has slipped from their grasp. Dictatorships such as China retained control from the start; others, including Russia, are following Beijing. With democracies, too, asserting their jurisdiction over the digital economy, a fragmentation of the internet along national lines is more likely. …The prospect of a “splinternet” has not been lost on governments. To avoid it, Japan’s G20 presidency has pushed for a shared approach to internet governance. In January 2019, prime minister Shinzo Abe called for “data free flow with trust”. The 2019 Osaka summit pledged international co-operation to “encourage the interoperability of different frameworks”.

But Europe is most in the crosshairs of those who warn against fragmentation…US tech giants have not appreciated EU authorities challenging their business model through privacy laws or competition rulings. But more objective commentators, too, fear the EU may cut itself off from the global digital economy. The critics fail to recognise that fragmentation can be the best outcome if values and tastes fundamentally differ…

If Europeans collectively do not want micro-targeted advertising, or artificial intelligence-powered behaviour manipulation, or excessive data collection, then the absence on a European internet of services using such techniques is a gain, not a loss. The price could be to miss out on some services available elsewhere… More probably, non-EU providers will eventually find a way to charge EU users in lieu of monetising their data…Some fear EU rules make it hard to collect the big data sets needed for AI training. But the same point applies. EU consumers may not want AI trained to do intrusive things. In any case, Europe is a big enough market to generate stripped, non-personal data needed for dumber but more tolerable AI, though this may require more harmonised within-EU digital governance. Indeed, even if stricter EU rules splinter the global internet, they also create incentives for more investment into EU-tailored digital products. In the absence of global regulatory agreements, that is a good second best for Europe to aim for.

Excerpts from Martin Sandbu,  Europe Should Not be Afraid of Splinternet,  FT, July 2, 2019

The Internet Was Never Open

Rarely has a manifesto been so wrong. “A Declaration of the Independence of Cyberspace”, written 20 years ago by John Perry Barlow, a digital civil-libertarian, begins thus: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

At the turn of the century, it seemed as though this techno-Utopian vision of the world could indeed be a reality. It didn’t last… Autocratic governments around the world…have invested in online-surveillance gear. Filtering systems restrict access: to porn in Britain, to Facebook and Google in China, to dissent in Russia.

Competing operating systems and networks offer inducements to keep their users within the fold, consolidating their power. Their algorithms personalise the web so that no two people get the same search results or social media feeds, betraying the idea of a digital commons. Five companies account for nearly two-thirds of revenue from advertising, the dominant business model of the web.

The open internet accounts for barely 20% of the entire web. The rest of it is hidden away in unsearchable “walled gardens” such as Facebook, whose algorithms are opaque, or on the “dark web”, a shady parallel world wide web. Data gathered from the activities of internet users are being concentrated in fewer hands. And big hands they are too. BCG, a consultancy, reckons that the internet will account for 5.3% of GDP of the world’s 20 big economies this year, or $4.2 trillion.

How did this come to pass? The simple reply is that the free, open, democratic internet dreamed up by the optimists of Silicon Valley was never more than a brief interlude. The more nuanced answer is that the open internet never really existed.

[T]e internet, it was developed “by the US military to serve US military purposes”… The decentralised, packet-based system of communication that forms the basis of the internet originated in America’s need to withstand a massive attack on its soil. Even the much-ballyhooed Silicon Valley model of venture capital as a way to place bets on risky new businesses has military origins.

In the 1980s the American military began to lose interest in the internet…. The time had come for the hackers and geeks who had been experimenting with early computers and phone lines.  Today they are the giants. Google, Apple, Facebook, Amazon and Microsoft—together with some telecoms operators—help set policy in Europe and America on everything from privacy rights and copyright law to child protection and national security. As these companies grow more powerful, the state is pushing back…

The other big risk is that the tension between states and companies resolves into a symbiotic relationship. A leaked e-mail shows a Google executive communicating with Hillary Clinton’s state department about an online tool that would be “important in encouraging more [Syrians] to defect and giving confidence to the opposition.”+++ If technology firms with global reach quietly promote the foreign-policy interests of one country, that can only increase suspicion and accelerate the fracturing of the web into regional internets….

Mr Malcomson describes the internet as a “global private marketplace built on a government platform, not unlike the global airport system”.

Excerpts from Evolution of the internet: Growing up, Economist, Mar. 26, 2016

+++The email said Google would be “partnering with Al Jazeera” who would take “primary ownership” of the tool, maintaining it and publicizing it in Syria.  It was eventually published by Al Jazeera in English and Arabic.

Who Controls Peoples’ Data?

The McKinsey Global Institute estimates that cross-border flows of goods, services and data added 10 per cent to global gross domestic product in the decade to 2015, with data providing a third of that increase. That share of the contribution seems likely to rise: conventional trade has slowed sharply, while digital flows have surged. Yet as the whole economy becomes more information-intensive — even heavy industries such as oil and gas are becoming data-driven — the cost of blocking those flows increases…

Yet that is precisely what is happening. Governments have sharply increased “data localisation” measures requiring information to be held in servers inside individual countries. The European Centre for International Political Economy, a think-tank, calculates that in the decade to 2016, the number of significant data localisation measures in the world’s large economies nearly tripled from 31 to 84.

Even in advanced economies, exporting data on individuals is heavily restricted because of privacy concerns, which have been highlighted by the Facebook/ Cambridge Analytica scandal. Many EU countries have curbs on moving personal data even to other member states. Studies for the Global Commission on Internet Governance, an independent research project, estimates that current constraints — such as restrictions on moving data on banking, gambling and tax records — reduces EU GDP by half a per cent.

In China, the champion data localiser, restrictions are even more severe. As well as long-established controls over technology transfer and state surveillance of the population, such measures form part of its interventionist “ Made in China 2025 ” industrial strategy, designed to make it a world leader in tech-heavy sectors such as artificial intelligence and robotics.

China’s Great Firewall has long blocked most foreign web applications, and a cyber security law passed in 2016 also imposed rules against exporting personal information, forcing companies including Apple and LinkedIn to hold information on Chinese users on local servers. Beijing has also given itself a variety of powers to block the export of “important data” on grounds of reducing vaguely defined economic, scientific or technological risks to national security or the public interest.   “The likelihood that any company operating in China will find itself in a legal blind spot where it can freely transfer commercial or business data outside the country is less than 1 per cent,” says ECIPE director Hosuk Lee-Makiyama….

Other emerging markets, such as Russia, India, Indonesia and Vietnam, are also leading data localisers. Russia has blocked LinkedIn from operating there after it refused to transfer data on Russian users to local servers.

Business organisations including the US Chamber of Commerce want rules to restrain what they call “digital protectionism”. But data trade experts point to a serious hole in global governance, with a coherent approach prevented by different philosophies between the big trading powers. Susan Aaronson, a trade academic at George Washington University in Washington, DC, says: “There are currently three powers — the EU, the US and China — in the process of creating separate data realms.”

The most obvious way to protect international flows of data is in trade deals — whether multilateral, regional or bilateral. Yet only the World Trade Organization laws governing data flows predate the internet and have not been thoroughly tested through litigation. It recently recruited Alibaba co-founder Jack Ma to front an ecommerce initiative, but officials involved admit it is unlikely to produce anything concrete for a long time. In any case, Prof Aaronson says: “While data has traditionally been addressed in trade deals as an ecommerce issue, it goes far wider than that.”

The internet has always been regarded by pioneers and campaigners as a decentralised, self-regulating community. Activists have tended to regard government intervention with suspicion, except for its role in protecting personal data, and many are wary of legislation to enable data flows.  “While we support the approach of preventing data localisation, we need to balance that against other rights such as data protection, cyber security and consumer rights,” says Jeremy Malcolm, senior global policy analyst at the Electronic Frontier Foundation, a campaign for internet freedom…

Europe has traditionally had a very different philosophy towards data and privacy than the US. In Germany, for instance, public opinion tends to support strict privacy laws — usually attributed to lingering memories of surveillance by the Stasi secret police in East Germany. The EU’s new General Data Protection Regulation (GDPR), which comes into force on May 25, 2018 imposes a long list of requirements on companies processing personal data on pain of fines that could total as much as 4 per cent of annual turnover….But trade experts warn that the GDPR is very cautiously written, with a blanket exemption for measures claiming to protect privacy. Mr Lee-Makiyama says: “The EU text will essentially provide no meaningful restriction on countries wanting to practice data localisation.”

Against this political backdrop, the prospects for broad and binding international rules on data flow are dim. …In the battle for dominance over setting rules for commerce, the EU and US often adopt contrasting approaches.  While the US often tries to export its product standards in trade diplomacy, the EU tends to write rules for itself and let the gravity of its huge market pull other economies into its regulatory orbit. Businesses faced with multiple regulatory regimes will tend to work to the highest standard, known widely as the “Brussels effect”.  Companies such as Facebook have promised to follow GDPR throughout their global operations as the price of operating in Europe.

Excerpts from   Data protectionism: the growing menace to global business, Financial Times, May 13, 2018

Biometrics: Behavioral and Physical

From DARPA pdf document available at  FedBizOpps. Gov Enhanced Attribution
Solicitation Number: DARPA-BAA-16-34

Malicious actors in cyberspace currently operate with little fear of being caught due to the fact that it is extremely difficult, in some cases perhaps even impossible, to reliably and confidently attribute actions in cyberspace to individuals. The reason cyber attribution is difficult stems at least in part from a lack of end-to-end accountability in the current Internet infrastructure…..The identities of malicious cyber operators are largely obstructed by the use of multiple layers of indirection… The lack of detailed information about the actions and identities of the adversary cyber operators inhibits policymaker considerations and decisions for both cyber and non-cyber response options (e.g., economic sanctions under EO-13694).

The DARPA’s Enhanced Attribution program aims to make currently opaque malicious cyber adversary actions and individual cyber operator attribution transparent by providing high-fidelity visibility into all aspects of malicious cyber operator actions and to increase the Government’s ability to publicly reveal the actions of individual malicious cyber operators without damaging sources and methods….

The program seeks to develop:

–technologies to extract behavioral and physical biometrics from a range of devices and
vantage points to consistently identify virtual personas and individual malicious cyber
operators over time and across different endpoint devices and C2 infrastructures;
–techniques to decompose the software tools and actions of malicious cyber operators into semantically rich and compressed knowledge representations;
–scalable techniques to fuse, manage, and project such ground-truth information over time,toward developing a full historical and current picture of malicious activity;

–algorithms for developing predictive behavioral profiles within the context of cyber campaigns; and
–technologies for validating and perhaps enriching this knowledge base with other sources of data, including public and commercial sources of information.

Excerpts from Enhanced Attribution, Solicitation Number: DARPA-BAA-16-34, April 22, 2016

Wikipedia Lawsuit against U.S. NSA

Excerpts from the Lawsuit of Wikipedia against the NSA

UNITED STATES DISTRICT COURT DISTRICT OF MARYLAND, Case 1:15-cv-00662-RDB, Filed 03/10/15

The government conducts at least two kinds of surveillance under the The Foreign Intelligence Surveillance Amendments Act of 2008 (FAA).  Under a program called “PRISM,” the government obtains stored and real-time communications directly from U.S. companies—such as Google, Yahoo, Facebook, and Microsoft—that provide communications services to targeted accounts.

This case concerns a second form of surveillance, called Upstream. Upstream surveillance involves the NSA’s seizing and searching the internet communications of U.S. citizens and residents en masse as those communications travel across the internet “backbone” in the United States. The internet backbone is the network of high-capacity cables, switches, and routers that facilitates both domestic and international communication via the internet.The NSA conducts Upstream surveillance by connecting surveillance devices to multiple major internet cables, switches, and routers inside the United States. These access points are controlled by the country’s largest telecommunications providers, including Verizon Communications, Inc. and AT&T, Inc. ….

. With the assistance of telecommunications providers, the NSA intercepts a wide variety of internet communications, including emails, instant messages, webpages, voice calls, and video chats. It copies and reviews substantially all international emails and other “text-based” communications—i.e., those whose content includes searchable text.

More specifically, Upstream surveillance encompasses the following processes, some of which are implemented by telecommunications providers acting at the NSA’s direction:

• Copying. Using surveillance devices installed at key access points, the NSA makes a copy of substantially all international text-based communications—and many domestic ones—flowing across certain high-capacity cables, switches, and routers. The copied traffic includes email, internet-messaging communications, web-browsing content, and search-engine queries.

• Filtering. The NSA attempts to filter out and discard some wholly domestic communications from the stream of internet data, while preserving international communications. The NSA’s filtering out of domestic communications is incomplete, however, for multiple reasons. Among them, the NSA does not eliminate bundles of domestic and international communications that transit the internet backbone together. Nor does it eliminate domestic communications that happen to be routed abroad.

• Content Review. The NSA reviews the copied communications—including their full content—for instances of its search terms. The search terms, called “selectors,” include email addresses, phone numbers, internet protocol (“IP”) addresses, and other identifiers that NSA analysts believe to be associated with foreign intelligence targets. Again, the NSA’s targets are not limited to suspected foreign agents and terrorists, nor are its selectors limited to individual email addresses. The NSA may monitor or “task” selectors used by large groups of people who are not suspected of any wrongdoing— such as the IP addresses of computer servers used by hundreds of different people.

• Retention and Use. The NSA retains all communications that contain selectors associated with its targets, as well as those that happened to be bundled with them in transit….

NSA analysts may read, query, data-mine, and analyze these communications with few restrictions, and they may share the results of those efforts with the FBI, including in aid of criminal investigations….. In other words, the NSA copies and reviews the communications of millions of innocent people to determine whether they are discussing or reading anything containing the NSA’s search terms. The NSA’s practice of reviewing the content of communications for selectors is sometimes called “about” surveillance. This is because its purpose is to identify not just communications that are to or from the NSA’s targets but also those that are merely “about” its targets. Although it could do so, the government makes no meaningful effort to avoid the interception of communications that are merely “about” its targets; nor does it later purge those communications.

PDF document of Lawsuit

Internet or Equinet?

“The Internet governance should be multilateral, transparent, democratic,and representative, with the participation of governments, private sector, civil society, and international organizations, in their respective roles. This should be one of the foundational principles of Internet governance,” the external affairs ministry says in its initial submission to the April 23-24 Global Multistakeholder Meeting on the Future of Internet Governance, also referred as NETmundial, in Sao Paulo, Brazil.  The proposal for a decentralised Internet is significant in view of Edward Snowden’s Wikileaks revelations of mass surveillance in recent months.

“The structures that manage and regulate the core Internet resources need to be internationalized, and made representative and democratic. The governance of the Internet should also be sensitive to the cultures and national interests of all nations.”The mechanism for governance of the Internet should therefore be transparent and should address all related issues. The Internet must be owned by the global community for mutual benefit and be rendered impervious to possible manipulation or misuse by any particular stake holder, whether state or non-state,” the ministry note says.  NETmundial will see representatives from nearly 180 countries participating to debate the future of Internet…

The US announced last month of its intent to relinquish control of a vital part of Internet Corporation for Assigned Names and Numbers (ICANN) – the Internet Assigned Numbers Authority (IANA).  “Many nations still think that a multilateral role might be more suitable than a multistakeholder approach and two years back India had proposed a 50-nation ‘Committee of Internet Related Policies’ (CIRP) for global internet governance,” Bhattacharjee added.

The concept of Equinet was first floated by Communications Minister Kapil Sibal in 2012 at the Internet Governance Forum in Baku, Azerbaijan.  Dr. Govind, chief executive officer, National Internet Exchange of India, is hopeful that Equinet is achievable. “Equinet is a concept of the Internet as a powerful medium benefiting people across the spectrum.It is all the more significant for India as we have 220 million Internet users, standing third globally after China and the US.””Moreover, by the year-end India’s number of Internet users are expected to surpass that of the US. The word Equinet means an equitable Internet which plays the role of an equaliser in the society and not limited only to the privileged people.”

He said the role of government in Internet management is important as far as policy, security and privacy of the cyber space is concerned, but the roles of the private sector, civil society and other stakeholders are no less. “Internet needs to be managed in a more collaborative, cooperative, consultative and consensual manner.”  Talking about the global strategy of renaming Internet as Equinet, he said: “Globally the US has the largest control over the management of the Internet, which is understandable since everything about Internet started there. Developing countries have still not much say over the global management of the Internet. But it is important that the Internet management be more decentralised and globalised so that the developing countries have more participation, have a say in the management where their consent be taken as well.”  The ministry note said: “A mechanism for accountability should be put in place in respect of crimes committed in cyberspace, such that the Internet is a free and secure space for universal benefaction. A ‘new cyber jurisprudence’ needs to be evolved to deal with cyber crime, without being limited by political boundaries and cyber-justice can be delivered in near real time.”

But other experts doubt the possibility of an Equinet or equalising the Internet globally.  Sivasubramanian Muthusamy, president, Internet Society India, Chennai, who is also a participant in the NETmundial, told IANS that the idea of Equinet is not achievable.  “Totally wrong idea. Internet provides a level playing field already. It is designed and operated to be universally accessible, free and open. Internet as it is operated today offers the greatest hope for developing countries to access global markets and prosper.”  “The idea of proposing to rename the Internet as Equinet has a political motive, that would pave way for telecom companies to have a bigger role to bring in harmful commercial models that would destabilize the open architecture of the Internet. If India is considering such a proposal, it would be severely criticized. The proposal does not make any sense. It is wrong advice or misplaced input that must have prompted the government of India to think of such a strange idea,” he said.

Excerpt from India wants Internet to become Equinet, Business Standard, Apr. 20, 2014