Monthly Archives: December 2018

Amazon Turtles are Back! Thanks to Local Vigilantes

The historically over-exploited Giant South American Turtle is making a significant comeback on river beaches in the Brazilian Amazon thanks to local protection efforts, say researchers at the University of East Anglia.  Their results, published in Nature Sustainability, show that Giant Turtle populations are well on their way to full recovery on beaches guarded by local vigilantes. There are now over nine times more turtles hatching on these beaches than there were in 1977, equivalent to an annual increase of over 70,000 hatchlings.  The beach survey showed that, of over 2000 turtle nests monitored on protected beaches, only two per cent were attacked by poachers. In contrast, on unprotected beaches, poachers had harvested eggs from 99 per cent of the 202 nests surveyed.The beach protection programme along the Juruá river is part of the largest community-based conservation programme in the Brazilian Amazon. Beaches are guarded on a shoestring budget by local communities carrying out round-the-clock beach surveillance throughout the five-month turtle breeding season.

Prof Carlos Peres, from UEA’s School of Environmental Sciences and a senior author on the study, said: “This study clearly demonstrates the effectiveness of empowering local management action by stakeholders who have the largest stake and a 24/7 presence at key conservation sites. The beaches protected by local communities represent noisy islands of high biodiversity, surrounded by lifeless unprotected beaches, which are invariably empty and silent.”

Excerpts from Amazon turtle populations recovering well thanks to local action, Nov. 3, 2018

Devil’s Idea for Tokyo’s End: Fukushima

By late March 2011… after tsunami struck the Fukushima Daiichi plant—it was far from obvious that the accident was under control and the worst was over. Chief Cabinet Secretary Yukio Edano feared that radioactive material releases from the Fukushima Daiichi plant and its sister plant (Fukushima Daini) located some 12 km south could threaten the entire population of eastern Japan: “That was the devil’s scenario that was on my mind. Common sense dictated that, if that came to pass, then it was the end of Tokyo.”

Prime Minister Naoto Kan asked Dr. Shunsuke Kondo, then-chairman of the Japanese Atomic Energy Commission, to prepare a report on worst-case scenarios from the accidenta .  Dr. Kondo led a 3-day study involving other Japanese experts and submitted his report (Kondo, 2011) to the prime minister on March 25, 2011. The existence of the report was initially kept secret because of the frightening nature of the scenarios it described. An article in the Japan Times quoted a senior government official as saying, “The content [of the report] was so shocking that we decided to treat it as if it didn’t exist.” …

One of the scenarios involved a self-sustaining zirconium cladding fire in the Unit 4 spent fuel pool. Radioactive material releases from the fire were estimated to cause extensive contamination of a 50- to 70-km region around the Fukushima Daiichi plant with hotspots significant enough to require evacuations up to 110 km from the plant. Voluntary evacuations were envisioned out to 200 km because of elevated dose levels. If release from other spent fuel pools occurred, then contamination could extend as far as Tokyo,…There was particular concern that the zirconium cladding fire could produce enough heat to melt the stored fuel, allowing it to flow to the bottom of the pool, melt through the pool liner and concrete bottom, and flow into the reactor building.

Lessons Learned from the Fukushima Daiichi Accident for Spent Fuel Storage: The U.S. nuclear industry and its regulator should give additional attention to improving the ability of plant operators to measure real-time conditions in spent fuel pools and maintain adequate cooling of stored spent fuel during severe accidents and terrorist attacks. These improvements should include hardened and redundant physical surveillance systems (e.g., cameras), radiation monitors, pool temperature monitors, pool water-level monitors, and means to deliver pool makeup water or sprays even when physical access to the pools is limited by facility damage or high radiation levels….

[At nuclear power plants there must be…adequate separation of plant safety and  security systems so that security systems can continue to function independently if safety systems are damaged. In particular, security systems need to have independent, redundant, and protected power sources…]

Excerpts from Lessons Learned from the Fukushima Accident for Improving
Safety and Security of U.S. Nuclear Plants: Phase 2, US National Academies, 2016

Sequencing All Species: the Earth BioGenome Project

In the first attempt of its kind, researchers plan to sequence all known species of eukaryotic life—66,000 species of animals, plants, fungi, and protozoa—in a single country, the United Kingdom. The announcement was made here today at the official launch of an even grander $4.7 billion global effort, called the Earth BioGenome Project (EBP), to sequence the genomes of all of Earth’s known 1.5 million species of eukaryotes within a decade.

In terms of genomes sequenced, the eukaryotes—the branch of complex life consisting of organisms with cells that have a nucleus inside a membrane—lag far behind the bacteria and archaea. Researchers have sequenced just about 3500 eukaryotic genomes, and only 100 at high quality.

The U.K. sequencing effort—dubbed The Darwin Tree of Life project—will now become part of the EBP mix…Also announced today was a memorandum of understanding for participating in EBP. It has been signed by 19 institutions, including BGI Shenzhen, China; the Royal Botanic Gardens, Kew; and Sanger. 

Excerpts from Erik Stokstad, Researchers launch plan to sequence 66,000 species in the United Kingdom. But that’s just a start, Science, Nov. 1, 2018

Eradicate Mosquitoes Forever: Gene Drives

The mosquitoes are being fitted with a piece of dna called a gene drive. Unlike the genes introduced into run-of-the-mill genetically modified organisms, gene drives do not just sit still once inserted into a chromosome. They actively spread themselves, thereby reaching more and more of the population with each generation. If their effect is damaging, they could in principle wipe out whole species.. If gene drives were to condemn to a similar fate the mosquitoes that spread malaria, a second of humankind’s great scourges might be consigned to history.

Gene drives can in principle be used against any creatures which reproduce sexually with short generations and aren’t too rooted to a single spot. The insects that spread leishmaniasis, Chagas disease, dengue fever, chikungunya, trypanosomiasis and Zika could all be potential targets. So could creatures which harm only humankind’s dominion, not people themselves. Biologists at the University of California, San Diego, have developed a gene-drive system for Drosophila suzukii, an Asian fruitfly which, as an invasive species, damages berry and fruit crops in America and Europe. Island Conservation, an international environmental ngo, thinks gene drives could offer a humane and effective way of reversing the damage done by invasive species such as rats and stoats to native ecosystems in New Zealand and Hawaii.

Such critics fear that the laudable aim of vastly reducing deaths from malaria—which the World Health Organisation puts at 445,000 a year, most of them children—will open the door to the use of gene drives for far less clear-cut benefits in ways that will entrench some interests, such as those of industrial farmers, at the expense of others. They also point to possible military applications: gene drives could in principle make creatures that used not to spread disease more dangerous… The ability to remove species by fiat—in effect, to get them to remove themselves—is, like the prospect of making new species from scratch, a power that goes beyond the past ambit of humankind.

Gene drives based on crispr-Cas9 could easily be engineered to target specific bits of the chromosome and insert themselves seamlessly into the gap, thus ensuring that every gamete gets a copy . By 2016, gene drives had been created in yeast, fruitflies and two species of mosquito. In work published in the journal Nature Biotechnology in September, Andrea Crisanti, Mr Burt and colleagues at Imperial showed that one of their gene drives could drive a small, caged population of the mosquito Anopheles gambiae to extinction—the first time a gene drive had shown itself capable of doing this. The next step is to try this in a larger caged population.

There are also worries about how gene drives might be used to create a weapon. …The need to find ways to guard against such attacks is one of the reasons that the Pentagon’s Defence Advanced Research Projects Agency (darpa) gives for its work on gene drives. Renee Wegrzyn, programme manager for darpa’s “Safe Genes” project, says the work is to prevent “technological surprise”, whether in the form of an unintended consequence or nefarious use. One of the academic teams she funds has made progress in developing anti-crispr enzyme systems that one day might be able to inhibit a drive’s operation.

Oversight needs not just to bring together a range of government agencies; it requires co-operation between governments, too. The Cartagena Protocol on Biosafety, which entered into force under the un Convention on Biological Diversity (cbd) in 2003, provides controls on the transfer of genetically modified organisms. But how it applies to gene drives is unclear—and besides, America has never ratified the convention. An attempt to ban gene-drive research through the cbd, which was backed by the etc Group and other ngos, failed at the convention’s biennial meeting in Cancún in 2016…Like the reintroduction of vanished species advocated by the rewilding movement, gene-drive technology will provide new arenas for the fight between those who wish to defend nature and those who wish to tame it.

Excerpts from Gene Drives: Extinction on Demand, Economist, Nov. 10, 2018, at 24

Meddling with Nature: Is it Right? Is it Fair?

Many envisioned environmental applications of newly developed gene-editing techniques such as CRISPR might provide profound benefits for ecosystems and society. But depending on the type and scale of the edit, gene-edited organisms intentionally released into the environment could also deliver off-target mutations, evolutionary resistance, ecological disturbance, and extinctions. Hence, there are ongoing conversations about the responsible application of CRISPR, especially relative to the limitations of current global governance structures to safeguard its use,   Largely missing from these conversations is attention to local communities in decision-making. Most policy discussions are instead occurring at the national or international level even though local communities will be the first to feel the context-dependent impacts of any release. ..

CRISPR gene editing and other related genetic technologies are groundbreaking in their ability to precisely and inexpensively alter the genome of any species. CRISPR-based gene drives hold particular import because they are designed to rapidly spread genetic changes—including detrimental traits such as infertility—through populations of sexually reproducing organisms, to potentially reach every member of a species. Villages in Burkina Faso are weighing the release of gene drive–bearing mosquitoes that could suppress malaria. Nantucket Island residents in the United States are considering the release of genetically engineered white-footed mice to deplete Lyme disease reservoirs. New Zealand communities are discussing the possibility of using genetic methods to eliminate exotic predators.

But what if a gene drive designed to suppress an invasive species escaped its release site and spread to a native population? Or if a coral species gene edited to better adapt to environmental stressors dominated reef ecosystems at the expense of a diversity of naturally evolving coral species and the fish that depend on them ? The gravity of these potential outcomes begs the question: Should humans even be meddling with the DNA of wild organisms? The absence of generally agreed on answers can be used to support calls for moratoria on developing and releasing genetically altered organisms, especially those with gene drives (6).

However, the promising benefits of environmental gene editing cannot be dismissed. Gene drives may provide a long-sought-after tool to control vectors of infectious disease and save millions of human lives. Projects to conserve ecosystems or promote species resilience are often intended to repair human-inflicted environmental damage. Put simply, either using this technology irresponsibly or not using it at all could prove damaging to humans, our welfare, and our planet.

At the international level, the Convention on Biological Diversity (CBD) has enlisted an expert technical panel to, in part, update its Cartagena Protocol (of which the United States is not a party) that oversees transboundary transport of living modified organisms to accommodate gene drive–bearing organisms. The International Union for the Conservation of Nature (IUCN) is also developing policy to address the release of gene-edited organisms. Although the CBD and the IUCN offer fora to engage diverse public feedback, a role largely fulfilled by civil society groups, none of these agencies currently use the broad and open deliberative process we advocate….

Different societal views about the human relationship to nature will therefore shape decision-making. Local community knowledge and perspectives must therefore be engaged to address these context-dependent, value-based considerations.  A special emphasis on local communities is also a matter of justice because the first and most closely affected individuals deserve a strong voice in the decision-making process…Compounding this challenge is that these decisions cannot be made in isolation. Organisms released into local environments may cross regional and even international borders. Hence, respect for and consideration of local knowledge and value systems are necessary, but insufficient, to anticipate the potentially ramifying global implications of environmental release of gene-edited organisms. What is needed is an approach that places great weight on local perspectives within a larger global vision…

The needs of ecosystems could also be given voice to inform deliberative outcomes through custodial human proxies. Inspired by legislative precedent set by New Zealand, in which the Whanganui River was granted legal “personhood,” human representatives, nominated by both an international body like the IUCN and the local community, would be responsible for upholding the health and interests of the ecosystems in question. Proposed gene-editing strategies would be placed in the larger context of alternative approaches to address the public health or environmental issue in question…

An online registry for all projects intending to release genetically engineered organisms into the environment must be created. Currently, no central database exists for environmental gene-editing applications or for decision-making outcomes associated with their deployment, and this potentially puts the global community at risk…A global coordination task force would be charged with coordinating multiple communities, nations, and regions to ensure successful deliberative outcomes. As a hypothetical example, genetic strategies to eliminate invasive possums from New Zealand must include representatives from Australia, the country likely to be affected should animals be transported outside the intended range. Similarly, the African Union is currently deliberating appropriate governance of gene drive–bearing mosquitoes to combat malaria on a regional scale. 

Excerpts from Natalie Kofl et al.,  Editing nature: Local roots of global governance, Science Magazine, Nov. 2, 2018

De-Extinction: Bring Back the Passenger Pigeon

The Crispr-Cas9 system consists of two main parts: an RNA guide, which scientists program to target specific locations on a genome, and the Cas9 protein, which acts as molecular scissors. The cuts trigger repairs, allowing scientists to edit DNA in the process. Think of Crispr as a cut-and-paste tool that can add or delete genetic information. Crispr can also edit the DNA of sperm, eggs and embryos—implementing changes that will be passed down to future generations. Proponents say it offers unprecedented power to direct the evolution of species.

The technology is widely used in animals. Crispr has produced disease-resistant chickens and hornless dairy cattle. Scientists around the world routinely edit the genes in mice for research, adding mutations for human diseases such as autism and Alzheimer’s in a search of possible cures. Crispr-edited pigs contain kidneys that scientists hope to test as transplants in humans.  Crispr has been discussed as a de-extinction tool since its earliest days. In March 2013 the conservation group Revive & Restore co-organized the first TedXDeExtinction conference in Washington, D.C. Revive & Restore was co-founded by Stewart Brand, the creator of the counterculture Whole Earth Catalog and a vocal advocate for a passenger pigeon revival.

The last known passenger pigeon—a bird named Martha—died in captivity at a Cincinnati zoo in 1914….The first step was to sequence the passenger pigeon genome…Sequencing an extinct species’ genome is no easy task. When an organism dies, the DNA in its cells begins to degrade, leaving scientists with what Shapiro describes as “a soup of trillions of tiny fragments” that require reassembly. For the passenger pigeon project, Shapiro and her team took tissue samples from the toe pads of stuffed birds in museum collections. DNA in the dead tissue left them with tantalizing clues but an incomplete picture. To fill in the gaps, they sequenced the genome of the band-tailed pigeon, the passenger pigeon’s closest living relative.

By comparing the genomes of the two birds, researchers began to understand which traits distinguished the passenger pigeon. In a paper published last year in “Science,” they reported finding 32 genes that made the species unique. Some of these allowed the birds to withstand stress and disease, essential traits for a species that lived in large flocks. They found no genes that might have led to extinction. “Passenger pigeons went extinct because people hunted them to death,” Shapiro says

.Revived passenger pigeons could also face re-extinction. The species thrived in the years before European settlement of North America, when vast forests supported billions of birds. Those forests have since been replaced by cities and farmland. “The habitat the passenger pigeons need to survive is also extinct,” Shapiro says.  But what does it mean to bring an extinct species back? Andre E.R. Soares, a scientist who helped sequence the passenger pigeon genome, says most people will accept a lookalike as proof of de-extinction. “If it looks like a passenger pigeon and flies like a passenger pigeon, if it has the same shape and color, they will consider it a passenger pigeon,” Soares says.

Shapiro says that’s not enough. Eventually, she says, gene-editing tools may be able to create a genetic copy of an extinct species, “but that doesn’t mean you are going to end up with an animal that behaves like a passenger pigeon or a woolly mammoth.” We can understand the nature of an extinct species through its genome, but nurture is another matter. 

After he determines how passenger pigeon DNA manifests in the rock pigeons, Novak hopes to edit the band-tailed pigeon, the passenger pigeon’s closest living relative, with as many of the extinct bird’s defining traits as possible. Eventually, he says, he’ll have a hybrid creature that looks and acts like a passenger pigeon (albeit with no parental training) but still contains band-tailed pigeon DNA. These new-old birds will need a name, which their human creator has already chosen: Patagioenas neoectopistes, or “new wandering pigeon of America.”

Excerpts from Amy Dockser Marcus, Meet the Scientists Bringing Extinct Species Back From the Dead, WSJ, the Future of Everything, Oct. 11, 2018

The Internet Was Never Open

Rarely has a manifesto been so wrong. “A Declaration of the Independence of Cyberspace”, written 20 years ago by John Perry Barlow, a digital civil-libertarian, begins thus: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

At the turn of the century, it seemed as though this techno-Utopian vision of the world could indeed be a reality. It didn’t last… Autocratic governments around the world…have invested in online-surveillance gear. Filtering systems restrict access: to porn in Britain, to Facebook and Google in China, to dissent in Russia.

Competing operating systems and networks offer inducements to keep their users within the fold, consolidating their power. Their algorithms personalise the web so that no two people get the same search results or social media feeds, betraying the idea of a digital commons. Five companies account for nearly two-thirds of revenue from advertising, the dominant business model of the web.

The open internet accounts for barely 20% of the entire web. The rest of it is hidden away in unsearchable “walled gardens” such as Facebook, whose algorithms are opaque, or on the “dark web”, a shady parallel world wide web. Data gathered from the activities of internet users are being concentrated in fewer hands. And big hands they are too. BCG, a consultancy, reckons that the internet will account for 5.3% of GDP of the world’s 20 big economies this year, or $4.2 trillion.

How did this come to pass? The simple reply is that the free, open, democratic internet dreamed up by the optimists of Silicon Valley was never more than a brief interlude. The more nuanced answer is that the open internet never really existed.

[T]e internet, it was developed “by the US military to serve US military purposes”… The decentralised, packet-based system of communication that forms the basis of the internet originated in America’s need to withstand a massive attack on its soil. Even the much-ballyhooed Silicon Valley model of venture capital as a way to place bets on risky new businesses has military origins.

In the 1980s the American military began to lose interest in the internet…. The time had come for the hackers and geeks who had been experimenting with early computers and phone lines.  Today they are the giants. Google, Apple, Facebook, Amazon and Microsoft—together with some telecoms operators—help set policy in Europe and America on everything from privacy rights and copyright law to child protection and national security. As these companies grow more powerful, the state is pushing back…

The other big risk is that the tension between states and companies resolves into a symbiotic relationship. A leaked e-mail shows a Google executive communicating with Hillary Clinton’s state department about an online tool that would be “important in encouraging more [Syrians] to defect and giving confidence to the opposition.”+++ If technology firms with global reach quietly promote the foreign-policy interests of one country, that can only increase suspicion and accelerate the fracturing of the web into regional internets….

Mr Malcomson describes the internet as a “global private marketplace built on a government platform, not unlike the global airport system”.

Excerpts from Evolution of the internet: Growing up, Economist, Mar. 26, 2016

+++The email said Google would be “partnering with Al Jazeera” who would take “primary ownership” of the tool, maintaining it and publicizing it in Syria.  It was eventually published by Al Jazeera in English and Arabic.

Cleaning Up Dirty Shipping

Making shipping cleaner is made more urgent by the decision of the International Maritime Organisation (IMO), the United Nations body responsible for the world’s shipping, to reduce the amount of sulphur allowed in bunker fuel from 3.5% to 0.5% by 2020. Sulphur is nasty stuff. When burned, it forms sulphates, which cause acid rain and pollute the air. A paper published in February 2017 in Nature Communications, by Mikhail Sofiev of the Finnish Meteorological Institute, found that the imo’s new rule could stop between 139,000 and 396,000 premature deaths a year.

The trouble is that sulphates also scatter sunlight and help to form and thicken clouds, which reflect solar radiation away from Earth. As a result, shipping is thought to reduce rather than increase man-made global warming—by 7% throughout the 20th century, according to one study. Dr Sofiev’s research showed that this cooling effect could fall by 80% after 2020, with the new low-sulphur standard in place…

The obvious way to offset the loss of sulphur-related cooling is by steep cuts to shipping’s planet-cooking carbon-dioxide emissions. The IMO wants these to fall by half, compared with 2008 levels, by 2050, regardless of how many vessels then ply the seas. But unlike desulphurisation, which is both imminent and legally binding, the CO2 target looks fuzzy and lacks any enforcement mechanism. An attempt to begin fleshing it out, at a meeting of  IMO member states which concluded in London on October 26, 2018 foundered.

One way to cut fuel consumption is to reduce drag by redesigning hulls and propellers. This is happening. In the past five or so years many ships’ propellers have been fitted with tip fins analogous to the turbulence-reducing upturned winglets on aeroplanes.  Further percentage points can be shaved away by smoothing hulls. This means, in particular, stopping barnacles and other creatures growing on them. Tin-based antifouling paints are now banned as toxic to sea life, so paintmakers are returning to an 18th-century solution to the fouling problem—copper.   Hulls can be scraped smooth, too, but restrictions on littering waters with paint chips and species from foreign parts have made such cleaning problematic. This may change, though, thanks to an underwater drone described by its Norwegian maker, ecosubsea, as “a cross between a vacuum cleaner and a lawnmower”. Rather than scour hulls with a metal brush, ecosubsea’s robots blast water at an angle almost parallel with the hull’s surface, which mostly spares paint from abrasion but hits marine growth perpendicularly, and thus hard. 

Many have hopes of returning to wind propulsion, and engineers have devised various modern versions of the sail. None has yet succeeded. A system developed by SkySails, a firm in Hamburg, for example, relied on kites to pull ships along. It was installed on five ships from 2008-11, but proved fiddly to use and maintain…

Some hope to cut marine emissions by employing batteries and electric motors. For transoceanic shipping this looks a long-shot. But local shipping might benefit. Norway, for instance, has started to introduce battery-powered ferries. And a Dutch company called Port-Liner is building electric canal barges for transporting shipping containers. The technology is expensive. Without taxpayer subsidy it would hardly be a runner—a fact also true of the Norwegian ferries.

The problem of shifting emissions around rather than eliminating them also applies to the idea of powering ocean-going vessels using fuel-cells. These generate electricity by reacting hydrogen and oxygen together. Given that electric propulsion more usually disguises emissions than eliminates them, some suggest the most practical approach to reducing shipping’s contribution to global warming is to switch to low-carbon fuel systems rather than conducting a futile search for no-carbon fuels. One alternative is diesel-electric propulsion.  Liquefied natural gas (lng) is another option. 

Excerpts  from Marine Technology of the Future: In Need for a Cean Up, Economist,  Nov. 3, 2018, at 75

Sucking the Life out of Deep Sea

Those involved in deep-sea mining hope it will turn into a multi-billion dollar industry. Seabed nodules are dominated by compounds of iron (which is commonplace) and manganese (which is rarer, but not in short supply from mines on dry land). However, the nodules also contain copper, nickel and cobalt, and sometimes other metals such as molybdenum and vanadium. These are in sufficient demand that visiting the bottom of the ocean to acquire them looks a worthwhile enterprise. Moreover, these metals seldom co-occur in terrestrial mines. So, as Kris Van Nijen, who runs deep-sea mining operations at Global Sea Mineral Resources (gsr), a company interested in exploiting the nodules, observes: “For the same amount of effort, you get the same metals as two or three mines on land.”

Though their location several kilometres beneath the ocean surface makes the nodules hard to get at in one sense, in another they are easily accessible, because they sit invitingly on the seabed, almost begging to be collected. Most are found on parts of the ocean floor like the Clarion Clipperton Zone (ccz), outside the 200-nautical-mile exclusive economic zones of littoral countries. They thus fall under the purview of the International Seabed Authority (isa), which has issued 17 exploration licences for such resources. All but one of these licences pertain to the ccz, an area of about 6m square kilometres east-south-east of Hawaii.

The licensees include Belgium, Britain, China, France, Germany, India, Japan, Russia, Singapore and South Korea, as well as several small Pacific island states. America, which is not party to the United Nations Convention on the Law of the Sea that established the isa, is not involved directly, but at least one American firm, Lockheed Martin, has an interest in the matter through a British subsidiary, uk Seabed Resources. And people are getting busy. Surveying expeditions have already visited the concessions. On land, the required mining machines are being built and tested. What worries biologists is that if all this busyness does lead to mining, it will wreck habitats before they can be properly catalogued, let alone understood.

 Some of the ccz’s creatures stretch the imagination. There is the bizarre, gelatinous, yellow “gummy squirrel”, a 50cm-long sea cucumber with a tall, wide tail that may operate like a sail. There are galloping sea urchins that can scurry across the sea floor on long spines, at speeds of several centimetres a second. There are giant red shrimps, measuring up to 40cm long. And there are “Dumbo” octopuses, which have earlike fins above their eyes, giving them an eerie resemblance to a well-known cartoon elephant…Of 154 species of bristle worms the surveyors found, 70% were previously unknown. 

the Whale fossils, sea cucumbers and shrimps are just the stuff that is visible to the naked eye. Adrian Glover, one of Dr Amon’s colleagues at the Natural History Museum, and his collaborators spent weeks peering down microscopes, inspecting every nook and cranny of the surfaces of some of the nodules themselves. They discovered a miniature ecosystem composed of things that look, at first sight, like flecks of colour—but are, in fact, tiny corals, sponges, fan-like worms and bryozoans, all just millimetres tall. In total, the team logged 77 species of such creatures, probably an underestimate.

Inevitably, much of this life will be damaged by nodule mining. The impacts are likely be long-lasting. Deep-sea mining technology is still in development, but the general idea is that submersible craft equipped with giant vacuum cleaners will suck nodules from the seafloor. Those nodules will be carried up several kilometres of pipes back to the operations’ mother ships, to be washed and sent on their way.

The largest disturbance experiment so far was carried out in 1989 in the Peru Basin, a nodule field to the south of the Galapagos Islands. An eight-metre-wide metal frame fitted with ploughs and harrows was dragged back and forth repeatedly across the seabed, scouring it and wafting a plume of sediment into the water…. The big question was, 26 years after the event, would the sea floor have recovered? The answer was a resounding “no”. The robots brought back images of plough tracks that looked fresh, and of wildlife that had not recovered from the decades-old intrusion.

Conservation and seabed minerals: Mining the deep ocean will soon begin, Economist, Nov. 10, 2018

How to Stop the Expoitation of Internet Users

Data breaches at Facebook and Google—and along with Amazon, those firms’ online dominance—crest a growing wave of anxiety around the internet’s evolving structure and its impact on humanity…The runaway success of a few startups has created new, proprietized one-stop platforms. Many people are not really using the web at all, but rather flitting among a small handful of totalizing apps like Facebook and Google. And those application-layer providers have dabbled in providing physical-layer internet access. Facebook’s Free Basics program has been one of several experiments that use broadband data cap exceptions to promote some sites and services over others.

What to do? Columbia University law professor Tim Wu has called upon regulators to break up giants like Facebook, but more subtle interventions should be tried first…Firms that do leverage users’ data should be “information fiduciaries,” obliged to use what they learn in ways that reflect a loyalty to users’ interests…The internet was designed to be resilient and flexible, without need for drastic intervention. But its trends toward centralization, and exploitation of its users, call for action

Excerpts from Jonathan Zittrain, Fixing the internet, Science, Nov. 23, 2018

The Underground Nuclear Tank Farms: Hanford

After spending billions of dollars over several decades to remove radioactive waste leaking from a plant where nuclear bombs were made, the U.S. Department of Energy has come up with a new plan: leave it in the ground.  The shuttered Hanford Nuclear Reservation, which produced plutonium for U.S. atomic weapons from World War II through the Cold War, is the nation’s largest nuclear cleanup site with about 56 million gallons of waste stored in leak-prone underground tanks in south-central Washington State.  The Energy Department has proposed to effectively reclassify the sludge left in 16 nearly empty underground tanks from “high-level” to “low-level” radioactive waste. The re-classification would allow the department to fill the tanks with grout, cover them with an unspecified “surface barrier,” and leave them in place.

But environmental groups and others say the plan amounts to a semantic sleight of hand that will leave as much as 70,000 gallons of remaining nuclear sludge — some of which could be radioactive for millions of years — in the ground…

The cleanup operations at Hanford are projected to cost more than $100 billion, and the Energy Department has already spent more than $19 billion, according to the Government Accountability Office. The reclassification could save the department billions of dollars. It would also open the door to doing the same for all 177 tanks on the sprawling 586-square-mile reservation.

The Columbia River borders the Hanford land for almost 50 miles and some of the tanks are as close as five miles (eight kilometers) to the river, the largest in the Pacific Northwest and the source of irrigation for agriculture and drinking water for downstream citiesions.

Opponents include the Yakama Nation, whose reservation is located 20 miles west of the Hanford site and that has treaty rights to the Chinook salmon that spawn in the Columbia River. The nation wrote in comments to the agency that leaving the waste in unstable shallow land is “simply bad policy.”

Excerpts from Ari Natter, Plan to Leave Buried Nuclear Bomb Waste Underground Draws Fire, Bloomberg, Nov. 29, 2018

American Oligarchs

Warren Buffett, the 21st century’s best-known investor, extols firms that have a “moat” around them—a barrier that offers stability and pricing power.One way American firms have improved their moats in recent times is through creeping consolidation. The Economist has divided the economy into 900-odd sectors covered by America’s five-yearly economic census. Two-thirds of them became more concentrated between 1997 and 2012 (see charts 2 and 3). The weighted average share of the top four firms in each sector has risen from 26% to 32%…

These data make it possible to distinguish between sectors of the economy that are fragmented, concentrated or oligopolistic, and to look at how revenues have fared in each case. Revenues in fragmented industries—those in which the biggest four firms together control less than a third of the market—dropped from 72% of the total in 1997 to 58% in 2012. Concentrated industries, in which the top four firms control between a third and two-thirds of the market, have seen their share of revenues rise from 24% to 33%. And just under a tenth of the activity takes place in industries in which the top four firms control two-thirds or more of sales. This oligopolistic corner of the economy includes niche concerns—dog food, batteries and coffins—but also telecoms, pharmacies and credit cards.

The ability of big firms to influence and navigate an ever-expanding rule book may explain why the rate of small-company creation in America is close to its lowest mark since the 1970s … Small firms normally lack both the working capital needed to deal with red tape and long court cases, and the lobbying power that would bend rules to their purposes….

Another factor that may have made profits stickier is the growing clout of giant institutional shareholders such as BlackRock, State Street and Capital Group. Together they own 10-20% of most American companies, including ones that compete with each other. Claims that they rig things seem far-fetched, particularly since many of these funds are index trackers; their decisions as to what to buy and sell are made for them. But they may well set the tone, for example by demanding that chief executives remain disciplined about pricing and restraining investment in new capacity. The overall effect could mute competition.

The cable television industry has become more tightly controlled, and many Americans rely on a monopoly provider; prices have risen at twice the rate of inflation over the past five years. Consolidation in one of Mr Buffett’s favourite industries, railroads, has seen freight prices rise by 40% in real terms and returns on capital almost double since 2004. The proposed merger of Dow Chemical and DuPont, announced last December, illustrates the trend to concentration. //

Roughly another quarter of abnormal profits comes from the health-care industry, where a cohort of pharmaceutical and medical-equipment firms make aggregate returns on capital of 20-50%. The industry is riddled with special interests and is governed by patent rules that allow firms temporary monopolies on innovative new drugs and inventions. Much of health-care purchasing in America is ultimately controlled by insurance firms. Four of the largest, Anthem, Cigna, Aetna and Humana, are planning to merge into two larger firms.

The rest of the abnormal profits are to be found in the technology sector, where firms such as Google and Facebook enjoy market shares of 40% or more

But many of these arguments can be spun the other way. Alphabet, Facebook and Amazon are not being valued by investors as if they are high risk, but as if their market shares are sustainable and their network effects and accumulation of data will eventually allow them to reap monopoly-style profits. (Alphabet is now among the biggest lobbyists of any firm, spending $17m last year.)…

Perhaps antitrust regulators will act, forcing profits down. The relevant responsibilities are mostly divided between the Department of Justice (DoJ) and the Federal Trade Commission (FTC), although some …[But]Lots of important subjects are beyond their purview. They cannot consider whether the length and security of patents is excessive in an age when intellectual property is so important. They may not dwell deeply on whether the business model of large technology platforms such as Google has a long-term dependence on the monopoly rents that could come from its vast and irreproducible stash of data. They can only touch upon whether outlandishly large institutional shareholders with positions in almost all firms can implicitly guide them not to compete head on; or on why small firms seem to be struggling. Their purpose is to police illegal conduct, not reimagine the world. They lack scope.

Nowhere has the alternative approach been articulated. It would aim to unleash a burst of competition to shake up the comfortable incumbents of America Inc. It would involve a serious effort to remove the red tape and occupational-licensing schemes that strangle small businesses and deter new entrants. It would examine a loosening of the rules that give too much protection to some intellectual-property rights. It would involve more active, albeit cruder, antitrust actions. It would start a more serious conversation about whether it makes sense to have most of the country’s data in the hands of a few very large firms. It would revisit the entire issue of corporate lobbying, which has become a key mechanism by which incumbent firms protect themselves.

Excerpts from Too Much of a Good Thing, Economist, Mar. 26, 2016, at 23

Scattered Nuclear Waste: 88 000MT, 33 States, 75 Plants

The broad coalition of labor unions, state public service commissioners, clean energy organizations, and energy trade associations told U.S. House and Senate leaders in a December 4, 2018 letter: “It is time for the federal government to meet its statutory and contractual obligations. Utilities and their electricity customers have done their part.”

The letter notes that the Nuclear Waste Fund—a U.S. Treasury account collected via a fee charged to electric ratepayers over 30 years—today holds a balance of more than $40 billion. The fund is mostly unused, owing to paralysis of the Yucca Mountain project, and it continues to accumulate interest of about $1.7 billion a year from investments in Treasury securities.

About $7.4 billion in damages have now also been paid out from the Treasury’s Judgment Fund to utilities, which have filed lawsuits against the Department of Energy (DOE) since 2000, seeking compensation for defaulting on a standard contract and missing the deadline to begin disposing of highly radioactive spent nuclear fuel as required by the Nuclear Waste Policy Act of 1982. To date, 40 suits have been settled and an additional 57 cases have been resolved, a November 2018 special report from the DOE’s Office of Inspector General noted.

The coalition includes major industry trade groups the Nuclear Energy Institute (NEI), the American Public Power Association, the National Rural Electric Cooperative Association, and the Edison Electric Institute—along with the National Association of Regulatory Utility Commissioners, which is a group of state regulators….According to the NEI, the inventory of used fuel in temporary storage at 75 reactor sites scattered across 33 states has now grown to more than 80,000 metric tons.

Exceprts from Sonal Patel, Industry Groups to Congress: Inaction on Nuclear Waste Not an Option, Power Magazine,  Dec. 6, 2018

Nuclear Robots

Robots have been used in nuclear facilities for a long time.Scenarios such as maintenance tasks in nuclear facilities or even disasters like radioactive leaks or search and rescue operations have proven to be quite successful. We are talking about robotic  arms or remote operated vehicles with some end effectors built in to handle dangerous situations.”

1986: Chernobyl’s robot trouble–During the Chernobyl nuclear incident, the Soviet authorities in charge of cleaning up nuclear waste developed around 60 unique remote-controlled robots to spare human workers from radioactive exposure. The total cost of the clean-up operation was $2bn.  Designs included the STR-1 robot, which resembles a moon buggy. It was placed on the roof of the nuclear plant and used to clean upparts of the destroyed reactor. Another design for the purpose of debris cleaning was the Mobot, developed by Moscow State University. It was a smaller version of a loader used in construction, with a front-end bucket used to  scoop up debris.

The problem was that cleaning up nuclear waste required more skills than the robots could provide, eventually resulting in the authorities sending in soldiers to perform most of the decontamination works. Radiation was so high that each worker could only spend 40 seconds inside or near the facility; 31 died from exposure, while 237 suffered from acute radiation sickness.

2008: Cleaning up nuclear waste at Hanford Nuclear Reservation. The Hanford Nuclear Reservation in the US has been somewhat of a hub for nuclear waste innovation. This is because scientists, and their robot friends, are faced with the task of emptying nuclear and chemical waste tanks the size of around 150 basketball courts before the waste reaches the Columbia River. Exposure to the material would kill a human instantly.

Luckily, Hanford has developed a few automated machines thatare specifically designed for different parts of the job. Take Foldtrack, for example, which can access the tanks through one-foot-wide pipes in the roof bysplitting into a string of pieces, and then rebuilding itself like a Transformer once inside. The remote-controlled robot uses a 3,000psi water cannon to blast nuclear sludge off the walls of the tank and pump it out. Upon completion, scientists are forced to leave the $500,000 robot in the tank due to the high levels of contamination.

Another robot, the Sand Mantis, looks like a fire hose on wheels. However, it comes packed with power, with the ability to blast tough toxic salts that build up in waste tanks with its 35,000psi water cannon. For comparison, a regular firehose has around 300psi of pressure. In order to support the huge power, the orifice of the hose is made of gems, such as sapphires, which can withstand the pressure….Finally, the Tandem Synthetic Aperture Focusing Technique,or Tank Crawler, locates cracks or corrosion in Hanford’s waste storage tanks using ultrasonic and electrical conductivity sensors.

2011: Fukushima—Previously designed robots failed to visually inspect the reactor, either breaking due to high radiation or by getting stuck in the confined spaces. That was until Toshiba’s senior scientist in its technology division, Kenji Matsuzaki, developed the Little Sunfish – an amphibious bread loaf-sized robot that could slip into the 5.5-inch reactor pipelines.

In 2017,  the Sellafield nuclear site in the UK, scientists have been working on methods to clean up the vast amounts of nuclear sludge from its First-Generation Magnox Storage Pond, as part of decommissioning efforts said to cost around £1.9bn each year. The size of two Olympic swimming pools, the storage pond contains large amounts of nuclear sludge from decaying fuel rods stored below the surface.  While robots have been designed to reach the depths of the pond and remove nuclear waste, none proved to be very successful, until Cthulhu– Collaborative Technology Hardened for Underwater and Littoral Hazardous Environment.  Cthulhu is a tracked robot that can move along the bottom ofthe storage pond, using whisker-like sensors and sonar to identify and retrieve the nuclear rods.

2018:  The South West Nuclear Hub at the University of Bristol inthe UK is collaborating with Sellafield to develop a nuclear waste robotic suit for humans, taking inspiration from the comic book hero Iron Man.

Excepts from Cherno-bots to Iron Man suits: the development of nuclear waste robotics,, Power-Technology. com, Dec. 4, 2018

Where to Go? Plutonium from Nuclear Weapons

The lack of space at the federal government’s only underground nuclear waste repository is among several challenges identified by the National Academy of Sciences who is looking at the viability of disposing tons of weapons-grade plutonium.  The National Academies of Sciences, Engineering, and Medicine released a preliminary report on the U.S. government’s plan, which calls for diluting 34 metric tons of plutonium and shipping it to the Waste Isolation Pilot Plant (WIPP) in southern New Mexico.

The disposal of plutonium has to do with a  pact signed between the United States and Russia. That pact was based on a proposal for turning the surplus plutonium into fuel that could be used for commercial nuclear reactors. That project, beset by years of delays and cost overruns, was cancelled in early 2018.

If the plan were to be approved, the Energy Department has estimated that it would take 31 years to dilute and dispose of all 34 metric tons. The work would involve four sites around the U.S. — the Pantex Plant in West Texas, the Savannah River Site in South Carolina, Los Alamos National Laboratory in northern New Mexico and the Waste Isolation Pilot Plant.

The panel of scientists found that the agency doesn’t have a well-developed plan for reaching out to those host sites and stressed that public trust would have to be developed and maintained over the life of the project.

Excerpts from Scientists: Capacity at US nuclear waste dump a challenge, Associated Press, Nov. 30, 2018

The Sanctions Busters: Germany and France

The steps by Europe’s most powerful countries are part of their campaign to salvage the 2015 Iran nuclear deal after President Trump withdrew the U.S. in May. Their goal is to help European companies continue some business activity with Iran despite sweeping new U.S. sanctions on the country and any company that does business with it.

France or Germany will host the corporation that would handle the payments channel, the diplomats said. If France hosts it, a German official will head the corporation and vice versa. Both countries will help fund the corporation.  The payments channel, known as a special purpose vehicle, or SPV, would use a system of credits to facilitate compensation for goods traded between Iran and Europe—allowing some trade to proceed without the need for European commercial banks to make or receive payments to Iran.

U.S. pressure on Austria and Luxembourg recently prompted those countries to reject European Union requests to host it, raising the prospect that the initiative might collapse, the diplomats said.  The company would be owned directly by participating European governments—an arrangement intended to dissuade the U.S. from directly targeting it with sanctions, diplomats said.

Laurence Norman , France and Germany Step In to Circumvent Iran Sanctions, WSJ, Nov. 26, 2018

Preserving Snow Leopard for Eternity

The breeding of the highly-endangered snow leopard in the Himalayan nature park Himachal Pradesh resort (India) is set to begin with zoo authorities in Darjeeling agreeing to lend it a pair.  “The Padmaja Naidu Himalayan Zoological Park in Darjeeling is providing us a pair of snow leopards for conserving bloodlines of the highly endangered species in the participatory zoos,” state Chief Wildlife Warden S.S. Negi told IANS….

In 2004, snow leopard Subhash and his sibling Sapna were brought to Kufri, 15 km from the state capital Shimla, from Darjeeling under an exchange programme.Officials said the breeding programme couldn’t be initiated as they belonged to the same bloodline. Sapna died of disease in 2007…

The Darjeeling zoo is internationally recognised for its 33-year-old conservation breeding programme for the snow leopard, with 56 births.

Forest Minister Thakur Singh Bharmouri said the central government-funded Snow Leopard Conservation Project of Rs.5.15 crore ($758,000) is under way in the Spiti Valley, which lies in the state’s northernmost part and runs parallel to Tibet.The programme would take care of restoring the snow leopard’s habitat, he said. Studies by the state wildlife department show the presence of seven to eight snow leopards per 100 sq km in the Spiti Valley.The department is already monitoring the habitat, range and behaviour of snow leopards in the Valley through camera traps (automatic cameras).As per the information gleaned from these devices, the snow leopard population is estimated to be 28 in Spiti and its nearby areas, and 29 in the rest of the state.

“We will soon start radio-collaring five to six snow leopards in Spiti and other areas to monitor their behaviour and, of course, habitat and range,” an official of the state’s wildlife wing told IANS.  Each radio collar costs around Rs.300,000 and can send signals for at least 18 months. “But the cost of procuring data sent through radio collars is quite expensive,” he said.

The problem of starting the radio collar installations is the non-availability of tranquillising drugs in India as prescribed by our international partner, Snow Leopard Trust,

Excerpt from Himachal to begin breeding the highly-endangered snow leopards,  India Live Today, June 28, 2016

The New Oil – Lithum

As demand heats up for lithium, a group of companies are hastening efforts to shine a light into the long-opaque market for the battery material that metal-industry cheerleaders call the “new oil.” … Auto makers, battery companies, and smartphone and laptop providers have been racing to lock down supplies of lithium from major producers such as Albemarle Corp of United States, the world’s biggest miner of lithium by volume, and Chilean company Sociedad Quimica y Minera de Chile, the No. 2 producer. Some of the world’s notable lithium users include Apple Inc., Samsung Electronics Co. and TeslaInc.

The surge in demand has sparked efforts to bring transparency to prices for lithium. …Because lithium isn’t traded on any exchange—unlike gold or silver, for instance—buyers have long been at a disadvantage in negotiations with producers, according to market watchers. In opaque markets, producers often have greater access to information about fast-moving market dynamics, such as unintended mine outages or suddenly sagging demand. That is especially the case with lithium, a metal mined by a relatively small group of big suppliers in countries from Chile to Australia…Big lithium miners “may say they support transparency, but they really don’t,” said Chris Berry, founder of New York commodity consultant House Mountain Partners. “Keeping prices secret between themselves and their end users is good for them.”

Excerpts  from Scott Patterson Lithium Boom Raises Question: What Is Its Price? WSJ,  Nov. 27, 2018

Killing Machines: Tiny Spy Satellites

As long as we’ve been launching spy satellites into space, we’ve been trying to find ways to hide them from the enemy. Now, thanks to the small satellite revolution—and a growing amount of space junk—America has a new way to mask its spying in orbit…

The National Reconnaissance Office, the operator of many of the U.S.’s spy sats, refused to answer any questions about ways to hide small satellites in orbit.  In 2014, Russia launched a trio of communications satellites. Like any other launch, spent stages and space debris were left behind in space. Air Force Space Command dutifully catalogued them, including a nondescript piece of debris called Object 2014-28E.  Nondescript until it started to move around in space, that is. One thing about orbits; they are supposed to be predictable. When something moves in an unexpected way, the debris is not debris but a spacecraft. And this object was flying close to the spent stages, maneuvering to get closer.  This fueled speculation that the object could be a prototype kamikaze-style sat killer. Other less frantic speculation postulated that it could be used to examine other sats in orbit, either Russia’s or those operated by geopolitical foes. Either way, the lesson was learned…

Modern tracking radar is supposed to map space junk better than ever before. But small spy satellites that will hide in the cloud of space debris may go undetected, even by the most sophisticated new radar or Earth-based electronic signals snooping.

Excerpts from Joe Pappalardo, Space Junk Could Provide a Perfect Hiding Spot for Tiny Spy Satellites, Popular Mechanics, Nov. 30, 2018

Killing US Enemies: Covert Operations

The U.S. has some of the best special operations units in the world, but they can’t do everything on their own. The American military relies on allied special operators from places like Britain, Iraq, and Israel to collect intelligence and kill enemy insurgents and soldiers. Here are 6 of those special operations commands.

1. SAS and SBS (UK)
These could obviously be two separate entries, but we’re combining them here because they’re both British units that often operate side-by-side with U.S. forces, just with different missions and pedigrees. The Special Air Service (SAS) pulls from the British Army and focuses on counter-terrorism and reconnaissance. The Special Boat Service (SBS) does maritime counter-terrorism and amphibious warfare (but will absolutely stack bodies on land, too).

2. Sayeret Matkal (Israel)
Israel’s Sayeret Matkal has generated rumors and conjecture for decades, and it’s easy to see why when you look at their few public successes…. The commandos in the unit are skilled in deception, direct action, and intelligence gathering…One of their most public recent successes came when they led a daring mission to install listening devices in ISIS buildings, learning of a plan to hide bombs in the battery wells of laptops.

3. French Special Operations Command
French special operations units are even more close-mouthed than the overall specops community…

4. Kommando Spezialkräfte (Germany)
The commandos have reportedly deployed to Syria in recent years to fight ISIS.

5. Iraqi Counter Terrorism Service

6. Afghan National Army Commando Corps
It’s even capable of the rapid nighttime raids that U.S. forces became famous for when they were in the lead in that country…Afghanistan also has the Ktah Khas, a counter-terrorism unit known for daring raids like their 2016 rescue of 59 prisoners in a Taliban hideout.

Logan Nye, We Are The Mighty: 6 foreign special operations units the US relies on to collect intelligence and kill enemy insurgents, Business Insider, Nov. 30, 2018

 

Undersea Drones: Military

Currently, manipulation operations on the seabed are conducted by Remotely Operated Vehicles (ROVs) tethered to a manned surface platform and tele-operated by a human pilot. Exclusive use of ROVs, tethered to manned ships and their operators, severely limits the potential utility of robots in the marine domain, due to the limitations of ROV tether length and the impracticality of wireless communications at the bandwidths necessary to tele-operate an underwater vehicle at such distances and depths. To address these limitations, the Angler program will develop and demonstrate an underwater robotic system capable of physically manipulating objects on the sea floor for long-duration missions in restricted environments, while deprived of both GPS and human intervention

The Angler program seeks to migrate advancements from terrestrial and space-based robotics, terrestrial autonomous manipulation, and underwater sensing technologies into the realm of undersea manipulation, with specific focus on long-distance, seabed-based missions. Specifically, the program aims to discover innovative autonomous robotic solutions capable of navigating unstructured ocean depths, surveying expansive underwater regions, and physically manipulating manmade objects of interest.

Excerpts DARPA Angle Program Nov. 2018