Log in

Sustainable Solutions - Home of News, Events & Ideas for Inquisitive Green Minds
an essential roadmap for your ultimate eco-journey...
Recent Entries 
Chemist Rebecca Abergel and her colleagues have found a way to remove radioactive contaminants from the body. Now they are trying to put their solution in a pill.

After the US dropped atomic bombs on the cities of Hiroshima and Nagasaki at the end of World War Two, more than 100,000 people died — many from exposure to radiation. At high doses, radiation blasts through tissues, ruptures DNA strands and alters the rhythms of cell division. Disrupted cells cause nausea, diarrhea and fever, then dizziness, weakness and hair loss. Over time they may turn cancerous. For a person who has experienced high levels of radiation in a short period of time, from a nuclear weapon or a Chernobyl-scale nuclear power plant meltdown, treatment options are limited. Doctors can make a patient comfortable, treat burns and nausea, and try to keep the radiation from spreading. But an acute dose is usually fatal. The problem is, once radiation is in the body, it can be very hard to get out.

It may not have to be this way. Nuclear weapons and accidents commonly release actinides, a group of radioactive elements at the bottom of the periodic table. Actinides such as plutonium, uranium and curium easily lock into our bones and organs, where they can emit radiation into our bodies for decades. Chemist Rebecca Abergel and colleagues at Lawrence Berkeley National Laboratory, in California, have created molecules that bind to actinides to form large, stable complexes that are easier for the body to expel. She (TEDxAix Talk: Ambivalent radionuclides) shares more on the idea, not least that it’s been around for more than 60 years.

Hey, plutonium, you’re coming with me. Abergel’s team is developing chelators, naturally occurring molecules that can form multiple bonds to a single metal ion. Chelators are a long-established treatment for poisoning by heavy metals such as iron, arsenic and lead. (They have also been tried as risky “alternative” treatments for conditions like heart disease and autism; many doctors consider these uses of chelation to be controversial.) Abergel and her team at Berkeley have made a chelator that binds to actinides — without interfering with other metals that we need in our bodies, such as zinc or iron. The work is a continuation of research that started back in the 1950s, pretty much once people started realizing the potential havoc of nuclear warfare.

Building a chelator requires educated guessing and checking. To make their chelators, Abergel’s team worked with scaffolds inspired by the known molecular structures of iron-binding chelators in bacteria. They tinkered with parts of the scaffolds step by step, modifying properties such as the acidity and number of binding sites through a series of chemical reactions. They also tacked on additional arms. “Actinides are bigger than iron, so you need more coordinating atoms,” Abergel explains. Over the last 30 years, she and her colleagues have tested dozens of different structures. One removed 80 percent of contaminating plutonium from mice, in two days, with just a single dose.

Human trials for a radiation poisoning cure. Abergel’s team has already demonstrated the safety and efficacy of their chelators in human cell cultures and several animal models. Last year they received FDA approval for a clinical trial testing the safety of a chelator treatment in humans. This will give one dose of the chelator, first at extremely low levels and then at increasingly higher levels to healthy volunteers, to establish the safety of the chelators and pinpoint any side effects. The researchers will never do a “controlled” clinical trial on people because that would require contaminating humans with radioactive substances. Instead, they will compare their human trial data with their animal data to establish safe, effective doses in humans.

Now, to make an anti-radiation poisoning pill. It would be possible to deliver chelators via a pill or an injection, but Abergel much prefers the pill. “If there is a mass casualty where millions of people are contaminated, you don’t want to be handing out needles,” she says. “Pills are straightforward to distribute, and can easily be crushed into a powder and mixed into something like yogurt for children or elderly people.”

The imperfect science of determining dosage. Much of the work that lies ahead for Abergel is nailing down the details of how to administer such a pill. “In animal models we’ve been looking at different contamination and treatment scenarios,” she says. “If there’s a nuclear power plant accident, what happens if we give the treatment 24 hours after contamination? Two days after? Should people take one pill every day for two weeks or twice a week for one? When do we stop treatment?” Her team is also looking at uses outside large-scale catastrophes — for instance, how the drug might benefit people who are regularly exposed to small amounts of radioactivity, such as scientists, nuclear power plant workers and uranium miners. Such everyday use would be a first for Abergel: normally, the best-case scenario for her product is that it never be used at all. (Until now, her work has relied on funding from the US government, which would add any resulting decontamination drug to its stockpile for emergency use.)

Chelator as a legitimate cancer treatment? Abergel is now experimenting with using synthesized chelators as a way to introduce therapeutic actinides to the body to treat cancer. Her idea: to attach the actinide-chelator complex to another molecule, such as an antibody, that can recognize and attach to a cancer cell. To minimize damage to healthy cells, Abergel is choosing actinides that decay and exit the body quickly. Doctors could also add tracer compounds to the actinide-chelator complex, which would show as fluorescent under certain light. “So these new platforms could target the cancer cell, bring in the radionuclide that destroys everything around it, and we can also image it,” Abergel says. It is very, very early days here — don’t ask your doctor about it quite yet. But if it works, then that of course is one treatment that definitely would be used.

As pollution from burning fossil fuels continues to heat the atmosphere, the world’s glaciers are melting at an accelerating rate. Scientists widely agree that this meltwater has been a major factor in raising global sea levels about seven inches over the 20th century.

The movement of all that water is affecting the Earth’s rotation, according to a study published Friday in the journal Science.

“If you are melting glaciers from high latitudes—in Alaska, Greenland, or Iceland—you move mass away from the pole, toward the equator, which slows the Earth down,” said Jerry Mitrovica, the study’s lead author and a Harvard geophysicist who specializes in studying sea level change. “The change in the distribution of the mass from the poles to lower latitudes also causes the rotation to wobble slightly, because it’s being redistributed unequally.”

That change in rotation added a microsecond to the course of a day over the 20th century.

The study showed that “glacier melt over the 20th century would have increased the duration of a day by maybe a millisecond,” said Mitrovica. While small, the change “can tell you something about how things are melting,” he said, and is a striking example of how human activities are altering Earth’s systems.

For the first time, researchers have mathematically detected glacier meltwater’s movement from higher altitudes to the ocean basins in the speed of the Earth’s rotation.

The finding is yet more proof that greenhouse gas emissions are having an enormous impact on the Earth’s climate. “This very subtle effect, Earth’s rotation, is a way to monitor how much ice sheets are melting and adds to how we are monitoring sea level rise,” Mitrovica said.

“And more important, it shows that even these very subtle effects support the scientific consensus that we’re affecting the climate, and that effect is accelerating,” he added.

In the study, Mitrovica and his colleagues updated and corrected how scientists calculate the known impacts on the Earth’s spin and axis resulting from the end of the last ice age. In the process, they may have solved a 13-year-old conundrum in earth science called “Munk’s enigma,” which asked why the influx of water from melting glaciers wasn’t apparent in measurements of the Earth’s rotation.

“It is an important additional tool in understanding the mass balance of ice sheets, and we need as many of those as we can get,” Mitrovica said.


Related Link:
The Great Glacier Melt Spreads to Greenland’s North
Start-ups are behind the new push. Hydrogen, the universe’s most abundant element, is the fuel for any potential fusion reactor. The machine lives in a white building in an Orange County office park so uninteresting-looking that not even the person who’s supposed to be taking me there can find it. We literally drive right past it and have to double back.

Though there are a few clues if you look closely. A towering silo of liquid nitrogen out back. A shed that turns out to be full of giant flywheels for storing energy. The machine, which is the size of a small house, draws so much juice that when they turn it on they have to disconnect from the public grid and run off their own power to keep from shorting out the whole county. If you had X-ray vision you might notice that all the iron rebar in the building’s foundations has been pulled out and replaced with stainless-steel rebar, because iron is too magnetic.

The machine is a prototype fusion reactor. It is the sole product of a small, secretive company called Tri Alpha Energy, and when it or one like it is up and running, it will transform the world as completely as any technology in the past century. This will happen sooner than you think.

It’s not the world’s only fusion reactor. There are several dozen scattered around the globe in various stages of completion. Most of them are being built by universities and large corporations and national governments, with all the blinding speed, sober parsimony and nimble risk taking that that implies. The biggest one, the International Thermonuclear Experimental Reactor, or ITER, is under construction by a massive international consortium in the south of France, with a price tag of $20 billion and a projected due date of 2027. Fusion research has a reputation for consuming time, money and careers in huge quantities while producing a lot of hype and not much in the way of actual fusion. It has earned that reputation many times over.

But over the past 10 years, a new front has opened up. The same engine of raging innovation that’s been powering the rest of the high-tech economy, the startup, has taken on the problem of fusion. There is now a stealth scene of virtually unknown companies working on it, doing the kind of highly practical rapid-iteration development you can do only in the private sector. They’re not funded by cumbersome grants; the money comes from heavy-hitting investors with an appetite for risk. These are companies most people have never heard of, like General Fusion, located outside Vancouver, and Helion Energy in Redmond, Wash. Tri Alpha is so low profile, it didn’t even have a website until a few months ago. But you’ve probably heard of the people who invest in them: Bezos Expeditions, Mithril Capital Management (a.k.a. PayPal co-founder Peter Thiel), Vulcan (a.k.a. Microsoft co-founder Paul Allen), Goldman Sachs.

The endgame for these companies isn’t acquisition by Google followed by a round of appletinis. It’s an energy source so cheap and clean and plentiful that it would create an inflection point in human history, an energy singularity that would leave no industry untouched. Fusion would mean the end of fossil fuels. It would be the greatest antidote to climate change that the human race could reasonably ask for. Saving the world: that is the endgame.

Michl (you say it like Michael) Binderbauer is one of the co-founders of Tri Alpha and its current chief technology officer. He has a Ph.D. in physics from U.C. Irvine. At 46, Binderbauer is charismatic and ultra-focused: he can talk about plasma physics, lucidly and without notes, apparently indefinitely. (We took a break after two hours.) The logical force of his arguments is enhanced by his radiant self-confidence, a trait that the fusion industry seems to select for, and by his Austrian accent–he grew up there–which inevitably reminds one of the Terminator.

Binderbauer’s confidence is infectious. Tri Alpha is probably the best-funded of the private fusion companies–to date it has raised hundreds of millions, according to a source close to the company, which is a lot of money but a tiny fraction of what’s being spent on the big government-funded projects.

One of the challenges for anybody working on fusion is that people have been talking about it way too much for way too long. The theoretical underpinnings go back to the 1920s, and serious attempts to produce fusion energy on Earth have been going on since the 1940s. Fusion was already supposed to save the world 50 years ago. “All of us fantasize about such things,” Binderbauer says. “It seems like it is the answer, so when someone says anything in that field, it usually very quickly exponentiates to a message of, Progress is already almost done. It gets hyped to a level I think is very dangerous.” (That’s one reason fusion scientists don’t love talking to journalists.)

Fusion also gets mixed up, for obvious reasons, with nuclear fission, which is the kind of nuclear power we have now, though in fact they’re very different animals. Nuclear fission involves splitting atoms, big ones like uranium-235, into smaller atoms. This releases a lot of energy, but it has a lot of drawbacks too. Uranium is a scarce and finite resource, and nuclear plants are expensive and hazardous–Three Mile Island, Chernobyl, Fukushima–and produce huge quantities of toxic waste that stays hazardously radioactive for centuries.

Nuclear fusion is the reverse of nuclear fission: instead of splitting atoms, you’re squashing small ones together to form bigger ones. This releases a huge burst of power too, as a fraction of the mass of the particles involved gets converted into energy (in obedience to Einstein’s famous E=mc[superscript 2]). Fusion has a vaguely science-fictional reputation, but in fact we watch it happen all day every day: it’s what makes the sun shine. The sun is a titanic fusion reactor, constantly smooshing hydrogen nuclei together into heavier elements and sending us the by-product in the form of sunlight.

As an energy source, fusion is so perfect, it could have been made up by a child. It produces three to four times as much power as nuclear fission. Its fuel isn’t toxic, or fossil, or even particularly rare: fusion runs on common elements like hydrogen, which is in fact the most plentiful element in the universe. If something goes wrong, fusion reactors don’t melt down; they just stop. They produce little to no radioactive waste. They also produce no pollution: the by-product of fusion is helium, which we can use to inflate the balloons for the massive party we’re going to have if it ever works.

Daniel Clery puts the contrast with conventional power starkly in his excellent history of fusion, A Piece of the Sun: “A 1-GW coal-fired power station requires 10,000 tonnes of coal–100 rail wagon loads–every day. By contrast … the lithium from a single laptop battery and the deuterium from 45 liters of water could generate enough electricity using fusion to supply an average U.K. consumer’s energy needs for 30 years.”

The running joke about fusion energy is that it’s 30 years away and always will be. It’s not a very funny joke, but historically it’s always been true.

What makes fusion hard is that atomic nuclei don’t particularly want to fuse. Atomic nuclei are composed of protons (and usually neutrons), so they’re positively charged, and as we know from magnets, things with the same charge repel each other. You have to force the atoms together, and to do that you have to heat them up to the point where they’re moving so fast that they shake off their electrons and become a weird cloud of free-range electrons and naked nuclei called a plasma. If you get the plasma really hot, and/or smoosh it hard enough, some of the nuclei bang into each other hard enough to fuse.

The heat and pressure necessary are extreme. Essentially you’re trying to replicate conditions in the heart of the sun, where its colossal mass–330,000 times that of Earth–creates crushing pressure, and where the temperature is 17 million degrees Celsius. In fact, because the amounts of fuel are so much smaller, the temperature at which fusion is feasible on Earth starts at around 100 million degrees Celsius.

That’s the first problem. The second problem is that your fuel is in the form of a plasma, and plasma, as mentioned above, is weird. It’s a fourth state of matter, neither liquid nor solid nor gas. When you torture plasma with temperatures and pressures like these, it becomes wildly unstable and writhes like a cat in a sack. So not only do you have to confine and control it, and heat it and squeeze it; you have to do all that without touching it, because at 100 million degrees, this is a cat that will instantly vaporize solid matter.

You see the difficulty. Essentially you’re trying to birth a tiny star on Earth. “It comes down to two challenges,” Binderbauer says. “Long enough and hot enough.” In other words: Can you keep your plasma stable while you’re getting it up to these crazy temperatures? The severity of the challenge has given rise to some of the most complex, most extreme technology humans have ever created.

Take for example the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, outside San Francisco. A 10-story building with a footprint the size of three football fields, the NIF houses one of the most powerful laser systems in the world: 192 beams of ultraviolet light capable of delivering 500 trillion watts, which is about 1,000 times as much power as the entire U.S. is using at any given moment. All that energy is delivered in a single shot lasting 20 billionths of a second focused on a tiny gold cylinder full of hydrogen. The cylinder, understandably, simultaneously explodes and implodes, and the hydrogen inside it fuses. This technique is called inertial confinement fusion.

A more common method for creating fusion is by controlling the plasma magnetically. One of the few breaks physicists catch in the quest for fusion is that plasmas are extremely sensitive to electromagnetism, to the point where electromagnetic fields can actually be used to contain and compress them without physically touching them. It’s a feat most often performed using a device called a tokamak. (The word is a Russian acronym.) A tokamak is a big hollow metal doughnut wrapped in massively powerful electromagnetic coils. The coils create a magnetic field that contains and compresses the plasma inside the doughnut.

Since they were developed in the Soviet Union in the 1950s, tokamaks have come to dominate fusion research: in the 1980s enormous tokamaks were built at Princeton and in Japan and England, at a cost of hundreds of millions of dollars. Their successor, the colossus of all tokamaks, is being built in a small town in France outside Marseilles. ITER, the International Thermonuclear Experimental Reactor, will be 30 meters tall and weigh 23,000 tons. Its staff numbers in the thousands. It will hold 840 cubic meters of plasma. Its magnets alone will require 100,000 kilometers of niobium-tin wire. Its stupendous cost is being paid by a global consortium that includes the U.S., Russia, the European Union, China, Japan, South Korea and India.

Because of their extreme size and complexity, and the political vagaries associated with their funding, fusion projects are bedeviled by cost overruns and missed deadlines. The NIF was finished seven years late for $5 billion, almost double the original budget. ITER’s estimated date for full power operation has slipped from 2016 to 2027, and even that date is under re-evaluation. Its price tag has gone from $5 billion to $20 billion; for purposes of comparison, the Large Hadron Collider cost $4.75 billion.

The goal for all these machines is to pass the break-even point, where the reactor puts out more energy than it takes to run it. The big tokamaks came close in the 1990s, but nobody has quite done it yet, and some scientists find the pace frustrating. “Academics aren’t necessarily good at adhering to a schedule, promising something and delivering it, on budget and on time,” Binderbauer says. “The federal process doesn’t condition you to live in that mind-set.” And even when it does get up and running, ITER will never supply a watt of power to the grid. It’s a science experiment, not a power plant. Proof of concept only.

Fusion research is too slow, too cautious, too focused on lavishing too much money on too few solutions and too many tokamaks. “In a university lab the name of the game, the end product, is a paper,” says Michel Laberge, founder of General Fusion in Vancouver, who has a Ph.D. in physics. “You want to get to making energy, but it’s not the primary goal. The primary goal is to publish a lot of papers, to go to conferences and understand very thoroughly all the little details of what is going on.” Understanding is all well and good, in an ideal world, but the real world is getting less ideal all the time. The real world needs clean power and lots of it.

The driving force behind the founding of Tri Alpha was a physicist at U.C. Irvine named Norman Rostoker. Rostoker, who died in 2014, was a plasma physicist with both a deep understanding of mathematics and a flair for practical applications. He also had an indomitable will and a pronounced independent streak–anybody who talks about him ends up using the word maverick sooner or later. Binderbauer was one of his protégés.

Even in the early 1990s, Rostoker was skeptical of the tokamak hegemony. In a tokamak, the particles in the plasma move in tight spiral orbits around lines of electric current. But it’s hard to keep those particles from being bumped out of their little orbits by electromagnetic turbulence, and when that happens the plasma becomes unstable and loses precious heat. One way scientists fight this instability is by building bigger and bigger tokamaks, but bigger means more complex, and more power-hungry, and more expensive. Rostoker thought there had to be a better way.

He found one in particle accelerators, those colossal rings, like the Large Hadron Collider, that crash subatomic particles into each other. In accelerators, particles travel on wide, conspicuously stable orbits. Rostoker and Binderbauer wondered if you could do something similar in a fusion reactor. They spent a couple of years thinking about it and decided, short answer, probably. “If you can bring accelerator physics into the realm of fusion, you can actually make a better-behaved plasma, one that can give you long timescales,” Binderbauer says. “Then you can invest energy and heat it.”

Rostoker’s other key insight had to do with the flow of people and money around the reactor: he thought the private sector would be a better place to get things done than a university lab. Essentially he recategorized fusion power from an object of lengthy, lofty scientific inquiry to just another product to be shipped. “Fusion is in the end an application, right?” Binderbauer says. “The problem with fusion typically is that it’s driven by science, which means you take the small steps. The most predictable next step, the one you’re comfortable with. So it doesn’t necessarily connect with what you want. Norm said, You’ve got to look at the end in mind. You’ve got to unravel it, reverse-engineer it. What would a utility want? What would make sense? And design something from there, and be agnostic as to how hard the physics might be.”

Raising money was a challenge: tokamaks were eating up all the grant money, and energy startups are expensive, risky long-term bets, especially to Silicon Valley investors spoiled from flipping web startups for quick paydays. Recruiting was tough too: building a fusion device requires a blended culture of physicists and engineers, two groups who don’t historically mix well. For the first few years, the company ran on the brink of insolvency. “You have money for a year or two to develop something, deliver, and go get the next chunk,” Binderbauer says. “It’s not the academic risk profile.”

To keep the pace up they freed themselves from the baggage of theory: as long as something worked, they didn’t analyze to death why. The idea was to stay pragmatic and iterate rapidly, spend as little as possible and not fear failure. “This is one of the failures of the governmental way of running it,” Binderbauer says. “It didn’t create enough diversity of ideas, and let those freely be pursued to failure. Say, this is where we ultimately want to go, what are the critical steps to get there, what are the risk elements of the path to get there, and can I test for some of these risks without spending a hundred million bucks?”

Some academics would disagree, but no one can deny that Tri Alpha has managed to build a prototype fusion reactor quickly on a tiny budget. The company has a panel of advisers–including Burton Richter, who won the Nobel Prize for Physics in 1976, and Ronald Davidson, past director of fusion labs at both MIT and Princeton–and Binderbauer has fond memories of unveiling his first prototype to them in 2008. “There were like jaws dropping. It was like, holy sh-t, these guys actually did this? On this time frame? This is not possible. Then we had world-record data by August. That was a year basically from seeing dust to seeing physics data taken that’s better than anyone else ever did.”

Davidson confirms that impression, though in less colorful language. “In the framework of a Department of Energy laboratory, and also in some universities, the level of regulations and restrictions you have on how you do things is somewhat different than in the industry,” he says. “The industry can be quite nimble, relatively speaking, in exploring ideas and testing them for the first time.”

Tri Alpha’s reactor is very different from the towering tokamaks that dominate the fusion skyline, or the supervillain lasers of the NIF. You could think of it as a massive cannon for firing smoke rings, except that the smoke rings are actually hot plasma rings, and the gunpowder is a sequence of 400 electric circuits, timed down to 10 billionths of a second, that accelerate that plasma ring to just under a million kilometers an hour.

And there are actually two cannons, arranged nose to nose, firing two plasmas straight at each other. The plasmas smash into each other and merge in a central chamber, and the violence of the collision further heats the combined plasma up to 10 million degrees Celsius and combines them into a single plasma 70 to 80 centimeters across, shaped more or less like a football with a hole through it the long way, quietly spinning in place.

But a fusion reactor’s work is never done. Positioned around that central chamber are six massive neutral beam injectors firing hydrogen atoms into the edges of the spinning cloud to stabilize it and keep it hot. Two more things about this cloud: one, the particles in it are moving in a much wider orbit than is typical in, say, a tokamak, and hence are much more stable in the face of turbulence. Two, the cloud is generating a magnetic field. Instead of applying a field from outside, Tri Alpha uses a phenomenon called a field-reversed configuration, or FRC, whereby the plasma itself generates the magnetic field that confines it. It’s an elegant piece of plasma-physics bootstrappery. “What you get within forty millionths of a second from the time you unleash your first little bit of gas,” Binderbauer says proudly, “is this FRC sitting in here, fully stagnant, no more moving axially, and rotating.”

The machine that orchestrates this plasma-on-plasma violence is something of a monster, 23 meters long and 11 meters wide, studded with dials and gauges and overgrown with steel piping and thick loose hanks of black spaghetti cable. Officially known as C-2U, it’s almost farcically complicated–it looks less like a fusion reactor than it does like a Hollywood fantasy of a fusion reactor. It sits inside a gigantic warehouse section of Tri Alpha’s Orange County office building surrounded by racks of computers that control it and more racks of computers that process the vast amounts of information that pour out of it–it has over 10,000 engineer control points that monitor the health of the machine, plus over 1,000 physics diagnostic channels pumping out experimental data. For every five millionths of a second it operates it generates about a gigabyte of data.

In August, Tri Alpha announced that its machine had generated some very interesting data. So far the company’s primary focus has been on the long-enough problem, rather than the hot-enough part; stabilizing the plasma is generally considered the tougher piece in this two-piece puzzle. Now Binderbauer believes that they’ve done it: in June the reactor proved able to hold its plasma stable for 5 milliseconds.

That’s not a very long time, but it’s an eternity in fusion time, long enough that if things were going to go pear-shaped, they would have. The reactor shut down only because it ran out of power–at lower power, and hence with slightly less stability, they’ve gone as long as 12 milliseconds. “We have totally mastered this topology,” Binderbauer says. “I can now hold this at will, 100% stable. This thing does not veer at all.” He didn’t live to see it, but Rostoker was right. The cat is in the sack. Tri Alpha has tamed the plasma.

Some other people may be right too. Where fusion is concerned, the private sector supports a robustly diverse range of methodologies. In 2002, Laberge, an intense redhead with a thick French-Canadian accent and a droll sense of humor, realized he’d spent enough of his life designing laser printers. “I decided to start a fusion company,” he says. “Which is pretty insane, but that’s what I went for. I guess, go big in life.”

Laberge too was skeptical of the monoculture that dominated fusion science. “The thing in fusion is, when they started they tried many different approaches, and then there’s one or two that had a bit of success and whatnot, and then everybody jumped on those approaches,” he says. “So it is a good hunting ground for new startup companies, to go and see those abandoned efforts.” The approach he hit on is called magnetized target fusion: crudely put, you create a spinning vortex of liquid metal, inject some plasma into its empty center, then squeeze the vortex, thereby squeezing the plasma inside it and causing it to heat up and fuse.

Laberge couldn’t get enough grant funding, so he took the idea to investors instead and founded General Fusion. Now General Fusion has 65 employees and is one of a small handful of companies racing Tri Alpha to the break-even point. To date it has raised $94 million and built prototypes of the reactor’s major subsystems, including a spherical chamber for the liquid metal vortex with 14 huge spikes projecting out at all angles–the spikes are massive hammers that do the squeezing. It looks, if possible, even more like Hollywood’s idea of a fusion reactor than Tri Alpha’s. “The tokamak people have a very long timeline, which I don’t like,” Laberge says, “so we’d like to speed that up, and we think we can move faster.” Predictions, like comparisons, are invidious, but when coerced he says, “About a decade to producing energy would be a good timeline to have.”

Helion Energy, another venture in Redmond, is already on its fourth-generation prototype. Its approach also has two plasmas colliding in a central chamber, but it will work in rapid pulses rather than sustaining a single static plasma. Helion is focused on developing a smaller-scale, truck-size reactor, and doing it as fast as possible. The company’s website states in no uncertain terms that it will have a commercial reactor operational within six years. (Helion told us it was too busy building fusion reactors right now to participate in this article.)

And there are others. Industrial Heat in Raleigh, N.C.; Lawrenceville Plasma Physics in New Jersey; Tokamak Energy outside Oxford, England. Lockheed Martin’s Skunk Works division is developing what it calls a compact fusion reactor, which it says will fit on the back of a truck. It also says it’ll have a working prototype within five years. (And it said that last year, so four to go.)

There’s a kind of cheeky underdog defiance in the attitude of the private sector to the public, but the attitude the other way is a bit more collegial. “They’re very interesting,” says Professor Stewart Prager, director of the Princeton Plasma Physics Laboratory. “Some more than others. There’s a range. It’s definitely good to see private investment in fusion.” Dennis Whyte, director of the Plasma Science and Fusion Center at MIT, understands the impatience that drives the startups. “Their argument is that if the science breaks go their way, they will be able to accelerate the pace of getting fusion energy on the grid, and I overall agree with that philosophy,” he says. “I’m part of the quote unquote Establishment that they’re railing against, but you can sense my frustration, because I’m not happy about the delays and so forth.” (He might well be frustrated: Congress has cut funding for MIT’s fusion reactor, which will cease operations next year. He’s currently focused on designing a smaller, modular reactor that takes advantage of recent advances in superconducting technology.)

Within the private sector, there’s a good deal of genial trash talk. The trash talk about Tri Alpha tends to focus on the question of fuel: When you’re doing fusion, which atomic nuclei do you fuse? By far the most popular answer is deuterium and tritium, twoisotopes of hydrogen. This is fusion’s low-hanging fruit, because deuterium and tritium fuse at a lower temperature than any other option, a comparatively mild 100 million degrees Celsius. ITER uses D-T fusion (as it’s known), as do the NIF, the National Spherical Torus Experiment at Princeton, Lockheed Martin, General Fusion and almost everybody else.

But there are catches. One is that tritium is rare, so you have to make it. The other is that the reaction emits, along with an isotope of helium, a neutron, which is a problem because when you throw a lot of free neutrons at something it eventually becomes radioactive. That means you’re stuck regularly replacing parts of your reactor as they become too hot to handle. Binderbauer is scathing on the subject of D-T fusion. “Let’s say you have success on ITER,” he says. “You’ve still got another many decades of materials research to try to make something that lasts more than six to nine months, in the hellish bombardment of neutrons it is going to have to live in.”

But there are engineering solutions to the problem: that vortex of liquid metal in General Fusion’s reactor will be a mixture of lead and lithium, which will catch the neutrons. As a bonus, when you hit lithium with neutrons, you get tritium. So two birds, one stone.

Helion’s reactor will fuse deuterium and helium-3, which produces fewer neutrons, though it requires more heat and raises the problem of finding enough helium-3, which is also rare. Tri Alpha plans to fuse protons (otherwise known as hydrogen nuclei) with boron-11. This reaction produces no neutrons at all, and both elements are plentiful and naturally occurring. “We’re always saying, if you want to buy our plant,” Binderbauer says, “we’ll give you a lifetime supply of fuel for free.” The reason hardly anybody else is pursuing it is that proton-boron-11 fusion requires much higher temperatures, insanely much higher: 3 billion degrees Celsius.

No one really knows how plasma will behave at that temperature, and virtually everybody I talked to was skeptical about Tri Alpha’s making it work, and considered the engineering challenges of D-T fusion to be vastly preferable. “Fusion is hard already, even when it’s D-T, and you have to realize how much harder this is than D-T,” says Whyte. “It’s O.K. to take a physics leap, but you also don’t want it to be so big that you worry about its viability.” Laberge felt the same way: “It’s like learning to run before you can walk. Or somebody told me it’s like learning to fly before you can walk. You can argue that General Fusion is outrageously ambitious trying to do fusion, but Tri Alpha is outrageously outrageously ambitious.”

Binderbauer, who is not intimidated by anything, is not intimidated by this either. His next move will be to tear down Tri Alpha’s current reactor and build a new one that will scale up to the necessary temperatures. He points out that particle accelerators can create temperatures in the trillions. “Going to higher temperatures is not that hard,” he says. “It sounds terrible, because it’s billions of degrees, but it’s not. You use techniques much like what you use in a microwave. They’re very similar principles.” You have to imagine the Austrian accent to get the full effect.

Everybody in the fusion industry shares a worldview in which the transformation of the globe by fusion power is imminent. I asked Binderbauer how confident he was that he would see a practical fusion reactor in his lifetime, and his answer was “Very. Scientifically I’m very confident. Now that we have this, this is the foundation.” He thinks he understands theoretically what will happen as his machine claws its way up to 3 billion degrees, and the theory tells him it’s possible. “There should be no physics that says it won’t be. But you gotta test it. This is the field where nature’s the ultimate arbiter, so there’s some risk there.”

Binderbauer’s Austrian rigor restrains him, barely, from making brash predictions about when all this is going to happen. “People tell you they’ll have a reactor in five years–I know it’s impossible. And it’s not because I’m negative. I want this too, and we work as fast as we can, but I know it’s more than five years. It just is.” Try to pin him down on a specific timeline for Tri Alpha and he writhes like a superheated plasma. “It’s not true that it takes 30 years and will always take 30 years. It doesn’t. I’m not prepared to tell you, X is the number of years till we have a commercial reactor here.

But I will tell you, we are truly about three to four years from the point where the risk changes from a science risk to an engineering risk. And I can certainly see that within a decade such things can mature to the point where you can have the first commercial steps.”

There may be a lot of those steps. The utilities will be the ones making the actual transition, and for fusion to be of any earthly use to anybody it will have to make business and engineering sense to them, because fusion plants will be expensive. Unlike solar or wind, fusion would provide energy constantly, not intermittently, but there would have to be enough of it. The gain (the ratio of energy-out to energy-in) of a commercial fusion plant would have to be in the 15-to-20 range; right now ITER’s target gain is 10; to date no fusion reactor has yet reached a ratio of 1, the break-even point. Then there’s the question of how exactly to extract that energy from the reactor in the form of heat, so that it can plug into the existing infrastructure.

But those steps would be giant leaps for mankind. Bill Gates is currently on a global campaign trying to raise awareness about how badly our addiction to energy is destroying the environment. He’s putting $2 billion of his foundation’s money into it. “We need innovation that gives us energy that’s cheaper than today’s hydrocarbon energy, that has zero CO[subscript 2] emissions, and that’s as reliable as today’s overall energy system,” he says in the November issue of the Atlantic. “We need an energy miracle.” (He personally has invested in TerraPower, a maker of next-generation fission plants.)

To assess the precise probability that fusion will or won’t be that miracle is beyond the remit of a journalist without a Ph.D. in plasma physics, but as miracles go, it’s looking a lot more plausible than most. Even Prager, head of the Princeton Lab, who considers the claims of the private sector to be overconfident, still believes it’s a question of when not if. “I think it’s inevitable. And I don’t think I’m alone in that. You can’t get commercial fusion in 10 years, but I think we’ll have commercial fusion, fusion on the grid, in the 2040s. It may sound like a long way away, but in terms of mitigating climate change, fusion will play a very critical role.”

Fusion may just turn out to belong to that category of human achievement, like powered flight and moon landings, that appeared categorically impossible right up until the moment somebody did it. At the very least, a lot of very smart people are betting their money and their careers on it. As for the rest of us, we may already have bet the planet.

Read more:
Germany to switch on a revolutionary nuclear fusion machine by end of 2015
Obese children's health improves with less sugar

Calories are not created equal. Reducing consumption of added sugar, even without reducing calories or losing weight, has the power to reverse a cluster of chronic metabolic diseases in children in as little as 10 days, according to a study by researchers at UC San Francisco and Touro University California.

"This study shows that sugar is metabolically harmful not because of its calories or its effects on weight; rather sugar is metabolically harmful because it's sugar," says lead author Dr Robert Lustig, a paediatric endocrinologist.

In the study, 43 Latino and African-American children between the ages of nine and 18 who were obese and had at least one other chronic metabolic disorder - such as hypertension, high triglyceride levels or a marker of fatty liver - were given nine days of food that restricted sugar but substituted starch to maintain the same fat, protein, carbohydrate and calorie levels as their previous diets.

Total dietary sugar was cut from 28 per cent to 10 per cent, and fructose from 12 per cent to 4 per cent of total calories, respectively. The foods included turkey hot dogs, potato chips, and pizza, all bought at local supermarkets.

At the end of the diet, virtually every aspect of the participants' metabolic health improved, without change in weight. Diastolic blood pressure decreased by 5mm, triglycerides by 33 points, LDL-cholesterol ("bad" cholesterol) by 10 points, and liver function tests improved. Fasting blood glucose went down by five points and insulin levels were cut by one-third.

Green offices linked to higher cognitive function

People who work in well-ventilated offices with below-average levels of indoor pollutants and carbon dioxide have much higher cognitive functioning scores - in crucial areas such as responding to a crisis or developing strategy - than those who work in offices with typical levels, according to a new study led by scientists at the Harvard T.H. Chan School of Public Health.

The researchers looked at people's experiences in "green" versus "non-green" buildings, and both the participants and the analysts were blinded to test conditions to avoid biased results. The decision-making performance of 24 participants - including architects, designers, programmers, engineers, creative marketing professionals and managers - was analysed while they worked in a controlled office environment for six days. At the end of each day, they conducted cognitive testing on the participants.

They found that cognitive performance scores for the participants who worked in the green environments were up to double those of participants who worked in conventional environments. The largest improvements occurred in the areas of crisis response, strategy and information usage. The findings suggest that the indoor environments in which many people work daily could be adversely affecting cognitive function - and that, conversely, improved air quality could greatly increase the cognitive function performance of workers.

Peter Kammerer says the city lags others in embracing the trend to share underused assets and services, a sign that we lack a sense of community.

Some governments, but not Hong Kong's, have encouraged carpooling as it reduces journeys and, as a result, lessens road congestion and pollution.

There's only so much space in a Hong Kong flat. The longer we live in it, the more stuff we collect - and we only realise how much we've gathered when we move. Books, electrical appliances, DVDs, tools and on and on are unearthed, most of it destined to be thrown away along with the furniture and old clothes that we would rather not take with us. It's a waste and we feel bad about putting them into the garbage, but what can be done in a place where high rents make second-hand shops a rarity?

The answer lies in the sharing economy - the idea behind Airbnb, carpooling and tool banks. It's taken off in North America and Europe, but is still a novelty in Hong Kong. Poor understanding about sustainability may have something to do with it, or maybe it's a lack of will by the government to push recycling. I suspect much of it has to do with laziness, selfishness and an inability to trust.

Smartphones and social media have, after all, made it possible for us to easily connect with someone who may want what we no longer need or have limited use of. Those with particular expertise or skills can also share them, while people with spare time can offer to babysit or run errands. Goods, services and time can be given free or charged to make better use of what we have. There's a sense of hippyness about it, something entrepreneurial, but above all, it's about being less wasteful.

We're most familiar with the room-rental company Airbnb, but there are untold numbers of others, ranging from the online learning community for creators, Skillshare, through the web marketplace for medical equipment, Cohealo, to car-sharing and rental.

Rachel Botsman, whose 2010 book What's mine is yours: How collaborative consumption is changing the way we live, drove the idea into popular consciousness. She defines the sharing economy as "an economic system based on sharing underused assets and services, for free or a fee, directly from individuals".

The concept is not new - libraries do it with books and poorer communities have always shared to make ends meet. Carpooling - where vehicle owners take turns to drive others to save costs - is as old as suburbs. Some governments, but alas, not Hong Kong's, have encouraged it as it reduces journeys and, as a result, lessens road congestion and pollution. Taxi companies don't like the lost business, just as they object to the competition of newcomers like Uber, with their app-based ease and efficiency.

But for sharing to be effective, communities have to work together. That requires selflessness and trust - commodities that are sometimes in short supply in Hong Kong, where consumerism is king and neighbours are more often than not avoided to prevent them from prying into lives.

Creating neighbourliness was one of the reasons Tai Po resident Albert Lui turned to Facebook a few months ago; he figured that by looking for people willing to share his drive to work in Central, he could make new friends while improving traffic. He had noticed many cars had one occupant - the driver.

Most private cars are only used an hour a day and are idle in parking spaces the rest. Hourly rental services, like Zipcar in the US, would make sense. That also raises questions about the cost-effectiveness of the government's 6,430 vehicles, among them 1,475 sedans and 643 buses. Surely renting when needed saves public funds, as governments elsewhere have found?

The sharing economy will catch on in Hong Kong. More people like Lui will push it along. But for it to work effectively, we need to be less possessive and reticent about dealing with strangers.

Related link:

"One Mega-City, Many Systems": The Evolution of Hong Kong

We’re disappointed that big news in France hasn’t made its way to the top of U.S. headlines: the French National Assembly recently passed a law that will help to limit young children’s exposure to electromagnetic fields (EMFs) generated by wireless technologies. Two years in the works, the law encompasses various rules, including:

-Banning WiFi in any childcare facilities catering to children under the age of 3.
-Requiring cell phone manufacturers to recommend the use of hands-free kits to everyone.
-Banning any advertising that specifically targets youth under the age of 14.

The law, passed by a majority vote and adopted into place on January 29, 2015, is the first in France to suggest and establish that WiFi over-exposure may indeed be hazardous to young children — a controversial topic, not just in France, but around the world.

The law, entitled, “An Act on Sobriety, Transparency, Information and Consultation for Exposure to Electromagnetic Waves,” while comprised of many sections, most importantly, is setting a good example about how it may be best to take a precautionary approach when addressing the potential health risks of WiFi exposure. Various research, from informal to formal, shows that chronic exposure to WiFi, may be harmful to youth. Some current literature on the subject has shown that exposure to high-powered WiFi environments may include attention problems, cardiac irregularities, seizures, fatigue, and other health problems. Another scientific report published recently shows that kids’ brains may absorb twice as much cellphone radiation as adults’ brains. Last, but certainly not least, screen addiction has become a very real health and well-being issue. Unfortunately, and contrary to the initial proponents of the new law, WiFi will still be permitted in primary schools. That said, we’re still very happy to hear that France is taking small steps to alleviate some of the problems caused by screens and WiFi, and hope to see similar actions in the United States and other places around the world.

There are an estimated 5 trillion pieces of plastic floating through the world’s oceans, enough to fill almost 600 jumbo jets. Cleaning up so much pollution might seem like a monumental task, but one 20-year-old is setting out to make a difference — in a big way.

Boyan Slat, the founder and CEO of The Ocean Cleanup, announced earlier this week that his organization will deploy the world’s first system to passively remove plastic waste from oceans around the world.

Ocean Array Could Clean Up 7,250,000 Tons Of Plastic
The system is comprised of a series of floating barriers that spans over a mile long, making it the longest floating structure in the ocean. The barriers trap floating plastic debris, which is then picked up a via conveyer belt 7,900 times faster and 33 times cheaper than other methods:

“Taking care of the world’s ocean garbage problem is one of the largest environmental challenges mankind faces today. Not only will this first cleanup array contribute to cleaner waters and coasts but it simultaneously is an essential step towards our goal of cleaning up the Great Pacific Garbage Patch. This deployment will enable us to study the system’s efficiency and durability over time,” said Slat.

The Ocean Cleanup plans to deploy the structure off of the coast of Japan during the second quarter of 2016.
Out of This World Cafe at CAMH, has a puff next to a newly installed recycling box affixed to a street pole at Queen St. W. and Ossignton Ave. Cigarettes tossed into new Queen West recycling bins will be turned into plastic lumber and pallets.

Warren Hawke, manager of the Out of This World Café, says he’s fed up with seeing tarry mounds of cigarette stubs on his way to work. Hundreds of cigarette butts are scattered on the strip of Queen St. W. and lawn in front of the Centre for Addiction and Mental Health, where the café is based. After he heard over the radio that Vancouver was recycling butts, he thought: “Why not do the same here?”

“I read that cigarette butts are the worst littering problem in the world,” he said, looking at the stubs sprinkled on the grass and sidewalk near the café. “So maybe we can start with our little corner.”

He teamed up with his city councillor, Mike Layton (open Mike Layton's policard), and U.S.-based recycling company TerraCycle to do something about the problem. CAMH and the West Queen West Business Improvement Area, a group representing businesses in the neighbourhood, are also backing the initiative.

After a year and a half, their pilot project is seeing the light of day. A CAMH maintenance crew fastened four sleek, stainless-steel cigarette recycling boxes Thursday to street poles near Queen St. W. and Ossington Ave.

Café staff will be in charge of emptying the boxes and shipping the butts to a TerraCycle plant in north Toronto, where they’ll be shredded and separated into organic and inorganic waste. The organic material will be turned into non-agricultural compost. The rest will be made into plastic lumber and shipping pallets, which could then be sold to home renovation stores and builders.

Layton said that’s a much better solution than letting tons of cigarette butts end up in a landfill or wash away into sewers and empty into Lake Ontario. The cigarette stubs “are made of plastic and they’re not breaking down — and what does break down is toxic,” he told the Star. “It’s poisoning our own water supply, which is pretty crazy.”

The pilot program has come at no cost to the city other than for the metal bands used to fasten the boxes to the poles.

He said he hopes litterbugs will stub out their cigarettes in the new boxes more than they do in the small and inconspicuous cigarette slots included in the city’s 8,000 garbage and recycling bins. Cigarettes tossed into the black or grey bins end up in a landfill. The city doesn’t know how many tons of cigarette butts end up in the trash, a city spokeswoman said. But according to last year’s litter audit, cigarette ends were the second-commonest small litter item, after chewing gum.

TerraCycle CEO Tom Szaky, who grew up near Eglinton Ave. W. and Bathurst, said he is glad street-side cigarette recycling bins have been installed in his hometown. In addition to polluting the water supply, discarded cigarettes also harm the ecosystem, he said.

“The (cigarette) filter was invented to trap as many of these cancer-causing agents or carcinogens as possible. Those are trapped in the butt and they become toxic pills for wildlife,” he said.

“The biggest message here is: If people don’t like seeing cigarette butts and all the damage they cause, don’t smoke. That is truly the answer. But if you do smoke — and about 20 per cent of Canadians do — then it’s really important not to litter.”

We’ve seen yoga, standing desks and vegetarian lunches turn troubled schools around, but we’ve never seen meditation adopted successfully within the school system. Until now. According to reports, several San Francisco middle and high schools, as well as scattered schools around the Bay Area, have adopted what they call, “Quiet Time” – a stress-reduction meditation strategy that is doing wonders for students and teachers.

The first school to adopt the Quiet Time practice in 2007, Visitacion Valley Middle School, has reaped huge rewards. Formally a school largely out of control, Visitacion Valley is smack in the middle of a neighborhood where shootings are common. This resulted in the students getting bad grades, skipping school and fighting daily, as they were likely highly troubled by the violence surrounding them. The situation at Visitacion Valley was so dire that teachers even started calling in sick, to avoid teaching these kids. The school tried everything from counseling to peer support to after-school tutoring and sports but nothing seemed to work until Quiet Time entered the picture.

The SFGate reports that within the first year that Quiet Time was used, “The number of suspensions fell by 45 percent. Within four years, the suspension rate was among the lowest in the city. Daily attendance rates climbed to 98 percent, well above the citywide average. Grade point averages improved markedly.”

Most importantly, the SFGate reports that, “Remarkably, in the annual California Healthy Kids Survey, these middle school youngsters recorded the highest happiness levels in San Francisco.” Now, at least three other schools have adopted Quiet Time with similarly successful results. Burton High School notes that students involved in Quiet Time say they experience significantly less stress and depression, and greater self-esteem, plus academic successes have risen dramatically.

The California Achievement Test, which measures grades of kids in CA, found that twice as many students in Quiet Time schools have become proficient in English, when compared to students who don’t participate in Quiet Time, and the gap is even bigger in math. Teachers in the schools using Quiet Time are also faring better, stating that they’re, “Less emotionally exhausted and more resilient.” The Quiet Time website notes the following success rates of meditation in schools:

*10% improvement in test scores—and a narrowing of the achievement gap.
*Highly effective for increasing creativity.
*Improved teacher retention and reduced teacher burnout.
*Greater happiness, focus and self-confidence.
*Reduced ADHD symptoms and symptoms of other learning disorders.
*86% reduction in suspensions over two years.
*40% reduction in psychological distress, including stress, anxiety and depression.
*65% decrease in violent conflict over two years.

While of course, we can’t know the very long-term effects of meditation in schools (yet) it’s clear that schools are benefiting from the innovative practice, making Quiet Time a stellar idea for other schools to try. For more information on the Quiet Time Program, contact Jamie Grant at Jamie@DavidLynchFoundation.org.

Related links:

Meditation transforms roughest San Francisco schools

Learn more about Quiet Time and see how to adopt it in your child’s school
Tuesday night, the House passed legislation aimed at reforming the Toxic Substance Control Act (TSCA), the 1976 law that for almost 40 years has dictated how chemicals are managed in the United States. Passed with broad bipartisan support — with the only no vote coming from Rep. Tom McClintock (R-CA) — the bill is a first step toward reforming TSCA, largely considered one of the most ineffective environmental laws in the country.

“Eighty-five thousand chemicals have been introduced into commerce in the United States, and what we know is that less than 1,000 have been well-tested for their human health and environmental effects,” Noah Sachs, professor of law at the University of Richmond and a scholar at the Center for Progressive Reform in Washington, D.C., told ThinkProgress. “I think there’s an assumption that the government must be watching out for these things, and if there were a dangerous chemical out there the government would remove it, but that’s not what is happening at all.”

Advocates for chemical regulation reform point to several shortcomings in the existing TSCA statute. When the TSCA passed in 1976, some 64,000 chemicals that were currently in use were exempted from testing — since then, another 22,000 have been evaluated, but few have been designated as toxic. Crude MCHM, the chemical that spilled into West Virginia’s Elk River in January 2014, for instance, is unregulated. Between 1976 and 2007, the Environmental Protection Agency generated data on just 200 chemicals.

While TSCA gave the EPA the authority to review chemicals, it never provided the agency with a mandate on how it should go about doing it. Instead, it required high burdens of proof for deeming a chemical toxic, calling for its removal, or restricting its use. Under the current TSCA, any time the EPA wants to regulate a chemical, it has to provide a cost-benefit analysis showing that the agency’s alternative chemical is the least burdensome in terms of environmental and health impacts and cost. It has to provide that analysis not just for the proposed alternative, but for every other potential alternative as well.

I think there’s an assumption that … if there were a dangerous chemical out there the government would remove it, but that’s not what is happening at all

Asbestos — which is classified as a known human carcinogen and is banned in over 40 countries — is still legal in the United States due to the high burden of evidence required of the EPA under TSCA. In 1991, the Fifth Circuit found that the EPA, in trying to ban the substance, had failed to provide substantial evidence that a ban was the “least burdensome alternative” as required by TSCA, and rejected the EPA’s cost-benefit analysis. Since 1991, the EPA has not attempted to regulate an existing chemical.

The House bill removes the requirement that the EPA find the least burdensome alternative, and for the first time includes a mandate that the EPA begin testing chemicals for their safety, requiring that the EPA test 10 chemicals per year.
But some environmentalists worried that the bill still doesn’t go far enough in regulating dangerous substances.

“We commend the House for its focus on the need to overhaul chemical policy, but this piece of legislation will not do the job,” Ken Cook, president of the Environmental Working Group, said in a press statement. “It tips much too far in favor of an industry in serious need of regulation.”

Though the House bill removes the requirement that the EPA provide evidence of a less burdensome alternative — placing more emphasis on scientific evidence during safety assessments — it still requires the agency to “determine whether technically and economically feasible alternatives that benefit health or the environment…will be reasonably available as a substitute when the proposed prohibition or other restriction takes effect.”

It also requires that the EPA show proof of a chemical’s potential risk before testing can even begin, forcing the agency to amass a record of a chemical’s potential impacts before it can order more testing.

“I don’t see why that should be the agency’s task,” Sachs said. “I think it’s putting yet another procedural hurdle in the place of removing dangerous chemicals from the market.”

The House bill also allows chemical companies to request that the EPA test a given chemical — a provision that environmentalists worry will allow industry to dictate the EPA’s agenda.

“It’s a nice way for industry to drive the testing priorities,” Sachs said. “It’s pretty extraordinary that this bill allows industry to set the testing agenda for a government agency.”

The American Chemistry Council — the main trade association for the American chemical industry — was quick to praise the bill’s passage, calling it “a pivotal moment in the years-long effort to reform TSCA.”

Whatever gets passed may be with us for another generation

The Senate is expected to vote on a similar bill before the August recess. That bill is largely considered to be more comprehensive than the House version, as it creates standards for labeling chemicals as either high or low priority for testing. But the bill — which only requires the testing of 25 chemicals over five years and strips states of their right to create their own chemical regulations — has also been criticized by environmentalists and public health officials, who claim that industry interests played too large a role in its drafting.

In March, Hearst Newspapers obtained a copy of a final draft of the bill, before it was seen by a Senate subcommittee. The draft was written in the form of a Microsoft Word Document, and by checking the documents “advanced properties” in Word, the document’s company of origin turned out to be the American Chemistry Council.

“It was clear from the computer coding that the final draft originated at the American Chemical Council itself,” Sen. Barbara Boxer (D-CA) said the day before the Senate Environmental and Public Works Committee began discussing the bill. “Maybe I’m old fashioned, but I do not believe that a regulated industry should be so intimately involved in writing a bill that regulates them.”

After the House passed its bill on Tuesday, however, Boxer expressed hope that the Senate could pass an amended version of the bill.

“While the House bill could still be improved, I feel it is the appropriate bill to take up in the United States Senate where we can work on just a few amendments to make it better,” she said in a statement.
Sachs, however, hopes that Congress build upon existing momentum to create a reform bill that addresses the gaps in current TSCA.

“Whatever gets passed may be with us for another generation,” he said. “I would like to see a much more aggressive statute, and after 40 years of working under this very weak law of TSCA, I think Congress can do a lot better to pass something more ambitious.”

This page was loaded Feb 12th 2016, 3:45 am GMT.