Log in

Sustainable Solutions - Home of News, Events & Ideas for Inquisitive Green Minds
an essential roadmap for your ultimate eco-journey...
Recent Entries 
With its ongoing recycling revolution, less than one per cent of Sweden’s household waste ends up in a rubbish dump. The rest is recycled in different ways.

Towards zero waste

Wouldn’t it be great if no household waste was wasted? If each and every item of refuse was turned into something else – new products, raw materials, gas or at least heat?

Sweden is almost there. More than 99 per cent of all household waste is recycled in one way or another. This means that the country has gone through something of a recycling revolution in the last decades, considering that only 38 per cent of household waste was recycled in 1975 (see chart).

Today, recycling stations are as a rule no more than 300 metres from any residential area. Most Swedes separate all recyclable waste in their homes and deposit it in special containers in their block of flats or drop it off at a recycling station. Few other nations deposit less in rubbish dumps.

Stepping up recycling

Weine Wiqvist, CEO of the Swedish Waste Management and Recycling Association (Avfall Sverige), still thinks Swedes can do more, considering that about half of all household waste is burnt, that is, turned into energy. He explains that reusing materials or products means using less energy to create a product, than burning one and making another from scratch.

‘We are trying to “move up the refuse ladder”, as we say, from burning to material recycling, by promoting recycling and working with authorities’, he says.

Meanwhile, Swedish households keep separating their newspapers, plastic, metal, glass, electric appliances, light bulbs and batteries. Many municipalities also encourage consumers to separate food waste. And all of this is reused, recycled or composted.

Newspapers are turned into paper mass, bottles are reused or melted into new items, plastic containers become plastic raw material; food is composted and becomes soil or biogas through a complex chemical process. Rubbish trucks are often run on recycled electricity or biogas. Wasted water is purified to the extent of being potable. Special rubbish trucks go around cities and pick up electronics and hazardous waste such as chemicals. Pharmacists accept leftover medicine. Swedes take their larger waste, such as a used TV or broken furniture, to recycling centres on the outskirts of the cities.

At the Gärstadverken in Linköping, waste is turned into energy. The chart shows how much recycling has grown in Sweden over the last decades. (Photo: Åke E:son Lindman)

Waste to energy

Let’s take a closer look at the 50 per cent of the household waste that is burnt to produce energy at incineration plants. Waste is a relatively cheap fuel and Sweden has, over time, developed a large capacity and skill in efficient and profitable waste treatment. Sweden even imports 700,000 tonnes of waste from other countries.

The remaining ashes constitute 15 per cent of the weight before burning. From the ashes, metals are separated and recycled, and the rest, such as porcelain and tile, which do not burn, is sifted to extract gravel that is used in road construction. About one per cent still remains and is deposited in rubbish dumps.

The smoke from incineration plants consists of 99.9 per cent non-toxic carbon dioxide and water, but is still filtered through dry filters and water. The dry filters are deposited. The sludge from the dirty filter water is used to refill abandoned mines.

In Sweden, burning waste to produce energy is uncontroversial, but in other countries – like the US – it is a much debated topic.

Doing better

Hans Wrådhe heads the section for waste and chemicals at the Swedish Environmental Protection Agency (Naturvårdsverket) and considers proposing a higher levy on waste collection.

‘That would increase everybody’s awareness of the problem’, he says.

Together with government agencies and corporations, Wrådhe has developed an action plan for waste prevention, including how to encourage producers to make products that last longer. The agency also considers proposing a tax deduction for some repairs.

‘Government-sponsored ads on how to avoid food waste might also help’, he says. ‘And less toxic substances used in production would mean fewer products that require expensive treatment.’

In this stationary vacuum system, users throw their waste into ordinary inlets, where the bags are stored temporarily. All full inlets are then emptied at regular intervals through a network of underground pipes. (Photo: Envac)

Companies joining the effort

Some Swedish companies have voluntarily joined in the struggle. For example, H&M has begun accepting used clothing from customers in exchange for rebate coupons in an initiative called Garment Collecting.

The Optibag company has developed a machine that can separate coloured waste bags from each other. People throw food in a green bag, paper in a red one, and glass or metal in another. Once at the recycling plant, Optibag sorts the bags automatically. This way, waste sorting stations could be eliminated.

The southern Swedish city of Helsingborg even fitted public waste bins with loudspeakers playing pleasant music – all in the name of recycling.

Back to Swedish Waste Management and Recycling Association CEO Wiqvist, who thinks perfection in recycling is possible, an idea worth striving for.

‘“Zero waste” – that is our slogan’, he says. ‘We would prefer less waste being generated, and that all the waste that is generated is recycled in some way. Perfection may never happen, but it certainly is a fascinating idea.’


The concept of Vision Zero first originated in Sweden in 1997, when the Swedish parliament adopted it as the official road policy. Founded on the belief that loss of life is not an acceptable price to pay for mobility, Vision Zero takes a systems approach to enhancing safety. Rather than exclusively faulting drivers and other users of the transportation system, Vision Zero places the core responsibility for accidents on the overall system design, addressing infrastructure design, vehicle technology, and enforcement. The approach has resulted in noteworthy successes – Sweden has one of the lowest annual rates of road deaths in the world (3 out of 100,000 as compared to 12.3 in the United States). Not only that, but fatalities involving pedestrians have fallen almost 50% in the last five years.

According to professor Claus Tingvall, one of the architects of Sweden’s Vision Zero policy, system design should be based on the premise that humans are fallible, and will make mistakes. “If you take a nuclear power station, if you take aviation, if you take a rail system, all of them are based on [the idea that] they are operated by people who can make a mistake.” The same understanding should influence roadway design, where traffic calming, well-marked crosswalks and pedestrian zones, and separated bike lanes can help minimize the consequences of a mistake. According to Vision Zero philosophy, “In every situation a person might fail. The road system should not.”

Vision Zero policies have already been adopted in Norway and Denmark and are gaining traction across the U.S. Shortly after his inauguration, New York City Mayor Bill de Blasio announced a Vision Zero goal of eliminating traffic deaths and injuries in the city. The NYC action plan uses a multi-pronged approach that emphasizes enhanced enforcement, improved street design, and legislative proposals dealing with safety. The plan cites successes from several U.S. states that have implemented similar approaches with dramatic results, including a 43% reduction in traffic fatalities in Minnesota, a 48% reduction in Utah, and a 40% decrease in Washington State.

The public health imperative behind Vision Zero is clear: increasing the safety of our streets not only saves lives, but also makes it easier and more enticing for people to engage in daily physical activity by walking and biking.

To Learn More About Vision Zero:

Related link:
How to Design a Neighborhood for Happiness
March 21 @6:15PM -- Free Pre-Festival Screening : After the Last River with Minister Glen Murray and Director Victoria Lean

For tickets, please click:

In the shadow of a De Beers diamond mine, a remote indigenous community lurches from crisis to crisis, as their homeland transforms into a modern frontier. Rosie Koostachin delivers donations to families who live in uninsulated sheds, overgrown with toxic mold. She is determined to raise awareness, believing that if only Canadians knew, her hometown's dire situation would improve. Over five years, filmmaker Victoria Lean follows Attawapiskat's journey from obscurity and into the international spotlight twice - first when the Red Cross intervenes and again during the protest movement, Idle No More. Weaving together great distances, intimate scenes and archive images, the documentary chronicles the First Nation's fight for justice in the face of hardened indifference.

Join us for a viewing of this thought-provoking and enlightening documentary that connects personal stories from the First Nation to entwined mining industry agendas and government policies, painting a complex portrait of a territory that is an imperiled homeland to some and a profitable new frontier for others.

Hosted by the Ontario Water Works Association (OWWA)- University of Toronto Student Chapter in association with Ecologos as a part of Water Docs 2016 - a documentary film festival about all things water

More about the Water Docs 2016 Festival:
Written By Jeromy Johnson

Wireless technology has become an integral part of our culture. It has connected us to people and information and has also brought us incredible convenience and economic benefits.

Just think how this technology has expanded in just the eight years since the iPhone was introduced. We have seen ubiquitous WiFi, tablet computers, wireless smart meters, the smart home, wearable tech, and now the Internet of Things. This latest development will connect everything we own to the internet via pulsed microwave radiation.

This may sound like an amazing technological future that will only provide us with benefits. However, what if there is a downside that is just now becoming apparent? In short, is this exponential rise in microwave radiation affecting our health? And, most importantly, is it affecting children who will be exposed to unprecedented levels of artificial electromagnetic radiation their entire lives?

This is the question I look at in this TEDx talk. I explore the problem and delve deeply into the long-term health consequences and why a growing number of people world-wide are starting to be injured by wireless technology. There are common symptoms that many people now experience when they are near a WiFi router, wireless smart meter, or cell phone tower. They include:

*Tinnitus/Ringing in the Ears
*Cognitive Disturbance

This TEDx talk is not only about the health effects and the science behind this growing problem. I also provide solutions – simple actions you can take right now to reduce your exposure. The five solutions I provide will cost you almost nothing at all, but will provide tremendous benefits.

I also discuss paths we can take to create safer technologies. This is the ultimate destiny of our culture. It is not about going backwards. I believe we can move toward a future with safe technology and that our society is evolving to the point where industry will have to implement safe technologies. Once this happens, our entire society will move into a healthier future.

My intention is that this talk helps you create a healthier relationship with technology and that it is something that will open up this important topic to those who are close to you.

**About Jeromy Johnson - He is an expert on reducing Electromagnetic Field (EMF) pollution. He has a leading website website on this topic and speaks around the world to provide solutions to this important issue. He has also written a book called "How to Find a Healthy Home" and has demonstrated that simple changes in our daily practices can go a long way to ensuring a healthier life. Jeromy has an advanced degree in Civil Engineering and has worked in Silicon Valley for 15 years.


Related Link:
This Is What WIFI, Cell Phones, iPads & More Are Doing To Your Child’s Brain – 100 + Scientists Are Now Petitioning The UN
Chemist Rebecca Abergel and her colleagues have found a way to remove radioactive contaminants from the body. Now they are trying to put their solution in a pill.

After the US dropped atomic bombs on the cities of Hiroshima and Nagasaki at the end of World War Two, more than 100,000 people died — many from exposure to radiation. At high doses, radiation blasts through tissues, ruptures DNA strands and alters the rhythms of cell division. Disrupted cells cause nausea, diarrhea and fever, then dizziness, weakness and hair loss. Over time they may turn cancerous. For a person who has experienced high levels of radiation in a short period of time, from a nuclear weapon or a Chernobyl-scale nuclear power plant meltdown, treatment options are limited. Doctors can make a patient comfortable, treat burns and nausea, and try to keep the radiation from spreading. But an acute dose is usually fatal. The problem is, once radiation is in the body, it can be very hard to get out.

It may not have to be this way. Nuclear weapons and accidents commonly release actinides, a group of radioactive elements at the bottom of the periodic table. Actinides such as plutonium, uranium and curium easily lock into our bones and organs, where they can emit radiation into our bodies for decades. Chemist Rebecca Abergel and colleagues at Lawrence Berkeley National Laboratory, in California, have created molecules that bind to actinides to form large, stable complexes that are easier for the body to expel. She (TEDxAix Talk: Ambivalent radionuclides) shares more on the idea, not least that it’s been around for more than 60 years.

Hey, plutonium, you’re coming with me. Abergel’s team is developing chelators, naturally occurring molecules that can form multiple bonds to a single metal ion. Chelators are a long-established treatment for poisoning by heavy metals such as iron, arsenic and lead. (They have also been tried as risky “alternative” treatments for conditions like heart disease and autism; many doctors consider these uses of chelation to be controversial.) Abergel and her team at Berkeley have made a chelator that binds to actinides — without interfering with other metals that we need in our bodies, such as zinc or iron. The work is a continuation of research that started back in the 1950s, pretty much once people started realizing the potential havoc of nuclear warfare.

Building a chelator requires educated guessing and checking. To make their chelators, Abergel’s team worked with scaffolds inspired by the known molecular structures of iron-binding chelators in bacteria. They tinkered with parts of the scaffolds step by step, modifying properties such as the acidity and number of binding sites through a series of chemical reactions. They also tacked on additional arms. “Actinides are bigger than iron, so you need more coordinating atoms,” Abergel explains. Over the last 30 years, she and her colleagues have tested dozens of different structures. One removed 80 percent of contaminating plutonium from mice, in two days, with just a single dose.

Human trials for a radiation poisoning cure. Abergel’s team has already demonstrated the safety and efficacy of their chelators in human cell cultures and several animal models. Last year they received FDA approval for a clinical trial testing the safety of a chelator treatment in humans. This will give one dose of the chelator, first at extremely low levels and then at increasingly higher levels to healthy volunteers, to establish the safety of the chelators and pinpoint any side effects. The researchers will never do a “controlled” clinical trial on people because that would require contaminating humans with radioactive substances. Instead, they will compare their human trial data with their animal data to establish safe, effective doses in humans.

Now, to make an anti-radiation poisoning pill. It would be possible to deliver chelators via a pill or an injection, but Abergel much prefers the pill. “If there is a mass casualty where millions of people are contaminated, you don’t want to be handing out needles,” she says. “Pills are straightforward to distribute, and can easily be crushed into a powder and mixed into something like yogurt for children or elderly people.”

The imperfect science of determining dosage. Much of the work that lies ahead for Abergel is nailing down the details of how to administer such a pill. “In animal models we’ve been looking at different contamination and treatment scenarios,” she says. “If there’s a nuclear power plant accident, what happens if we give the treatment 24 hours after contamination? Two days after? Should people take one pill every day for two weeks or twice a week for one? When do we stop treatment?” Her team is also looking at uses outside large-scale catastrophes — for instance, how the drug might benefit people who are regularly exposed to small amounts of radioactivity, such as scientists, nuclear power plant workers and uranium miners. Such everyday use would be a first for Abergel: normally, the best-case scenario for her product is that it never be used at all. (Until now, her work has relied on funding from the US government, which would add any resulting decontamination drug to its stockpile for emergency use.)

Chelator as a legitimate cancer treatment? Abergel is now experimenting with using synthesized chelators as a way to introduce therapeutic actinides to the body to treat cancer. Her idea: to attach the actinide-chelator complex to another molecule, such as an antibody, that can recognize and attach to a cancer cell. To minimize damage to healthy cells, Abergel is choosing actinides that decay and exit the body quickly. Doctors could also add tracer compounds to the actinide-chelator complex, which would show as fluorescent under certain light. “So these new platforms could target the cancer cell, bring in the radionuclide that destroys everything around it, and we can also image it,” Abergel says. It is very, very early days here — don’t ask your doctor about it quite yet. But if it works, then that of course is one treatment that definitely would be used.

As pollution from burning fossil fuels continues to heat the atmosphere, the world’s glaciers are melting at an accelerating rate. Scientists widely agree that this meltwater has been a major factor in raising global sea levels about seven inches over the 20th century.

The movement of all that water is affecting the Earth’s rotation, according to a study published Friday in the journal Science.

“If you are melting glaciers from high latitudes—in Alaska, Greenland, or Iceland—you move mass away from the pole, toward the equator, which slows the Earth down,” said Jerry Mitrovica, the study’s lead author and a Harvard geophysicist who specializes in studying sea level change. “The change in the distribution of the mass from the poles to lower latitudes also causes the rotation to wobble slightly, because it’s being redistributed unequally.”

That change in rotation added a microsecond to the course of a day over the 20th century.

The study showed that “glacier melt over the 20th century would have increased the duration of a day by maybe a millisecond,” said Mitrovica. While small, the change “can tell you something about how things are melting,” he said, and is a striking example of how human activities are altering Earth’s systems.

For the first time, researchers have mathematically detected glacier meltwater’s movement from higher altitudes to the ocean basins in the speed of the Earth’s rotation.

The finding is yet more proof that greenhouse gas emissions are having an enormous impact on the Earth’s climate. “This very subtle effect, Earth’s rotation, is a way to monitor how much ice sheets are melting and adds to how we are monitoring sea level rise,” Mitrovica said.

“And more important, it shows that even these very subtle effects support the scientific consensus that we’re affecting the climate, and that effect is accelerating,” he added.

In the study, Mitrovica and his colleagues updated and corrected how scientists calculate the known impacts on the Earth’s spin and axis resulting from the end of the last ice age. In the process, they may have solved a 13-year-old conundrum in earth science called “Munk’s enigma,” which asked why the influx of water from melting glaciers wasn’t apparent in measurements of the Earth’s rotation.

“It is an important additional tool in understanding the mass balance of ice sheets, and we need as many of those as we can get,” Mitrovica said.


Related Link:
The Great Glacier Melt Spreads to Greenland’s North
Start-ups are behind the new push. Hydrogen, the universe’s most abundant element, is the fuel for any potential fusion reactor. The machine lives in a white building in an Orange County office park so uninteresting-looking that not even the person who’s supposed to be taking me there can find it. We literally drive right past it and have to double back.

Though there are a few clues if you look closely. A towering silo of liquid nitrogen out back. A shed that turns out to be full of giant flywheels for storing energy. The machine, which is the size of a small house, draws so much juice that when they turn it on they have to disconnect from the public grid and run off their own power to keep from shorting out the whole county. If you had X-ray vision you might notice that all the iron rebar in the building’s foundations has been pulled out and replaced with stainless-steel rebar, because iron is too magnetic.

The machine is a prototype fusion reactor. It is the sole product of a small, secretive company called Tri Alpha Energy, and when it or one like it is up and running, it will transform the world as completely as any technology in the past century. This will happen sooner than you think.

It’s not the world’s only fusion reactor. There are several dozen scattered around the globe in various stages of completion. Most of them are being built by universities and large corporations and national governments, with all the blinding speed, sober parsimony and nimble risk taking that that implies. The biggest one, the International Thermonuclear Experimental Reactor, or ITER, is under construction by a massive international consortium in the south of France, with a price tag of $20 billion and a projected due date of 2027. Fusion research has a reputation for consuming time, money and careers in huge quantities while producing a lot of hype and not much in the way of actual fusion. It has earned that reputation many times over.

But over the past 10 years, a new front has opened up. The same engine of raging innovation that’s been powering the rest of the high-tech economy, the startup, has taken on the problem of fusion. There is now a stealth scene of virtually unknown companies working on it, doing the kind of highly practical rapid-iteration development you can do only in the private sector. They’re not funded by cumbersome grants; the money comes from heavy-hitting investors with an appetite for risk. These are companies most people have never heard of, like General Fusion, located outside Vancouver, and Helion Energy in Redmond, Wash. Tri Alpha is so low profile, it didn’t even have a website until a few months ago. But you’ve probably heard of the people who invest in them: Bezos Expeditions, Mithril Capital Management (a.k.a. PayPal co-founder Peter Thiel), Vulcan (a.k.a. Microsoft co-founder Paul Allen), Goldman Sachs.

The endgame for these companies isn’t acquisition by Google followed by a round of appletinis. It’s an energy source so cheap and clean and plentiful that it would create an inflection point in human history, an energy singularity that would leave no industry untouched. Fusion would mean the end of fossil fuels. It would be the greatest antidote to climate change that the human race could reasonably ask for. Saving the world: that is the endgame.

Michl (you say it like Michael) Binderbauer is one of the co-founders of Tri Alpha and its current chief technology officer. He has a Ph.D. in physics from U.C. Irvine. At 46, Binderbauer is charismatic and ultra-focused: he can talk about plasma physics, lucidly and without notes, apparently indefinitely. (We took a break after two hours.) The logical force of his arguments is enhanced by his radiant self-confidence, a trait that the fusion industry seems to select for, and by his Austrian accent–he grew up there–which inevitably reminds one of the Terminator.

Binderbauer’s confidence is infectious. Tri Alpha is probably the best-funded of the private fusion companies–to date it has raised hundreds of millions, according to a source close to the company, which is a lot of money but a tiny fraction of what’s being spent on the big government-funded projects.

One of the challenges for anybody working on fusion is that people have been talking about it way too much for way too long. The theoretical underpinnings go back to the 1920s, and serious attempts to produce fusion energy on Earth have been going on since the 1940s. Fusion was already supposed to save the world 50 years ago. “All of us fantasize about such things,” Binderbauer says. “It seems like it is the answer, so when someone says anything in that field, it usually very quickly exponentiates to a message of, Progress is already almost done. It gets hyped to a level I think is very dangerous.” (That’s one reason fusion scientists don’t love talking to journalists.)

Fusion also gets mixed up, for obvious reasons, with nuclear fission, which is the kind of nuclear power we have now, though in fact they’re very different animals. Nuclear fission involves splitting atoms, big ones like uranium-235, into smaller atoms. This releases a lot of energy, but it has a lot of drawbacks too. Uranium is a scarce and finite resource, and nuclear plants are expensive and hazardous–Three Mile Island, Chernobyl, Fukushima–and produce huge quantities of toxic waste that stays hazardously radioactive for centuries.

Nuclear fusion is the reverse of nuclear fission: instead of splitting atoms, you’re squashing small ones together to form bigger ones. This releases a huge burst of power too, as a fraction of the mass of the particles involved gets converted into energy (in obedience to Einstein’s famous E=mc[superscript 2]). Fusion has a vaguely science-fictional reputation, but in fact we watch it happen all day every day: it’s what makes the sun shine. The sun is a titanic fusion reactor, constantly smooshing hydrogen nuclei together into heavier elements and sending us the by-product in the form of sunlight.

As an energy source, fusion is so perfect, it could have been made up by a child. It produces three to four times as much power as nuclear fission. Its fuel isn’t toxic, or fossil, or even particularly rare: fusion runs on common elements like hydrogen, which is in fact the most plentiful element in the universe. If something goes wrong, fusion reactors don’t melt down; they just stop. They produce little to no radioactive waste. They also produce no pollution: the by-product of fusion is helium, which we can use to inflate the balloons for the massive party we’re going to have if it ever works.

Daniel Clery puts the contrast with conventional power starkly in his excellent history of fusion, A Piece of the Sun: “A 1-GW coal-fired power station requires 10,000 tonnes of coal–100 rail wagon loads–every day. By contrast … the lithium from a single laptop battery and the deuterium from 45 liters of water could generate enough electricity using fusion to supply an average U.K. consumer’s energy needs for 30 years.”

The running joke about fusion energy is that it’s 30 years away and always will be. It’s not a very funny joke, but historically it’s always been true.

What makes fusion hard is that atomic nuclei don’t particularly want to fuse. Atomic nuclei are composed of protons (and usually neutrons), so they’re positively charged, and as we know from magnets, things with the same charge repel each other. You have to force the atoms together, and to do that you have to heat them up to the point where they’re moving so fast that they shake off their electrons and become a weird cloud of free-range electrons and naked nuclei called a plasma. If you get the plasma really hot, and/or smoosh it hard enough, some of the nuclei bang into each other hard enough to fuse.

The heat and pressure necessary are extreme. Essentially you’re trying to replicate conditions in the heart of the sun, where its colossal mass–330,000 times that of Earth–creates crushing pressure, and where the temperature is 17 million degrees Celsius. In fact, because the amounts of fuel are so much smaller, the temperature at which fusion is feasible on Earth starts at around 100 million degrees Celsius.

That’s the first problem. The second problem is that your fuel is in the form of a plasma, and plasma, as mentioned above, is weird. It’s a fourth state of matter, neither liquid nor solid nor gas. When you torture plasma with temperatures and pressures like these, it becomes wildly unstable and writhes like a cat in a sack. So not only do you have to confine and control it, and heat it and squeeze it; you have to do all that without touching it, because at 100 million degrees, this is a cat that will instantly vaporize solid matter.

You see the difficulty. Essentially you’re trying to birth a tiny star on Earth. “It comes down to two challenges,” Binderbauer says. “Long enough and hot enough.” In other words: Can you keep your plasma stable while you’re getting it up to these crazy temperatures? The severity of the challenge has given rise to some of the most complex, most extreme technology humans have ever created.

Take for example the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, outside San Francisco. A 10-story building with a footprint the size of three football fields, the NIF houses one of the most powerful laser systems in the world: 192 beams of ultraviolet light capable of delivering 500 trillion watts, which is about 1,000 times as much power as the entire U.S. is using at any given moment. All that energy is delivered in a single shot lasting 20 billionths of a second focused on a tiny gold cylinder full of hydrogen. The cylinder, understandably, simultaneously explodes and implodes, and the hydrogen inside it fuses. This technique is called inertial confinement fusion.

A more common method for creating fusion is by controlling the plasma magnetically. One of the few breaks physicists catch in the quest for fusion is that plasmas are extremely sensitive to electromagnetism, to the point where electromagnetic fields can actually be used to contain and compress them without physically touching them. It’s a feat most often performed using a device called a tokamak. (The word is a Russian acronym.) A tokamak is a big hollow metal doughnut wrapped in massively powerful electromagnetic coils. The coils create a magnetic field that contains and compresses the plasma inside the doughnut.

Since they were developed in the Soviet Union in the 1950s, tokamaks have come to dominate fusion research: in the 1980s enormous tokamaks were built at Princeton and in Japan and England, at a cost of hundreds of millions of dollars. Their successor, the colossus of all tokamaks, is being built in a small town in France outside Marseilles. ITER, the International Thermonuclear Experimental Reactor, will be 30 meters tall and weigh 23,000 tons. Its staff numbers in the thousands. It will hold 840 cubic meters of plasma. Its magnets alone will require 100,000 kilometers of niobium-tin wire. Its stupendous cost is being paid by a global consortium that includes the U.S., Russia, the European Union, China, Japan, South Korea and India.

Because of their extreme size and complexity, and the political vagaries associated with their funding, fusion projects are bedeviled by cost overruns and missed deadlines. The NIF was finished seven years late for $5 billion, almost double the original budget. ITER’s estimated date for full power operation has slipped from 2016 to 2027, and even that date is under re-evaluation. Its price tag has gone from $5 billion to $20 billion; for purposes of comparison, the Large Hadron Collider cost $4.75 billion.

The goal for all these machines is to pass the break-even point, where the reactor puts out more energy than it takes to run it. The big tokamaks came close in the 1990s, but nobody has quite done it yet, and some scientists find the pace frustrating. “Academics aren’t necessarily good at adhering to a schedule, promising something and delivering it, on budget and on time,” Binderbauer says. “The federal process doesn’t condition you to live in that mind-set.” And even when it does get up and running, ITER will never supply a watt of power to the grid. It’s a science experiment, not a power plant. Proof of concept only.

Fusion research is too slow, too cautious, too focused on lavishing too much money on too few solutions and too many tokamaks. “In a university lab the name of the game, the end product, is a paper,” says Michel Laberge, founder of General Fusion in Vancouver, who has a Ph.D. in physics. “You want to get to making energy, but it’s not the primary goal. The primary goal is to publish a lot of papers, to go to conferences and understand very thoroughly all the little details of what is going on.” Understanding is all well and good, in an ideal world, but the real world is getting less ideal all the time. The real world needs clean power and lots of it.

The driving force behind the founding of Tri Alpha was a physicist at U.C. Irvine named Norman Rostoker. Rostoker, who died in 2014, was a plasma physicist with both a deep understanding of mathematics and a flair for practical applications. He also had an indomitable will and a pronounced independent streak–anybody who talks about him ends up using the word maverick sooner or later. Binderbauer was one of his protégés.

Even in the early 1990s, Rostoker was skeptical of the tokamak hegemony. In a tokamak, the particles in the plasma move in tight spiral orbits around lines of electric current. But it’s hard to keep those particles from being bumped out of their little orbits by electromagnetic turbulence, and when that happens the plasma becomes unstable and loses precious heat. One way scientists fight this instability is by building bigger and bigger tokamaks, but bigger means more complex, and more power-hungry, and more expensive. Rostoker thought there had to be a better way.

He found one in particle accelerators, those colossal rings, like the Large Hadron Collider, that crash subatomic particles into each other. In accelerators, particles travel on wide, conspicuously stable orbits. Rostoker and Binderbauer wondered if you could do something similar in a fusion reactor. They spent a couple of years thinking about it and decided, short answer, probably. “If you can bring accelerator physics into the realm of fusion, you can actually make a better-behaved plasma, one that can give you long timescales,” Binderbauer says. “Then you can invest energy and heat it.”

Rostoker’s other key insight had to do with the flow of people and money around the reactor: he thought the private sector would be a better place to get things done than a university lab. Essentially he recategorized fusion power from an object of lengthy, lofty scientific inquiry to just another product to be shipped. “Fusion is in the end an application, right?” Binderbauer says. “The problem with fusion typically is that it’s driven by science, which means you take the small steps. The most predictable next step, the one you’re comfortable with. So it doesn’t necessarily connect with what you want. Norm said, You’ve got to look at the end in mind. You’ve got to unravel it, reverse-engineer it. What would a utility want? What would make sense? And design something from there, and be agnostic as to how hard the physics might be.”

Raising money was a challenge: tokamaks were eating up all the grant money, and energy startups are expensive, risky long-term bets, especially to Silicon Valley investors spoiled from flipping web startups for quick paydays. Recruiting was tough too: building a fusion device requires a blended culture of physicists and engineers, two groups who don’t historically mix well. For the first few years, the company ran on the brink of insolvency. “You have money for a year or two to develop something, deliver, and go get the next chunk,” Binderbauer says. “It’s not the academic risk profile.”

To keep the pace up they freed themselves from the baggage of theory: as long as something worked, they didn’t analyze to death why. The idea was to stay pragmatic and iterate rapidly, spend as little as possible and not fear failure. “This is one of the failures of the governmental way of running it,” Binderbauer says. “It didn’t create enough diversity of ideas, and let those freely be pursued to failure. Say, this is where we ultimately want to go, what are the critical steps to get there, what are the risk elements of the path to get there, and can I test for some of these risks without spending a hundred million bucks?”

Some academics would disagree, but no one can deny that Tri Alpha has managed to build a prototype fusion reactor quickly on a tiny budget. The company has a panel of advisers–including Burton Richter, who won the Nobel Prize for Physics in 1976, and Ronald Davidson, past director of fusion labs at both MIT and Princeton–and Binderbauer has fond memories of unveiling his first prototype to them in 2008. “There were like jaws dropping. It was like, holy sh-t, these guys actually did this? On this time frame? This is not possible. Then we had world-record data by August. That was a year basically from seeing dust to seeing physics data taken that’s better than anyone else ever did.”

Davidson confirms that impression, though in less colorful language. “In the framework of a Department of Energy laboratory, and also in some universities, the level of regulations and restrictions you have on how you do things is somewhat different than in the industry,” he says. “The industry can be quite nimble, relatively speaking, in exploring ideas and testing them for the first time.”

Tri Alpha’s reactor is very different from the towering tokamaks that dominate the fusion skyline, or the supervillain lasers of the NIF. You could think of it as a massive cannon for firing smoke rings, except that the smoke rings are actually hot plasma rings, and the gunpowder is a sequence of 400 electric circuits, timed down to 10 billionths of a second, that accelerate that plasma ring to just under a million kilometers an hour.

And there are actually two cannons, arranged nose to nose, firing two plasmas straight at each other. The plasmas smash into each other and merge in a central chamber, and the violence of the collision further heats the combined plasma up to 10 million degrees Celsius and combines them into a single plasma 70 to 80 centimeters across, shaped more or less like a football with a hole through it the long way, quietly spinning in place.

But a fusion reactor’s work is never done. Positioned around that central chamber are six massive neutral beam injectors firing hydrogen atoms into the edges of the spinning cloud to stabilize it and keep it hot. Two more things about this cloud: one, the particles in it are moving in a much wider orbit than is typical in, say, a tokamak, and hence are much more stable in the face of turbulence. Two, the cloud is generating a magnetic field. Instead of applying a field from outside, Tri Alpha uses a phenomenon called a field-reversed configuration, or FRC, whereby the plasma itself generates the magnetic field that confines it. It’s an elegant piece of plasma-physics bootstrappery. “What you get within forty millionths of a second from the time you unleash your first little bit of gas,” Binderbauer says proudly, “is this FRC sitting in here, fully stagnant, no more moving axially, and rotating.”

The machine that orchestrates this plasma-on-plasma violence is something of a monster, 23 meters long and 11 meters wide, studded with dials and gauges and overgrown with steel piping and thick loose hanks of black spaghetti cable. Officially known as C-2U, it’s almost farcically complicated–it looks less like a fusion reactor than it does like a Hollywood fantasy of a fusion reactor. It sits inside a gigantic warehouse section of Tri Alpha’s Orange County office building surrounded by racks of computers that control it and more racks of computers that process the vast amounts of information that pour out of it–it has over 10,000 engineer control points that monitor the health of the machine, plus over 1,000 physics diagnostic channels pumping out experimental data. For every five millionths of a second it operates it generates about a gigabyte of data.

In August, Tri Alpha announced that its machine had generated some very interesting data. So far the company’s primary focus has been on the long-enough problem, rather than the hot-enough part; stabilizing the plasma is generally considered the tougher piece in this two-piece puzzle. Now Binderbauer believes that they’ve done it: in June the reactor proved able to hold its plasma stable for 5 milliseconds.

That’s not a very long time, but it’s an eternity in fusion time, long enough that if things were going to go pear-shaped, they would have. The reactor shut down only because it ran out of power–at lower power, and hence with slightly less stability, they’ve gone as long as 12 milliseconds. “We have totally mastered this topology,” Binderbauer says. “I can now hold this at will, 100% stable. This thing does not veer at all.” He didn’t live to see it, but Rostoker was right. The cat is in the sack. Tri Alpha has tamed the plasma.

Some other people may be right too. Where fusion is concerned, the private sector supports a robustly diverse range of methodologies. In 2002, Laberge, an intense redhead with a thick French-Canadian accent and a droll sense of humor, realized he’d spent enough of his life designing laser printers. “I decided to start a fusion company,” he says. “Which is pretty insane, but that’s what I went for. I guess, go big in life.”

Laberge too was skeptical of the monoculture that dominated fusion science. “The thing in fusion is, when they started they tried many different approaches, and then there’s one or two that had a bit of success and whatnot, and then everybody jumped on those approaches,” he says. “So it is a good hunting ground for new startup companies, to go and see those abandoned efforts.” The approach he hit on is called magnetized target fusion: crudely put, you create a spinning vortex of liquid metal, inject some plasma into its empty center, then squeeze the vortex, thereby squeezing the plasma inside it and causing it to heat up and fuse.

Laberge couldn’t get enough grant funding, so he took the idea to investors instead and founded General Fusion. Now General Fusion has 65 employees and is one of a small handful of companies racing Tri Alpha to the break-even point. To date it has raised $94 million and built prototypes of the reactor’s major subsystems, including a spherical chamber for the liquid metal vortex with 14 huge spikes projecting out at all angles–the spikes are massive hammers that do the squeezing. It looks, if possible, even more like Hollywood’s idea of a fusion reactor than Tri Alpha’s. “The tokamak people have a very long timeline, which I don’t like,” Laberge says, “so we’d like to speed that up, and we think we can move faster.” Predictions, like comparisons, are invidious, but when coerced he says, “About a decade to producing energy would be a good timeline to have.”

Helion Energy, another venture in Redmond, is already on its fourth-generation prototype. Its approach also has two plasmas colliding in a central chamber, but it will work in rapid pulses rather than sustaining a single static plasma. Helion is focused on developing a smaller-scale, truck-size reactor, and doing it as fast as possible. The company’s website states in no uncertain terms that it will have a commercial reactor operational within six years. (Helion told us it was too busy building fusion reactors right now to participate in this article.)

And there are others. Industrial Heat in Raleigh, N.C.; Lawrenceville Plasma Physics in New Jersey; Tokamak Energy outside Oxford, England. Lockheed Martin’s Skunk Works division is developing what it calls a compact fusion reactor, which it says will fit on the back of a truck. It also says it’ll have a working prototype within five years. (And it said that last year, so four to go.)

There’s a kind of cheeky underdog defiance in the attitude of the private sector to the public, but the attitude the other way is a bit more collegial. “They’re very interesting,” says Professor Stewart Prager, director of the Princeton Plasma Physics Laboratory. “Some more than others. There’s a range. It’s definitely good to see private investment in fusion.” Dennis Whyte, director of the Plasma Science and Fusion Center at MIT, understands the impatience that drives the startups. “Their argument is that if the science breaks go their way, they will be able to accelerate the pace of getting fusion energy on the grid, and I overall agree with that philosophy,” he says. “I’m part of the quote unquote Establishment that they’re railing against, but you can sense my frustration, because I’m not happy about the delays and so forth.” (He might well be frustrated: Congress has cut funding for MIT’s fusion reactor, which will cease operations next year. He’s currently focused on designing a smaller, modular reactor that takes advantage of recent advances in superconducting technology.)

Within the private sector, there’s a good deal of genial trash talk. The trash talk about Tri Alpha tends to focus on the question of fuel: When you’re doing fusion, which atomic nuclei do you fuse? By far the most popular answer is deuterium and tritium, twoisotopes of hydrogen. This is fusion’s low-hanging fruit, because deuterium and tritium fuse at a lower temperature than any other option, a comparatively mild 100 million degrees Celsius. ITER uses D-T fusion (as it’s known), as do the NIF, the National Spherical Torus Experiment at Princeton, Lockheed Martin, General Fusion and almost everybody else.

But there are catches. One is that tritium is rare, so you have to make it. The other is that the reaction emits, along with an isotope of helium, a neutron, which is a problem because when you throw a lot of free neutrons at something it eventually becomes radioactive. That means you’re stuck regularly replacing parts of your reactor as they become too hot to handle. Binderbauer is scathing on the subject of D-T fusion. “Let’s say you have success on ITER,” he says. “You’ve still got another many decades of materials research to try to make something that lasts more than six to nine months, in the hellish bombardment of neutrons it is going to have to live in.”

But there are engineering solutions to the problem: that vortex of liquid metal in General Fusion’s reactor will be a mixture of lead and lithium, which will catch the neutrons. As a bonus, when you hit lithium with neutrons, you get tritium. So two birds, one stone.

Helion’s reactor will fuse deuterium and helium-3, which produces fewer neutrons, though it requires more heat and raises the problem of finding enough helium-3, which is also rare. Tri Alpha plans to fuse protons (otherwise known as hydrogen nuclei) with boron-11. This reaction produces no neutrons at all, and both elements are plentiful and naturally occurring. “We’re always saying, if you want to buy our plant,” Binderbauer says, “we’ll give you a lifetime supply of fuel for free.” The reason hardly anybody else is pursuing it is that proton-boron-11 fusion requires much higher temperatures, insanely much higher: 3 billion degrees Celsius.

No one really knows how plasma will behave at that temperature, and virtually everybody I talked to was skeptical about Tri Alpha’s making it work, and considered the engineering challenges of D-T fusion to be vastly preferable. “Fusion is hard already, even when it’s D-T, and you have to realize how much harder this is than D-T,” says Whyte. “It’s O.K. to take a physics leap, but you also don’t want it to be so big that you worry about its viability.” Laberge felt the same way: “It’s like learning to run before you can walk. Or somebody told me it’s like learning to fly before you can walk. You can argue that General Fusion is outrageously ambitious trying to do fusion, but Tri Alpha is outrageously outrageously ambitious.”

Binderbauer, who is not intimidated by anything, is not intimidated by this either. His next move will be to tear down Tri Alpha’s current reactor and build a new one that will scale up to the necessary temperatures. He points out that particle accelerators can create temperatures in the trillions. “Going to higher temperatures is not that hard,” he says. “It sounds terrible, because it’s billions of degrees, but it’s not. You use techniques much like what you use in a microwave. They’re very similar principles.” You have to imagine the Austrian accent to get the full effect.

Everybody in the fusion industry shares a worldview in which the transformation of the globe by fusion power is imminent. I asked Binderbauer how confident he was that he would see a practical fusion reactor in his lifetime, and his answer was “Very. Scientifically I’m very confident. Now that we have this, this is the foundation.” He thinks he understands theoretically what will happen as his machine claws its way up to 3 billion degrees, and the theory tells him it’s possible. “There should be no physics that says it won’t be. But you gotta test it. This is the field where nature’s the ultimate arbiter, so there’s some risk there.”

Binderbauer’s Austrian rigor restrains him, barely, from making brash predictions about when all this is going to happen. “People tell you they’ll have a reactor in five years–I know it’s impossible. And it’s not because I’m negative. I want this too, and we work as fast as we can, but I know it’s more than five years. It just is.” Try to pin him down on a specific timeline for Tri Alpha and he writhes like a superheated plasma. “It’s not true that it takes 30 years and will always take 30 years. It doesn’t. I’m not prepared to tell you, X is the number of years till we have a commercial reactor here.

But I will tell you, we are truly about three to four years from the point where the risk changes from a science risk to an engineering risk. And I can certainly see that within a decade such things can mature to the point where you can have the first commercial steps.”

There may be a lot of those steps. The utilities will be the ones making the actual transition, and for fusion to be of any earthly use to anybody it will have to make business and engineering sense to them, because fusion plants will be expensive. Unlike solar or wind, fusion would provide energy constantly, not intermittently, but there would have to be enough of it. The gain (the ratio of energy-out to energy-in) of a commercial fusion plant would have to be in the 15-to-20 range; right now ITER’s target gain is 10; to date no fusion reactor has yet reached a ratio of 1, the break-even point. Then there’s the question of how exactly to extract that energy from the reactor in the form of heat, so that it can plug into the existing infrastructure.

But those steps would be giant leaps for mankind. Bill Gates is currently on a global campaign trying to raise awareness about how badly our addiction to energy is destroying the environment. He’s putting $2 billion of his foundation’s money into it. “We need innovation that gives us energy that’s cheaper than today’s hydrocarbon energy, that has zero CO[subscript 2] emissions, and that’s as reliable as today’s overall energy system,” he says in the November issue of the Atlantic. “We need an energy miracle.” (He personally has invested in TerraPower, a maker of next-generation fission plants.)

To assess the precise probability that fusion will or won’t be that miracle is beyond the remit of a journalist without a Ph.D. in plasma physics, but as miracles go, it’s looking a lot more plausible than most. Even Prager, head of the Princeton Lab, who considers the claims of the private sector to be overconfident, still believes it’s a question of when not if. “I think it’s inevitable. And I don’t think I’m alone in that. You can’t get commercial fusion in 10 years, but I think we’ll have commercial fusion, fusion on the grid, in the 2040s. It may sound like a long way away, but in terms of mitigating climate change, fusion will play a very critical role.”

Fusion may just turn out to belong to that category of human achievement, like powered flight and moon landings, that appeared categorically impossible right up until the moment somebody did it. At the very least, a lot of very smart people are betting their money and their careers on it. As for the rest of us, we may already have bet the planet.

Read more:
Germany to switch on a revolutionary nuclear fusion machine by end of 2015
Obese children's health improves with less sugar

Calories are not created equal. Reducing consumption of added sugar, even without reducing calories or losing weight, has the power to reverse a cluster of chronic metabolic diseases in children in as little as 10 days, according to a study by researchers at UC San Francisco and Touro University California.

"This study shows that sugar is metabolically harmful not because of its calories or its effects on weight; rather sugar is metabolically harmful because it's sugar," says lead author Dr Robert Lustig, a paediatric endocrinologist.

In the study, 43 Latino and African-American children between the ages of nine and 18 who were obese and had at least one other chronic metabolic disorder - such as hypertension, high triglyceride levels or a marker of fatty liver - were given nine days of food that restricted sugar but substituted starch to maintain the same fat, protein, carbohydrate and calorie levels as their previous diets.

Total dietary sugar was cut from 28 per cent to 10 per cent, and fructose from 12 per cent to 4 per cent of total calories, respectively. The foods included turkey hot dogs, potato chips, and pizza, all bought at local supermarkets.

At the end of the diet, virtually every aspect of the participants' metabolic health improved, without change in weight. Diastolic blood pressure decreased by 5mm, triglycerides by 33 points, LDL-cholesterol ("bad" cholesterol) by 10 points, and liver function tests improved. Fasting blood glucose went down by five points and insulin levels were cut by one-third.

Green offices linked to higher cognitive function

People who work in well-ventilated offices with below-average levels of indoor pollutants and carbon dioxide have much higher cognitive functioning scores - in crucial areas such as responding to a crisis or developing strategy - than those who work in offices with typical levels, according to a new study led by scientists at the Harvard T.H. Chan School of Public Health.

The researchers looked at people's experiences in "green" versus "non-green" buildings, and both the participants and the analysts were blinded to test conditions to avoid biased results. The decision-making performance of 24 participants - including architects, designers, programmers, engineers, creative marketing professionals and managers - was analysed while they worked in a controlled office environment for six days. At the end of each day, they conducted cognitive testing on the participants.

They found that cognitive performance scores for the participants who worked in the green environments were up to double those of participants who worked in conventional environments. The largest improvements occurred in the areas of crisis response, strategy and information usage. The findings suggest that the indoor environments in which many people work daily could be adversely affecting cognitive function - and that, conversely, improved air quality could greatly increase the cognitive function performance of workers.

Peter Kammerer says the city lags others in embracing the trend to share underused assets and services, a sign that we lack a sense of community.

Some governments, but not Hong Kong's, have encouraged carpooling as it reduces journeys and, as a result, lessens road congestion and pollution.

There's only so much space in a Hong Kong flat. The longer we live in it, the more stuff we collect - and we only realise how much we've gathered when we move. Books, electrical appliances, DVDs, tools and on and on are unearthed, most of it destined to be thrown away along with the furniture and old clothes that we would rather not take with us. It's a waste and we feel bad about putting them into the garbage, but what can be done in a place where high rents make second-hand shops a rarity?

The answer lies in the sharing economy - the idea behind Airbnb, carpooling and tool banks. It's taken off in North America and Europe, but is still a novelty in Hong Kong. Poor understanding about sustainability may have something to do with it, or maybe it's a lack of will by the government to push recycling. I suspect much of it has to do with laziness, selfishness and an inability to trust.

Smartphones and social media have, after all, made it possible for us to easily connect with someone who may want what we no longer need or have limited use of. Those with particular expertise or skills can also share them, while people with spare time can offer to babysit or run errands. Goods, services and time can be given free or charged to make better use of what we have. There's a sense of hippyness about it, something entrepreneurial, but above all, it's about being less wasteful.

We're most familiar with the room-rental company Airbnb, but there are untold numbers of others, ranging from the online learning community for creators, Skillshare, through the web marketplace for medical equipment, Cohealo, to car-sharing and rental.

Rachel Botsman, whose 2010 book What's mine is yours: How collaborative consumption is changing the way we live, drove the idea into popular consciousness. She defines the sharing economy as "an economic system based on sharing underused assets and services, for free or a fee, directly from individuals".

The concept is not new - libraries do it with books and poorer communities have always shared to make ends meet. Carpooling - where vehicle owners take turns to drive others to save costs - is as old as suburbs. Some governments, but alas, not Hong Kong's, have encouraged it as it reduces journeys and, as a result, lessens road congestion and pollution. Taxi companies don't like the lost business, just as they object to the competition of newcomers like Uber, with their app-based ease and efficiency.

But for sharing to be effective, communities have to work together. That requires selflessness and trust - commodities that are sometimes in short supply in Hong Kong, where consumerism is king and neighbours are more often than not avoided to prevent them from prying into lives.

Creating neighbourliness was one of the reasons Tai Po resident Albert Lui turned to Facebook a few months ago; he figured that by looking for people willing to share his drive to work in Central, he could make new friends while improving traffic. He had noticed many cars had one occupant - the driver.

Most private cars are only used an hour a day and are idle in parking spaces the rest. Hourly rental services, like Zipcar in the US, would make sense. That also raises questions about the cost-effectiveness of the government's 6,430 vehicles, among them 1,475 sedans and 643 buses. Surely renting when needed saves public funds, as governments elsewhere have found?

The sharing economy will catch on in Hong Kong. More people like Lui will push it along. But for it to work effectively, we need to be less possessive and reticent about dealing with strangers.

Related link:

"One Mega-City, Many Systems": The Evolution of Hong Kong

We’re disappointed that big news in France hasn’t made its way to the top of U.S. headlines: the French National Assembly recently passed a law that will help to limit young children’s exposure to electromagnetic fields (EMFs) generated by wireless technologies. Two years in the works, the law encompasses various rules, including:

-Banning WiFi in any childcare facilities catering to children under the age of 3.
-Requiring cell phone manufacturers to recommend the use of hands-free kits to everyone.
-Banning any advertising that specifically targets youth under the age of 14.

The law, passed by a majority vote and adopted into place on January 29, 2015, is the first in France to suggest and establish that WiFi over-exposure may indeed be hazardous to young children — a controversial topic, not just in France, but around the world.

The law, entitled, “An Act on Sobriety, Transparency, Information and Consultation for Exposure to Electromagnetic Waves,” while comprised of many sections, most importantly, is setting a good example about how it may be best to take a precautionary approach when addressing the potential health risks of WiFi exposure. Various research, from informal to formal, shows that chronic exposure to WiFi, may be harmful to youth. Some current literature on the subject has shown that exposure to high-powered WiFi environments may include attention problems, cardiac irregularities, seizures, fatigue, and other health problems. Another scientific report published recently shows that kids’ brains may absorb twice as much cellphone radiation as adults’ brains. Last, but certainly not least, screen addiction has become a very real health and well-being issue. Unfortunately, and contrary to the initial proponents of the new law, WiFi will still be permitted in primary schools. That said, we’re still very happy to hear that France is taking small steps to alleviate some of the problems caused by screens and WiFi, and hope to see similar actions in the United States and other places around the world.

This page was loaded Dec 7th 2016, 2:18 pm GMT.