Cameron Abadi
Last month in California, a nuclear reactor produced 3.15 megajoules of energy using only 2.05 megajoules of energy input. That surplus has been treated as a major breakthrough in the future of energy because it was produced through the process of nuclear fusion. Experts have talked for decades about nuclear fusion’s potential as a carbon neutral source of energy without any of nuclear energy’s toxic waste.
What were the economics behind this breakthrough technology? Might it provide a status boost to old-fashioned engineering relative to computer engineering? And what’s the path from laboratory success to industrial use? Those are a few of the questions that came up in my recent conversation with FP economics columnist Adam Tooze on the podcast we co-host, Ones and Tooze. What follows is an excerpt, edited for length and clarity.
Cameron Abadi: This breakthrough was achieved by the National Ignition Facility (NIF). Could you describe the history of the NIF and its relationship to the U.S. government?
Adam Tooze: It’s a project that goes back originally to some really far-out thinking in the 1950s about uses that could be made of atomic bombs for the purposes of power generation. And the original idea was literally to organize a continuous stream of atomic explosions underground—you know, find some suitably stable caves, and explode several atomic bombs a day to keep a huge mass of water boiling to generate lots of steam. Anyway, that’s where it started.
But out of all of this, from the late 1960s onward, came more serious programs in fusion energy, which essentially focused on lasers. And that’s what this National Ignition Facility is—it is the ultimate fire lighter, right? Basically it’s a gigantic torch or something—the sort of effect that you generate as a Boy Scout or a Cub Scout or whatever, when you start a fire by concentrating the heat of the sun using a magnifying glass. So that’s essentially what we’re doing. And the stunning success of the current round of experiments announced by the U.S. Department of Energy to the public a few weeks ago now is that now for the first time ever, the amount of energy generated by the fusion reaction is larger than the amount of energy fired at it by the laser.
Of course, the amount of energy necessary to generate the laser beam is multiples larger—in the case of this laser beam, somewhere between 150 times larger than the amount that actually reaches the fuel materials. So this is still a powerfully net negative reaction that we have going on here; it uses more energy than it produces. But at least hypothetically, if you could increase the yield and reduce the energy demands of the lasers, you could end up with a process that actually generates power on a large scale. The total cost to date is in the order of, I think, about $3.5 billion.
CA: How long are we still from having fusion as a workable source of energy? What is the path generally from basic research to industrial use?
AT: I think the only honest answer to this in general is that we do not know the answer to this. You know, there was somebody talking to the New York Times and it really took me aback because this expert assumed that the answer was half a century away. And, you know, the optimism engendered by this extraordinary news from this lab suggested otherwise, but it could easily be many decades.
CA: The idea of fusion energy goes back almost a century, and it seems like the promise has always been that the big breakthrough is just one decade away. What accounts for this consistent obsession with this particular source of energy? What makes this still-undeveloped energy source so much more attractive than, say, the renewable energy sources we already have?
AT: I think, fundamentally, because it’s gee whiz, final frontier, extraordinary stuff. And the physics involved are mind-blowing; the engineering is crazy and so much more exciting than just a solar panel sitting beat up in a field somewhere or on a roof or a windmill slowly turning.
And I think commitments to this kind of technology—like fusion, like the development of atomic power—are essential. I don’t think we should back away from them at all. We should absolutely have strong, viable, energetic, practical, fundamental energy research programs in all of these areas. We cannot afford to rule out any technology at this point given the desperation of our situation faced with the climate crisis. And in fact, we should recognize that the overwhelming majority of fundamental energy research funding in the last half century has gone to atomic power rather than renewables. I mean, if you look at the national funding levels from the 1970s onward, it’s overwhelmingly tilted toward high-energy physics because it’s super sexy for physicists. It’s directly related to the military industrial complex, and so the synergies are there. It’s also very expensive; it requires a lot of capital investment, so the engineering companies like getting in on this. You know, as much as this National Ignition Facility is a public project, the $3.5 billion were mainly not spent on scientists. It was mainly spent on extremely complex raw materials and labor necessary to build the facilities, and much of that goes to the private sector. So there was a huge private sector stake in these kinds of projects.
But having said all of that, our experience both at the level of economics and at the level of politics with this particular set of technologies—those to do with nuclear power, fission, and fusion—over the last 50 years has been sobering. And on the whole, they appear at this point to be both massively unpopular technologies and, in some cases, hugely politicized technologies as well as incredibly expensive in terms of capital costs—not in terms of operating them but in terms of capital cost to build them. And so, a realistic energy strategy that addresses a crisis where we need to make huge strides in the next 20 to 30 years should not rule those technologies out, but it should realistically gauge how much contribution they can make. And in both Europe and the United States, there is evidently a case for maintaining the existing capacity, but it’s pretty difficult to see what the case is for investing in new capacity when the costs are as explosively uneconomic.
So that is why I find it difficult to make the case for either conventional atomic power or fusion power as an immediately practical or relevant answer to the issues facing Western countries in the chase for a solution to the problem of the energy transition and decarbonization. And we are lucky, extraordinarily lucky, that renewable technologies have come on as quickly as they have. We should double down on this. We should invest even more.
CA: Can this kind of breakthrough serve to raise the status of materials engineering relative to computer engineering or even financial engineering? Is the old-fashioned kind of engineering in need of that kind of status boost in our society?
AT: I mean, my instinct was to immediately say, yes, of course it is. You know, poor downtrodden engineers, they need a pat on the head. But then I actually looked at the data for the National Science Foundation, and it turns out that among Ph.D.s of all types, it’s the humanities and the social sciences that we need to worry about because the share of doctorates in engineering—and this is distinct from computer science—is in fact on the rise and has been very dramatically over the last 20 years. And the share of doctorates going to engineering in the broadest sense has risen from 14 percent to 20 percent. So, hundreds of thousands of brilliant young people are easily smart enough to recognize just how exciting this is and flock into those very tough, very demanding disciplines in very large numbers—and not just, of course, in the United States but all over the world. And thank God for that.
No comments:
Post a Comment