The Future is Fusion

in all experiments of which i am aware, plasma has not been affected at a temperature which could sustain the reaction

already explained three times. We can't contain that temperature, but we can create it. So all experiments to date have to be aborted before they reach that critical self sustaining temp threshold.

if it is sustainable without external input

Have you ever heard of the Sun? Do you doubt that the Sun is a self sustaining hydrogen fusion reactor? Or any of the trillions of similar stars in the universe?
 
no, cannon, there are tokamaks that can handle fusion temperatures and containment. again, the containment issue is not the fundamental issue. the issue is net return of power. you've claimed that there is a reactor which netted energy! this is great news. where did that happen and when?

please dont say the center of the solar system.
 
OK I found this and it contradicts my previous claims and it appears current:

In an operating fusion reactor, part of the energy generated will serve to maintain the plasma temperature as fresh deuterium and tritium are introduced. However, in the startup of a reactor, either initially or after a temporary shutdown, the plasma will have to be heated to its operating temperature of greater than 10 keV (over 100 million degrees Celsius). In current tokamak (and other) magnetic fusion experiments, insufficient fusion energy is produced to maintain the plasma temperature.

I have no idea whether or not the experiment I am recalling utilized magnetic containment of the plasma.
 
you've not answered what experiment in which reactor you are referring to.

the part of your explanation which challenges my understanding is the gross inaccuracy. in all experiments of which i am aware, plasma has not been affected at a temperature which could sustain the reaction, if it is sustainable without external input. this is why there is a laser race over fusion technology. there's no way that could be mistaken for exothermic to the degree you've claimed. to my knowledge, not a single trial has ever given off more energy than was applied to it.

certainly your claims constitute ground-breaking proof of concept. provide a link.

The only experiments that I have ever heard of that led to anything like he described (enough energy to light the Eastern seaboard in a very short period of time) all resulted in massive and used nuclear reactions to trigger the fusion reactions. As far as I know no one is even trying to use that technology to build a power supply.
 
OK I found this and it contradicts my previous claims and it appears current:

In an operating fusion reactor, part of the energy generated will serve to maintain the plasma temperature as fresh deuterium and tritium are introduced. However, in the startup of a reactor, either initially or after a temporary shutdown, the plasma will have to be heated to its operating temperature of greater than 10 keV (over 100 million degrees Celsius). In current tokamak (and other) magnetic fusion experiments, insufficient fusion energy is produced to maintain the plasma temperature.

I have no idea whether or not the experiment I am recalling utilized magnetic containment of the plasma.

this is where i was coming from with the laser race and fusion tech. the cutting edge of plasma containment is magnetic, i think.
 
you've not answered what experiment in which reactor you are referring to.

the part of your explanation which challenges my understanding is the gross inaccuracy. in all experiments of which i am aware, plasma has not been affected at a temperature which could sustain the reaction, if it is sustainable without external input. this is why there is a laser race over fusion technology. there's no way that could be mistaken for exothermic to the degree you've claimed. to my knowledge, not a single trial has ever given off more energy than was applied to it.

certainly your claims constitute ground-breaking proof of concept. provide a link.

The only experiments that I have ever heard of that led to anything like he described (enough energy to light the Eastern seaboard in a very short period of time) all resulted in massive and used nuclear reactions to trigger the fusion reactions. As far as I know no one is even trying to use that technology to build a power supply.

COME ON!!! The link has been published repeatedly

ITER - the way to new energy
 
OK I found this and it contradicts my previous claims and it appears current:

In an operating fusion reactor, part of the energy generated will serve to maintain the plasma temperature as fresh deuterium and tritium are introduced. However, in the startup of a reactor, either initially or after a temporary shutdown, the plasma will have to be heated to its operating temperature of greater than 10 keV (over 100 million degrees Celsius). In current tokamak (and other) magnetic fusion experiments, insufficient fusion energy is produced to maintain the plasma temperature.

I have no idea whether or not the experiment I am recalling utilized magnetic containment of the plasma.

this is where i was coming from with the laser race and fusion tech. the cutting edge of plasma containment is magnetic, i think.

I know how to find the example I was referring to, I posted it on another board with a primitive search function. It will take 20-30 minutes for me to find the source article tonight. I will.

What I recall is that that experiment utilized a finite supply of hydrogen all held in place in one small spherical space and 500 lasers targeting that small space. They didn't intend to produce a sustained reaction just to reach the threshold where it could sustain if they fed more hydrogen.
 
OK I found this and it contradicts my previous claims and it appears current:



I have no idea whether or not the experiment I am recalling utilized magnetic containment of the plasma.

this is where i was coming from with the laser race and fusion tech. the cutting edge of plasma containment is magnetic, i think.

I know how to find the example I was referring to, I posted it on another board with a primitive search function. It will take 20-30 minutes for me to find the source article tonight. I will.

What I recall is that that experiment utilized a finite supply of hydrogen all held in place in one small spherical space and 500 lasers targeting that small space. They didn't intend to produce a sustained reaction just to reach the threshold where it could sustain if they fed more hydrogen.

Were you thinking of the Shiva Project?

"Two experimental laser fusion devices have been developed at Lawrence Livermore Laboratory, called Shiva and Nova. They deliver high power bursts of lase light from multiple lasers onto a small deuterium-tritium target. These lasers are neodymium glass lasers which are capable of extremely high power pulses."
See Laser Fusion
 
Huh. Fusion. In maybe 30-50 years we'll have the ability to get an industrial prototype online. And then it will probably be protested and shut down as too dangerous by the econazis.

So this is a solution now, how?
 
you've not answered what experiment in which reactor you are referring to.

the part of your explanation which challenges my understanding is the gross inaccuracy. in all experiments of which i am aware, plasma has not been affected at a temperature which could sustain the reaction, if it is sustainable without external input. this is why there is a laser race over fusion technology. there's no way that could be mistaken for exothermic to the degree you've claimed. to my knowledge, not a single trial has ever given off more energy than was applied to it.

certainly your claims constitute ground-breaking proof of concept. provide a link.

The only experiments that I have ever heard of that led to anything like he described (enough energy to light the Eastern seaboard in a very short period of time) all resulted in massive and used nuclear reactions to trigger the fusion reactions. As far as I know no one is even trying to use that technology to build a power supply.

COME ON!!! The link has been published repeatedly

ITER - the way to new energy

ITER has not been constructed yet, buddy. COME ON!!! :doubt:
 
OK I found this and it contradicts my previous claims and it appears current:



I have no idea whether or not the experiment I am recalling utilized magnetic containment of the plasma.

this is where i was coming from with the laser race and fusion tech. the cutting edge of plasma containment is magnetic, i think.

I know how to find the example I was referring to, I posted it on another board with a primitive search function. It will take 20-30 minutes for me to find the source article tonight. I will.

What I recall is that that experiment utilized a finite supply of hydrogen all held in place in one small spherical space and 500 lasers targeting that small space. They didn't intend to produce a sustained reaction just to reach the threshold where it could sustain if they fed more hydrogen.

that setup sounds like the national ignition facility or HiPER (in the works). they've not done any thing like what you had described earlier, but promise to input enough heat to cross the neutral threshold. even so. i don't see sustainable heat coming from the reaction without fission involved as well. much of the energy is neutrons which could be converted to heat if they interact with plutonium or uranium or something.

your earlier testimony sounds like a number of bomb-type experiments to measure the potentials of different fusion cycles. there was no connection between that approach and an energy source beyond that. the idea of a dynamite-powered reactor is also implausible.
 
"i don't see sustainable heat coming from the reaction without fission involved as well. much of the energy is neutrons which could be converted to heat if they interact with plutonium or uranium or something."

The links I posted describe the inefficiency of capturing and converting the heat and energy. I can't understand the inclusion of fission in the arrangement. But I know it will be a great challenge to A) contain the reaction within a magnetic enclosure and B) capture and convert the heat to useful energy.
 
you've not answered what experiment in which reactor you are referring to.

the part of your explanation which challenges my understanding is the gross inaccuracy. in all experiments of which i am aware, plasma has not been affected at a temperature which could sustain the reaction, if it is sustainable without external input. this is why there is a laser race over fusion technology. there's no way that could be mistaken for exothermic to the degree you've claimed. to my knowledge, not a single trial has ever given off more energy than was applied to it.

certainly your claims constitute ground-breaking proof of concept. provide a link.

The only experiments that I have ever heard of that led to anything like he described (enough energy to light the Eastern seaboard in a very short period of time) all resulted in massive and used nuclear reactions to trigger the fusion reactions. As far as I know no one is even trying to use that technology to build a power supply.

COME ON!!! The link has been published repeatedly

ITER - the way to new energy

ITER has not produced anywhere near the energy you are claiming.
 
"i don't see sustainable heat coming from the reaction without fission involved as well. much of the energy is neutrons which could be converted to heat if they interact with plutonium or uranium or something."

The links I posted describe the inefficiency of capturing and converting the heat and energy. I can't understand the inclusion of fission in the arrangement. But I know it will be a great challenge to A) contain the reaction within a magnetic enclosure and B) capture and convert the heat to useful energy.

There have been some models run on computers that produce results like you described. The problem is that they are all computer models, and transferring that into the real world is beyond anything we know how to do yet. Producing the initial temperatures is not that much of a challenge, but containing and sustaining it are.

Even if they do build one, I do not think 100% capture of the energy is possible, even in theory. Even if we achieve 99.99% efficiency we will sitll have a lot of waste heat to deal with, which is not going to help the overall climate problem we already have, even if we stick all the plants in the middle of the Sahara.
 
"i don't see sustainable heat coming from the reaction without fission involved as well. much of the energy is neutrons which could be converted to heat if they interact with plutonium or uranium or something."

The links I posted describe the inefficiency of capturing and converting the heat and energy. I can't understand the inclusion of fission in the arrangement. But I know it will be a great challenge to A) contain the reaction within a magnetic enclosure and B) capture and convert the heat to useful energy.

fusion, particularly the more effective cycle which uses tritium for fuel, will create a storm of neutrons, but which are not extremely excited(hot). additionally, some energy is released which could contribute to the energy in a plasma. the neutrons wont. they will induce radiation in any material around the plasma, an issue with the sustainability of the process. what neutrons are also indicated to do is essentially accelerate the decay of radioactive substances by orders of magnitude. so introducing minute amounts of fissile matter to the plasma could negate the tritium and inductive neutron waste along with recovering the lost potential of the cold neutrons with nuclear heat.

following fission bombs like hiroshima, scientists incorporated fusion into the mix. the fission was hot enough to power a fusion cycle, the fusion created enough neutrons to multiply the effect of a second fission event. these bombs were smaller, but would yield vastly greater release of energy than a fission bomb alone.

i'd add c) contain radio-inductive waste d) derive sufficient energy from the process to make it competitive with other power sources.
 
is it so wacky to point out that the earth has a hot, molten center? what is the prognosis on digging a deep hole, 10+ miles into the crust and introducing water to it? cap it with a turbine.

The GHK Co. 1–27 Bertha Rogers hole or well was an oil-exploratory hole drilled in Washita County, Oklahoma, and was formerly the world's deepest hole until surpassed by the Kola Superdeep Borehole, dug by the former USSR.

It took GHK two years to reach 31,441 feet (9,583 m), a depth of almost six miles. During drilling, the well encountered enormous pressure – almost 25,000 psi (172,369 kPa). No commercial hydrocarbons were found before drilling hit a molten sulfur deposit (which melted the drill bit), and the well was plugged and abandoned.

Bertha Rogers - Wikipedia, the free encyclopedia

... six miles

However, due to higher than expected temperatures at this depth and location, 180 °C (356 °F) instead of expected 100 °C (212 °F), drilling deeper was deemed unfeasible and the drilling was stopped in 1992.[3] With the expected further increase in temperature with increasing depth, drilling to 15,000 m (49,000 ft) would have meant working at a projected 300 °C (570 °F), at which the drill bit would no longer work.

http://en.wikipedia.org/wiki/Kola_Superdeep_Borehole

... 10 miles
 
Last edited:

Forum List

Back
Top