Time for Global Zero’s Nukeout

20 years after the Cold War, NATO still has 200 US nukes in Europe. It’s time we got the world back on track to zero nuclear weapons.

As a founding member of the board of Global Zero, an international movement aiming for a world without nukes, we have done a lot of research into the nuclear weapon issue. The evidence is overwhelming that nukes are dangerous, useless and expensive. What’s more, by removing nukes from Europe we could clear the road for the US and Russia to agree on unprecedented cuts to their arsenals, which now represent more than 90% of the nuclear weapons on the planet.

President Obama, who endorses Global Zero, has declared abolition of these weapons to be a long-term policy goal for the US. Some progress has been made so far, but – as a former commander of the United States’ nuclear forces said in the New York Times this week – the US must push ahead with further cuts. Everyone knows every single US ICBMs are a security breech and are sitting ducks waiting to be shot down.

On May 20th, world leaders will meet at the NATO Summit in Chicago, where the future of hundreds of US nuclear weapons deployed in Europe is on the agenda.

It’s time to get these relics of a bygone age out of Europe, and we can all help make it happen. Sign Global Zero’s NukesOut petition to urge the leaders of NATO countries to get nukes out of Europe.
By Richard Branson. Founder of Virgin Group

Advertisements

Google’s Knowledge Graph

 

Google has just launched Knowledge Graph, the latest refinement to its search engine product that seeks to provide users with more relevant and in-depth responses to search queries. The company actually started testing this new interface last week, but now its ready to take the wraps off its new method for connecting search queries to information-rich topics on people, places, or things. Along with the standard search results you’re used to seeing, Google’s search results page now displays instant results related to your queries — a search for Taj Mahal immediately brings up a list of facts, photos, and a map of the famous landmark, as well as quick links to other popular uses of the search term (like the musician or the casino in New Jersey). There are a multitude of sources behind this data — Google cites Freebase, Wikipedia, and the CIA World Factbook, but also notes that “it’s augmented at a much larger scale” and tuned based on what the average user searches for. Google’s goal is to get you to the information you’re looking for in fewer clicks, while also increasing the relevancy of what you see when searching, and there are three main innovations the company is highlighting. Knowledge Graph results seek to remove the ambiguity from typical search results by presenting different segments of results with one click — so if you’re looking for Taj Mahal, the musician (and not the landmark), you can just click over instantly to tell Google which segment of search results you’re looking for. There’s also new summary info, which might keep a lot of people from having to click through to Wikipedia. Google gave the example of Marie Curie — when doing a search for the scientist, Knowledge Graph brings up a photo, birth and death dates, and a list of her major discoveries and education. It doesn’t have the in-depth information WIkipedia has, but it may save a few clicks when you’re just searching for a quick bit of information. Lastly, Knowledge Graph seeks to present information that users commonly look for after making a search, letting users easily drill down into a subject to find deeper information. For example, Google notes that the information Knowledge Graph shows when searching for Tom Cruise answers 37 percent of the next search queries made about him. Google also shows other common searches that other people made when searching for your term — kind of like Amazon’s “other people who shopped for “X” also liked…” features. Google’s calling this the first “baby step” towards a “Star Trek Computer” future, but it sounds like one of the bigger innovations the company has rolled out to its core product in some time (though it reminds us a lot of what Wolfram Alpha is trying to do, as well). For those who search on-the-go, Google’s also gradually rolling this out to Android 2.2 and higher and iOS 4, both in the browser and in dedicated search apps. Unfortunately, we’re not yet seeing the Knowledge Engine live — Google says this new feature is gradually rolling out to US English users, with no word on when exactly it’ll make its way to other countries. This article originally appeared on theverge.com as Google launches Knowledge Graph, a new intelligent search platform .

MetaData RDF/XML/XMP? I need a proper standard for my 3D search engine

Rather than integrate RDF into the architecture of a tool from the ground up, as occurred with the previous applications discussed in this chapter, other companies are incorporating RDF and RDF/XML into their existing applications. Adobe, a major player in the publications and graphics business, is one such company. Its RDF/XML strategy is known as XMP?eXtensible Metadata Platform. According to the Adobe XMP web site, other major players have agreed to support the XMP framework, including companies such as Microsoft.

XMP focuses on providing a metadata label that can be embedded directly into applications, files, and databases, including binary data, using what Adobe calls XMP packets?XML fragments that can be embedded regardless of recipient format. Regardless of where the material is moved or located, the data contained in the embedded material moves with it and can be accessed by external tools using the XMP Toolkit. Adobe has added support for XMP to Photoshop 7.0, Acrobat 5.0, FrameMaker 7.0, GoLive 6.0, InCopy 2.0, InDesign 2.0, Illustrator 10, and LiveMotion 2.0.

The information included within the embedded labels can be from any schema as long as it’s recorded in valid RDF/XML. The XMP source code is freely available for download, use, and modification under an open source license.

Unlike so much of the RDF/XML technology, which emphasizes Java or Python, the XMP Toolkit provides only support for C++. Specifically, the toolkit works with Microsoft’s Visual C++ in Windows (or compatible compiler) and Metrowerks CodeWarrior C++ for the Mac.

Within the SDK is a subdirectory of C++ code that allows a person to read and write XMP metadata. Included in the SDK is a good set of documentation that provides samples and instructions on embedding XMP metadata into TIFF, HTML, JPEG, PNG, PDF, SVG/XML, Illustrator (.ai), Photoshop (.psd), and Postscript and EPS formats.

The SDK is a bit out of date in regard to recent activities with RDF and RDF/XML. For instance, when discussing embedded RDF/XML into HTML documents, it references a W3C note that was favorable to the idea of embedding of RDF/XML into HTML. However, as you read in Chapter 3, recent decisions discourage the embedding of metadata into (X)HTML documents, though it isn’t expressly forbidden.

The SDK contains some documentation, but be forewarned, it assumes significant experience with the different data types, as well as experience working with C++. The document of most interest is the Metadata Framework PDF file, specifically the section discussing how XMP works with RDF, as well as the section on extending XMP with external RDF/XML Schemas. This involves nothing more than defining data in valid RDF and using a namespace for data not from the core schemas used by XMP. The section titled “XMP Schemas” lists all elements of XMP’s built-in schemas.

The SDK also includes C++ and the necessary support files for the Metadata Library, as well as some other utilities and samples. I dusted off my rarely used Visual C++ 6.0 to access the project for the Metadata Toolkit, Windows, and was able to build the library without any problems just by accessing the project file, XAPToolkit.dsw. The other C++ applications also compiled cleanly as long as I remembered to add the paths for the included header files and libraries.

One of the samples included with the SDK was XAPDumper, an application that scans for embedded RDF/XML within an application or file and then prints it out. I compiled it and ran it against the SDKOverview.pdf document. An excerpt of the embedded data found in this file is:

<rdf:Description rdf:about=''
xmlns:pdf='http://ns.adobe.com/pdf/1.3/'>
Acrobat Distiller 5.0.5 for Macintosh





Embedding RDF/XML isn’t much different than attaching a bar code to physical objects. Both RDF and bar codes uniquely identify important information about the object in case it becomes separated from an initial package. In addition, within a publications environment, if all of the files are marked with this RDF/XML-embedded information, automated processes could access this information and use it to determine how to connect the different files together, such as embedding a JPEG file into an HTML page and so on.

I can see the advantage of embedded RDF/XML for any source that’s loaded to the Web. Eventually, web bots could access and use this information to provide more intelligent information about the resources that they touch. Instead of a few keywords and a title as well as document type, these bots could provide an entire history of a document or picture, as well as every particular about it.

Other applications can also build in support for working with XMP. For instance, RDF Gateway, mentioned earlier, has the capability of reading in Adobe XMP. An example of how this application would access data from an Adobe PDF would be:

var monsters = new
DataSource("inet?url=http://burningbird.net/articles/monsters3.pdf&parse
type=xmp");

An important consideration with these embedded techniques is that there is no adverse impact on the file, nothing that impacts on the visibility of a JPEG or a PNG graphic or prevents an HTML file from loading into a browser. In fact, if you’ve read any PDF files from Adobe and other sites that use the newer Adobe products, you’ve probably been working with XMP documents containing embedded RDF/XML and didn’t even know it.

Nuclear Fusion and Helium 3

What are the barriers to making fusion reactors work?

For one thing, materials will be needed that can withstand the assaults from products of the fusion reaction. Deuterium-fusion reactions produce helium, which can provide some of the energy to keep the plasma heated. But the main source of energy to be extracted from the reaction comes from neutrons, which are also produced in the fusion reaction. The fast-flying neutrons will pummel through the reactor chamber wall into a blanket of material surrounding the reactor, depositing their energy as heat that can then be used to produce power. (In advanced reactor designs, the neutrons would also be used to initiate reactions converting lithium to tritium.)
Not only will the neutrons deposit energy in the blanket material, but their impact will convert atoms in the wall and blanket into radioactive forms. Materials will be needed that can extract heat effectively while surviving the neutron-induced structural weakening for extended periods of time.
Methods also will be needed for confining the radioactivity induced by neutrons as well as preventing releases of the radioactive tritium fuel. In addition, interaction of the plasma with reactor materials will produce radioactive dust that needs to be removed.
Building full-scale fusion-generating facilities will require engineering advances to meet all of these challenges, including better superconducting magnets and advanced vacuum systems. The European Union and Japan are designing the International Fusion Materials Irradiation Facility, where possible materials for fusion plant purposes will be developed and tested. Robotic methods for maintenance and repair will also have to be developed.
While these engineering challenges are considerable, fusion provides many advantages beyond the prospect of its almost limitless supply of fuel.

Helium 3

A second possibility, fusing 32He with itself (32He + 32He), requires even higher temperatures (since now both reactants have a +2 charge), and thus is even more difficult than the D-3He reaction. However, it does offer a possible reaction that produces no neutrons; the protons it produces possess charges and can be contained using electric and magnetic fields, which in turn results in direct electricity generation. 32He + 32He fusion has been demonstrated in the laboratory and is thus theoretically feasible and would have immense advantages, but commercial viability is many years in the future.[13]
The amounts of helium-3 needed as a replacement for conventional fuels are substantial by comparison to amounts currently available. The total amount of energy produced in the 21H + 32He reaction is 18.4 MeV, which corresponds to some 493 megawatt-hours (4.93×108 W·h) per three grams (one mole) of ³He. Even if that total amount of energy could be converted to electrical power with 100% efficiency (a physical impossibility), it would correspond to about 30 minutes of output of a gigawatt electrical plant; a year’s production by the same plant would require some 17.5 kilograms of helium-3.
The amount of fuel needed for large-scale applications can also be put in terms of total consumption: According to the US Energy Information Administration, “Electricity consumption by 107 million U.S. households in 2001 totaled 1,140 billion kW·h” (1.14×1015 W·h). Again assuming 100% conversion efficiency, 6.7 tonnes of helium-3 would be required for that segment of the energy demand of the United States, 15 to 20 tonnes given a more realistic end-to-end conversion efficiency.[citation needed]

In zero magnetic field, there are two distinct superfluid phases of 3He, the A-phase and the B-phase. The B-phase is the low-temperature, low-pressure phase which has an isotropic energy gap. The A-phase is the higher temperature, higher pressure phase that is further stabilized by a magnetic field and has two point nodes in its gap. The presence of two phases is a clear indication that 3He is an unconventional superfluid (superconductor), since the presence of two phases requires an additional symmetry, other than gauge symmetry, to be broken. In fact, it is a p-wave superfluid, with spin one, S=1, and angular momentum one, L=1. The ground state corresponds to total angular momentum zero, J=S+L=0 (vector addition). Excited states are possible with non-zero total angular momentum, J>0, which are excited pair collective modes. Because of the extreme purity of superfluid 3He (since all materials except 4He have solidified and sunk to the bottom of the liquid 3He and any 4He has phase separated entirely, this is the most pure condensed matter state), these collective modes have been studied with much greater precision than in any other unconventional pairing system.

Production of helium-3 from tritium at a rate sufficient to meet world demand will require significant investment, as tritium must be produced at the same rate as helium-3, and approximately eighteen times as much tritium must be maintained in storage as the amount of helium-3 produced annually by decay (production rate dN/dt from number of moles or other unit mass of tritium N, is N γ = N ln 2/t½ where the value of t½/(ln 2) is about 18 years; see radioactive decay). If commercial fusion reactors were to use helium-3 as a fuel, they would require tens of tonnes of helium-3 each year to produce a fraction of the world’s power, requiring substantial expansion of facilities for tritium production and storage.[30]

There have been many claims about the capabilities of helium-3 power plants. According to proponents, fusion power plants operating on deuterium and helium-3 would offer lower capital and operating costs than their competitors due to less technical complexity, higher conversion efficiency, smaller size, the absence of radioactive fuel, no air or water pollution, and only low-level radioactive waste disposal requirements. Recent estimates suggest that about $6 billion in investment capital will be required to develop and construct the first helium-3 fusion power plant. Financial breakeven at today’s wholesale electricity prices (5 US cents per kilowatt-hour) would occur after five 1-gigawatt plants were on line, replacing old conventional plants or meeting new demand.[53]
 
The reality is not so clear-cut. The most advanced fusion programs in the world are inertial confinement fusion (such as National Ignition Facility) and magnetic confinement fusion (such as ITER and other tokamaks). In the case of the former, there is no solid roadmap to power generation. In the case of the latter, commercial power generation is not expected until around 2050.[54] In both cases, the type of fusion discussed is the simplest: D-T fusion. The reason for this is the very low Coulomb barrier for this reaction; for D+3He, the barrier is much higher, and it is even higher for 3He–3He. The immense cost of reactors like ITER and National Ignition Facility are largely due to their immense size, yet to scale up to higher plasma temperatures would require reactors far larger still. The 14.7 MeV proton and 3.6 MeV alpha particle from D–3He fusion, plus the higher conversion efficiency, means that more electricity is obtained per kilogram than with D-T fusion (17.6 MeV), but not that much more. As a further downside, the rates of reaction for helium-3 fusion reactions are not particularly high, requiring a reactor that is larger still or more reactors to produce the same amount of electricity.
To attempt to work around this problem of massively large power plants that may not even be economical with D-T fusion, let alone the far more challenging D–3He fusion, a number of other reactors have been proposed – the Fusor, Polywell, Focus fusion, and many more, though many of these concepts have fundamental problems with achieving a net energy gain, and generally attempt to achieve fusion in thermal disequilibrium, something that could potentially prove impossible,[55] and consequently, these long-shot programs tend to have trouble garnering funding despite their low budgets. Unlike the “big”, “hot” fusion systems, however, if such systems were to work, they could scale to the higher barrier “aneutronic” fuels. However, these systems would scale well enough that their proponents tend to promote p-B fusion, which requires no exotic fuels like helium-3.

Aneutronic fusion reactions produce the overwhelming bulk of their energy in the form of charged particles instead of neutrons. This means that energy could be converted directly into electricity by various techniques. Many proposed direct conversion techniques are based on mature technology derived from other fields, such as microwave technology, and some involve equipment that is more compact and potentially cheaper than that involved in conventional thermal production of electricity.
In contrast, fusion fuels like deuterium-tritium (DT), which produce most of their energy in the form of neutrons, require a standard thermal cycle, in which the neutrons are used to boil water, and the resulting steam drives a large turbine and generator. This equipment is sufficiently expensive that about 80% of the capital cost of a typical fossil-fuel electric power generating station is in the thermal conversion equipment.[citation needed]
Thus, fusion with DT fuels could not significantly reduce the capital costs of electric power generation even if the fusion reactor that produces the neutrons were cost-free. (Fuel costs would, however, be greatly reduced.) But according to proponents, aneutronic fusion with direct electric conversion could, in theory, produce electricity with reduced capital costs.
Direct conversion techniques can either be inductive, based on changes in magnetic fields, or electrostatic, based on making charged particles work against an electric field.[32] If the fusion reactor worked in a pulsed mode, inductive techniques could be used.
A sizable fraction of the energy released by aneutronic fusion would not remain in the charged fusion products but would instead be radiated as X-rays.[33] Some of this energy could also be converted directly to electricity. Because of the photoelectric effect, X-rays passing through an array of conducting foils would transfer some of their energy to electrons, which can then be captured electrostatically. Since X-rays can go through far greater thickness of material than electrons can, many hundreds or even thousands of layers would be needed to absorb most of the X-rays.

Current research

  • The Z-machine at Sandia National Laboratory, a z-pinch device, can produce ion energies of interest to hydrogen–boron reactions, up to 300 keV.[26] Non-equilibrium plasmas usually have an electron temperature higher than their ion temperature, but the plasma in the Z machine has a special, reverted non-equilibrium state, where ion temperature is 100 times higher than electron temperature. These data represent a new research field, and indicate that Bremsstrahlung losses could be in fact lower than previously expected in such a design.

None of these efforts has yet tested its device with hydrogen–boron fuel, so the anticipated performance is based on extrapolating from theory, experimental results with other fuels and from simulations.

  • A picosond laser produced hydrogen–boron aneutronic fusions for a Russian team in 2005.[27] However, the number of the resulting α particles (around 103 per laser pulse) was extremely low.

———————————————————————————
Although Helium 3 is the closest fuel you can find to theoretically create nuclear fusion energy, but there is still not a complete solution, even ITER project faces technical challenges with no total solution in sight, you need to test out all the fundamental paths to find a feasible solution, by modelling and applying physics into every reaction, before putting investments to build the reactor. You cannot put the cart before the horse, can you?
– Contributed by Oogle.