viernes, 27 de noviembre de 2009

Copenhagen


In 2012 the Kyoto Protocol to prevent climate changes and global warming runs out. To keep the process on the line there is an urgent need for a new climate protocol. At the conference in Copenhagen 2009 the parties of the UNFCCC meet for the last time on government level before the climate agreement need to be renewed.

Therefore the Climate Conference in Copenhagen is essential for the worlds climate and the Danish government and UNFCCC is putting hard effort in making the meeting in Copenhagen a success ending up with a Copenhagen Protocol to prevent global warming and climate changes.

Governmental representatives from 170 countries are expected to be in Copenhagen in the days of the conference accompanied by other governmental representatives, NGO's, journalists and others. In total 8000 people are expected to Copenhagen in the days of the climate meeting.


The conference in Copenhagen is the 15th conference of parties (COP15) in the Framework Convention on Climate Change. The recent meeting in United Nations Climate Change Conferences was held in December 2007 in Bali.

martes, 24 de noviembre de 2009

Data Confirms Water on the Moon


The argument that the moon is a dry, desolate place no longer holds water.Secrets the moon has been holding, for perhaps billions of years, are now being revealed to the delight of scientists and space enthusiasts alike.

NASA opened a new chapter in our understanding of the moon. Preliminary data from the Lunar CRater Observation and Sensing Satellite, or LCROSS, indicates that the mission successfully uncovered water during the Oct. 9, 2009 impacts into the permanently shadowed region of Cabeus cater near the moon’s south pole.

The impact created by the LCROSS Centaur upper stage rocket created a two-part plume of material from the bottom of the crater. The first part was a high angle plume of vapor and fine dust and the second a lower angle ejecta curtain of heavier material. This material has not seen sunlight in billions of years.

Permanently shadowed regions could hold a key to the history and evolution of the solar system, much as an ice core sample taken on Earth reveals ancient data. In addition, water, and other compounds represent potential resources that could sustain future lunar exploration.

Since the impacts, the LCROSS science team has been working almost nonstop analyzing the huge amount of data the spacecraft collected. The team concentrated on data from the satellite's spectrometers, which provide the most definitive information about the presence of water.

A spectrometer examines light emitted or absorbed by materials that helps identify their composition.

"Multiple lines of evidence show water was present in both the high angle vapor plume and the ejecta curtain created by the LCROSS Centaur impact. The concentration and distribution of water and other substances requires further analysis, but it is safe to say Cabeus holds water." The team took the known near infrared spectral signatures of water and other materials and compared them to the spectra collected by the LCROSS near infrared spectrometer of the impact.

The possibility of contamination from the Centaur also was ruled out." Additional confirmation came from an emission in the ultraviolet spectrum that was attributed to hydroxyl, one product from the break-up of water by sunlight. When atoms and molecules are excited, they release energy at specific wavelengths that are detected by the spectrometers.

A similar process is used in neon signs. When electrified, a specific gas will produce a distinct color. The ultraviolet visible spectrometer detected hydroxyl signatures just after impact that are consistent with a water vapor cloud in sunlight. Data from the other LCROSS instruments are being analyzed for additional clues about the state and distribution of the material at the impact site.

"The full understanding of the LCROSS data may take some time.

" LCROSS was launched June 18, 2009 as a companion mission to the Lunar Reconnaissance Orbiter, or LRO, from NASA's Kennedy Space Center in Florida. After separating from LRO, the LCROSS spacecraft held onto the spent Centaur upper stage rocket of the launch vehicle, executed a lunar swingby and entered into a series of long looping orbits around the Earth.

After traveling approximately 113 days and nearly 5.6 million miles (9 million km), the Centaur and LCROSS separated on final approach to the moon. Traveling as fast as a speeding bullet, the Centaur impacted the lunar surface shortly after 4:31 a.m. PDT Oct. 9 with LCROSS watching with its onboard instruments.
Approximately four minutes of data was collected before the LCROSS itself impacted the lunar surface. Working closely with scientists from LRO and other observatories that viewed the impact, the LCROSS team is working to understand the full scope of the LCROSS data. LRO continues to make passes over the impact site to give the LCROSS team additional insight into the mechanics of the impact and its resulting craters.
What other secrets will the moon reveal? The analysis continues!

jueves, 12 de noviembre de 2009

Intel or not Intel


When corporate giants wield their pricing power as a weapon to smash pygmies, consumers win – but only for a while. That is why regulators must be vigilant. This week, the New York attorney general, filed an anti-trust complaint against Intel, following a case that has already led to regulatory action in Europe and Asia.

The pricing discounts that suppliers like Intel give to large volume customers are welcome, even when those discounts reinforce the inherent advantages of a market leader in an industry which, like semiconductors, relies heavily on scale economies. But if price discounts are used as a predatory weapon, prompt action is essential.

It is for the courts to draw that line, based on the facts of the case. But there are some obvious pointers. If volume discounts are made dependent on conditions that extend beyond the pure volume that a customer buys, that should raise a red flag. In Intel’s case, discounts were tied to the condition that PC makers did not buy chips from its smaller rival AMD.

The court should examine the alleged all-or-nothing nature of Intel’s volume discounts; economic logic suggests that smaller discounts should be available on smaller volumes. Intel cut off all discounts to those who turned to AMD. Intel has said it will defend itself against the allegations.

Other factors will help to decide the case, such as whether Intel offered the same discounts to all customers, or used them selectively. If a court determines that it has crossed the line, heavy fines are the best deterrent. Persistent wrongdoing should result in closer regulatory supervision – for instance, requiring more transparency through the publishing of volume discounts. With Intel, matters have not reached that point yet.

Some have argued for tighter restrictions, such as absolute limits to the discounts that a dominant company like Intel can apply. Price regulation, however, is a last resort justified only in cases of total market failure.

If anything, the AMD example argues the opposite. It lacked its rival’s massive financial strength, but beat Intel in the race to the first generation of multi-core chips. Indeed, it was the desire of Dell and others to buy these more advanced chips that led Intel to wield its power in a way that watchdogs argue was too extreme. AMD’s experience is a reminder that tech innovators can flourish even against dominant rivals – but only if regulators remain alert.

miércoles, 4 de noviembre de 2009

Economic or Cooking Growth ?


Economic growth occurs whenever people take resources and rearrange them in ways that are more valuable. A useful metaphor for production in an economy comes from the kitchen. To create valuable final products, we mix inexpensive ingredients together according to a recipe. The cooking one can do is limited by the supply of ingredients, and most cooking in the economy produces undesirable side effects. If economic growth could be achieved only by doing more and more of the same kind of cooking, we would eventually run out of raw materials and suffer from
unacceptable levels of pollution and nuisance. Human history teaches us, however, that economic growth springs from better recipes, not just from more cooking. New recipes generally produce fewer unpleasant side effects and generate more economic value per unit of raw material.
Take one small example. In most coffee shops, you can now use the same size lid for small, medium, and large cups of coffee. That wasn’t true as recently as 1995.
That small change in the geometry of the cups means that a coffee shop can serve customers at lower cost. Store owners need to manage the inventory for only one type of lid. Employees can replenish supplies more quickly throughout the day.
Customers can get their coffee just a bit faster. Such big discoveries as the transistor, antibiotics, and the electric motor attract most of the attention, but it takes millions of little discoveries like the new design for the cup and lid to double average income in a nation.
Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new recipes or ideas were discovered. And every generation has underestimated the potential for finding new recipes and ideas.
We consistently fail to grasp how many ideas remain to be discovered. The difficulty is the same one we have with compounding: possibilities do not merely add up; they multiply.
In a branch of physical chemistry known as exploratory synthesis, chemists try mixing selected elements together at different temperatures and pressures to see what comes out. About a decade ago, one of the hundreds of compounds discovered this way—a mixture of copper, yttrium, barium, and oxygen—was found to be a superconductor at temperatures far higher than anyone had previously thought possible. This discovery may ultimately have far-reaching implications for the storage and transmission of electrical energy.
To get some sense of how much scope there is for more such discoveries, we can calculate as follows. The periodic table contains about a hundred different types of atoms, which means that the number of combinations made up of four different elements is about 100 × 99 × 98 × 97 = 94,000,000. A list of numbers like 6, 2, 1, 7 can represent the proportions for using the four elements in a recipe. To keep things simple, assume that the numbers in the list must lie between 1 and 10, that no fractions are allowed, and that the smallest number must always be 1. Then there are about 3,500 different sets of proportions for each choice of four elements, and
3,500 × 94,000,000 (or 330 billion) different recipes in total. If laboratories around the world evaluated 1,000 recipes each day, it would take nearly a million years to go through them all.

lunes, 2 de noviembre de 2009

Where We Stand


At present, the standard model of particle physics stands triumphant. It has survived testing far beyond the range of energies for which it was crafted, and to fargreater precision.
Even the “ugly” parts look good. Unlike the gauge part of the standard model, whose parameters are few in number (namely, three) and have a beautiful geometric interpretation, the part dealing with fermion masses and mixings contains many parameters (about two dozen in the minimal model) that appear merely as abstract numbers describing the magnitudes of Yukawa type couplings of one or more hypothetical Higgs fields. In the present state of theory all these numbers must be taken from experiment. Nevertheless, the framework is very significantly constrained and predictive.
From the underlying hypotheses of renormalizable local quantum field theory, and three generation structure, we derive that a 3 × 3 unitary matrix, the CKM (Cabibbo, Kobayashi, Maskawa) matrix, must describe a multitude of a priori independent decay rates and mixing phenomena, including several manifestations of CP violation.
Phenomena associated with neutrino masses, and with gravity, are commonly regarded as beyond, or at least outside, the standard model. Of course, where one draws the boundary of the standard model is largely a matter of taste. But it’s appropriate to emphasize that our working descriptions both of neutrino masses and of gravity fit smoothly and naturally into the conceptual framework associated with the “core” standard model of strong and electroweak interactions. Specifically, neutrino masses can be accommodated using dimension 5 operators, and gravity through the Einstein-Hilbert curvature term and minimal coupling to matter (we can also include a cosmological term). The deep guiding principles that underlie the standard model, to wit local quantum field heory based on operators of the lowest available mass dimension, also work to give theories of neutrino masses and of gravity that describe all existing observations in terms of a small number of parameters.
Altogether, the standard model supplies an economical, precise and (we now know) xtraordinarily accurate description of an enormous range of phenomena. It supplies, in particular, firm and adequate foundations for chemistry (including biochemistry), materials science, and most of astrophysics. We should be very proud of what we, as a community stretching across continents and generations, have accomplished.