jueves, 24 de diciembre de 2009

200,000 Degree Star Found at Center of NGC 6302


Astronomers at The University of Manchester's Jodrell Bank Centre for Astrophysics have discovered one of the hottest stars in the Galaxy - with a surface temperature of around 200,000 degrees, it is 35 times hotter than the Sun. Despite numerous attempts by astronomers across the world, the mysterious dying star at the heart of NGC 6302, the Butterfly nebula - one of the brightest and most beautiful of the planetary nebulae - has never been seen before. NGC 6302 lies within our Milky Way galaxy, roughly 3,800 light-years away in the constellation Scorpius. The glowing gas is the star's outer layers, expelled over about 2,200 years. The "butterfly" stretches for more than two light-years, which is about half the distance from the Sun to the nearest star, Alpha Centauri.The central star, which up to now could not be seen because it is hidden within a doughnut-shaped ring of dust, appears as a dark band pinching the nebula in the center. The thick dust belt constricts the star's outflow, creating the classic "bipolar" or hourglass shape displayed by some planetary nebulae.

The star's surface temperature is estimated to be about 200,000 degrees Fahrenheit, making it one of the hottest known stars in our galaxy. Spectroscopic observations made with ground-based telescopes show that even the gas surrounding the star is roughly 36,000 degrees Fahrenheit, which is unusually hot compared to typical planetary nebulae."This star was so hard to find because it is hidden behind a cloud of dust and ice in the middle of the nebula.

Using the recently refurbished Hubble Space Telescope (HST), a team of astronomers have shed new light on the nebula with a set of spectacular images. The images were taken to show off the new improved HST after it began work again in September this year. The Manchester astronomers were amazed to find that the images unexpectedly revealed the missing central star.

Astronomer's said "It's extremely important to understand planetary nebulae such as the Bug Nebula, as they are crucial to understanding our own existence on Earth". That is because the elements necessary for life, especially carbon, are created inside stars, and ejected into space as part of these planetary nebulae. Planets such as the Earth form from small dust particles, which also form within planetary nebulae. The cloud of dust and ice in the Bug Nebula contains the seeds of a future generation of planets." Finding the star was made possible by the Space Shuttle's final servicing mission of the HST, earlier this year. During the mission, astronauts installed the new Wide Field Camera 3 which was used to take these images. "How a star ejects a nebula like this is still a mystery. It seems most stars, including the Sun, will eject as much as 80 per cent of their mass when they finally run out of nuclear fuel at the end of their lives. Material that then goes on to help form the next generation of stars and planets.

These observations have shown that the star at the heart of the Bug Nebula is only about 2/3 as heavy as the Sun, but was several times heavier before it threw off its outer layers to form the nebula which had previously hidden it from our view.

jueves, 17 de diciembre de 2009

Something Big is Lurking Beyond the Visible Edge of Our Universe


Using data from NASA's Wilkinson Microwave Anisotropy Probe (WMAP), scientists have identified an unexpected motion in distant galaxy clusters. The cause, they suggest, is the gravitational attraction of matter that lies beyond the observable universe.

"The clusters show a small but measurable velocity that is independent of the universe's expansion and does not change as distances increase, We never expected to find anything like this."

Hot X-ray-emitting gas in a galaxy cluster scatters photons from the cosmic microwave background. Clusters don't precisely follow the expansion of space, so the wavelengths of scattered photons change in a way that reflects each cluster's individual motion.

This results in a minute shift of the microwave background's temperature in the cluster's direction. Astronomers refer to this change as the kinematic Sunyaev-Zel'dovich (SZ) effect.

A related distortion, known as the thermal SZ effect, has been observed in galaxy clusters since the 1980s. But the kinematic version is less than one-tenth as strong and has not been detected in any cluster.

In 2000, Kashlinsky and Fernando Atrio-Barandela from the University of Salamanca, Spain, showed that astronomers could, in essence, amplify the effect isolating the kinematic SZ term. The trick, they found, is to study large numbers of clusters.

The astronomers teamed to identify some 700 X-ray clusters that could be used to find the subtle spectral shift. This sample includes objects up to 6 billion light-years -- or nearly half of the observable universe -- away.

Using the cluster catalog and WMAP's three-year view of the microwave background, the astronomers detected bulk cluster motions of nearly 2 million miles per hour. The clusters are heading toward a 20-degree patch of sky between the constellations of Centaurus and Vela.

What's more, this motion is constant out to at least a billion light-years. "Because the dark flow already extends so far, it likely extends across the visible universe,"

The finding flies in the face of predictions from standard cosmological models, which describe such motions as decreasing at ever greater distances.

Cosmologists view the microwave background - a flash of light emitted 380,000 years after the big bang - as the universe's ultimate reference frame. Relative to it, all large-scale motion should show no preferred direction.

Big-bang models that include a feature called inflation offer a possible explanation for the flow. Inflation is a brief hyper-expansion early in the universe's history. If inflation did occur, then the universe we can see is only a small portion of the whole cosmos.

WMAP data released in 2006 support the idea that our universe experienced inflation. Kashlinsky and his team suggest that their clusters are responding to the gravitational attraction of matter that was pushed far beyond the observable universe by inflation.

The next step is to narrow down uncertainties in the measurements.

viernes, 27 de noviembre de 2009

Copenhagen


In 2012 the Kyoto Protocol to prevent climate changes and global warming runs out. To keep the process on the line there is an urgent need for a new climate protocol. At the conference in Copenhagen 2009 the parties of the UNFCCC meet for the last time on government level before the climate agreement need to be renewed.

Therefore the Climate Conference in Copenhagen is essential for the worlds climate and the Danish government and UNFCCC is putting hard effort in making the meeting in Copenhagen a success ending up with a Copenhagen Protocol to prevent global warming and climate changes.

Governmental representatives from 170 countries are expected to be in Copenhagen in the days of the conference accompanied by other governmental representatives, NGO's, journalists and others. In total 8000 people are expected to Copenhagen in the days of the climate meeting.


The conference in Copenhagen is the 15th conference of parties (COP15) in the Framework Convention on Climate Change. The recent meeting in United Nations Climate Change Conferences was held in December 2007 in Bali.

martes, 24 de noviembre de 2009

Data Confirms Water on the Moon


The argument that the moon is a dry, desolate place no longer holds water.Secrets the moon has been holding, for perhaps billions of years, are now being revealed to the delight of scientists and space enthusiasts alike.

NASA opened a new chapter in our understanding of the moon. Preliminary data from the Lunar CRater Observation and Sensing Satellite, or LCROSS, indicates that the mission successfully uncovered water during the Oct. 9, 2009 impacts into the permanently shadowed region of Cabeus cater near the moon’s south pole.

The impact created by the LCROSS Centaur upper stage rocket created a two-part plume of material from the bottom of the crater. The first part was a high angle plume of vapor and fine dust and the second a lower angle ejecta curtain of heavier material. This material has not seen sunlight in billions of years.

Permanently shadowed regions could hold a key to the history and evolution of the solar system, much as an ice core sample taken on Earth reveals ancient data. In addition, water, and other compounds represent potential resources that could sustain future lunar exploration.

Since the impacts, the LCROSS science team has been working almost nonstop analyzing the huge amount of data the spacecraft collected. The team concentrated on data from the satellite's spectrometers, which provide the most definitive information about the presence of water.

A spectrometer examines light emitted or absorbed by materials that helps identify their composition.

"Multiple lines of evidence show water was present in both the high angle vapor plume and the ejecta curtain created by the LCROSS Centaur impact. The concentration and distribution of water and other substances requires further analysis, but it is safe to say Cabeus holds water." The team took the known near infrared spectral signatures of water and other materials and compared them to the spectra collected by the LCROSS near infrared spectrometer of the impact.

The possibility of contamination from the Centaur also was ruled out." Additional confirmation came from an emission in the ultraviolet spectrum that was attributed to hydroxyl, one product from the break-up of water by sunlight. When atoms and molecules are excited, they release energy at specific wavelengths that are detected by the spectrometers.

A similar process is used in neon signs. When electrified, a specific gas will produce a distinct color. The ultraviolet visible spectrometer detected hydroxyl signatures just after impact that are consistent with a water vapor cloud in sunlight. Data from the other LCROSS instruments are being analyzed for additional clues about the state and distribution of the material at the impact site.

"The full understanding of the LCROSS data may take some time.

" LCROSS was launched June 18, 2009 as a companion mission to the Lunar Reconnaissance Orbiter, or LRO, from NASA's Kennedy Space Center in Florida. After separating from LRO, the LCROSS spacecraft held onto the spent Centaur upper stage rocket of the launch vehicle, executed a lunar swingby and entered into a series of long looping orbits around the Earth.

After traveling approximately 113 days and nearly 5.6 million miles (9 million km), the Centaur and LCROSS separated on final approach to the moon. Traveling as fast as a speeding bullet, the Centaur impacted the lunar surface shortly after 4:31 a.m. PDT Oct. 9 with LCROSS watching with its onboard instruments.
Approximately four minutes of data was collected before the LCROSS itself impacted the lunar surface. Working closely with scientists from LRO and other observatories that viewed the impact, the LCROSS team is working to understand the full scope of the LCROSS data. LRO continues to make passes over the impact site to give the LCROSS team additional insight into the mechanics of the impact and its resulting craters.
What other secrets will the moon reveal? The analysis continues!

jueves, 12 de noviembre de 2009

Intel or not Intel


When corporate giants wield their pricing power as a weapon to smash pygmies, consumers win – but only for a while. That is why regulators must be vigilant. This week, the New York attorney general, filed an anti-trust complaint against Intel, following a case that has already led to regulatory action in Europe and Asia.

The pricing discounts that suppliers like Intel give to large volume customers are welcome, even when those discounts reinforce the inherent advantages of a market leader in an industry which, like semiconductors, relies heavily on scale economies. But if price discounts are used as a predatory weapon, prompt action is essential.

It is for the courts to draw that line, based on the facts of the case. But there are some obvious pointers. If volume discounts are made dependent on conditions that extend beyond the pure volume that a customer buys, that should raise a red flag. In Intel’s case, discounts were tied to the condition that PC makers did not buy chips from its smaller rival AMD.

The court should examine the alleged all-or-nothing nature of Intel’s volume discounts; economic logic suggests that smaller discounts should be available on smaller volumes. Intel cut off all discounts to those who turned to AMD. Intel has said it will defend itself against the allegations.

Other factors will help to decide the case, such as whether Intel offered the same discounts to all customers, or used them selectively. If a court determines that it has crossed the line, heavy fines are the best deterrent. Persistent wrongdoing should result in closer regulatory supervision – for instance, requiring more transparency through the publishing of volume discounts. With Intel, matters have not reached that point yet.

Some have argued for tighter restrictions, such as absolute limits to the discounts that a dominant company like Intel can apply. Price regulation, however, is a last resort justified only in cases of total market failure.

If anything, the AMD example argues the opposite. It lacked its rival’s massive financial strength, but beat Intel in the race to the first generation of multi-core chips. Indeed, it was the desire of Dell and others to buy these more advanced chips that led Intel to wield its power in a way that watchdogs argue was too extreme. AMD’s experience is a reminder that tech innovators can flourish even against dominant rivals – but only if regulators remain alert.

miércoles, 4 de noviembre de 2009

Economic or Cooking Growth ?


Economic growth occurs whenever people take resources and rearrange them in ways that are more valuable. A useful metaphor for production in an economy comes from the kitchen. To create valuable final products, we mix inexpensive ingredients together according to a recipe. The cooking one can do is limited by the supply of ingredients, and most cooking in the economy produces undesirable side effects. If economic growth could be achieved only by doing more and more of the same kind of cooking, we would eventually run out of raw materials and suffer from
unacceptable levels of pollution and nuisance. Human history teaches us, however, that economic growth springs from better recipes, not just from more cooking. New recipes generally produce fewer unpleasant side effects and generate more economic value per unit of raw material.
Take one small example. In most coffee shops, you can now use the same size lid for small, medium, and large cups of coffee. That wasn’t true as recently as 1995.
That small change in the geometry of the cups means that a coffee shop can serve customers at lower cost. Store owners need to manage the inventory for only one type of lid. Employees can replenish supplies more quickly throughout the day.
Customers can get their coffee just a bit faster. Such big discoveries as the transistor, antibiotics, and the electric motor attract most of the attention, but it takes millions of little discoveries like the new design for the cup and lid to double average income in a nation.
Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new recipes or ideas were discovered. And every generation has underestimated the potential for finding new recipes and ideas.
We consistently fail to grasp how many ideas remain to be discovered. The difficulty is the same one we have with compounding: possibilities do not merely add up; they multiply.
In a branch of physical chemistry known as exploratory synthesis, chemists try mixing selected elements together at different temperatures and pressures to see what comes out. About a decade ago, one of the hundreds of compounds discovered this way—a mixture of copper, yttrium, barium, and oxygen—was found to be a superconductor at temperatures far higher than anyone had previously thought possible. This discovery may ultimately have far-reaching implications for the storage and transmission of electrical energy.
To get some sense of how much scope there is for more such discoveries, we can calculate as follows. The periodic table contains about a hundred different types of atoms, which means that the number of combinations made up of four different elements is about 100 × 99 × 98 × 97 = 94,000,000. A list of numbers like 6, 2, 1, 7 can represent the proportions for using the four elements in a recipe. To keep things simple, assume that the numbers in the list must lie between 1 and 10, that no fractions are allowed, and that the smallest number must always be 1. Then there are about 3,500 different sets of proportions for each choice of four elements, and
3,500 × 94,000,000 (or 330 billion) different recipes in total. If laboratories around the world evaluated 1,000 recipes each day, it would take nearly a million years to go through them all.

lunes, 2 de noviembre de 2009

Where We Stand


At present, the standard model of particle physics stands triumphant. It has survived testing far beyond the range of energies for which it was crafted, and to fargreater precision.
Even the “ugly” parts look good. Unlike the gauge part of the standard model, whose parameters are few in number (namely, three) and have a beautiful geometric interpretation, the part dealing with fermion masses and mixings contains many parameters (about two dozen in the minimal model) that appear merely as abstract numbers describing the magnitudes of Yukawa type couplings of one or more hypothetical Higgs fields. In the present state of theory all these numbers must be taken from experiment. Nevertheless, the framework is very significantly constrained and predictive.
From the underlying hypotheses of renormalizable local quantum field theory, and three generation structure, we derive that a 3 × 3 unitary matrix, the CKM (Cabibbo, Kobayashi, Maskawa) matrix, must describe a multitude of a priori independent decay rates and mixing phenomena, including several manifestations of CP violation.
Phenomena associated with neutrino masses, and with gravity, are commonly regarded as beyond, or at least outside, the standard model. Of course, where one draws the boundary of the standard model is largely a matter of taste. But it’s appropriate to emphasize that our working descriptions both of neutrino masses and of gravity fit smoothly and naturally into the conceptual framework associated with the “core” standard model of strong and electroweak interactions. Specifically, neutrino masses can be accommodated using dimension 5 operators, and gravity through the Einstein-Hilbert curvature term and minimal coupling to matter (we can also include a cosmological term). The deep guiding principles that underlie the standard model, to wit local quantum field heory based on operators of the lowest available mass dimension, also work to give theories of neutrino masses and of gravity that describe all existing observations in terms of a small number of parameters.
Altogether, the standard model supplies an economical, precise and (we now know) xtraordinarily accurate description of an enormous range of phenomena. It supplies, in particular, firm and adequate foundations for chemistry (including biochemistry), materials science, and most of astrophysics. We should be very proud of what we, as a community stretching across continents and generations, have accomplished.

viernes, 30 de octubre de 2009

Photovoltaics market prospect.


The robust 40% increase for the global photovoltaic (PV) market is likely to slow down in 2009, and may even decline. This is partly due to the worldwide recession that has slowed planned projects and forced down prices, but an even bigger factor is that Spain's huge boom of 2008 won't be repeated this year. The 2.5GW of PV installations in Spain last year were close to half of the world's total of 5.5GW, according to the European PV Industry Association (EPIA), but that surge was somewhat of a fluke.

While Spain did want to spur PV last year, the feed-in tariffs offered were miscalculated by officials, so that new installations could be paid off in as little as a year. This led to explosive growth which won't be repeated in 2009 because Spain capped installations it will support at 0.5GW. That means that somehow the world market would have to make up that 2GW shortfall to match last year, which is unlikely industry observers believe.

Italy is a wild card in this year's market, however. GSE, the state-run power agency, wants to jump-start its PV industry, so it is offering feed-in tariffs similar to Spain's. The agency's goal is to more than double the number of installations this year, reaching a total of 900MW to 1GW. GSE expects to support some 70,000 new projects, mostly roof-mounted PV panels in northern Italy. GSE is projecting its market to grow to 1500GW in 2010, achieving an increase of 135% from 2006 to 2010, according to Johan Trip of Solarplaza. This would move Italy up to #2 in Europe behind only Germany. The German market is somewhat similar to Italy in that 40% of its installed PV is for systems <10kw.>1MW ground-based utility installations.

The surge in Spain enabled the global PV market to reach $37.1B in 2008, more than double the 2007 total of $17.2B. The 5.5GW installed last year brought the cumulative global total of PV capacity to 15GW. Spain led the world with 2.5GW, followed by Germany with 1.5GW. The top five also including the US (324MW), South Korea (247MW), and Italy (260MW. By 2013, with appropriate policies and feed-in tariffs, total installed PV capacity could reach 22GW, according to the association's projections.

A European Renewable Energy Directive requires that 20% of the total energy output in Europe in 2020 comes from renewable sources. Each member state of the Common Market will have to specify how it intends to reach the 20% goal in its own country.

In Italy, with attractive feed-in tariffs along with module costs about half what they were last year, according to Solarplaza, a market boom is developing with more than 600 companies participating. Developers in southern Italy are aiming for more large-scale power plants rather than the smaller rooftop panels sprouting all over northern Italy. One player is Premier Power Renewable Energy, El Dorado Hills, CA, which in May reported it acquired privately held Arco Energy, an Italian solar project developer. Initially the partners say they will concentrate on large-scale green-field solar farms, but later they intend to move into the rooftop and then building-integrated photovoltaic (BIPV) installations all over Italy. By 2011, some believe, grid electricity may become so costly that incentives will not be needed to make solar installations profitable. Premier indicates that it believes that by 2016 Italy may reach 3GW of installed capacity.

But reaching this year's target of 900MW will be a stretch, believe some in the industry, because of the lack of infrastructure. More likely would be about 500MW, but that still would double last year's installations.

While the bulk of PV installations use crystalline silicon modules, thin film is about 10% of the market. This could grow to 30% by 2013, according to Andy London, global manager of Heraeus' photovoltaic business unit in West Conshohocken, PA -- but with a caveat. Commercial thin-film installations use cadmium telluride (CdTe) cells, although a bright future is also expected for copper indium gallium selenide (CIGS) cells because of the better match of their response to the solar spectrum. There could be a hitch, however, London pointed out -- due to the toxicity of cadmium, it may be banned in the future in the European Community.

While the global PV market may stall this year, analysts expect it to resume its strong growth in 2010 and beyond as prices rise for fossil fuels and growing world economies compete for a limited supply.

jueves, 29 de octubre de 2009

NASA's Spitzer Space Telescope has discovered an enormous ring around Saturn


NASA's Spitzer Space Telescope has discovered an enormous ring around Saturn -- by far the largest of the giant planet's many rings.

The new belt lies at the far reaches of the Saturnian system, with an orbit tilted 27 degrees from the main ring plane. The bulk of its material starts about six million kilometers (3.7 million miles) away from the planet and extends outward roughly another 12 million kilometers (7.4 million miles). One of Saturn's farthest moons, Phoebe, circles within the newfound ring, and is likely the source of its material.

Saturn's newest halo is thick, too -- its vertical height is about 20 times the diameter of the planet. It would take about one billion Earths stacked together to fill the ring.

"This is one supersized ring," said Anne Verbiscer, an astronomer at the University of Virginia, Charlottesville. "If you could see the ring, it would span the width of two full moons' worth of sky, one on either side of Saturn." Verbiscer; Douglas Hamilton of the University of Maryland, College Park; and Michael Skrutskie, of the University of Virginia, Charlottesville, are authors of a paper about the discovery to be published by the journal Nature.

The ring itself is tenuous, made up of a thin array of ice and dust particles. Spitzer's infrared eyes were able to spot the glow of the band's cool dust. The telescope, launched in 2003, is currently 107 million kilometers (66 million miles) from Earth in orbit around the sun.

The discovery may help solve an age-old riddle of one of Saturn's moons. Iapetus has a strange appearance -- one side is bright and the other is really dark, in a pattern that resembles the yin-yang symbol. The astronomer Giovanni Cassini first spotted the moon in 1671, and years later figured out it has a dark side, now named Cassini Regio in his honor.

Saturn's newest addition could explain how Cassini Regio came to be. The ring is circling in the same direction as Phoebe, while Iapetus, the other rings and most of Saturn's moons are all going the opposite way. According to the scientists, some of the dark and dusty material from the outer ring moves inward toward Iapetus, slamming the icy moon like bugs on a windshield.

"Astronomers have long suspected that there is a connection between Saturn's outer moon Phoebe and the dark material on Iapetus," said Hamilton. "This new ring provides convincing evidence of that relationship."

Verbiscer and her colleagues used Spitzer's longer-wavelength infrared camera, called the multiband imaging photometer, to scan through a patch of sky far from Saturn and a bit inside Phoebe's orbit. The astronomers had a hunch that Phoebe might be circling around in a belt of dust kicked up from its minor collisions with comets -- a process similar to that around stars with dusty disks of planetary debris. Sure enough, when the scientists took a first look at their Spitzer data, a band of dust jumped out.

The ring would be difficult to see with visible-light telescopes. Its particles are diffuse and may even extend beyond the bulk of the ring material all the way in to Saturn and all the way out to interplanetary space. The relatively small numbers of particles in the ring wouldn't reflect much visible light, especially out at Saturn where sunlight is weak.

"The particles are so far apart that if you were to stand in the ring, you wouldn't even know it," said Verbiscer.

Spitzer was able to sense the glow of the cool dust, which is only about 80 Kelvin (minus 316 degrees Fahrenheit). Cool objects shine with infrared, or thermal radiation; for example, even a cup of ice cream is blazing with infrared light. "By focusing on the glow of the ring's cool dust, Spitzer made it easy to find," said Verbiscer.

These observations were made before Spitzer ran out of coolant in May and began its "warm" mission.


miércoles, 28 de octubre de 2009

Tamiflu boosts Roche sales figures


Roche, the Swiss pharmaceuticals group best known for its Tamiflu influenza treatment and its powerful cancer drugs, raised guidance for full-year sales in 2009 after turnover in the first nine months came in significantly higher than expected.

The sales figures – Roche publishes no profits at the nine-month stage – were boosted by bumper demand for Tamiflu amid persistent fears about a global flu pandemic. Tamiflu sales of SFr2bn ($1.9bn) in the first nine months were more than four times ahead of the same period last year. In the third quarter, sales reached SFr994m – nearly 10 times more than in the same period last year.

The increase in orders prompted Roche to raise its full-year sales target for the anti-viral treatment from SFr2bn to SFr2.7bn and almost double its 2010 estimate from SFr400m to SFr700m.

The group, which successfully completed the $47bn acquisition of its majority owned Genentech operation in the US and is integrating the two companies, said sales in both drugs and diagnostics – its two core divisions – had risen strongly, and well above market rates.

As a result, Roche raised its forecast for sales growth in pharmaceuticals – its dominant division – to “at least high single-digit growth” for the year. The group sales estimate was also tweaked upwards, with Roche now predicting sales to rise “well ahead” of the market, compared with just “ahead” previously.

Analysts welcomed the figures and revised targets. The Tamiflu sales figures were more than 50 per cent above expectations, and easily outweighed slight disappointment about sales growth for some of the group’s top cancer drugs. Diagnostics also performed more strongly than expected.

In late morning trading, Roche shares were down just over 1.5 per cent at SFr166.90 on profit taking.

Group sales in the first nine months rose by 9 per cent to SFr36.4bn. Drug sales climbed by 11 per cent to SFr29bn, while diagnostics were 4 per cent higher at SFr7.4bn.

“The Roche group continued to perform very strongly in the third quarter ... Based on this performance, we expect another very good full-year result,” noted Severin Schwan, chief executive.

Roche said integration with Genentech was now achieving “substantial productivity gains” and should be largely completed by the end of the year. The group noted that, by 2011, it expected annual pre-tax synergies of about SFr1bn from the deal.

Strong operating cash flow meant that Roche expected to reduce the significant debt it assumed for the Genentech purchase, and return to a position of positive net cash by 2015.

martes, 27 de octubre de 2009

2009 Nobel Prize in Physics


This year's Nobel Prize in Physics is awarded for two scientific achievements that have helped to shape the foundations of today’s networked societies. They have created many practical innovations for everyday life and provided new tools for scientific exploration.

In 1966, Charles K. Kao, a Director of Engineering at Standard Telecommunication Laboratories, Harlow, UK and later Vice-Chancellor at the Chinese University of Hong Kong, made a discovery that led to a breakthrough in fiber optics. He carefully calculated how to transmit light over long distances via optical glass fibers. With a fiber of purest glass it would be possible to transmit light signals over 100 kilometers, compared to only 20 meters for the fibers available in the 1960s. Kao's enthusiasm inspired other researchers to share his vision of the future potential of fiber optics. The first ultrapure fiber was successfully fabricated just four years later, in 1970.

Today optical fibers make up the circulatory system that nourishes our communication society. These low-loss glass fibers facilitate global broadband communication such as the Internet. Light flows in thin threads of glass, and it carries almost all of the telephony and data traffic in each and every direction. Text, music, images and video can be transferred around the globe in a split second.

If we were to unravel all of the glass fibers that wind around the globe, we would get a single thread over one billion kilometers long – which is enough to encircle the globe more than 25 000 times – and is increasing by thousands of kilometers every hour.

A large share of the traffic is made up of digital images, which constitute the second part of the award. In 1969 Willard S. Boyle and George E. Smith invented the first successful imaging technology using a digital sensor, a CCD (Charge-Coupled Device). The CCD technology makes use of the photoelectric effect, as theorized by Albert Einstein and for which he was awarded the 1921 year's Nobel Prize. By this effect, light is transformed into electric signals. The challenge when designing an image sensor was to gather and read out the signals in a large number of image points, or pixels, in a short time.

The CCD is the digital camera's electronic eye. It revolutionized photography, as light could now be captured electronically instead of on film. The digital form facilitates the processing and distribution of these images. CCD technology is also used in many medical applications, e.g. imaging the inside of the human body, both for diagnostics and for microsurgery.

CCD technology, which transforms patterns of light into useful digital information, is the basis for many forms of modern imaging.

Leveraging pioneering foundational work in both the transistor and solar cell technologies, both of which were invented at Bell Labs, Doctors Boyle and Smith designed and developed the first CCD in 1969. By 1970, the Bell Labs researchers had built the CCD into the world’s first solid-state video camera. In 1975, they demonstrated the first CCD camera with image quality sharp enough for broadcast television.

Since its invention, the CCD has spawned significant new industries and markets by enabling a wide range of products including digital cameras, camcorders, HDTV, security monitoring, medical endoscopy, modern astronomy, and video conferencing to name a few. The insights behind CCDs also played a crucial role in the emergence of optical networking, which is the underlying transport technology for both the Internet and all other core communication networks today.

Beginning in 1983, telescopes were first outfitted with solid-state CCD cameras, which enabled astronomers to study objects thousands of times fainter than the most sensitive existing photographic plates, and enabled scientists to image in seconds what would have taken hours before. Today, most optical observatories, including the Hubble Space Telescope, rely on digital information systems built around “mosaics” of ultra sensitive CCD chips. CCD-enabled cameras also are used in satellite observations of the earth for environmental monitoring, surveying, and surveillance.