Search This Blog

Tuesday, October 30, 2007

New technology improves the reliability of wind turbines


New technology improves the reliability of wind turbines


The world's first commercial Brushless Doubly-Fed Generator (BDFG) is to be installed on a 20kW turbine at or close to the University of Cambridge Engineering Department's Electrical Engineering Division Building on the West Cambridge site by early 2008. This will help the University meet its obligations under new legislation, which requires a new building to obtain ten percent of its electricity from renewable sources.


The research team, led by Dr Richard McMahon, have developed a new generator technology for the wind turbine industry to the point of commercial exploitation. This type of generator can be used in a wide spectrum of wind turbines ranging from multi-megawatt systems for wind farms down to micro turbines used for domestic power generation


Research in Cambridge on this type of generator was started by Professor Williamson in the 1990s and, since 1999, is now being undertaken by Richard and his team. The team are collaborating with Durham's Head of Engineering, Professor Peter Tavner. The research has recently matured, enabling practical and complete designs to be made with confidence. Wind Technologies Ltd has recently been founded to exploit the technology.


"We are very excited about the new installation. This will be the first time BDFG is to be used commercially. The benefits to the wind power industry are clear: higher reliability, lower maintenance and lower production costs" recalls co-researcher and Managing Director of Wind Technologies, Dr Ehsan Abdi. "The West Cambridge medium size turbine should successfully demonstrate the applicability of the new generator, and we hope it will encourage the developers of other new construction projects to consider local wind-powered electricity generation to meet their obligations", adds Ehsan.


On a larger scale, a 600kW generator built by Wind Technologies is to be tested on a DeWind turbine in Germany, starting next spring. Its planned one-year test should demonstrate the improved performance of the BDFG technology to the key players in this industry. "This will put Wind Technologies in a position of strength in pursuant discussions on technology trade sale, licensing or partnering with large generator manufacturers, which is the strategy of choice for tackling this concentrated marketplace", says Ehsan.


A contemporary Brushless Doubly-Fed Machine (BDFM) is a single frame induction machine with two 3-phase stator windings of different pole numbers, and a special rotor design. Typically one stator winding is connected to the mains or grid, and hence has a fixed frequency, and the other is supplied with variable voltage at variable frequency from a converter.


In the majority (more than 90%) of newly-installed wind turbines in the world, generation is from a slip-ring generator. There are drawbacks to the use of slip-ring generators, particularly the additional cost and bulk of a machine which incorporates slip-rings and the need to maintain brush-gears including replacement of the brushes on a regular basis. Studies have shown that problems with brush-gear are a significant issue in wind turbine operation and reliability, and that the problem will be more severe in machines deployed offshore where there are stronger winds and accessibility is impaired.


The project received the Scientific Instrument Makers Award and the Cambridge University Entrepreneurs Business Idea Award in 2004. In 2005, the Institution of Electrical Engineers, now the Institution of Engineering and Technology, added its Innovation in Engineering Award. The company has recently received grants from Cambridge University Challenge Fund and East of England Development Agency to carry out market assessment, file patents and complete the pre-production prototype.


Harnessing wind power for electricity generation is becoming ever more common, both by large-scale wind farms, and increasingly by small domestic installations, with the UK the world's leading market for micro wind generation. "We hope that our generator, through offering high reliability and low maintenance, will significantly contribute to the wider adoption of wind power generation, particularly in offshore developments. This will lead to significant reductions in CO2 emission and further reduce our dependency on fossil fuels" says Ehsan.




Technorati : , , , ,

String Theory's Next Top Mode


String Theory's Next Top Mode


Ernest Rutherford used to tell his physics students that if they couldn't explain a concept to a barmaid, they didn't really understand the concept. With regard to the cosmological implications of string theory, the barmaids and physicists are both struggling-a predicament that SLAC string theorist Shamit Kachru hopes to soon resolve.
String theory is currently the most popular candidate for a unified theory of the fundamental forces, but it is not completely understood-and experimental evidence is notoriously elusive. Physicists can, however, gain crucial insight into the theory by evaluating how accurately its models can predict the observed universe.


Using this indirect approach, Kachru, in collaboration with theorists at Rutgers University and the Massachusetts Institute of Technology, sought models that could reproduce inflation-the prevailing cosmological paradigm in which the nascent universe experienced a fleeting period of exponential expansion.


Although there is already a substantial body of literature presenting such models-spawned in part by publications of Kachru and his Stanford and SLAC colleagues Renata Kallosh, Andrei Linde and Eva Silverstein in 2003-the complexity of the models leaves room for doubt.


"They incorporate inflation, and they're the most realistic models of string theory," Kachru said, "but they're complicated. They're fancy. They have a lot of 'moving parts,' and we need to fine-tune all of them, so we can't verify anything to a high degree of accuracy. It forces us to ask-are we confident that we really understand what's going on?"


To achieve a comprehensive understanding of how inflation can be embedded in string theory, Kachru and his collaborators employed a pedagogical tactic. "What we wanted was an explicit 'toy' model," Kachru explained. "The goal wasn't to have something realistic, but to allow us to understand everything to every detail."
"There are deep conceptual questions about how inflation is supposed to work," Kachru continued. "In order to understand these issues, it's best to have a simple model. There's so much clutter in the complicated examples, you can't disentangle the conceptual issues from the clutter."


The group investigated three versions of the simplest formulation of string theory, and found that they were incompatible with inflation. "This means we're going to have to consider slightly more complicated scenarios," said Kachru. "There are a lot of levels between this and the fancier working models, so we'll find one eventually."
"There are deep conceptual questions about how inflation is supposed to work," Kachru continued. "In order to understand these issues, it's best to have a simple model. There's so much clutter in the complicated examples, you can't disentangle the conceptual issues from the clutter."


The group investigated three versions of the simplest formulation of string theory, and found that they were incompatible with inflation. "This means we're going to have to consider slightly more complicated scenarios," said Kachru. "There are a lot of levels between this and the fancier working models, so we'll find one eventually."


Kachru and his colleagues published their work in Physical Review Letters D, providing a framework for others in search of simple inflationary models of string theory. "There are so many successful models out there that incorporate string theory and inflation, so we'll undoubtedly find a simpler version.




Technorati : , , ,

Laser Surgery Can Cut Flesh With Micro-explosions


Laser Surgery Can Cut Flesh With Micro-explosions



Laser Surgery Can Cut Flesh With Micro-explosions Or With Burning


Lasers are at the cutting edge of surgery. From cosmetic to brain surgery, intense beams of coherent light are gradually replacing the steel scalpel for many procedures.



Despite this increasing popularity, there is still a lot that scientists do not know about the ways in which laser light interacts with living tissue. Now, some of these basic questions have been answered in the first investigation of how ultraviolet lasers -- similar to those used in LASIK eye surgery -- cut living tissues.


The effect that powerful lasers have on actual flesh varies both with the wavelength, or color, of the light and the duration of the pulses that they produce. The specific wavelengths of light that are absorbed by, reflected from or pass through different types of tissue can vary substantially. Therefore, different types of lasers work best in different medical procedures.


For lasers with pulse lengths of a millionth of a second or less, there are two basic cutting regimes:


Mid-infrared lasers with long wavelengths cut by burning. That is, they heat up the tissue to the point where the chemical bonds holding it together break down. Because they automatically cauterize the cuts that they make, infrared lasers are used frequently for surgery in areas where there is a lot of bleeding.
Shorter wavelength lasers in the near-infrared, visible and ultraviolet range cut by an entirely different mechanism. They create a series of micro-explosions that break the molecules apart. During each laser pulse, high-intensity light at the laser focus creates an electrically-charged gas known as a plasma. At the end of each laser pulse, the plasma collapses and the energy released produces the micro-explosions. As a result, these lasers -- particularly the ultraviolet ones -- can cut more precisely and produce less collateral damage than mid-infrared lasers. That is why they are being used for eye surgery, delicate brain surgery and microsurgery.
"This is the first study that looks at the plasma dynamics of ultraviolet lasers in living tissue," says Shane Hutson, assistant professor of physics at Vanderbilt University who conducted the research with post-doctoral student Xiaoyan Ma. "The subject has been extensively studied in water and, because biological systems are overwhelmingly water by weight, you would expect it to behave in the same fashion. However, we found a surprising number of differences."


One such difference involves the elasticity, or stretchiness, of tissue. By stretching and absorbing energy, the biological matrix constrains the growth of the micro-explosions. As a result, the explosions tend to be considerably smaller than they are in water. This reduces the damage that the laser beam causes while cutting flesh. This effect had been predicted, but the researchers found that it is considerably larger than expected.


Another surprising difference involves the origination of the individual plasma "bubbles." All it takes to seed such a bubble is a few free electrons. These electrons pick up energy from the laser beam and start a cascade process that produces a bubble that grows until it contains millions of quadrillions of free electrons. Subsequent collapse of this plasma bubble causes a micro-explosion. In pure water, it is very difficult to get those first few electrons. Water molecules have to absorb several light photons at once before they will release any electrons. So a high-powered beam is required.


"But in a biological system there is a ubiquitous molecule, called NADH, that cells use to donate and absorb electrons. It turns out that this molecule absorbs photons at near ultraviolet wavelengths. So it produces seed electrons when exposed to ultraviolet laser light at very low intensities," says Hutson. This means that in tissue containing significant amounts of NADH, ultraviolet lasers don't need as much power to cut effectively as people have thought.


The cornea in the eye is an example of tissue that has very little NADH. As a result, it responds to an ultraviolet laser beam more like water than skin or other kinds of tissue, according to the researcher.


"Now that we have a better sense of how tissue properties affect the laser ablation process, we can do a better job of predicting how the laser will work with new types of tissue,"




Technorati : , , ,

Simulate Life And Death In The Universe


Simulate Life And Death In The Universe


Astronomers Simulate Life And Death In The Universe


Stars always evolve in the universe in large groups, known as clusters. Astronomers distinguish these formations by their age and size. The question of how star clusters are created from interstellar gas clouds and why they then develop in different ways has now been answered by researchers at the Argelander Institute for Astronomy at the University of Bonn with the aid of computer simulations. The scientists have solved -- at least at a theoretical level -- one of the oldest astronomical puzzles, namely the question of whether star clusters differ in their internal structure.


Astronomical observations have shown that all stars are formed in star clusters. Astronomers distinguish between, on the one hand, small and, by astronomical standards, young star clusters ranging in number from several hundred to several thousand stars and, on the other, large high-density globular star clusters consisting of as many as ten million tightly packed stars which are as old as the universe. No one knows how many star clusters there might be of each type, because scientists have not previously managed to fully compute the physical processes behind their genesis.


Stars and star clusters are formed as interstellar gas clouds collapse. Within these increasingly dense clouds, individual "lumps" emerge which, under their own gravitational pull, draw ever closer together and finally become stars. Similar to our "solar wind", the stars send out strong streams of charged particles. These "winds" literally sweep out the remaining gas from the cloud. What remains is a cluster that gradually disintegrates until its component stars can move freely in the interstellar space of the Milky Way.


Scientists believe that our own sun arose within a small star cluster which disintegrated in the course of its development. "Otherwise our planetary system would probably have been destroyed by a star moving close by," says Professor Dr. Pavel Kroupa of the Argelander Institute for Astronomy at Bonn University. In order to achieve a better understanding of the birth and death of stellar aggregations Professor Kroupa and Dr. Holger Baumgardt have developed a computer programme that simulates the influence of the gases remaining in a cluster on the paths taken by stars.


Heavy star clusters live longer


The main focus of this research has been on the question of what the initial conditions must look like if a new-born star cluster is to survive for a long time. The Bonn astronomers discovered that clusters below a certain size are very easily destroyed by the radiation of their component stars. Heavy star clusters, on the other hand, enjoy significantly better "survival chances".


For astronomers, another important insight from this work is that both light and heavy star clusters do have the same origins. As Professor Kroupa explains, "It seems that when the universe was born there were not only globular clusters but also countless mini star clusters. A challenge now for astrophysics is to find their remains." The computations in Bonn have paved the way for this search by providing some valuable theoretical pointers.


The Argelander Institute has recently been equipped with five "GRAPE Computers", which operate at speeds 1,000 times higher than normal PCs. They are being deployed not only in research but also for research-related teaching: "Thanks to the GRAPE facilities, our students and junior academics are learning to exploit the power of supercomputers and the software developed specially for them." The Argelander Institute is regarded world-wide as a Mecca for the computation of stellar processes. Despite their enormous calculating capacity, the machines require several weeks to complete the simulation.





Technorati : , , ,

New Superlensing Technique Brings Everything into Focus




Ultratight focusing over very short distances beats the best lenses; the discovery could bring the nanoworld up close and into focus


Light cannot be focused on anything smaller than its wavelength-or so says more than a century of physics wisdom. But a new study now shows that it is possible, if light is focused extremely close to a very special kind of lens.


The traditional limit on the resolution of light microscopes, which depends on the sharpness of focusing, is a typical wavelength of visible light (around 500 nanometers). This limitation inspired the invention of the electron microscope for viewing smaller objects like viruses that are only 10 to 300 nanometers in size. But scientists have discovered that placing a special pattern of circles in front of a laser enables them to focus its beam down to 50 nanometers, tiny enough to illuminate viruses and nanoparticles.



To achieve this, scientists draw opaque concentric circles on a transparent plate with much shorter spacing than the wavelength of light, and vary the line thickness so that the circles are far apart at the center but practically overlap near the edge. This design ensures that light transmitted through the plate is brightest at the core and dimmer around the edges.


"This construction is a way to convert traveling waves into evanescent waves," says Roberto Merlin, a physicist at the University of Michigan at Ann Arbor and author of the study published today in the online journal Science Express. Unlike ordinary light waves (such as sunlight), which can travel forever, evanescent waves traverse only very short distances before dying out. Whereas most of the light shining on such a plate is reflected back, a portion of the light leaks out the other side in the form of evanescent waves. If these waves, which have slipped through the different slits between the circles, can blend before disappearing, they form a single bright spot much smaller than the wavelength. The plate effectively acts like a "superlens", and the focal length or distance between the lens and the spot is nearly the same as that between the plate's bright center and dim edge; the size of the spot is fixed by the spacing between the circles.


With current nanofabrication technology, scientists say, it is not unreasonable to imagine circles spaced 50 nanometers apart giving a comparable spot size, which is about 10 times smaller than what conventional lenses can achieve. The rub, however, is that the smaller the spot, the faster it fades away from the plate. For instance, the intensity of the spot from a circle spacing of 50 nanometers would halve every 5.5 nanometers away from the plate, so anything that needs lighting would have to be extremely close to it. Positioning with such nanometer-scale precision is well within present technological acumen, and is routinely used in other microscopy techniques like scanning tunneling microscopy.


There are currently other ongoing investigations of so-called superlensing schemes, but researchers say this technique may be more tolerant to variations in the color of light used. This is important because scientists would want to use the brightest lasers available (usually pulsed lasers spanning a range of colors) due to the attenuation in intensity of the focused spot.


Merlin and his U.M. collaborator Anthony Grbic, an electrical engineer, are wrapping up the construction of a focusing system for microwaves based on the above theory; they say they're confident it will focus 30-centimeter wavelength microwaves onto a spot 1.5 centimeters wide. If a similar system could be built for light, it would enable the study of viral and nanoparticle structures by focusing light on them and detecting their scattered light. Other potential applications include larger capacity CDs and DVDs, which are currently limited by the size of the laser dot used to encode individual bits.





Technorati : , , ,

Astronauts Enter Harmony For First Time


Astronauts Enter Harmony For First Time


ESA astronaut Paolo Nespoli and Expedition 16 Commander Peggy Whitson were the first astronauts to enter the newest addition to the International Space Station after opening Harmony's hatch at 14:24 CEST (12:24 UT) . "It's a pleasure to be here in this very beautiful piece of hardware," said Nespoli, during a short ceremony to mark the occasion. "I would like to thank everybody who worked hard in making this possible and allowing the Space Station to be built even further."
The second of three nodes, or connecting modules, for the Station, the arrival of Harmony paves the way for the addition of the European Columbus laboratory and the Japanese Kibo laboratory during upcoming Shuttle missions.


Carried into space inside Space Shuttle Discovery's cargo bay, the Italian-built Harmony is the first addition to the Space Station's pressurized volume for six years. The module adds an extra 34 m3 of living and working space to the orbital outpost.



First European module


The Harmony module, also known as Node 2, was installed in its temporary location on the port side of the Node 1 module during a six and a half hour spacewalk yesterday.


Once Discovery has undocked at the end of the STS-120 mission, the Expedition 16 crew will relocate Harmony to its permanent location at the forward end of the US Destiny lab.


Built in Europe, with Thales Alenia Spazio as prime contractor, thanks to a barter agreement between NASA and ESA including the Shuttle launch of the European laboratory Columbus, Harmony is the first European module permanently attached to the ISS.






Technorati : , , , ,

Public Health Risks from Climate Change are Key Concern Commentsby Professor Jonathan A. Patz


Message not to be Lost in Debate: Public Health Risks from Climate Change are Key Concern - Oct 26, 2007


Comment by Professor Jonathan A. Patz, University of Wisconsin, Madison


With all the attention over the Bush Administration's mishandling of senate testimony by CDC Director, Julie Gerberding, I fear that the central clear message is being overshadowed by the (albeit errant) procedural aspects of the situation. Serving as a Lead Author for the United Nations Intergovernmental Panel on Climate Change (IPCC) reports of 1995, 1998, 2001, and 2007 (and Health Co-Chair for the US National Assessment on Climate Change) I can reaffirm that the original CDC testimony was scientifically accurate and consistent with IPCC findings.


But more than enough press has focused on the handling of the testimony, and not enough on the important messages that Congress and the American public need to know about Global Warming. These are:


1) Our public's health is indeed at risk from the effects of climate change acting via numerous hazardous exposure pathways, including: more intense and frequent heat waves and storms; ozone smog pollution and increased pollen allergens; insect-borne and water-borne infectious diseases; and disease risks from outside the US - afterall, we live in a globalized world. Some benefits from reduced cold and some decline in certain diseases can be expected, however, the scientific assessments have consistently found that, on balance, the health risks outweigh the benefits.


2) The Department of Health and Human Services, that includes CDC and NIH, are responsible for protecting the health of the American public. To the extent that extremes of climate can have broad population-wide impacts, neither the CDC nor NIH have directed adequate resources to address climate change, and to date, funding has been minimal compared to the size of the health threat.


3) There are potentially large opportunities and co-benefits in addressing the health risks of global warming. Certainly, our public health infrastructure must be strengthened, e.g, fortify water supply systems, heat and storm early warning and response programs, and enhance disease modeling and surveillance. However, energy policy now becomes one and the same as public health policy. Reducing fossil fuel burning will: (a) further reduce air pollution, (b) improve our fitness (e.g., if urban transportation planning allows for more Americans to travel by foot or bike, than by car), and (c) lessen potential greenhouse warming.


In short, the challenges posed by climate change urgently demand improving public health infrastructure AND energy conservation / urban planning policies - as such, climate change can present both enormous health risks and opportunities. But without funding from Congress to address climate change, CDC has its hands tied.


Web Resources:
Global & Sustainable Environmental Health at the University of Wisconsin, Madison
http://www.sage.wisc.edu/pages/health.html


Website for Middle School teachers and students, and the general public
http://www.ecohealth101.org




Technorati : ,

Space Challenger : Astronomers Simulate Life And Death In The Universe


Stars always evolve in the universe in large groups, known as clusters. Astronomers distinguish these formations by their age and size. The question of how star clusters are created from interstellar gas clouds and why they then develop in different ways has now been answered by researchers at the Argelander Institute for Astronomy at the University of Bonn with the aid of computer simulations. The scientists have solved -- at least at a theoretical level -- one of the oldest astronomical puzzles, namely the question of whether star clusters differ in their internal structure.


This image is of the spidery filaments and newborn stars of the Tarantula Nebula, a rich star-forming region also known as 30 Doradus. This cloud of glowing dust and gas is located in the Large Magellanic Cloud, the nearest galaxy to our own Milky Way, and is visible primarily from the Southern Hemisphere. This image of an interstellar cauldron provides a snapshot of the complex physical processes and chemistry that govern the birth - and death - of stars. (Credit: NASA Jet Propulsion Laboratory (NASA-JPL))



Astronomical observations have shown that all stars are formed in star clusters. Astronomers distinguish between, on the one hand, small and, by astronomical standards, young star clusters ranging in number from several hundred to several thousand stars and, on the other, large high-density globular star clusters consisting of as many as ten million tightly packed stars which are as old as the universe. No one knows how many star clusters there might be of each type, because scientists have not previously managed to fully compute the physical processes behind their genesis.


Stars and star clusters are formed as interstellar gas clouds collapse. Within these increasingly dense clouds, individual "lumps" emerge which, under their own gravitational pull, draw ever closer together and finally become stars. Similar to our "solar wind", the stars send out strong streams of charged particles. These "winds" literally sweep out the remaining gas from the cloud. What remains is a cluster that gradually disintegrates until its component stars can move freely in the interstellar space of the Milky Way.


Scientists believe that our own sun arose within a small star cluster which disintegrated in the course of its development. "Otherwise our planetary system would probably have been destroyed by a star moving close by," says Professor Dr. Pavel Kroupa of the Argelander Institute for Astronomy at Bonn University. In order to achieve a better understanding of the birth and death of stellar aggregations Professor Kroupa and Dr. Holger Baumgardt have developed a computer programme that simulates the influence of the gases remaining in a cluster on the paths taken by stars.


Heavy star clusters live longer


The main focus of this research has been on the question of what the initial conditions must look like if a new-born star cluster is to survive for a long time. The Bonn astronomers discovered that clusters below a certain size are very easily destroyed by the radiation of their component stars. Heavy star clusters, on the other hand, enjoy significantly better "survival chances".


For astronomers, another important insight from this work is that both light and heavy star clusters do have the same origins. As Professor Kroupa explains, "It seems that when the universe was born there were not only globular clusters but also countless mini star clusters. A challenge now for astrophysics is to find their remains." The computations in Bonn have paved the way for this search by providing some valuable theoretical pointers.


The Argelander Institute has recently been equipped with five "GRAPE Computers", which operate at speeds 1,000 times higher than normal PCs. They are being deployed not only in research but also for research-related teaching: "Thanks to the GRAPE facilities, our students and junior academics are learning to exploit the power of supercomputers and the software developed specially for them." The Argelander Institute is regarded world-wide as a Mecca for the computation of stellar processes. Despite their enormous calculating capacity, the machines require several weeks to complete the simulation.


The findings have now been published in the science journal "Monthly Notices of the Royal Astronomical Society" (MNRAS 380, 1589).




Technorati : ,

Quantum Cascade Laser Nanoantenna Created :with a wide-range of potential applications


In a major feat of nanotechnology engineering researchers from Harvard University have demonstrated a laser with a wide-range of potential applications in chemistry, biology and medicine. Called a quantum cascade (QC) laser nanoantenna, the device is capable of resolving the chemical composition of samples, such as the interior of a cell, with unprecedented detail,


Spearheaded by graduate students Nanfang Yu, Ertugrul Cubukcu, and Federico Capasso, Robert L. Wallace Professor of Applied Physics, all of Harvard's School of Engineering and Applied Sciences, the findings will be published as a cover feature of the October 22 issue of Applied Physics Letters. The researchers have also filed for U.S. patents covering this new class of photonic devices.


The laser's design consists of two gold rods separated by a nanometer gap (a device known as an optical antenna) built on the facet of a quantum cascade laser, which emits invisible light in the region of the spectrum where most molecules have their tell tale absorption fingerprints. The nanoantenna creates a light spot of nanometric size about fifty to hundred times smaller than the laser wavelength; the spot can be scanned across a specimen to provide chemical images of the surface with superior spatial resolution.



The device consists of an optical antenna fabricated on the facet of a quantum cascade laser emitting infrared light with a wavelength of 7 microns. The Harvard team used nanofabrication techniques to form the optical antenna, which consists of two gold rectangles, each 1.2 microns long, separated by a narrow gap (100 nm). Light from the laser illuminates the antenna, resulting in an intense spot of light in the gap of size seventy times smaller than the wavelength. This is far smaller than what would be possible with the conventional approach of forming a spot of light by focusing with a lens. Due to the wave nature of light, such a spot would have a diameter of more than 7 microns. The figure is an electron microscope micrograph of the facet of the QC laser with the built-in nanoantenna. Shown are also an atomic force microscope topographic image of the antenna and an optical image obtained with a near field scanning optical microscope, showing the highly localized light spot in the antenna gap. (Credit: Nanfang Yu, Ertugrul Cubukcu, and Federico Capasso)



"There's currently a major push to develop powerful tabletop microscopes with spatial resolution much smaller than the wavelength that can provide images of materials, and in particular biological specimens, with chemical information on a nanometric scale," says Federico Capasso.


While infrared microscopes, based on the detection of molecular absorption fingerprints, are commercially available and widely used to map the chemical composition of materials, their spatial resolution is limited by the range of available light sources and optics to well above the wavelength. Likewise the so-called near field infrared microscopes, which rely on an ultra sharp metallic tip scanned across the sample surface at nanometric distances, can provide ultrahigh spatial resolution but applications are so far strongly limited by the use of bulky lasers with very limited tunability and wavelength coverage.


"By combining Quantum Cascade Lasers with optical antenna nanotechnology we have created for the first time an extremely compact device that will enable the realization of new ultrahigh spatial resolution microscopes for chemical imaging on a nanometric scale of a wide range of materials and biological specimens," says Capasso.


Quantum cascade (QC) lasers were invented and first demonstrated by Capasso and his group at Bell Labs in 1994. These compact millimeter length semiconductor lasers, which are now commercially available, are made by stacking nanometer thick layers of semiconductor materials on top of each other. By varying the thickness of the layers one can select the wavelength of the QC laser across essentially the entire infrared spectrum where molecules absorb, thus custom designing it for a specific application.


In addition by suitable design the wavelength of a particular QCL can be made widely tunable. The range of applications of QC laser based chemical sensors is very broad, including pollution monitoring, chemical sensing, medical diagnostics such as breath analysis, and homeland security.


The teams co-authors are Kenneth Crozier, Assistant Professor of Electrical Engineering, and research associates Mikhail Belkin and Laurent Diehl, all of Harvard's School of Engineering and Applied Sciences; David Bour, Scott Corzine, and Gloria Höfler, all formerly with Agilent Technologies. The research was supported by the Air Force Office of Scientific Research and the National Science Foundation. The authors also acknowledge the support of two Harvard-based centers, the Nanoscale Science and Engineering Center and the Center for Nanoscale Systems, a member of the National Nanotechnology Infrastructure Network.




Technorati : ,

Brain acts diffrent For Creative And Noncreative Thinkers


How brain acts for crealtivity ?


Why do some people solve problems more creatively than others? Are people who think creatively different from those who tend to think in a more methodical fashion?


These questions are part of a long-standing debate, with some researchers arguing that what we call "creative thought" and "noncreative thought" are not basically different. If this is the case, then people who are thought of as creative do not really think in a fundamentally different way from those who are thought of as noncreative. On the other side of this debate, some researchers have argued that creative thought is fundamentally different from other forms of thought. If this is true, then those who tend to think creatively really are somehow different.


A new study led by John Kounios, professor of Psychology at Drexel University and Mark Jung-Beeman of Northwestern University answers these questions by comparing the brain activity of creative and noncreative problem solvers. The study, published in the journal Neuropsychologia, reveals a distinct pattern of brain activity, even at rest, in people who tend to solve problems with a sudden creative insight -- an "Aha! Moment" - compared to people who tend to solve problems more methodically.


At the beginning of the study, participants relaxed quietly for seven minutes while their electroencephalograms (EEGs) were recorded to show their brain activity. The participants were not given any task to perform and were told they could think about whatever they wanted to think about. Later, they were asked to solve a series of anagrams - scrambled letters that can be rearranged to form words [MPXAELE = EXAMPLE]. These can be solved by deliberately and methodically trying out different letter combinations, or they can be solved with a sudden insight or "Aha!" in which the solution pops into awareness. After each successful solution, participants indicated in which way the solution had come to them.


The participants were then divided into two groups - those who reported solving the problems mostly by sudden insight, and those who reported solving the problems more methodically - and resting-state brain activity for these groups was compared. As predicted, the two groups displayed strikingly different patterns of brain activity during the resting period at the beginning of the experiment - before they knew that they would have to solve problems or even knew what the study was about.


One difference was that the creative solvers exhibited greater activity in several regions of the right hemisphere. Previous research has suggested that the right hemisphere of the brain plays a special role in solving problems with creative insight, likely due to right-hemisphere involvement in the processing of loose or "remote" associations between the elements of a problem, which is understood to be an important component of creative thought. The current study shows that greater right-hemisphere activity occurs even during a "resting" state in those with a tendency to solve problems by creative insight. This finding suggests that even the spontaneous thought of creative individuals, such as in their daydreams, contains more remote associations.


Second, creative and methodical solvers exhibited different activity in areas of the brain that process visual information. The pattern of "alpha" and "beta" brainwaves in creative solvers was consistent with diffuse rather than focused visual attention. This may allow creative individuals to broadly sample the environment for experiences that can trigger remote associations to produce an Aha! Moment.


For example, a glimpse of an advertisement on a billboard or a word spoken in an overheard conversation could spark an association that leads to a solution. In contrast, the more focused attention of methodical solvers reduces their distractibility, allowing them to effectively solve problems for which the solution strategy is already known, as would be the case for balancing a checkbook or baking a cake using a known recipe.


Thus, the new study shows that basic differences in brain activity between creative and methodical problem solvers exist and are evident even when these individuals are not working on a problem. According to Kounios, "Problem solving, whether creative or methodical, doesn't begin from scratch when a person starts to work on a problem. His or her pre-existing brain-state biases a person to use a creative or a methodical strategy."


In addition to contributing to current knowledge about the neural basis of creativity, this study suggests the possible development of new brain imaging techniques for assessing potential for creative thought, and for assessing the effectiveness of methods for training individuals to think creatively.


Journal reference: Kounios, J., Fleck, J.I., Green, D.L., Payne, L., Stevenson, J.L., Bowden, E.M., & Jung-Beeman, M. The origins of insight in resting-state brain activity, Neuropsychologia (2007), doi:10.1016/j.neuropsychologia.2007.07.013


See also:


Jung-Beeman, M., Bowden, E.M., Haberman, J., Frymiare, J.L., Arambel-Liu, S., Greenblatt, R., Reber, P.J., & Kounios, J. (2004). Neural activity when people solve verbal problems with insight. PLoS Biology, 2, 500-510.


Kounios, J., Frymiare, J.L., Bowden, E.M., Fleck, J.I., Subramaniam, K., Parrish, T.B., & Jung-Beeman, M.J. (2006). The prepared mind: Neural activity prior to problem presentation predicts subsequent solution by sudden insight. Psychological Science, 17, 882-890.


sourced Drexel University. www.sciencedaily.com




Technorati : , ,

Alternative fuel : If Corn Is Biofuels King !!!


Researcher is finding the source of energy - fuel of Alternative way .....


When University of Illinois crop scientist Fred Below began growing tropical maize, the form of corn grown in the tropics, he was looking for novel genes for the utilization of nitrogen fertilizer and was hoping to discover information that could be useful to American corn producers.


Now, however, it appears that maize itself may prove to be the ultimate U.S. biofuels crop.


Early research results show that tropical maize, when grown in the Midwest, requires few crop inputs such as nitrogen fertilizer, chiefly because it does not produce any ears. It also is easier for farmers to integrate into their current operations than some other dedicated energy crops because it can be easily rotated with corn or soybeans, and can be planted, cultivated and harvested with the same equipment U.S. farmers already have. Finally, tropical maize stalks are believed to require less processing than corn grain, corn stover, switchgrass, Miscanthus giganteus and the scores of other plants now being studied for biofuel production.


What it does produce, straight from the field with no processing, is 25 percent or more sugar -- mostly sucrose, fructose and glucose.


"Corn is a short-day plant, so when we grow tropical maize here in the Midwest the long summer days delay flowering, which causes the plant to grow very tall and produce few or no ears," says Below. Without ears, these plants concentrate sugars in their stalks, he adds. Those sugars could have a dramatic affect on Midwestern production of ethanol and other biofuels.


According to Below, "Midwestern-grown tropical maize easily grows 14 or 15 feet tall compared to the 7-1/2 feet height that is average for conventional hybrid corn. It is all in these tall stalks," Below explains. "In our early trials, we are finding that these plants build up to a level of 25 percent or higher of sugar in their stalks.


This differs from conventional corn and other crops being grown for biofuels in that the starch found in corn grain and the cellulose in switchgrass, corn stover and other biofuel crops must be treated with enzymes to convert them into sugars that can be then fermented into alcohols such as ethanol.


Storing simple sugars also is more cost-effective for the plant, because it takes a lot of energy to make the complex starches, proteins, and oils present in corn grain. This energy savings per plant could result in more total energy per acre with topical maize, since it produces no grain.


"In terms of biofuel production, tropical maize could be considered the 'Sugarcane of the Midwest',"Below said. "The tropical maize we're growing here at the University of Illinois is very lush, very tall, and very full of sugar."


He added that his early trials also show that tropical maize requires much less nitrogen fertilizer than conventional corn, and that the stalks actually accumulate more sugar when less nitrogen is available. Nitrogen fertilizer is one of major costs of growing corn.


He explained that sugarcane used in Brazil to make ethanol is desirable for the same reason: it produces lots of sugar without a high requirement for nitrogen fertilizer, and this sugar can be fermented to alcohol without the middle steps required by high-starch and cellulosic crops. But sugarcane canít be grown in the Midwest.


The tall stalks of tropical maize are so full of sugar that producers growing it for biofuel production will be able to supply a raw material at least one step closer to being turned into fuel than are ears of corn.


"And growing tropical maize doesn't break the farmers' rotation. You can grow tropical maize for one year and then go back to conventional corn or soybeans in subsequent years," Below said. "Miscanthus, on the other hand, is thought to need a three-year growth cycle between initial planting and harvest and then your land is in Miscanthus. To return to planting corn or soybean necessitates removing the Miscanthus rhizomes.


Below is studying topical maize along with doctoral candidate Mike Vincent and postdoctoral research associate Matias Ruffo, and in conjunction with U of I Associate Professor Stephen Moose. This latest discovery of high sugar yields from tropical maize became apparent through cooperative work between Below and Moose to characterize genetic variation in response to nitrogen fertilizers.


Currently supported by the National Science Foundation, these studies are a key element to developing maize hybrids with improved nitrogen use efficiency. Both Below and Moose are members of Illinois Maize Breeding and Genetics Laboratory, which has a long history of conducting research that identifies new uses for the maize crop.


Moose now directs one of the longest-running plant genetics experiments in the world, in which more than a century of selective breeding has been applied to alter carbon and nitrogen accumulation in the maize plant. Continued collaboration between Below and Moose will investigate whether materials from these long term selection experiments will further enhance sugar yields from tropical maize.




Technorati : , , ,

Research : Dark Matter


We believe that most of the matter in the universe is dark, i.e. cannot be detected from the light which it emits (or fails to emit). This is "stuff" which cannot be seen directly -- so what makes us think that it exists at all? Its presence is inferred indirectly from the motions of astronomical objects, specifically stellar, galactic, and galaxy cluster/supercluster observations. It is also required in order to enable gravity to amplify the small fluctuations in the Cosmic Microwave Background enough to form the large-scale structures that we see in the universe today.


For each of the stellar, galactic, and galaxy cluster/supercluster observations the basic principle is that if we measure velocities in some region, then there has to be enough mass there for gravity to stop all the objects flying apart. When such velocity measurements are done on large scales, it turns out that the amount of inferred mass is much more than can be explained by the luminous stuff. Hence we infer that there is dark matter in the Universe.


Dark matter has important consequences for the evolution of the Universe and the structure within it. According to general relativity, the Universe must conform to one of three possible types: open, flat, or closed. The total amount of mass and energy in the universe determines which of the three possibilities applies to the Universe. In the case of an open Universe, the total mass and energy density (denoted by the greek letter Omega) is less than unity. If the Universe is closed, Omega is greater than unity. For the case where Omega is exactly equal to one the Universe is "flat".


Note that the dynamics of the Universe are not determined entirely by the geometry (open, closed or flat) unless the Universe contains only matter. In our Universe, where most of Omega comes from dark energy, this relation between the mass density, spatial curvature and the future of the universe no longer holds. It is then no longer true in this case that "geometry (spatial curvature) is destiny." Instead, to find out what will happen one needs to calculate the evolution of the expansion factor of the universe for the specific case of matter density, spatial curvature and "funny energy" to find out what will happen.



Dark matter was invoked to explain how galaxies stick together. The visible matter alone in galaxies - stars, gas and dust - is nowhere near enough to hold them together, so scientists reasoned there must be something invisible that exerts gravity and is central to all galaxies.


Last August, an astronomer at the University of Arizona at Tucson and his colleagues reported that a collision between two huge clusters of galaxies 3 billion light-years away, known as the Bullet Cluster, had caused clouds of dark matter to separate from normal matter.


Many scientists said the observations were proof of dark matter's existence and a serious blow for alternative explanations aiming to do away with dark matter with modified theories of gravity.




Technorati : , ,

Find in Virtualization :Virtualization Delivers IT and Business Benefits for SMBs

Virtualization is a technology that can benefit anyone who uses a computer, from IT professionals to commercial businesses and government organizations. Join the millions of people around the world who use virtualization to save time, money and energy while achieving more with the computer hardware they already own.


Virtualization Delivers IT and Business Benefits for SMBs.


Virtualization leader VMware has been a pioneer in both enterprise and SMB virtualization deployments, particularly in server virtualization. In June 2006, VMware launched its VMware Infrastructure 3 bundled solutions, packaging together the company's older and newer product functionality into several easily digestible bundles. This initiative included some entry-level pricing designed to put virtualization in reach of a wider array of enterprises and, notably, SMB customers grappling with various IT management and disaster recovery issues.This paper profiles the virtualization experience of two VMware SMB-size customers, and provides a window into the ‘real-life' impact of virtualization for IT managers juggling the dual priorities of remaining competitive and keeping costs in check


Guide to Virtual Infrastructure Implementation.


The decision to implement a virtual infrastructure within your enterprise is a smart one that will provide numerous financial and operational benefits to your organization. There are many options and directions that the roadmap to virtualization can take depending on the unique needs of your organization. This paper provides a practical implementation strategy with a phased approach to virtualization that will enable you to experience success and provide a solid foundation from which to expand





Technorati : ,

Find here

Home II Large Hadron Cillider News