Search This Blog

Sunday, November 11, 2007

Heart Health Affected By Past Discrimination And Personality.


Heart Health Affected By Past Discrimination And Personality.


Heart Health Affected By Past Discrimination And Personality, Study Shows:


People who previously experienced discrimination -- especially optimistic and trusting people -- suffer larger jumps in blood pressure when performing a stressful task such as talking about a situation that made them angry, according to a new study.


The results were similar for African American and white participants in the study, although blacks who had experienced discrimination had a larger surge in blood pressure during the stressful task. This may help explain why blacks have higher rates of cardiovascular diseases such as heart attacks, hypertension and strokes, said lead investigator Laura Smart Richman, an assistant research professor in Duke University's Department of Psychology and Neuroscience.


"These results are consistent with discrimination being a chronic stressor that is related to acute stress responses, particularly for blacks," Richman said. "It also may help to explain why people who experience more discrimination in their lives tend to have worse health outcomes.


"It's being understood more and more that discrimination may be an important contributor to racial health disparities," said Richman, one of a growing number of researchers bridging the fields of social psychology and health psychology to better understand the mind-body connection.


The findings are in the November issue of Health Psychology, a publication of the American Psychological Association. The study was funded by a grant from the National Heart, Lung and Blood Institute.


The study's co-authors are Gary G. Bennett of the Harvard School of Public Health, Jolynn Pek of the University of North Carolina at Chapel Hill, and Ilene Siegler and Redford B. Williams of Duke.


Discrimination and stress


In order to determine what long-term effects, if any, past discrimination might have on cardiovascular reactions to new, stressful experiences, Richman's team had participants report on the frequency of their past exposure to discrimination.


Then, participants were exposed to a new, unrelated stressor by asking them to recall a time when they were angry. Heart rate and blood pressure were monitored before, during and after the recall task. Researchers also calculated the time it took heart rate and blood pressure to return to normal, resting rates.


Participants were Durham, N.C., residents, and included blacks, whites, men and women of high and low socioeconomic status in roughly equal proportions. The sample included 71 whites (39 men, 32 women) and 94 blacks (52 men, 42 women.)


The data showed similar cardiovascular reactions to stress for all participants in the study, regardless of race, gender or socioeconomic status, but that the blood pressure of blacks went up more than whites. "Their blood pressure rose more during the stressor and took longer to get back to their pre-stress baseline," the study said.


The study also found that stress reactions were larger, and recovery times longer, in whites and blacks who had experienced high levels of past discrimination and had either high optimism or low cynicism. A high level of cynicism, on the other hand, was seen to be protective.


Previous research has demonstrated the benefits of optimism to overall health -- "optimism has long been found to serve as a buffer against the negative effects of stressful events and to generally be protective for health," Richman said -- but this study found that optimism may not be protective of cardiovascular health when combined with a previous history of discrimination.


"It could be the result of defying expectations," Richman said. "Optimistic people expect others to treat them well, and when this is not the case it results in more distress and larger blood-pressure surges when recalling a situation that made them angry."


Researchers believe this finding reflects how personality and individual views about the world shape our expectations.


"Those who are highly cynical and thus already have negative, untrusting attitudes and beliefs may expect life to be unfair to some extent. Thus, when discriminatory events do occur, it may not be as stressful an experience as for those who were relatively less cynical," the study said.


Most previous research on cardiovascular outcomes has sampled only white males of middle to high socioeconomic status, Richman said.


"That's what's unique. It was intentionally designed to have blacks and whites of both low and high socioeconomic status," Richman said. "Usually you don't see samples like that."


Richman said the study reinforces the view that discrimination has real effects on health.


"Societal discrimination can affect differences in housing and education as well as differences in treatment in the health care system and other things," she said. "But taking it on a more micro level, we are seeing that everyday chronic experiences can ultimately have a negative effect on cardiovascular outcomes because of the acute blood-pressure surges during stress."


Richman cautioned that though the findings suggest that blacks who are more trusting and optimistic may be more adversely affected by experiences of discrimination, "these findings do not imply that intervention strategies should be designed to reduce blacks' optimism or to increase their cynicism."


Instead, there is a need to better understand the underlying mechanisms that explain how discrimination has an impact on health outcomes.




Technorati : , , ,

Large Hadron Collider Ready To Go



Interconnections for the LHC's cryogenic system include more than 40 000 leak-tight welds.


CERN Director General Robert Aymar sealed the last interconnect in the world's largest cryogenic system, the Large Hadron Collider (LHC). This is the latest milestone in commissioning the LHC, the world's most powerful particle accelerator.



The LHC's cryogenic system has the task of cooling some 36 800 tonnes of material to a temperature of just 1.9 degrees above absolute zero (-271.3°C), colder than outer space. To do this, over 10 000 tonnes of liquid nitrogen and 130 tonnes of liquid helium will be deployed through a cryogenic system including over 40 000 leak-tight welds. Today's ceremony marks the end of a two year programme of work to connect all the main dipole and quadrupole magnets in the LHC. This complex task included both electrical and fluid connections.


"This is a huge accomplishment," said Lyn Evans, LHC project leader. "Now that it is done, we can concentrate on getting the machine cold and ready for physics."


The LHC is a circular machine, 27 kilometres around and divided into eight sectors, each of which can be cooled down to its operating temperature of 1.9 degrees above absolute zero and powered-up individually. One sector was cooled down, powered and warmed up in the first half of 2007. This was an important learning process, allowing subsequent sectors to be tested more quickly.


"Over the coming months, we'll be cooling down the remaining sectors," said Evans. "Five sectors will be cooling by the end of 2007, with the remaining three joining them early next year."


If all goes well, the first beams could be injected into the LHC in May 2008, and circulating beams established by June or July. With a project of this scale and complexity, however, the transition from construction to operation is a lengthy process.


"There is no big red button, and there are inevitably hurdles to be overcome as we bring the LHC into operation," said Aymar, "Every part of the system has to be brought on stream carefully, with each sub-system and component tested and repaired if necessary."


"There have been no show-stoppers so far," added Evans. "For a machine of this complexity, things are going remarkably smoothly and we're all looking forward to doing physics with the LHC next summer. If for any reason we have to warm up a sector, though," he cautioned, "we'll be looking at the end of summer rather than the beginning."




Technorati :

Space Mission Xeus Probes Origins Of The Universe


Space Mission Xeus Probes Origins Of The Universe


A new mission seeks to study the origins of the universe. Professor Martin Turner of the Department of Physics and Astronomy is Co-Principal Investigator on XEUS - a next-generation X-ray space observatory.


XEUS, which stands for X-ray Evolving Universe Spectroscopy, aims to study the fundamental laws of the Universe. With unprecedented sensitivity to the hot, million-degree universe, XEUS will explore key areas of contemporary astrophysics: growth of supermassive black holes, cosmic feedback and galaxy evolution, evolution of large-scale structures, extreme gravity and matter under extreme conditions, the dynamical evolution of cosmic plasmas and cosmic chemistry.


Professor Turner is also Chair of the XEUS International Steering committee. He said: "XEUS is an X-ray observatory 30-50 times more sensitive than XMM-Newton, which will be placed 1.5 million km from Earth, beyond the Moon, at the second Lagrangian point, a quiet stable location where the instruments can observe the universe undisturbed.


"Because it is so large, the observatory has two spacecraft. The five-metre diameter X-ray lens is in one, and the instruments in another. The two spacecraft fly together, 35 metres apart, to keep the instruments at the focus of the lens.


"XEUS has been selected for study by ESA as part of its Cosmic Vision programme. If the study outcome is successful it will be launched on Ariane 5 from Kourou in 2018.


"We have been developing the XEUS concept for an advanced X-ray observatory, for many years. This acceptance by ESA is a major step forward for X-ray astronomers all over the world."


"The million degree universe, where gravity is the main source of energy, is the finest physics laboratory we have. XEUS will help us find out about the behaviour of matter under extreme conditions of temperature, pressure, and gravity. It will also let us study the influence of black holes on the formation of galaxies and stars; and ultimately planets and ourselves."


Dr Richard Willingale, of the University of Leicester and chairman of the XEUS telescope working group said.


"XEUS will use new lightweight silicon optics to make the lens, the same material used to make silicon chips; one of the instruments has sensors cooled to within a tiny fraction of absolute zero to study the chemistry and physics of matter surrounding black holes."


Various international Space Agencies have expressed interest in cooperation in XEUS and discussions will start by the end of the year to ensure the earliest involvement in study work.


All the candidate missions are now competing in an assessment cycle which ends in 2011. Before the end of the cycle, there will be an important selection foreseen in 2009. At the end of this process, two missions will be proposed for implementation to ESA's Science Programme Committee, with launches planned for 2017 and 2018 respectively.


The selected missions fit well within the themes of ESA's Cosmic Vision 2015-2025 plan. The themes range from the conditions for life and planetary formation, to the origin and formation of the Solar System, the fundamental laws of our cosmos and the origin, structure and evolution of the Universe.


"The maturity of most of the proposals received demonstrates the excellence of the scientific community in Europe. This made the task of the SSAC very difficult but we believe that the set of selected missions will shape the future of European space science," said Tilman Spohn, chairperson of the SSAC (German Aerospace Center, Berlin). "The next decade will indeed be very exciting for the scientific exploration of space."


According to the chair of the Astronomy Working Group (AWG), Tommaso Maccacaro, (INAF - Osservatorio Astronomico di Brera) "The chosen candidates for astronomy missions show very promising and broad scientific return and have received excellent recommendations also from external referees."


"Technical feasibility and potential for successful cooperation with other agencies are two factors which are clearly evident in the Solar System missions that have been chosen," added Nick Thomas at the Physikalisches Institut, Universität Bern, chair of the Solar System Working Group.


In 2004, Professor Turner was honoured with a CBE for services to X-ray astronomy. Paying tribute to his colleague, Professor George Fraser, Director of the Space Research Centre, said at the time: "The award of a CBE to Martin Turner is very well-deserved recognition of a tremendous contribution to the field of X-ray Astronomy in a career of over thirty years here at Leicester. Martin has, perhaps uniquely, led the development of three major instruments in the field -launched on the EXOSAT (1983), Ginga (1987) and XMM-Newton (1999) -of which he is Principal Investigator- satellites. The last of these - the EPIC camera -has now performed flawlessly in orbit for four years. Martin, nothing daunted, is also heavily involved in the initial design stages of the successor to XMM, a giant European observatory called XEUS."





Technorati : , ,

Space Mission Xeus Probes Origins Of The Universe


Space Mission Xeus Probes Origins Of The Universe


A new mission seeks to study the origins of the universe. Professor Martin Turner of the Department of Physics and Astronomy is Co-Principal Investigator on XEUS - a next-generation X-ray space observatory.


XEUS, which stands for X-ray Evolving Universe Spectroscopy, aims to study the fundamental laws of the Universe. With unprecedented sensitivity to the hot, million-degree universe, XEUS will explore key areas of contemporary astrophysics: growth of supermassive black holes, cosmic feedback and galaxy evolution, evolution of large-scale structures, extreme gravity and matter under extreme conditions, the dynamical evolution of cosmic plasmas and cosmic chemistry.


Professor Turner is also Chair of the XEUS International Steering committee. He said: "XEUS is an X-ray observatory 30-50 times more sensitive than XMM-Newton, which will be placed 1.5 million km from Earth, beyond the Moon, at the second Lagrangian point, a quiet stable location where the instruments can observe the universe undisturbed.


"Because it is so large, the observatory has two spacecraft. The five-metre diameter X-ray lens is in one, and the instruments in another. The two spacecraft fly together, 35 metres apart, to keep the instruments at the focus of the lens.


"XEUS has been selected for study by ESA as part of its Cosmic Vision programme. If the study outcome is successful it will be launched on Ariane 5 from Kourou in 2018.


"We have been developing the XEUS concept for an advanced X-ray observatory, for many years. This acceptance by ESA is a major step forward for X-ray astronomers all over the world."


"The million degree universe, where gravity is the main source of energy, is the finest physics laboratory we have. XEUS will help us find out about the behaviour of matter under extreme conditions of temperature, pressure, and gravity. It will also let us study the influence of black holes on the formation of galaxies and stars; and ultimately planets and ourselves."


Dr Richard Willingale, of the University of Leicester and chairman of the XEUS telescope working group said.


"XEUS will use new lightweight silicon optics to make the lens, the same material used to make silicon chips; one of the instruments has sensors cooled to within a tiny fraction of absolute zero to study the chemistry and physics of matter surrounding black holes."


Various international Space Agencies have expressed interest in cooperation in XEUS and discussions will start by the end of the year to ensure the earliest involvement in study work.


All the candidate missions are now competing in an assessment cycle which ends in 2011. Before the end of the cycle, there will be an important selection foreseen in 2009. At the end of this process, two missions will be proposed for implementation to ESA's Science Programme Committee, with launches planned for 2017 and 2018 respectively.


The selected missions fit well within the themes of ESA's Cosmic Vision 2015-2025 plan. The themes range from the conditions for life and planetary formation, to the origin and formation of the Solar System, the fundamental laws of our cosmos and the origin, structure and evolution of the Universe.


"The maturity of most of the proposals received demonstrates the excellence of the scientific community in Europe. This made the task of the SSAC very difficult but we believe that the set of selected missions will shape the future of European space science," said Tilman Spohn, chairperson of the SSAC (German Aerospace Center, Berlin). "The next decade will indeed be very exciting for the scientific exploration of space."


According to the chair of the Astronomy Working Group (AWG), Tommaso Maccacaro, (INAF - Osservatorio Astronomico di Brera) "The chosen candidates for astronomy missions show very promising and broad scientific return and have received excellent recommendations also from external referees."


"Technical feasibility and potential for successful cooperation with other agencies are two factors which are clearly evident in the Solar System missions that have been chosen," added Nick Thomas at the Physikalisches Institut, Universität Bern, chair of the Solar System Working Group.


In 2004, Professor Turner was honoured with a CBE for services to X-ray astronomy. Paying tribute to his colleague, Professor George Fraser, Director of the Space Research Centre, said at the time: "The award of a CBE to Martin Turner is very well-deserved recognition of a tremendous contribution to the field of X-ray Astronomy in a career of over thirty years here at Leicester. Martin has, perhaps uniquely, led the development of three major instruments in the field -launched on the EXOSAT (1983), Ginga (1987) and XMM-Newton (1999) -of which he is Principal Investigator- satellites. The last of these - the EPIC camera -has now performed flawlessly in orbit for four years. Martin, nothing daunted, is also heavily involved in the initial design stages of the successor to XMM, a giant European observatory called XEUS."





Technorati : , ,

Space Mission Xeus Probes Origins Of The Universe


Space Mission Xeus Probes Origins Of The Universe


A new mission seeks to study the origins of the universe. Professor Martin Turner of the Department of Physics and Astronomy is Co-Principal Investigator on XEUS - a next-generation X-ray space observatory.


XEUS, which stands for X-ray Evolving Universe Spectroscopy, aims to study the fundamental laws of the Universe. With unprecedented sensitivity to the hot, million-degree universe, XEUS will explore key areas of contemporary astrophysics: growth of supermassive black holes, cosmic feedback and galaxy evolution, evolution of large-scale structures, extreme gravity and matter under extreme conditions, the dynamical evolution of cosmic plasmas and cosmic chemistry.


Professor Turner is also Chair of the XEUS International Steering committee. He said: "XEUS is an X-ray observatory 30-50 times more sensitive than XMM-Newton, which will be placed 1.5 million km from Earth, beyond the Moon, at the second Lagrangian point, a quiet stable location where the instruments can observe the universe undisturbed.


"Because it is so large, the observatory has two spacecraft. The five-metre diameter X-ray lens is in one, and the instruments in another. The two spacecraft fly together, 35 metres apart, to keep the instruments at the focus of the lens.


"XEUS has been selected for study by ESA as part of its Cosmic Vision programme. If the study outcome is successful it will be launched on Ariane 5 from Kourou in 2018.


"We have been developing the XEUS concept for an advanced X-ray observatory, for many years. This acceptance by ESA is a major step forward for X-ray astronomers all over the world."


"The million degree universe, where gravity is the main source of energy, is the finest physics laboratory we have. XEUS will help us find out about the behaviour of matter under extreme conditions of temperature, pressure, and gravity. It will also let us study the influence of black holes on the formation of galaxies and stars; and ultimately planets and ourselves."


Dr Richard Willingale, of the University of Leicester and chairman of the XEUS telescope working group said.


"XEUS will use new lightweight silicon optics to make the lens, the same material used to make silicon chips; one of the instruments has sensors cooled to within a tiny fraction of absolute zero to study the chemistry and physics of matter surrounding black holes."


Various international Space Agencies have expressed interest in cooperation in XEUS and discussions will start by the end of the year to ensure the earliest involvement in study work.


All the candidate missions are now competing in an assessment cycle which ends in 2011. Before the end of the cycle, there will be an important selection foreseen in 2009. At the end of this process, two missions will be proposed for implementation to ESA's Science Programme Committee, with launches planned for 2017 and 2018 respectively.


The selected missions fit well within the themes of ESA's Cosmic Vision 2015-2025 plan. The themes range from the conditions for life and planetary formation, to the origin and formation of the Solar System, the fundamental laws of our cosmos and the origin, structure and evolution of the Universe.


"The maturity of most of the proposals received demonstrates the excellence of the scientific community in Europe. This made the task of the SSAC very difficult but we believe that the set of selected missions will shape the future of European space science," said Tilman Spohn, chairperson of the SSAC (German Aerospace Center, Berlin). "The next decade will indeed be very exciting for the scientific exploration of space."


According to the chair of the Astronomy Working Group (AWG), Tommaso Maccacaro, (INAF - Osservatorio Astronomico di Brera) "The chosen candidates for astronomy missions show very promising and broad scientific return and have received excellent recommendations also from external referees."


"Technical feasibility and potential for successful cooperation with other agencies are two factors which are clearly evident in the Solar System missions that have been chosen," added Nick Thomas at the Physikalisches Institut, Universität Bern, chair of the Solar System Working Group.


In 2004, Professor Turner was honoured with a CBE for services to X-ray astronomy. Paying tribute to his colleague, Professor George Fraser, Director of the Space Research Centre, said at the time: "The award of a CBE to Martin Turner is very well-deserved recognition of a tremendous contribution to the field of X-ray Astronomy in a career of over thirty years here at Leicester. Martin has, perhaps uniquely, led the development of three major instruments in the field -launched on the EXOSAT (1983), Ginga (1987) and XMM-Newton (1999) -of which he is Principal Investigator- satellites. The last of these - the EPIC camera -has now performed flawlessly in orbit for four years. Martin, nothing daunted, is also heavily involved in the initial design stages of the successor to XMM, a giant European observatory called XEUS."





Technorati : , ,

Internet control by U.S. promises to be hot topic at U.N. forum



When hundreds of technology experts from around the world gather here this week to hammer out the future of the Internet, the hottest issue won't be spam, phishing or any of the other phenomena that bedevil users everywhere.


Instead, ending U.S. control over what's become a global network will be at the top of the agenda for many of the more than 2,000 participants expected at the United Nations Internet Governance Forum, which begins Monday.


With the Internet now dominating nearly every aspect of modern life, U.S. control of the medium has become a sensitive topic worldwide. In nations that try to control what people can see and hear, the Internet often is the only source of uncensored news and opinion.


U.S. officials say that keeping Internet functions under their control has protected that free flow of information.


Issue of control


Yet to many foreign-government officials and technology gurus, the United States has too much control over a tool that's used by more than 1.4 billion people worldwide. Brazil, China and other countries have proposed transferring oversight to an international body.


"The Internet has become an everyday instrument of particular importance for the entire world, yet it's still under the control of one country," said Rogerio Santanna, Brazil's secretary of logistics and information technology.


Others worry, however, that transferring the administration of the Internet to the United Nations or another international body would make it vulnerable to censorship, especially by powerful countries such as China.


The most dramatic example of Internet censorship happened recently in Myanmar, when the ruling military junta cut Internet connections to stop dissident blogs and other sites that had distributed information about government repression during September's crushed pro-democracy protests.


China is routinely criticized for its censorship policies and its use of information gleaned from Internet providers to crack down on dissidents.


Even Brazil has inspired Internet privacy debates by demanding that U.S. technology giant Google hand over information about users who are suspected of posting child pornography and other offensive material on its social-networking site Orkut.


"Should the U.N. gain control of the Internet," the conservative U.S. research center the Heritage Foundation wrote on its Web site, "it would give meddlesome governments the opportunity to censor and regulate the medium until its usefulness as a vehicle for freedom of expression and international competition is crippled."


Debate for years


Such debates have dominated technology circles for years and are the spark for this week's meeting, the second of five such global forums organized by the United Nations.


The Brazil forum will feature panels on other key issues such as blocking online child pornography, expanding Internet access in less developed countries and an array of technical matters.


Yet the fight over U.S. control promises to take center stage.


The forum, which was organized partly as a response to international debate about the issue, can't make binding decisions, but it can lay the foundation for policy changes.


At the heart of the controversy is a nonprofit company based in Marina del Rey, Calif., called the Internet Corporation for Assigned Names and Numbers (ICANN), which the U.S. government has contracted to help it manage key Internet functions.


Those include regulating Web sites with popular domain names such as .com and .org and creating new top-level domain names. Critics say the arrangement makes internationally popular Web sites subject to U.S. policy on everything from user privacy to obscenity.


Through ICANN, the United States also assigns Internet protocol addresses, which identify computers, routers and other electronic devices worldwide.


Best for users?


U.S. government officials also argue that keeping the Internet under centralized control is best for users.


But for many critics of the ICANN system, the main problem is the organization's perceived lack of transparency. They say ICANN shuts out the public when it makes key decisions, such as when it nominates board members, and lacks accountability to the Internet users it serves.


"We feel there would be a very healthy check and balances if there was something independent of the United States and ICANN to oversee the system," said Syracuse University professor Milton Mueller, who's part of the academic policy group the Internet Governance Project.


"As long as the United States holds on to its control, there will always be questions about the system's transparency."


For Santanna, however, the need for change has been clear in the dispute with Google, which could make news while the U.N. forum is under way.


Google has said it can't meet Brazil's demands for user information because its servers are in the United States and are subject to U.S. privacy laws.


After months of dispute, both sides will meet this week in an attempt to reach agreement.


Santanna said the long dispute could have been avoided if there'd been an international body that, in addition to managing the system's technical functions, could resolve such cross-border controversies.





Technorati : , , , ,

Electronic Nose' Could Detect Hazards


,



Professor Harry Tuller, left, leads a team that has found a way to print useful devices, like gas sensors, from inkjet printers. At far right is Woochul Jung, graduate student in material sciences and engineering, and at center is Amy Leung, a sophomore in chemical engineering; between them is the printing device.

A tiny "electronic nose" that MIT researchers have engineered with a novel inkjet printing method could be used to detect hazards including carbon monoxide, harmful industrial solvents and explosives.


Led by MIT professor Harry Tuller, the researchers have devised a way to print thin sensor films onto a microchip, a process that could eventually allow for mass production of highly sensitive gas detectors.


"Mass production would be an enormous breakthrough for this kind of gas sensing technology," said Tuller, a professor in the Department of Materials Science and Engineering (MSE), who is presenting the research Oct. 30 at the Composites at Lake Louise Conference in Alberta, Canada.


The prototype sensor, created by Tuller, postdoctoral fellow Kathy Sahner and graduate student Woo Chul Jung, members of MIT's Electroceramics Group in MSE, consists of thin layers of hollow spheres made of the ceramic material barium carbonate, which can detect a range of gases. Using a specialized inkjet print head, tiny droplets of barium carbonate or other gas-sensitive materials can be rapidly deposited onto a surface, in any pattern the researchers design.


The miniature, low-cost detector could be used in a variety of settings, from an industrial workplace to an air-conditioning system to a car's exhaust system, according to Tuller. "There are many reasons why it's important to monitor our chemical environment," he said.


For a sensor to be useful, it must be able to distinguish between gases. For example, a sensor at an airport would need to know the difference between a toxic chemical and perfume, Tuller said. To achieve this, sensors should have an array of films that each respond differently to different gases. This is similar to the way the human sense of smell works, Tuller explained.


"The way we distinguish between coffee's and fish's odor is not that we have one sensor designed to detect coffee and one designed to detect fish, but our nose contains arrays of sensors sensitive to various chemicals. Over time, we train ourselves to know that a certain distribution of vapors corresponds to coffee," he said.


In previous work designed to detect nitrogen oxide (NOx) emissions from diesel exhaust, the researchers created sensors consisting of flat, thin layers of barium carbonate deposited on quartz chips. However, the films were not sensitive enough, and the team decided they needed more porous films with a larger surface area.


To create more texture, they applied the barium carbonate to a layer of microspheres, hollow balls less than a micrometer in diameter made of a plastic polymer. When the microspheres are burned away, a textured, highly porous layer of gas-sensitive film is left behind.


The resulting film, tens of nanometers (billionths of a meter) thick, is much more sensitive than flat films because it allows the gas to readily permeate through the film and interact with a much larger active surface area.


At first, the researchers used a pipette to deposit the barium carbonate and microspheres. However, this process proved time-consuming and difficult to control.


To improve production efficiency, the researchers took advantage of a programmable Hewlett-Packard inkjet print head located in the MIT Laboratory of Organic Optics and Electronics. The inkjet print head, like that in a regular inkjet printer, can deposit materials very quickly and controllably. The special gas-sensitive "inks" used in this work were optimized for printing by Amy Leung, an MIT sophomore in chemical engineering.


This allows the researchers to rapidly produce many small, identical chips containing geometrically well-defined gas-sensing films with micrometer dimensions. Patterns of different gas-sensitive inks, just as in a color printer, can be easily generated to form arrays with very little ink required per sensor.


In future studies, the team hopes to create large arrays of gas-sensitive films with controlled three-dimensional shapes and morphologies.


Electronic nose


An electronic nose (e-nose) is a device that identifies the specific components of an odor and analyzes its chemical makeup to identify it. An electronic nose consists of a mechanism for chemical detection, such as an array of electronic sensors, and a mechanism for pattern recognition, such as a neural network. Electronic noses have been around for several years but have typically been large and expensive. Current research is focused on making the devices smaller, less expensive, and more sensitive. The smallest version, a nose-on-a-chip is a single computer chip containing both the sensors and the processing components.
An odor is composed of molecules, each of which has a specific size and shape. Each of these molecules has a correspondingly sized and shaped receptor in the human nose. When a specific receptor receives a molecule, it sends a signal to the brain and the brain identifies the smell associated with that particular molecule. Electronic noses based on the biological model work in a similar manner, albeit substituting sensors for the receptors, and transmitting the signal to a program for processing, rather than to the brain. Electronic noses are one example of a growing research area called biomimetics, or biomimicry, which involves human-made applications patterned on natural phenomena.


Electronic noses were originally used for quality control applications in the food, beverage and cosmetics industries. Current applications include detection of odors specific to diseases for medical diagnosis, and detection of pollutants and gas leaks for environmental protection.







Technorati : , , ,

Seaweed Transformed Into Stem Cell Technology


Seaweed Transformed Into Stem Cell Technology


The new stem cell scaffold. Circled in black: a microbead degrades to release neural stem cells. Circled in white: a separate microbead releases alginate lyase that will break down the outer layer of the scaffold, releasing stem cells into the body.


Engineers at Rensselaer Polytechnic Institute have transformed a polymer found in common brown seaweed into a device that can support the growth and release of stem cells at the sight of a bodily injury or at the source of a disease.


The findings mark an important step in efforts to develop new medical therapies using stem cells.


"We have developed a scaffold for stem cell culture that can degrade in the body at a controlled rate," said lead researcher Ravi Kane, professor of chemical and biological engineering. "With this level of control we can foster the growth of stem cells in the scaffold and direct how, when, and where we want them to be released in the body."


Kane and his collaborators, which include the author of the paper and former Rensselaer graduate student Randolph Ashton, created the device from a material known as alginate. Alginate is a complex carbohydrate found naturally in brown seaweed. When mixed with calcium, alginate gels into a rigid, three-dimensional mesh.


The device could have wide-ranging potential for use in regenerative medicine, Kane explains. For example, the scaffolds could one day be used in the human body to release stem cells directly into injured tissue. Kane and his colleagues hope that the scaffold could eventually be used for medical therapies such as releasing healthy bone stem cells right at the site of a broken bone, or releasing neural stem cells in the brain where cells have been killed by diseases such as Alzheimer's.


Kane and his team encapsulated healthy neural stem cells in the alginate mesh, producing a three-dimensional scaffold that degrades at a tunable, controlled rate. Once the scaffold is implanted in the body, the researchers use an enzyme called alginate lyase, which eats away at alginate, to release the stem cells. Alginate lyase is naturally produced in some marine animals and bacterial strains, but not in humans.


In order to control the degradation of the alginate scaffold, the researchers encapsulated varying amounts of alginate lyase into microscale beads, called microspheres. The microspheres containing the alginate lyase were then encapsulated into the larger alginate scaffolds along with the stem cells. As the microspheres degraded, the alginate lyase enzyme was released into the larger alginate scaffold and slowly began to eat away at its surface, releasing the healthy stem cells in a controlled fashion.


The microspheres also can be filled with more than just alginate lyase. "We can add drug molecules or proteins to the microspheres along with the alginate lyase that, when released into the larger alginate scaffold, could influence the fate of the encapsulated stem cells," Kane said. "By adding these materials to the larger scaffold, we can direct the stem cells to become the type of mature, differentiated cell that we desire, such as a neuron. This will prove very valuable for applications of stem cells in regenerative medicine."








Technorati : , ,

New Prosthetic Devices Will Convert Brain Signals Into Action


New Prosthetic Devices Will Convert Brain Signals Into Action


Lakshminarayan Srinivasan is part of a team that develops standardizing math equations to allow neural prostheses to work better.


MIT researchers have developed a new algorithm to help create prosthetic devices that convert brain signals into action in patients who have been paralyzed or had limbs amputated.


The technique, described in the October edition of the Journal of Neurophysiology, unifies seemingly disparate approaches taken by experimental groups that prototype these neural prosthetic devices in animals or humans.


"The work represents an important advance in our understanding of how to construct algorithms in neural prosthetic devices for people who cannot move to act or speak," said Lakshminarayan "Ram" Srinivasan, lead author of the paper.


Srinivasan, currently a postdoctoral researcher at the Center for Nervous System Repair at Massachusetts General Hospital and a medical student in the Harvard-MIT Division of Health Sciences and Technology, began working on the algorithm while a graduate student in MIT's Department of Electrical Engineering and Computer Science.


Trauma and disease can lead to paralysis or amputation, reducing the ability to move or talk despite the capacity to think and form intentions. In spinal cord injuries, strokes, and diseases such as amyotrophic lateral sclerosis (Lou Gehrig's disease), the neurons that carry commands from the brain to muscle can be injured. In amputation, both nerves and muscle are lost.


Neural prosthetic devices represent an engineer's approach to treating paralysis and amputation. Here, electronics are used to monitor the neural signals that reflect an individual's intentions for the prosthesis or computer they are trying to use. Algorithms form the link between neural signals that are recorded and the user's intentions that are decoded to drive the prosthetic device.


Over the past decade, efforts at prototyping these devices have divided along various boundaries related to brain regions, recording modalities, and applications. The MIT technique provides a common framework that underlies all these various efforts.


The research uses a method called graphical models that has been widely applied to problems in computer science like speech-to-text or automated video analysis. The graphical models used by the MIT team are diagrams composed of circles and arrows that represent how neural activity results from a person's intentions for the prosthetic device they are using.


The diagrams represent the mathematical relationship between the person's intentions and the neural manifestation of that intention, whether the intention is measured by an electoencephalography (EEG), intracranial electrode arrays or optical imaging. These signals could come from a number of brain regions, including cortical or subcortical structures.


Until now, researchers working on brain prosthetics have used different algorithms depending on what method they were using to measure brain activity. The new model is applicable no matter what measurement technique is used, according to Srinivasan. "We don't need to reinvent a new paradigm for each modality or brain region," he said.


Srinivasan is quick to underscore that many challenges remain in designing neural prosthetic algorithms before they are available for people to use. While the algorithm is unifying, it is not universal: the algorithm consolidates multiple avenues of development in prostheses, but it isn't the final and only approach these researchers expect to see in the years to come. Moreover, energy efficiency and robustness are key considerations for any portable, implantible bio-electronic device.


Through a better quantitative understanding of how the brain normally controls movement and the mechanisms of disease, he hopes these devices could one day allow for a level of dexterity depicted in movies, such as actor Will Smith's mechanical arm in the movie, "I, Robot."


The gap between existing prototypes and that final goal is wide. Translating an algorithm into a fully functioning clinical device will require a great deal of work, but also represents an intriguing road of scientific and engineering development for the years to come.


Other authors on the paper are Uri Eden Ph.D. , assistant professor in mathematics and statistics at Boston University; Sanjoy Mitter, professor in EECS and MIT's Engineering Systems Division; and Emery Brown, professor in brain and cognitive sciences, HST, and anesthesia and critical care at Massachusetts General Hospital.


This work was sponsored by the National Institutes of Health and the National Science Foundation.





Technorati : ,

Supercomputers Make Safer Nuclear Reactors


Supercomputers Make Safer Nuclear Reactors



Using Supercomputers To Make Safer Nuclear Reactors


Rensselaer Polytechnic Institute is leading a $3 million research project that will pair two of the world's most powerful supercomputers to boost the safety and reliability of next-generation nuclear power reactors.



The three-year project, funded by the U.S. Department of Energy, will call upon a diverse team of researchers and institutions to create highly detailed computer models of a new proposed type of nuclear reactor. These models could play a key role for the future development of the new reactors, which meet stringent safety and nonproliferation criteria, can burn long-lived and highly radioactive materials, and can operate over a long time without using new fuel.


Watch an interview with engineering physics professor Michael Podowski, a world-renowned nuclear engineering and multiphase science and technology expert, who is project director and principal investigator of the new study.


Running simulations of such a vast virtual model, where scientists can watch the reactor system perform as a whole or zoom in to focus on the interaction of individual molecules, requires unprecedented computing power. To undertake such a task, researchers will use both Rensselaer's Computational Center for Nanotechnology Innovations, or CCNI - the world's seventh most powerful supercomputer - and Brookhaven National Laboratory's New York Blue - the world's fifth most powerful supercomputer. body>


The research program, titled "Deployment of a Suite of High Performance Computational Tools for Multiscale Multiphysics Simulation of Generation-IV Reactors," is unique in scale as well as its geographic concentration. Along with Rensselaer and Brookhaven, the partnership includes researchers from Columbia University and the State University of New York at Stony Brook, all New York state-based institutions. Another Empire State connection is computer giant IBM, headquartered in New York and the maker of Blue Gene supercomputers. The company developed, designed, and built both CCNI and New York Blue.


Rensselaer nuclear engineering and engineering physics professor Michael Podowski, a world-renowned nuclear engineering and multiphase science and technology expert who also heads Rensselaer's Interdisciplinary Center for Multiphase Research, is project director and principal investigator of the new study.


Podowski said nuclear power should likely gain traction and become more widespread in the coming decades, as nations seek ways to fulfill their growing energy needs without increasing their greenhouse emissions. Nuclear reactors produce no carbon dioxide, Podowski said, which gives this energy source an advantage over coal and other fossil fuels for large-scale electricity production.


The main challenge of nuclear power plants, he said, is that they produce radioactive waste as a byproduct of energy production. But several governments around the world, including the United States, are working tirelessly with universities, research consortia, and the private sector to design and develop new, so-called "fourth generation" nuclear reactors that are safer and produce less waste. These reactors will be necessary in the coming decades as nuclear reactors currently in use reach the end of their life cycle and are gradually decommissioned.


The type of reactor that Podowski's team will be modeling, a sodium-cooled fast reactor, or SFR, is among the most promising of these next-generation designs. The primary advantage of the SFR is its ability to burn highly radioactive nuclear materials, which today's reactors cannot do, Podowski said.


Whereas current reactors source their power from uranium, SFRs can also source their power from fuel that is a mixture of uranium and plutonium. In particular, SFRs will be able to burn both weapons-grade plutonium and pre-existing nuclear waste, Podowski said. Thanks to their high temperatures, SFRs will also produce electricity at higher efficiency than current nuclear reactors.


So along with producing less toxic waste, SFRs should be able to actively help reduce the amount of existing radioactive materials by burning already-spent nuclear waste, he said. SFRs also offer a viable, productive way to start getting rid of the world's stockpile of weapons-grade nuclear fuel.


"The idea is to design reactors that can use this material and that are safe," Podowski said. "With this project, we are trying to improve the understanding of the physics of the system in order to provide the necessary advancements for the design of new, safer, and better reactors."


To expedite this understanding, Podowski's team will construct an incredibly detailed computer model of an SFR. The model will allow researchers to zoom in and watch as individual molecules of fission gas and fuel material interact with other molecules inside the reactor, or zoom out to simulate and test the behavior of the reactor as a whole. Creating such a model, not to mention running hundreds or thousands of simulations with slightly modified models and conditions, requires a tremendous amount of computing power and would not be possible without the help of supercomputers, Podowski said.


In order to construct the model and run these massive simulations, Podowski's team will develop and deploy a suite of powerful, high-performance software tools capable of performing such a task. Since no one computer code or technology is robust enough to model the wide variety of systems that comprise an SFR, the team will use different computer codes for different parts of the model and then develop new ways of linking those differently coded segments together into a single, cohesive, seamless package.


The researchers will use simulations to study fuel performance, local core degradation, fuel particle transport, and several other aspects of the SFRs. By better understanding how design and operational issues will affect the reactor at different stages in its life cycle, Podowski said, the new study will help to dramatically improve the design and safety of SFRs long before the first physical prototype is ever built.


"Nuclear reactors are safe, but nothing is perfect," Podowski said. "So the issue is to anticipate what could happen, understand how it could happen, and then take actions to both prevent it from happening and, in the extremely unlikely instance of an accident, be able to mitigate the consequences."


Podowski will lead a team of more than 10 researchers on the three-year project. Rensselaer associate professor Kenneth Jansen, assistant professor Li Liu, and research assistant professor Steven Antal - all of the Department of Mechanical, Aerospace, and Nuclear Engineering - are listed as co-PIs and will contribute to the study. Podowski said he also expects to hire a postdoctoral researcher and at least three doctoral students to work on the project.


The rest of the team includes James Glimm from Stony Brook University; David Keyes from Columbia University; as well as Lap Cheng and Roman Samulyak from Brookhaven National Laboratory.


Supercomputer


With 4700 gigaflops and 950 processors, Horseshoe is one of the world's most powerful computers. Developed at the Department of Mathematics and Computer Science, the supercomputer enables researchers at the University to carry out complex calculations.



In just one week, the supercomputer can conduct calculations and simulations that would take a year on a normal computer. Few researchers in the world have access to this type of computer, giving Danish researchers a rare opportunity to develop world-class expertise.



Using the supercomputer to create complex simulations, researchers at the University have studied the interaction of fat and protein in living organisms' cell membranes. This knowledge enables our researchers to contribute to a better understanding of several diseases.



Horseshoe is the result of advanced research in computer technology. The supercomputer is built from standard computer components available in any high-street shop. By linking the many components together, researchers created a cluster with signifi cant capacity. The University's research team is one of few in the world to have mastered this technology. Computer science students gain valuable knowledge about cluster computing, an area in increasing demand.



supercomputer is a computer that is considered, or was considered at the time of its introduction, to be at the frontline in terms of processing capacity, particularly speed of calculation.



Supercomputer challenges, technologies


A supercomputer generates large amounts of heat and must be cooled. Cooling most supercomputers is a major HVAC problem.
Information cannot move faster than the speed of light between two parts of a supercomputer. For this reason, a supercomputer that is many meters across must have latencies between its components measured at least in the tens of nanoseconds. Seymour Cray's supercomputer designs attempted to keep cable runs as short as possible for this reason: hence the cylindrical shape of his Cray range of computers. In modern supercomputers built of many conventional CPUs running in parallel, latencies of 1-5 microseconds to send a message between CPUs are typical.
Supercomputers consume and produce massive amounts of data in a very short period of time. According to Ken Batcher, "A supercomputer is a device for turning compute-bound problems into I/O-bound problems." Much work on external storage bandwidth is needed to ensure that this information can be transferred quickly and stored/retrieved correctly.

Technologies developed for supercomputers include:



.Vector processing


.Liquid cooling


.Non-Uniform Memory Access (NUMA)

.Striped disks (the first instance of what was later called RAID)


.Parallel filesystems .




Technorati : ,

Wireless Sensors To Monitor Bearings In Jet Engines Developed


Wireless Sensors To Monitor Bearings In Jet Engines Developed


Dimitrios Peroulis, an assistant professor of electrical and computer engineering at Purdue, holds a new MEMS sensor at an "environmentally controlled probe station." The wireless sensors are being developed to detect impending bearing failure in jet engines. The probe station recreates extreme conditions inside engines, enabling researchers to test the sensors


Researchers at Purdue University, working with the U.S. Air Force, have developed tiny wireless sensors resilient enough to survive the harsh conditions inside jet engines to detect when critical bearings are close to failing and prevent breakdowns.


The devices are an example of an emerging technology known as "micro electromechanical systems," or MEMS, which are machines that combine electronic and mechanical components on a microscopic scale.


"The MEMS technology is critical because it needs to be small enough that it doesn't interfere with the performance of the bearing itself," said Farshid Sadeghi, a professor of mechanical engineering. "And the other issue is that it needs to be able to withstand extreme heat."


The engine bearings must function amid temperatures of about 300 degrees Celsius, or 572 degrees Fahrenheit.


The researchers have shown that the new sensors can detect impending temperature-induced bearing failure significantly earlier than conventional sensors.


"This kind of advance warning is critical so that you can shut down the engine before it fails," said Dimitrios Peroulis, an assistant professor of electrical and computer engineering.


Findings will be detailed in a research paper to be presented on Tuesday (Oct. 30) during the IEEE Sensors 2007 conference in Atlanta, sponsored by the Institute of Electrical and Electronics Engineers. The paper was written by electrical and computer engineering graduate student Andrew Kovacs, Peroulis and Sadeghi.


The sensors could be in use in a few years in military aircraft such as fighter jets and helicopters. The technology also has potential applications in commercial products, including aircraft and cars.


"Anything that has an engine could benefit through MEMS sensors by keeping track of vital bearings," Peroulis said. "This is going to be the first time that a MEMS component will be made to work in such a harsh environment. It is high temperature, messy, oil is everywhere, and you have high rotational speeds, which subject hardware to extreme stresses."


The work is an extension of Sadeghi's previous research aimed at developing electronic sensors to measure the temperature inside critical bearings in communications satellites.


"This is a major issue for aerospace applications, including bearings in satellite attitude control wheels to keep the satellites in position," Sadeghi said.


The wheels are supported by two bearings. If mission controllers knew the bearings were going bad on a specific unit, they could turn it off and switch to a backup.


"What happens, however, is that you don't get any indication of a bearing's imminent failure, and all of a sudden the gyro stops, causing the satellite to shoot out of orbit," Sadeghi said. "It can take a lot of effort and fuel to try to bring it back to the proper orbit, and many times these efforts fail."


The Purdue researchers received a grant from the U.S. Air Force in 2006 to extend the work for high-temperature applications in jet engines.


"Current sensor technology can withstand temperatures of up to about 210 degrees Celsius, and the military wants to extend that to about 300 degrees Celsius," Sadeghi said. "At the same time, we will need to further miniaturize the size."


The new MEMS sensors provide early detection of impending failure by directly monitoring the temperature of engine bearings, whereas conventional sensors work indirectly by monitoring the temperature of engine oil, yielding less specific data.


The MEMS devices will not require batteries and will transmit temperature data wirelessly.


"This type of system uses a method we call telemetry because the devices transmit signals without wires, and we power the circuitry remotely, eliminating the need for batteries, which do not perform well in high temperatures," Peroulis said.


Power will be provided using a technique called inductive coupling, which uses coils of wire to generate current.


"The major innovation will be the miniaturization and design of the MEMS device, allowing us to install it without disturbing the bearing itself," Peroulis said.


Data from the onboard devices will not only indicate whether a bearing is about to fail but also how long it is likely to last before it fails, Peroulis said.


The research is based at the Birck Nanotechnology Center in Purdue's Discovery Park and at Sadeghi's mechanical engineering laboratory.





Technorati : , , , ,

TNT speeds up with mobile technology








The rollout of wearable mobile computers and scanners is paying dividends on the global mail company's previous investment in tracking technology.
TNT, the parcel courier and mail company has made significant improvements to its delivery services with the rollout of mobile computers.


TNT Express UK, which is the carrier's largest division and its main business-to-business arm, has reduced vehicle loading time by 30 per cent and achieved on-time delivery rates of 98 per cent since rolling wearable computers and scanners out to its delivery drivers a year ago.


David Higgins of TNT said 'track and trace' is incredibly important to TNT's customer relationships. "It is imperative that we stay at the forefront of service technology to keep our competitive advantage," he said.


TNT Express looked to long-standing provider Motorola, to provide its MC9000 series for drivers to be able to operate the devices even when wearing gloves and has now implemented 2,750 devices for its UK van drivers, expanding to over 8,500 across the Europe, Middle East and Africa division. The UK TNT Express Specialist Services delivery team opted for 900 smaller, lighter MC70 devices.


TNT Express also wanted a scanner that left workers' hands free, to optimise productivity and vehicle loading time. As a result, TNT played an integral part in the beta testing of the WT4000 wearable scanner. It is currently using 155 of these wearable mobile computers in the UK and has plans to implement a further 800.


The new wearable mobile computers and scanners have increased worker productivity and load accuracy, while reducing vehicle loading time. Instead of checking a manifest on the vehicle itself, operatives can now see the status of the order simply by scanning the package and consulting the wearable mobile computer. Using the wearable mobile computer has reduced the time taken to load vans and trucks by up to 30 per cent, while delivery accuracy is running at 98 per cent.


Higgins added: "Although the net effect on efficiency per parcel may seem small, getting each van out of the warehouse ten minutes earlier, each shift on each day, has a huge effect on the business efficiency and means that we can deliver more parcels and maintain a high level of service performance through a premium on time delivery."


TNT is now planning to implement further wearable mobile computers and MC9000s worldwide and is currently investigating how the use of radio frequency identification (RFID) technology can help its business.




Technorati : ,

Novel Nanostructure Response Opens Possibilities for Electrical Devices.


Novel Nanostructure



A University of Arkansas physicist and her colleagues have examined dielectric susceptibilities of nanostructures (that is the response of their polarization to electric fields) and found novel, seemingly contradictory properties that may change how such materials can be used by scientists and engineers to build electronic devices.


Inna Ponomareva, Laurent Bellaiche and Raffaele Resta of the Università de Trieste reported their findings in the journal Physical Review Letters.


Ponomareva and her colleagues examined a property called the dielectric susceptibility of a material, or its polarization response to an electric field. High dielectric responses mean engineers and scientists can build highly sensitive devices, so knowing how to maximize this property in nanostructures will help scientists and engineers make small, efficient electronic devices. The researchers used physical and mathematical models to examine the effect of an electric field on a nanostructure of lead zirconate, a ferroelectric material -- a material that can exhibit a electrical polarization even after the electric field has disappeared.


At the nanoscale, scientists have discovered that the dielectric response has three different aspects, unlike in the bulk level. These include the change of polarization with respect to the external field, called external susceptibility, and the change in polarization with respect to the internal field, called internal susceptibility. Both of these are characteristic of the shape of the material - that is, the susceptibility is dependent upon whether the object is a nanorod, a nanodot, or a nanofilm. The third aspect - called intrinsic susceptibility - is a characteristic of the material.


Ponomareva and her colleagues determined that the internal susceptibility can be negative - in other words, a positive electric field created a negative polarization within the material. This finding contradicted what was previously thought.


"It was believed that negative susceptibility meant that the system was unstable," Ponomareva said. Such negative sign can open the door to the realization of novel technological devices.


The researchers also wanted to see what would happen with the material when the electric field was supplied by perfect electrodes, that were 100 percent efficient, and also with less efficient electrodes.


"In many practical applications, it is really hard to find perfect electrodes," Ponomareva said. Based on their calculations, they found that the highest external dielectric response occurred for electrodes that are around 90 percent efficient. This indicates a point at which the material can be most easily manipulated by an external electric field.


"It's important to know what happens from many angles," she said. "These characteristics may have useful applications, but right now we have more of a fundamental interest in them."


Ponomareva is a research assistant professor in the J. William Fulbright College of Arts and Sciences.


Nanotechnology
Nanotechnology developments over the past two decades and the ability to measure and manipulate matter at atomic and molecular scales have led to the discovery of novel materials and phenomena.
Nanotechnology is a multidisciplinary area that deals with the synthesis, manipulation and characterization of matter at the sub-100 nanometers level (1 nanometer = 1 billionth of a meter). Nanotechnology is still emerging although commercial products are already on the market.
Research and Development focuses on practical applications, such as energy, homeland security, healthcare, food and agriculture, environment, new materials, electronics.
On the other hand, it should be recognized the importance of the social dimensions of nanotechnology. Research aimed at understanding the benefits and risks to human health and the environment should be done, and methods for nanotechnology risk assessment and management should be developed. Conform to the National Science and Technology Council, the areas of society that may be affected by nanotechnology include economic, education, workforce, ethical and legal aspects




Technorati : , , ,

Find here

Home II Large Hadron Cillider News