Search This Blog

Sunday, October 28, 2007

Scientists Find New Causes For Neurodegeneration - By U-M Scientists


A small region of a mutant mouse brain magnified 1000X. In the normal tissue the cells are tightly packed with no gaps, but in the mutant there are large holes due to loss of neurons from absence of the molecule PI(3,5)P2 from suppression of the Vac14 gene. (Credit: Image courtesy of University of Michigan)




Diseases that cause neurons to break-down, such as Alzheimer's, Multiple Sclerosis and Creutzfeldt-Jakob disease (Mad Cow Disease), continue to be elusive to scientists and resistant to treatments.

A new finding from University of Michigan researchers demonstrates an unpredicted link between a virtually unknown signaling molecule and neuron health.


In a study in PNAS, graduate student, Yanling Zhang, postdoctoral fellow Sergey Zolov and Life Sciences Institute professor Lois Weisman connect the loss of this molecule to massive neurodegeneration in the brain.


The molecule PI(3,5)P2 is a lipid found in all cells at very low levels. Lipids are a group of small organic compounds. While the best studied lipids are fats, waxes and oils, PI3,5P2 is a member of a unique class of lipids that signal the cell to perform special tasks.


Weisman said it was surprising to find that PI(3,5)P2 plays a key role in the survival of nervous system cells.


"In mice, lowered levels of PI(3,5)P2 leads to profound neurodegeneration," said Weisman. "It suggests that we have a good place to look to find treatments for neurodegenerative diseases such as Alzheimer's."


Weisman, who is also professor of Cell & Developmental Biology at the U-M Medical School and her colleagues, began from clues that were hidden in a conserved genetic pathway in yeast (a pathway that has remained the same in yeast, plants and humans over evolutionary time). Studies in yeast showed that the enzyme that manufactures the lipid is governed by the FIG4 and VAC14 genes, which exist in yeast, mice and humans.


Working with two independently derived mouse models, Weisman's team and collaborators including graduate student Clement Chow and Professor Miriam Meisler of the Department of Human Genetics at the U-M Medical School, reached the same conclusions in a pair of important papers for neuroscience research.


Building on research from Meisler, a mouse geneticist, and Weisman, a yeast geneticist, the collaborators published a paper in Nature, July 5, 2007, showing that in mice, the FIG4 gene is required to maintain normal levels of the signaling lipid and to maintain a normal nervous system. Importantly, they found that human patients with a very minor defect in their FIG4 genes had serious neurological problems.


The signaling lipid PI(3,5)P2 (short for phosphatidylinositol 3,5-bisphosphate) is part of a communication cascade that senses changes outside the cell and promotes actions inside the cell to accommodate to the changes.


Weisman's team found that mice missing the VAC14 gene, which encodes a regulator of PI(3,5)P2 levels, suffer massive neurodegeneration that looks nearly identical to the neurodegeneration seen in the FIG4 mutant mice. In both cases the levels of PI(3,5)P2 are one half of the normal levels. The fact that both mice have half the normal levels of the lipid and also have the same neurodegenerative problems provides evidence that there is a direct link between the lipid and neuronal health.


The new findings indicate that when Vac14 is removed, the cell bodies of many of the neurons appear to be empty spaces and the brain takes on a spongiform appearance




Technorati : , ,

Space Station Has Power System Damage


This image provided by NASA television shows the hatch opened on the Quest airlock and astronaut Scott Parazynski waiting to exit on the second space walk of the mission early Sunday Oct. 28, 2007. (AP Photo/NASA




Two spacewalking astronauts unhooked a 35,000-pound girder from the international space station Sunday, starting the delicate process of moving the giant solar power tower to another part of the orbiting outpost.


Spacewalkers Scott Parazynski and Daniel Tani started their 6 1/2-hour jaunt by disconnecting cables and unscrewing bolts that connected the girder to the space station's backbone.



Spacewalking astronauts found evidence of damage to a key part of the International Space Station's power system today.


It was the second of five scheduled spacewalks during the shuttle mission. More than six hours of outdoor activities were originally to be devoted to unbuckling an solar array atop the International Space Station so it could be moved to the side of the station, and also doing some work on the new "Harmony" module that astronauts had installed earlier in the week and first entered on Saturday.


Those tasks proceeded well, as has virtually everything else in this otherwise exceptionally smooth mission. But those successes could well be overshadowed by the discovery of iron shavings in one of the shuttle's enormous rotating joint assemblies.


The part, known as the Solar Array Rotating Joint, or SARJ, is 10 feet across and one sits toward each end of the station's long truss. The motorized joint allows solar panels to rotate and constantly face the sun during the sunny part of each orbit.


"It's quite clear," said Daniel Tani, one of the two spacewalkers, describing what he saw after removing a protective cover over a motor. "There's metal-to-metal scraping, or something, and it's widespread."


A sharp-eyed space station flight controller had recently noticed that the joint on the right side of the station was experiencing unusual vibrations as it rotated. Further examination revealed that the motor on that joint was using greater-than-expected amounts of current, which suggested that it was having to work harder than it should to turn the paddlewheel-like array. Mission managers added the inspection to the spacewalk schedule on Friday.


The shavings suggest that moving parts may be misaligned and grinding against each other, or perhaps that a piece of debris from the ground or from space may have gotten into the works. Mission managers had hoped the problem with the rotary joint would be easy to spot and easy to fix - something like a bolt out of place or an insulating blanket that was dragging and increasing friction, or even a leftover shop rag that was carried up to space and became lodged in the wrong place but could be removed. Before taking the cover off, Mr. Tani conducted a visual inspection of every bolt and blanket on the exterior of the device, which was made by Lockheed Martin at its Space Systems facility in Sunnyvale, and found no problems.


The problem could have ripple effects that go beyond this mission. If NASA wants a second look at the joint, a second spacewalk will have to be added to the schedule. With five spacewalks already on the shuttle mission's calendar, it would be difficult to squeeze in another. At the same time, the days between the end of this shuttle mission and the arrival of the next shuttle in December is packed with activities for the three-person space station crew, and so even a single additional spacewalk could mean delaying the December mission.


Kirk Shireman, the deputy space station program manager at the Johnson Space Center in Houston, noted said in a media briefing on Friday that there are backup motors and controllers for each rotary joint, and so the system might still be able to work after a switchover.


During the same briefing, Derek Hassman, the lead space station flight director, said that the troubled joint could be "parked" in a position that allows it to pick up a fair amount of sunlight throughout the orbit while NASA continues to investigate the problem. "As long as we can get it into an attitude that's reasonably good for power generation, combined with what the other SARJ can produce, we wouldn't have any significant power impacts that we couldn't deal with," he said.





Technorati : , ,

Lunar Challenge : Armadillo’s MOD shows their possible capability but not reach satisfactory level


Armadillo's MOD shows their possible capability but not reach satisfactory levelArmadillo's MOD Meets with Limited Success on Day 1 of Lunar Challenge.


The first day of the Northrop Grumman Lunar Lander Challenge, a grant challenge event that NASA organizes saw Armadillo Aerospace, a space company, make its mark in its first attempt to fly a lunar spacecraft. The task in the challenge is to simulate a lunar flight.


The challenge, which is a two-day event and is the headline act of the 2007 X Prize Cup space and air show. The challenge is broken into two levels. The teams get two tries for each level and have to be successful at each attempt. The event is part of a NASA initiative to tap new technology for its space programs.


Armadillo Aerospace, which is owned by John Cormack, the creator of Doom, completed the first phase of the challenge by successfully launched and flew a privately built lunar probe, the MOD, for a duration of one and a half minutes. The probe was not successful in its second attempt at flight.


The MOD stands 12 feet tall and 8 feet wide and weighs 1,400 pounds at the time of take-off. However, the craft has a thrust of 1,800 pounds. It is a liquid oxygen rocket that has computer controls and a host of sensors, including a GPS. Armadillo said it was more a prototype of vehicles it planned to use for tourist trips into space in the future.


During the first attempt, the MOD completed all the required tasks of the first level challenge - launching vertically off the launch pad to a height of 50 meters, then straightening out and traveling horizontally for another 50 meters, and finally landing back on the flat launch pad. However, during the second attempt, it toppled over five seconds prior to landing on the pad.


The toppled landing meant there was no clear winner on the first day of the challenge, even though Armadillo was the only company participating in the Lunar Lander event. The company had two more attempts left tomorrow to win the challenge and take home the two paychecks for the winner, one for each successful level.


The total prize money for the event is a whopping $2 million. Completing the first level successfully, which requires the craft to fly 50 meters vertically, then another 50 meters horizontally, and then land safely on the flat launch pad, would fetch the winner $350,000 while the second place participant takes home $150,000 for this level.


The second level is much the same as the first, the major difference being the landing surface. While the first level required safe landing on a flat surface, the second level would require safe landing on a rocky surface similar to that of the moon. The winner of this more difficult round would take home $1 million, while the runner-up would get $500,000.


Doug Comstock, the director of the innovative space program at NASA, said Armadillo had two more tries to emerge the winner tomorrow. If Armadillo can iron out the kinks in its probe, it has an easy chance of emerging the winner, considering it is the only participant in the fray for the second year running.


Of the nine teams scheduled to compete this year, the remaining eight teams could not make it to the challenge. While most of these teams were not allowed to compete because of an inability to comply with the safety requirements laid down by the Federal Aviation Administration, one mystery team opted out before the challenge was scheduled to start.





Technorati : ,

South Africa's coal-to-liquids (CTL) plants have relied exclusively on coarse coal


Hitherto, South Africa's coal-to-liquids (CTL) plants have relied exclusively on coarse coal for their feed material, with Sasol either stockpiling the fines or selling them to power utility Eskom for use in their coal-fired power stations. Some of the coal has, nevertheless, found application in its boilers.


But South Africa energy cluster head Benny Mokaba tells Engineering News that the feedstock for the multibillion expansion, which forms part of a multibillion-rand expenditure profile for the cluster over the next nine years will be a blend of gas (15%) imported from Mozambique and fine coals (5%).


"This is a different type of coal technology. We will, for the first time, be seeking to gasify, on a commercial basis, fine coal, which is a material that we had been stock-piling," Mokaba explains.


He contends that the ability to exploit the fines has both commercial benefits and environmental virtue.


"It elongates the life expect- ancy of our mine and the Secunda plant enables us to add to our production profile. In addition, we will deal with an above-ground resource that has, until now, been an environmental hazard."


TURNING GROWTH NECESSITY TO ENVIRONMENTAL VIRTUE
Further, the integration of additional gas-to-liquids capacity at the sprawling Secunda complex for the 15% balance of the expansion will offer foreign-exchange earnings for impoverished Mozambique, the originator of the feedstock, while lowering the overall emission profile at Secunda on a per-ton-of-product basis.


This environmental angle is central, with the expansion having also been coupled with two other programmes:



a 300-MW cogeneration project, which would reduce Secunda's gas-flaring as well as its reliance on the power grid at a time when Eskom is struggling to keep pace with electricity demand growth; and



a R4,5-billion environmental remediation programme, which aims to help support Secunda in meeting its target of reducing greenhouse-gas emissions (on a per-ton basis) by 10% by 2015.


Mokaba says it is in advanced negotiations for the securing of specially designed turbines that will enable Secunda to convert flared gas from Secunda into electricity ' the petrochemicals complex currently draws about 1 200 MW of electricity from the national grid.


"In all, there are something like nine key elements to the expansion and we are accelerating our discussions with various contractors and suppliers with the intention of being able to bring on the new capacity and the cogeneration by 2009," he explains.


This 30 000-bl/d expansion will be introduced ahead of a proposed 80 000-bl/d greenfield inland CTL facility, dubbed 'Project Mafutha', which is also seen as necessary, owing to material demand growth for transport fuels in South Africa, leading to rising import levels.


Sites in the Free State and the coal-rich Waterberg, in Limpopo province, are being studied simultaneously, with the type and quality of coal likely to be the crucial factor in deciding where to locate the plant.


Should the fines technology prove commercially successful, it could also have an influence on the technology chosen at Mafutha, which Sasol also aims to make its most environmentally acceptable CTL facility yet.


The group accepts that its current genera-tion of CTL plants still adds significantly to greenhouse-gas flows and remains water intensive. But the group is reportedly working on a range of new technologies that could reduce the environmental footprint of the current and future fleet.


Mokaba indicates that carbon-capture and sequestration techniques may even be incorporated into the Project Mafutha design, with Sasol scientists evaluating the possibility of storage in deep-level mines, as well as deposition into other deep geological formations, such as deep-saline reservoirs, and coal bed methane recovery opportunities.


BIG HUMAN RESOURCES SCALE-UP
Mafutha could also create employ- ment for up to 13 000 South Africans across the cluster, from mining to marketing, while the 20% Secunda expansion could add thousands more.


For this reason, the group is sharpening its focus on growing its skills base in anticipation of project and operational growth.


A programme known as ' Project TalentGro' has been initiated by executive director Nolitha Fakude as part of a multipronged approach to developing and raising the level of internal skills.


A new division has been created to recruit and train new employees, while the scale of Sasol's learnership and appren- tice training has already increased by 233% between 2004 and 2007. The group has also set aside R140-million specifically for artisan training.


High-level project manage- ment and technical skills are also receiving priority attention, with Sasol Technology having already increased its staff complement by 400 people, a 25% increase, in a bid to mitigate the constraints emerging in the global project economy.


Mokaba acknowledges that these actions will have cost implications for the cluster, echoing CFO Christine Ramon's recent warning that expenses related to the expansion programme locally and abroad will have to increase.


'These costs have to be seen in the context of preparing for a material expansion,' Mokaba argues, adding that financial resources are having to be directed towards project studies, human- resources development and the retention of key personnel.


Overall, the company has plans to invest up to R65-billion over the next three years on replacement and expansionary capital, with only R10-billion of that having been approved.


About half of this capital will be spent by businesses within Mokaba's cluster, including mining, synfuels, oil and gas, with a heavy weighting being given to growth projects.


RESPONDING TO DEMAND
Underpinning these aggressive expansion plans is the surging market demand for liquid fuels.


South African Petroleum Industry Association figures show that the sale of liquid fuels has been rising steadily from about 20,8-billion litres in 2000 to over 24-billion litres last year, with that growth trend continuing strongly in 2007.


There have already been some short-term supply challenges, and it is now also widely accepted that fuel imports are now a permanent feature of South Africa's energy economy.


It is also accepted that a key limitation on the creation of additional crude-based refineries is the scale of the investment required to make these commercially and technically viable. These refineries typically require a minimal scale of about 300 000 bbl/d, which could have a destabilising effect on a modestly sized market such as South Africa's.


Mokaba believes the addition of a fine-coal-conversion technology to the technology mix could further magnify the attractiveness of CTL as a possible solution.




Technorati : , ,

RFID -is Inescapable but Implantable


24hoursnews :RFID technology is used widely in the public and private sector to assist businesses with asset tracking and security. Hospitals use RFID equipped bracelets and patches to track newborn movements, prevent accidental switching of infants, reduce prescription medicine and surgical errors, and monitor the location of equipment. Public transportation systems worldwide use RFID equipped cards to track and bill use of the services, such as "EZ Pass" type equipment for toll roads or payment cards for mass transit systems. Corporations and schools use RFID chips in identification cards to control access to restricted areas. Many cars have RFID chips in their ignition keys, and keyless ignition works because of the technology as well.

In most of these situations the type of RFID product used is a chip in a bracelet or some kind of device that is portable and removable, but recently the subject of implantable RFID, radio-frequency identification, is receiving a lot of attention in the news. On September 9, 2007, an Associated Press article said that research in the 1990s showed that implanted RFID chips caused cancer in mice and rats and may cause cancer in cats, dogs, and humans. Verichip, the manufacturer specifically mentioned in the article, has responded to the accusation with assertions that research confirming the safety of their implantable chips was not included in the article, that mice and rats are far more likely to develop tumors at the site of any type of injection, and that the FDA has cleared the chip as a Class II Medical Device.

Although an RFID chip in particular may not cause cancer, the inflammation caused by surgical implantation of a foreign object can increase the instance of tumor development. According to the National Cancer Institute, "Inflammation is a response to acute tissue damage, whether resulting from physical injury, ischemic injury, infection, exposure to toxins, or other types of trauma. It can play a role in tumor suppression by stimulating an antitumor immune response, but more often it appears to stimulate tumor development," and "chronic inflammation is also clearly correlated with increased risk of developing cancer."

While millions of people experience acute tissue damage from surgery, injury or infection every year, the deliberate implanting of a foreign object that could produce chronic inflammation for the purposes of tracking has activists concerned, especially when combined with fears about privacy.

According to Verichip, a leading manufacturer of RFID products, all of its implantable RFID chips are "passive," which means that the chips themselves do not contain a power source such as a battery. The chips only transmit data when in range of a reader, and therefore are not continuously transmitting. In the case of current technology passive chips, the reader needs to be within about ten feet of the chip in order to extract data. These chips are therefore not equipped to handle any kind of long-range GPS capability - less than most cell phones.

As for the actual data on the chip, the only information available is a 16-digit identifier that must be matched up to a database. Simply knowing the number will not identify name, address, or any other personal information unless the person with the scanner also has access to the proper database.

Neither the limited transmission range nor the limited data capability found in current chips assuages the fear of privacy advocates who point out that criminally intent people or overzealous employers could still take advantage of the technology. An office building with readers built into doorways and halls could effectively track the movement of employees throughout the day, or a stalker could install readers in homes to track the movement and routines of victims. Databases can be hacked and open up the possibility of identity theft, or simple abuse by the keepers of the information.

One of the biggest concerns of activists is that eventually the government will push to require RFID chips either in identification cards or implantable chips, tied to a centralized database that would maintain a complete medical history and would also be able to track general movement every time an ID had to be presented. In an apparent first step in that direction, Verichip was in discussions in 2006 with the Pentagon to replace military dog tags with implantable chips tied back to identification and medical history databases, along with other discussions to implant immigrants and guest workers.

So far, three states (Wisconsin, North Dakota, and California) have banned the forced implantation of RFID chips by employers. Meanwhile, Verichip recently celebrated the launch of a partnership with Alzheimer's Community Care in Florida, where 90 Alzheimer's patients were implanted with RFID chips to assist in identification if they wander away from their caregivers.

Although consumers today are not in any way anonymous when we consider all of the information we provide to businesses in the course of our daily lives, the specter of the population as a whole being persuaded or coerced into accepting the idea of implantable RFID chips should raise serious health and privacy questions.




Technorati : ,

Bulbs are accelerating global warming : next is mercury contamination


The incandescent light bulb was downright amazing when it was invented in 1809 by Humphry Davy. Nope, it wasn't invented by Thomas Edison -- that's just another American history lie, much like the stories about Christopher Columbus "discovering" America and being some sort of upstanding hero. In truth, he and his men were butchers who committed numerous atrocities against the Native Americans (see The People's History of the United States by Howard Zinn). U.S. history is largely a collection of politically convenient lies, and the story of the invention of the light bulb by Thomas Edison is just one of many such distortions.

Unfortunately, very little has changed about the light bulb since the turn of the 20th century. The device still wastes 95 percent of the electricity it consumes. And thanks to a deliberate design by manufacturers to encourage repeat sales (i.e. they are deliberately engineered to burn out), light bulbs still burn out after about 1,000 hours, requiring consumers to toss them into the garbage and buy new ones. (It's true: Light bulbs were invented in 1991 that last 60,000 hours, but companies refuse to mass produce them, since repeat sales of light bulbs would plummet. The bulbs sold to consumers today are designed to self-destruct.)
Incandescent lights are a safety hazard (glass shards, anyone?) and an environmental hazard, since they produce massive carbon dioxide emissions from the coal power plants used to power these bulbs. They're incredibly cheap to purchase up front, but astonishingly expensive to use over time. A typical incandescent light bulb is ten times more expensive to operate than an LED light bulb. It also produces ten times as much carbon dioxide that contributes to global warming. Want to warm the climate? Turn on the lights!
So why, then, are so many people still using incandescent light bulbs? Primarily because they have no idea what it costs to actually operate them. The fact that these light bulbs are secretly slipping dollars out of your pocket every time they're used seems to go unnoticed by most consumers. All they see is the price tag at the store. And there, incandescent lights look really cheap.
The $500 incandescent light bulbBut what if the price of the light bulb at the store included the entire cost of the electricity needed to actually power the light bulb? If that incandescent light bulb actually lasted 50,000 hours like LED lights do, the cost of buying the bulb together with all the electricity needed to power it would be a whopping $500!. Would you pay $500 for a light bulb?
Of course, incandescent lights don't last 50,000 hours. They last only about 1,000. Which means you have to buy fifty bulbs, replace them fifty times and throw fifty burned out bulbs in the garbage, all while still paying nearly $500 in electricity anyway. In other words, paying for 50,000 worth of light from an incandescent light bulb actually costs MORE than $500!
That's no bargain. Not by a long shot. Especially when a $100 ten-watt LED light bulb can operate for 50,000 hours using only about $54 in electricity. (We're assuming 10 cents per kilowatt-hour for these calculations. Folks in California are paying a lot more than that, but in some states, it's less...)
Would you rather pay $500 for light, or $154? If you love overpaying for stuff, and destroying the environment, and piling more garbage onto landfill, then keep buying incandescent light bulbs! They will raise your electricity bills, fill your trash with shards of glass, use up natural resources and accelerate global warming faster than any other light source on the planet today.
Are Compact Fluorescent Lights the answer?But what about CFLs? Everybody's crazy about CFLs all of a sudden, it seems. People know that CFLs use only about 1/3rd the electricity of incandescent lights. Of course, they flicker and hum, and they take a long time to warm up, but they do save on electricity compared to the extremely inefficient incandescent light bulb. So what's not to like about CFLs?


Mercury, for one thing.
All fluorescent lights contain mercury, period. It's the dirty little secret of the CFL industry. This is mercury brought into your home, and if you break a fluorescent light in your home, you are releasing a powerful neurotoxic heavy metal in your home! Birth defects, neurodegenerative diseases, developmental disorders, dementia... these have all been linked to mercury exposure. It's not even debated in the scientific literature. Even doctors readily admit that mercury is extremely toxic to the human body. (Dentists, of course, remain in bewildering denial and continue to place mercury fillings into the mouths of children, seemingly oblivious to the neurotoxicity of this extremely dangerous heavy metal...)
There's enough mercury in a single fluorescent light bulb to contaminate 7,000 gallons of fresh water.
I cringe to think about how much water could be contaminated by the recent fluorescent light giveaway programs hosted by big box retailers like The Home Depot, which gave away an astonishing 1 million fluorescent lights containing approximately 3 million mg of mercury (that's a whopping 3 kilograms of mercury!). And on what day did they choose to distribute these toxic light bulbs all across the country? Earth Day, of course! (It would all be rolling-on-the-floor hilarious if not for all the deformed babies that will probably result from widespread mercury contamination of our environment...)
So why are people rushing out to buy mercury light bulbs and place them in their homes? Because no one told them about the mercury, that's why! Of the hundreds of consumers I've talked to about this issue, very few (less than 4%) were aware of the mercury in fluorescent light bulbs. Sure, it's printed in microscopic text on the packaging of CFLs, but nobody reads that.
So most consumers keep on buying mercury light bulbs and bringing them right into their homes and communities, oblivious to the extremely hazardous materials found inside each light. I launched http://www.ecoleds.com/ because I wanted to provide an eco-friendly alternative to toxic CFLs and wasteful incandescent lights. My aim is to educate consumers about the advantages of LED lights and make them so popular that even Wal-Mart starts selling them, putting my own company out of business.
I will only consider EcoLEDs.com a meaningful success when LED lights are sold at mass merchandisers and incandescent lights become a thing of the past. I hope The Home Depot stops giving away toxic fluorescent lights and starts selling LED lights instead.
Isn't it interesting how the U.S. government requires Energy Saver statistics to be printed on washing machines, dryers and other household appliances, but NOT on incandescent light bulbs (which are, by any measure, the least efficient household appliances of all)? I think we should start with mandated labeling that shows the lifetime cost of each bulb sold at retail so that consumers can start to see the different in the total cost of ownership right there at the point of purchase.
That would, for the first time, make consumers acutely aware of what it costs them to operate a light bulb, not to even mention the cost to the planet.

Home Computers Help Researchers Better Understand


Home Computers Help Researchers Better Understand
Want to help unravel the mysteries of the universe? A new distributed computing project designed by a University of Illinois researcher allows people around the world to participate in cutting-edge cosmology research by donating their unused computing cycles


The project is called Cosmology@Home, and is similar to SETI@Home, a popular program that searches radio telescope data for evidence of extraterrestrial transmissions.


"When you run Cosmology@Home on your computer, it uses part of the computer's processing power, disk space and network bandwidth," said project leader Benjamin D. Wandelt, a professor of astronomy and of physics at Illinois.


"Our goal is to search for cosmological models that describe our universe and agree with available astronomical and particle physics data," Wandelt said.


To achieve this goal, participating computers calculate the observable predictions of millions of theoretical models with different parameters. The predictions are then compared with actual data, including fluctuations in the cosmic microwave background, large-scale distributions of galaxies, and the acceleration of the universe.


In addition to picking out possible models, Cosmology@Home could help design future cosmological observations and prepare for the analysis of future data sets, such as those to be collected by the Planck spacecraft, Wandelt said.





Technorati : , , , , , ,

Hottest Chile Pepper Discovered


Hottest Chile Pepper Discovered






World's Hottest Chile Pepper Discovered


Researchers at New Mexico State University recently discovered the world's hottest chile pepper. Bhut Jolokia, a variety of chile pepper originating in Assam, India, has earned Guiness World Records' recognition as the world's hottest chile pepper by blasting past the previous champion Red Savina.


In replicated tests of Scoville heat units (SHUs), Bhut Jolokia reached one million SHUs, almost double the SHUs of Red Savina, which measured a mere 577,000.


Dr. Paul Bosland, Director of the Chile Pepper Institute at New Mexico State University's Department of Plant and Environmental Sciences collected seeds of Bhut Jolokia while visiting India in 2001.


Bosland grew Bhut Jolokia plants under insect-proof cages for three years to produce enough seed to complete the required field tests.


"The name Bhut Jolokia translates as 'ghost chile,'" Bosland said, "I think it's because the chile is so hot, you give up the ghost when you eat it!"


Bosland added that the intense heat concentration of Bhut Jolokia could have significant impact on the food industry as an economical seasoning in packaged foods.






Technorati : , , ,

Seven from MIT named AAAS Fellows


The American Association for the Advancement of Science (AAAS) has awarded the distinction of Fellow to 471 members, including seven MIT faculty members.


Fellows are recognized for their efforts toward advancing science applications that are deemed scientifically or socially distinguished. New Fellows will be presented with an official certificate and the society's gold and blue (representing science and engineering, respectively) rosette pin on Feb. 16 at the 2008 annual meeting in Boston. The following MIT faculty are new AAAS Fellows:


Emery N. Brown, a professor in the Harvard-MIT Division of Health Sciences and Technology and in the Department of Brain and Cognitive Sciences, was cited for "fundamental contributions to statistical modeling of dynamic biological phenomena, especially involving circadian rhythms, functional imaging signals and neuronal spike trains." Brown is also affiliated with Massachusetts General Hospital.


Jeffrey P. Freidberg, a professor in the Department of Nuclear Science and Engineering and the Plasma Science and Fusion Center, was named a Fellow for "distinguished contributions to research and teaching in the areas of theoretical plasma physics and magnetohydrodynamics as applied to problems in magnetic fusion."


Klavs F. Jensen, the Warren K. Lewis Professor of Chemical Engineering and Materials Science and Engineering, was cited for "the elegant use of detailed simulations of reactive systems to gain new insight into the underlying basic physical and chemical rate processes." Jensen is also head of the Department of Chemical Engineering.


Daniel G. Nocera, the Henry Dreyfus Professor of Energy in the Department of Chemistry, was named for "distinguished contributions to the development of renewable energy at the molecular level, with emphasis on the splitting of water with solar light."


Leona D. Samson, the Ellison American Cancer Society Professor, was cited for "distinguished contributions to cancer prevention and treatment,
particularly for elucidating ways in which cells, tissues and animals respond to carcinogenic and chemotherapeutic agents." Samson is also a professor of toxicology and biological engineering in the Department of Biological Engineering, and is director of the Center for Environmental Health Sciences.


Joseph M. Sussman, J.R. East Professor of Civil and Environmental Engineering and Engineering Systems, was cited for "contributions to understanding large, complex engineering systems with emphasis on transportation, freight and traveler systems, and for pioneering work in transportation systems education." Sussman is also director of the Association of American Railroads Affiliated Lab.


Maria T. Zuber, the Earle Griswold Professor of Geophysics and Planetary Science, was named a Fellow for "outstanding research contributions and scientific leadership in the geophysical studies of Earth and the solid planets." Zuber is also head of the Department Earth, Atmospheric and Planetary Sciences, and director of the Earth and Planetary Geodynamics Group.




Technorati : ,

Human-generated ozone will damage crops, according to MIT study


Could reduce production by more than 10 percent by 2100.


A novel MIT study concludes that increasing levels of ozone due to the growing use of fossil fuels will damage global vegetation, resulting in serious costs to the world's economy.


The analysis, reported in the November issue of Energy Policy, focused on how three environmental changes (increases in temperature, carbon dioxide and ozone) associated with human activity will affect crops, pastures and forests.


The research shows that increases in temperature and in carbon dioxide may actually benefit vegetation, especially in northern temperate regions. However, those benefits may be more than offset by the detrimental effects of increases in ozone, notably on crops. Ozone is a form of oxygen that is an atmospheric pollutant at ground level.


The economic cost of the damage will be moderated by changes in land use and by agricultural trade, with some regions more able to adapt than others. But the overall economic consequences will be considerable. According to the analysis, if nothing is done, by 2100 the global value of crop production will fall by 10 to 12 percent.


"Even assuming that best-practice technology for controlling ozone is adopted worldwide, we see rapidly rising ozone concentrations in the coming decades," said John M. Reilly, associate director of the MIT Joint Program on the Science and Policy of Global Change. "That result is both surprising and worrisome."


While others have looked at how changes in climate and in carbon dioxide concentrations may affect vegetation, Reilly and colleagues added to that mix changes in tropospheric ozone. Moreover, they looked at the combined impact of all three environmental "stressors" at once. (Changes in ecosystems and human health and other impacts of potential concern are outside the scope of this study.)


They performed their analysis using the MIT Integrated Global Systems Model, which combines linked state-of-the-art economic, climate and agricultural computer models to project emissions of greenhouse gases and ozone precursors based on human activity and natural systems.


Expected and unexpected findings.
Results for the impacts of climate change and rising carbon dioxide concentrations (assuming business as usual, with no emissions restrictions) brought few surprises. For example, the estimated carbon dioxide and temperature increases would benefit vegetation in much of the world.


The effects of ozone are decidedly different.


Without emissions restrictions, growing fuel combustion worldwide will push global average ozone up 50 percent by 2100. That increase will have a disproportionately large impact on vegetation because ozone concentrations in many locations will rise above the critical level where adverse effects are observed in plants and ecosystems.


Crops are hardest hit. Model predictions show that ozone levels tend to be highest in regions where crops are grown. In addition, crops are particularly sensitive to ozone, in part because they are fertilized. "When crops are fertilized, their stomata open up, and they suck in more air. And the more air they suck in, the more ozone damage occurs," said Reilly. "It's a little like going out and exercising really hard on a high-ozone day."


What is the net effect of the three environmental changes? Without emissions restrictions, yields from forests and pastures decline slightly or even increase because of the climate and carbon dioxide effects. But crop yields fall by nearly 40 percent worldwide.


However, those yield losses do not translate directly into economic losses. According to the economic model, the world adapts by allocating more land to crops. That adaptation, however, comes at a cost. The use of additional resources brings a global economic loss of 10-12 percent of the total value of crop production.


The regional view.
Global estimates do not tell the whole story, however, as regional impacts vary significantly.


For example, northern temperate regions generally benefit from climate change because higher temperatures extend their growing season. However, the crop losses associated with high ozone concentrations will be significant. In contrast, the tropics, already warm, do not benefit from further warming, but they are not as hard hit by ozone damage because ozone-precursor emissions are lower in the tropics.


The net result: regions such as the United States, China and Europe would need to import food, and supplying those imports would be a benefit to tropical countries.


Reilly warns that the study's climate projections may be overly optimistic. The researchers are now incorporating a more realistic climate simulation into their analysis.


Reilly's colleagues are from MIT and the Marine Biological Laboratory. The research was supported by the Department of Energy, the Environmental Protection Agency, the National Science Foundation, NASA, the National Oceanographic and Atmospheric Administration and the MIT Joint Program on the Science and Policy of Global Change.


It is part of the MIT Energy Initiative (MITEI), an Institute-wide initiative designed to help transform the global energy system to meet the challenges of the future. MITEI includes research, education, campus energy management and outreach activities, an interdisciplinary approach that covers all areas of energy supply and demand, security and environmental impact.




Technorati : , ,

SPARE COMPUTING POWER TO moved from its Geneva home to the GridPP project at Queen Mary, University of London.



24hoursnews :The UK's GridPP project is taking the lead in the LHC@home project, which uses spare computing power on people's PCs to analyse data about the Cern particle accelerator.
LHC@home, a project that lets the public donate spare computing power to Cern scientists, has moved from its Geneva home to the GridPP project at Queen Mary, University of London.


The distributed computing project uses volunteers' desktop machines to help run simulations of the Large Hadron Collider (LHC), to ensure that protons travelling the 27 kilometre circuit stay in their orbits. The LHC is set to start operations at Cern next year, and is being used to search for evidence of the Higgs particle, by recreating the conditions of the universe just after the Big Bang.


"Like its larger cousin, SETI@home, LHC@home uses the spare computing power on people's desks," said Dr Alex Owen, who runs the project in the UK. "But rather than searching for aliens, LHC@home models the progress of sub-atomic particles traveling at nearly the speed of light around Europe's newest particle accelerator, the Large Hadron Collider (LHC)."


So far, over 40,000 people from 100 countries have run LHC@home, contributing what would equal 3,000 years computing on a single machine. The programme is baesd on the BOINC platform, which also runs the Search for Extraterristrial Intelligence (SETI) project, as well as distributed computing projects for modelling climate change and the spread of disesases.


"We started trial running LHC@home from a computer server in the UK in June, and have spent the last few months working with the physicists who use the data it produces. Now, with the official launch of the UK base for the project, we're ready to fully exploit this fantastic resource," said Neasan O'Neill of GridPP.


The programme will eventually be used for other particle physics research, such as modelling the operations of different parts of the particle detectors. The actual processing of the expected 15 million gigabytes of data that the LHC will produce annually will be handled by a grid computing network built by 17 universities and research centres across the UK.




Technorati :

The Sensitive Side Of Carbon Nanotubes: Makes Powerful Pressure Sensors


When the block is compressed, individual carbon nanotubes start to buckle, which in turn decreases the block's electrical resistance. Researchers can measure this resistance in order to determine precisely how much pressure is being placed on the blockBlocks of carbon nanotubes can be used to create effective and powerful pressure sensors, according to a new study by researchers at Rensselaer Polytechnic Institute.




Taking advantage of the material's unique electrical and mechanical properties, researchers repeatedly squeezed a 3-millimeter nanotube block and discovered it was highly suitable for potential applications as a pressure sensor. No matter how many times or how hard they squeezed the block, it exhibited a constant, linear relationship between how much force was applied and electrical resistance.


"Because of the linear relationship between load and stress, it can be a very good pressure sensor," said Subbalakshmi Sreekala, a postdoctoral researcher at Rensselaer and author of the study.


A sensor incorporating the carbon nanotube block would be able to detect very slight weight changes and would be beneficial in any number of practical and industrial applications, Sreekala said. Two potential applications are a pressure gauge to check the air pressure of automobile tires, and a microelectromechanical pressure sensor that could be used in semiconductor manufacturing equipment.


Despite extensive research over the past decade into the mechanical properties of carbon nanotube structures, this study is the first to explore and document the material's strain-resistance relationship. The paper, titled "Effects of compressive strains on electrical conductivities of a macroscale carbon nanotube block," was published in a recent issue of Applied Physics Letters.


Over the course of the experiment, the researchers placed the carbon nanotube block in a vice-like machine and applied different levels of stress. They took note of the stress applied and measured the corresponding strain put on the nanotube block. As it was being squeezed, the researchers also sent an electrical charge through the block and measured its resistance, or how easily the charge moved from one end of the block to the other.


The research team discovered that the strain they applied to the block had a linear relationship with the block's electrical resistance. The more they squeezed the block, the more its resistance decreased. On a graph, the relationship is represented by a neat, straight line. This means every time one exposes the block to a load of X, they can reliably expect the block's resistance to decrease by Y.


This reliability and predictability of this relationship makes the carbon nanotube block an ideal material for creating a highly sensitive pressure sensor, Sreekala said.


The pressure sensor would function similarly to a typical weight scale. By placing an object with an unknown weight onto the carbon nanotube block, the block would be squeezed down and its electrical resistance would decrease. The sensor would then send an electrical charge through the nanotube block, and register the resistance. The exact weight of the object could then be easily calculated, thanks to the linear, unchanging relationship between the block's strain and resistance.


A study published earlier this year, written by Rensselaer senior research specialist Victor Pushparaj, who is also an author of the pressure sensor paper, showed that carbon nanotubes are able to withstand repeated stress yet retain their structural and mechanical integrity. Electrical resistance decreases as the block is squeezed, as the charged electrons have more pathways to move from one end of the block to the other.


In the new study, Sreekala and the research team found that the nanotube block's linear strain-resistance relationship holds true until the block is squeezed to 65 percent of its original height. Beyond that, the block's mechanical properties begin to fail and the linear relationship breaks down.


The team is currently thinking of ways to boost the nanotubes' strength by mixing them with polymer composites, to make a new material with a longer-lived strain-resistance relationship.


"The challenge will be to choose the correct polymer so we don't lose efficiency, but retain the same response in all directions," Sreekala said.




Technorati : ,

Three Atomic Nuclei Created;Super-heavy Aluminum Isotopes May Exist


Researchers at Michigan State University's National Superconducting Cyclotron Laboratory, NSCL, have created three never-before-observed isotopes of magnesium and aluminum. The results not only stake out new territory on the nuclear landscape, but also suggest that variants of everyday elements might exist that are heavier than current scientific models predict.


It's been a longstanding project since the beginning of nuclear science to establish what isotopes can exist in nature," said Dave Morrissey, University Distinguished Professor of chemistry and one of the paper's authors. "This result suggests that the limit of stability of matter may be further out than previously expected; really, it shows how much mystery remains about atomic nuclei."


Particles that comprise atomic nuclei, protons and neutrons, are held together by the nuclear force. One of the four fundamental forces that collectively describe the interactions of all matter in the cosmos, the nuclear force, has been the subject of scientific inquiry since the 1930s.


Despite much progress in nuclear physics during the subsequent decades, understanding of how the nuclear force and other effects play out inside nuclei is far from complete. For example, even today scientists aren't sure exactly what combinations of protons and neutrons can make up most atomic nuclei.


One way experimental nuclear physicists explore this issue is by using accelerator facilities to create reactions that, in effect, kluge together piles of protons. An element is defined by its number of protons. For example, hydrogen has one proton; helium, two protons; oxygen eight protons, uranium, 92 protons. Whenever physicists establish a new proton limit, they invariably garner attention for conjuring new elements. In October 2006, a team of Russian and American scientists generated worldwide headlines for creating an element with 118 protons, the most protons ever recorded in a single nucleus.


Another way to probe nuclear stability is to see how many neutrons can be loaded onto nuclei of more quotidian elements, which is the focus of much of the work at NSCL. Elements can exist as different isotopes, which contain the same number of protons but different numbers of neutrons. As an example, the most abundant stable isotope of carbon has six protons and six neutrons. However, trace amounts of carbon-13 and carbon-14 -- with seven and eight neutrons respectively -- also can be found on Earth.


The neutron-limit, referred to as the neutron-dripline, is a basic property of matter. Yet remarkably, despite more than a half-century of inquiry, scientists know the dripline location only for the eight lightest elements, hydrogen to oxygen. So one very basic question -- what's the heaviest isotope of a given element that can exist" -- remains unanswered for all but eight of the hundred or so elements on the Periodic Table.


In an experiment that ran earlier this year at NSCL, researchers successfully created and detected three new super-heavy isotopes of magnesium and aluminum: magnesium-40, with 12 protons and 28 neutrons; aluminum-42, 13 protons and 29 neutrons; and aluminum-43, 13 protons and 30 neutrons. If the everyday version of aluminum were a 160-pound adult, aluminum-43 would be a muscular, 255-pound heavyweight.


"Evidence of particle stability for magnesium-40 obtained at NSCL is a major step in the field of rare isotope physics," said Hiro Sakurai, chief scientist at RIKEN in Japan, who was not involved in the research. The RIKEN research institute in Saitama, Japan, is home to the world's most powerful accelerator facility for creating radioisotope beams.


The fleeting appearance of these three nuclear newcomers is significant for several scientific and technical reasons.


First, when is comes to magnesium, the results indicate that the dripline extends at least as far as, and possibly beyond, magnesium-40. The isotope wasn't detected in several dripline-focused experiments conducted around the world since 1997 and the research community had begun to suspect that it was beyond the bounds of stability. Though it's difficult to compare across disciplines, physicists' success in detecting three magnesium-40 isotopes in the course of an 11-day experiment is roughly similar to the achievement of biologists who finally snap an image of an elusive and thought-to-be-extinct animal after years of traipsing through the jungle.


"The discovery of the hitherto unknown heaviest magnesium and aluminum isotopes at NSCL is a milestone in rare isotope research and is a great accomplishment for the worldwide scientific community exploring unstable nuclei close to the so-called neutron dripline," said Horst Stocker, director of Gesellschaft fur Schwerionenforschung, GSI, who was not involved in the research. Darmstadt, Germany-based GSI is one of the world's top accelerator facilities for producing heavy-ion beams for research.


Second, aside from being a similarly interesting outlier, aluminum-42 carries added importance since it is a near-dripline nucleus with an odd number of neutrons. Isotopes of lighter elements that toe the edge of existence generally have even numbers of neutrons due to the fact that neutrons naturally pair up inside nuclei. With an even number of neutrons, the nuclei in effect have a tidy, complete set of such pairs that collectively form a sort of energetic scaffolding that increases stability.


According to one of the leading theoretical models, aluminum-42 shouldn't exist. That it does suggests that the dripline may in fact tilt in the direction of more novel, neutron-rich isotopes, an implication that will help to extend nuclear theory and point the way to future experiments.


The NSCL result "alters the landscape of known nuclei, it alters our understanding of the forces that bind nuclei into stable objects, and it has important implications for future attempts with next-generation facilities to map the evolution of nuclear structure and existence into the most weakly bound nuclei," said Rick Casten, D. Allan Bromley Professor of Physics at Yale University, also not involved in the research.


The experimental technique itself also is noteworthy. Creating and measuring rare isotopes is always needle-in-a-haystack work that requires researchers to hunt for a few desired nuclei from a swarm of fast-moving and mostly known and therefore less interesting particles. But in this experiment, NSCL researchers achieved a hundred- to thousand-fold boost in their ability to filter out what can be thought of as junk. They did so by essentially jury-rigging the facility to filter the beam twice. The result was an ability to detect and measure isotopes so rare that they represent less than one in every million billion particles that passed by the detectors.


The dual filtering process, more properly known as two-stage separation, is a fixture in most new and planned facilities for rare isotope beam research, including the proposed upgrade of NSCL. This experiment marks one of the first uses of two-stage separation in the world and the first time the technique has been tried at NSCL, which typically filters and purifies particles only once in its A1900 separator.


NSCL detectors returned just one blip of data consistent with the existence of aluminum-43. This generally isn't enough to count as a discovery, according to the conventions of nuclear science. However, more than 20 instances of its immediate neighbor, aluminum-42, were observed. Because of this relative abundance and the fact that, due to pairing, the 30 neutrons in aluminum-43 should prove more stable than the 29 neutrons in aluminum-42, the solitary signature of aluminum-43 etched in the data logs carries more than usual amount of credibility.


"Experiments such as these are paving the way into the new era of nuclear structure studies that technological developments are opening to investigation for the first time ever," said Yale's Casten.


The findings appear in the October 25 issue of the journal Nature.




Technorati : ,

Bird Flu Finds Children's Lungs Faster


Pediatric tissue sample. The researchers found that a particular form of MAA (MAA1) displayed widespread binding throughout the respiratory tract, but was particularly good at binding to children's cells in the lower respiratory tract. (Credit: Image courtesy ofBioMed Central)



New findings, reported in the journal Respiratory Research, about how the virus binds to the respiratory tract and lung suggest children may be particularly susceptible to avian influenza,. The results also mean that previous receptor distribution studies may have to be re-evaluated.

John Nicholls and colleagues at the University of Hong Kong and Adelaide Women and Children's Hospital, in Australia, describe a modified technique to visualize the putative receptors for influenza viruses in the upper and lower respiratory tract, including the lung.


Sialic acid molecules on the cell surface act as chemical beacons for the influenza virus. Once the virus finds sialic acid, it can attach and infect the cell, although the precise distribution of sialic acid molecules affects how easily the virus can find host cells to infect.


The team has turned to lectins- molecules which bind sugars, to help them differentiate receptors for human and avian influenza viruses. The researchers used an improved staining technique to see how well two lectins, Sambucus nigra agglutinin (SNA) and Maackia amurensis agglutinin (MAA), bind to different forms of sialic acid on respiratory tract cells in healthy adults and children. SNA is particularly good at identifying the receptor for human influenza viruses while MAA identifies the receptor for avian viruses - including H5N1.


The researchers found that a particular form of MAA (MAA1) displayed widespread binding throughout the respiratory tract, but was particularly good at binding to children's cells in the lower respiratory tract as well as the upper respiratory tract of adults. Although this MAA1 binding is not unique for avian influenza receptors, these findings could explain how avian influenza infects children more readily than it does adults. This may explain previous findings from this group that avian H5N1 viruses can infect the human upper respiratory tract, even though these tissues were thought to lack receptors for these viruses.


"Understanding the how and why of avian virus infection of humans is a very complex process involving research into properties of H5N1 virus, the host receptor and the cellular response" said Dr John Nicholls. "We believe that the studies we have done investigating where the receptors are located and their distribution with age is a small step towards unravelling this process and help in finding ways to diminish the potential treat from this emerging infection."




Technorati : , ,

New Room on Space Station


New Room on Space Station


Astronauts Open New Room on Space Station


Astronauts aboard the International Space Station (ISS) have successfully opened a new room they installed on the orbiting laboratory Friday. The addition is part of U.S. space shuttle Discovery's complex two-week mission to boost the station's capability. VOA's Alex Villarreal reports from Washington.


Crews entered the Harmony module for the first time on Saturday after Station Commander Peggy Whitson and Discovery crewmember Paolo Nespoli opened the hatches.


The astronauts wore protective gear during the grand opening. They began setup of an air circulation system in the Italian-made compartment to make it safe for crewmembers to be inside.


Shuttle Flight Director Rick LaBrode said the day's activities went extremely well. He offered his praise to the crew and the new module.


"It's beautiful," he said. "It's bright, shiny. The report from the crew is that it's just as clean as can be. Perfect shape."


Astronauts attached the bus-sized module to a temporary location on the station during the mission's first spacewalk on Friday.


Harmony will provide docking ports for European and Japanese research laboratories to be installed on the station during upcoming missions. It will be moved to a permanent location after the shuttle departs.


LaBrode said other activities Saturday stabilized the onboard computer system. Both Discovery and the space station have had networking problems during the past few days.


"You've heard me report over pretty much the duration of the mission that we've been fighting on board network problems," he said. "Well, I'm cautiously optimistic that that I think we resolved these problems."


Time had also been set aside Saturday for a more focused inspection of Discovery's heat protection system. But mission managers canceled the plans after NASA engineers analyzed photos for damage and found nothing to warrant further checks.


Crews now have more time to prepare for Sunday's spacewalk, the second of the mission. During the spacewalk, astronauts will prepare a massive solar panel section to be moved by robotic arm to another part of the ISS.


They will also inspect a rotating joint on the station. The joint is needed to keep the station's solar wings turned toward the sun for power. It has experienced problems with increased friction for the past month and a half.


Discovery launched on Tuesday and is expected to return to Earth on November 6.






Technorati : , , ,

Find here

Home II Large Hadron Cillider News