Search This Blog

Friday, October 12, 2007

Weaves remote-control magic


weaves remote-control magicTechnology shows everyday new news, really its magic


Microsoft Xwand weaves remote-control magic


Cinderella's fairy godmother has nothing on Microsoft's Xwand. Point this digital magic wand at a light, and the lights go on. Point it at the stereo and turn it to the right, and the volume goes up. "We're trying to get away from ... the universal remote where you have 300 buttons," says Andy Wilson, the Microsoft researcher who was key in developing the device.
The Xwand, a prototype, contains a set of motion sensors that tell a nearby computer whether it is pointing left or right, up or down.


The computer uses that information to adjust whatever device the Xwand is pointing at.


Microsoft is experimenting with adding voice-recognition features. To turn on the light, you'd point the Xwand at it and say, "On." That would keep the lights from flickering every time you moved the Xwand around.


The Xwand is only a prototype. It will be years before the technology is ready for consumers, but when it is, Wilson says, he sees it being used in homes and offices.




Technorati : , , ,

A new private jet for Google's founders


Google is reportedly expanding the fleet of private jets that has privileges to take off from a small, Nasa-operated airfield near the company's headquarters in Mountain View, California.


As part of an agreement with the space agency, the company's top executives can already drive the mere four miles up the road to Moffett Field, which is generally closed to private aircraft, and board any of the three jets that they keep there.


Now Larry Page and Sergey Brin, Google's billionaire founders, are adding a fourth aircraft to the fleet which, for $1.3 million a year, they park at the airfield in return for assisting Nasa with its information-gathering activities.


The plane that will join the Boeing 767-200 and two Gulfstream Vs already in Google's hangar is, according to the New York Times, a Boeing 757, which is large by corporate jet standards


Under an agreement between Nasa and H211 LLC, a company controlled by Google's senior executives, the search firm always intended to park four planes at the airfield, but the leasing arrangements of the 757 had not been agreed at the time details of the rest of the fleet emerged.


The new plane is due to begin flights in November, documents seen by the paper suggest.


A spokesman for Google would not comment about the new plane, and H211 LLC, which counts the Google chief executive, Eric Schmidt, among its principals, was not able to be reached.


"Our company's senior-most executives have entered into a notable public-private partnership with the space agency," Matt Furman, a Google spokesman, said. "As a result of that arrangement, NASA scientists now have access to aircraft for experiments they might not otherwise be able to perform."


He added that the fees paid - which were for landing rights and hangar access - helped to "significantly defray" the cost of running Moffett Airfield.


Google, whose offices are a seven-minute drive from the airfield, already has a broad research agreement with Nasa, and in August, the agency's scientists used one of the company's planes to observe a meteor shower.


Local residents expressed opposition to the idea that Nasa open up its runways as a means of helping pay for facilities. "The Google flights represent the possibility that the camel's nose is under the tent," one was quoted as saying when news of the agreement broke last month.


But Anna Eshoo, a Democratic representative whose district includes the airfield, told the New York Times: "You have to live with your neighbours. You are not out in the middle of the desert. You are in the heart of Silicon Valley."


From Another comments:


The fun thing about having lots of money is you can buy as many toys as you want. Over at Google it's a veritable playground.


(Credit: Boeing)A company controlled by Google co-founders Sergey Brin and Larry Page and CEO Eric Schmidt, with the strange name of H211 LLC, has an agreement to land four jets at Moffett Field, according to documents released to The New York Times after the paper filed a Freedom of Information Act request. Moffett Field, which is operated by NASA Ames Research Center, is very close to Google's Mountain View, Calif., headquarters.


So, in addition to the two Gulfstream Vs and a Boeing 767, the Google billionaires anticipate landing a Boeing 757 at the airfield starting sometime next month, the documents show.


Google and NASA have a public-private partnership that gives NASA scientists access to the planes and provides fees that help defray the costs of running Moffett, a Google spokesman told the Times. Oh, and the Googlers have bought carbon offsets to mitigate the Boeing 767's negative impact on the environment, he says.


The Google billionaires are paying $1.3 million annually for the Moffett rights.


Some may see the planes as a sign of indulgence. And they have the community up in arms over the increase in flight traffic over the area and other CEOs in the area jealous that they weren't the ones to score a Moffett deal.


Actually, maybe these aren't just toys to the Google guys. Maybe they are a reflection that the company has become so important it needs special privileges, and a way to evacuate large groups of people out of the area quickly.







Technorati :

The Swedish Academy cited this year's physics prize :UCSB Nanotechnology Researcher said


"The Swedish Academy cited this year's physics prize as one of the first major applications of nanotechnology. This should remind people that everyday objects we use already incorporate sophisticated nanoscale devices."McCray UCSB Nanotechnology Researcher said.



Abstract:
This week's announcement of the 2007 Nobel Prize in Physics generated considerable interest for CNS researcher and UC Santa Barbara historian W. Patrick McCray. For the past two years, McCray and his colleagues Timothy Lenoir (Duke University) and Cyrus Mody (Rice University) have studied the history of nanoelectronics. The recent news from Stockholm helped demonstrate the relevance of their work for understanding the societal impact of nanotechnologies.



On October 8, the Royal Swedish Academy of Sciences awarded the 2007 Nobel to Albert Fert and Peter Grünberg for their discovery of giant magnetoresistance (GMR). GMR is the process whereby a small magnetic field can trigger a large change in electrical resistance. This discovery is at the heart of modern hard drive technology, and it has stimulated the manufacture of a new generation of electronics. The Nobel citation also noted that Fert and Grünberg's work heralded the advent of new and potentially more powerful forms of memory storage using "spintronics" in which information is stored and processed by manipulating the spins of electrons.


For over two years, McCray and his colleagues have documented the emergence of spintronics research. Discovery of the GMR phenomena, according to McCray, marked the beginning of the spintronics field. "Just as it is impossible to imagine life today without the transistor," said McCray, "spintronics and many other fields in nanotechnology are hard to predict, but they may have a major impact on our society and economy. The GMR phenomenon helped enable a major change in how we interact with technology and the possibilities afforded by it."


Most of the electronics industry is based on manipulating the charges of electrons moving through circuits. But the spin of electrons might also be exploited to gain new control over data storage and processing. Spintronics, an area of physics research in which UCSB is especially strong, is the general name for this branch of electronics. One area of nano-research that appears most exciting to scientists, commercial firms, and government patrons is the development of nanoelectronics which replace or complement traditional transistor technologies, explained McCray. "The potential economic and social effects of this transformation may be profound, and now the connection of a Nobel Prize to it might really increase its visibility for the public," McCray said.



Science Background


Nanotechnology is the manipulation of materials on a very small scale. One nanometer is one billionth of a meter. By comparison, DNA is two nanometers wide, a red blood cell is 10,000 nanometers wide, and a single strand of hair is 100,000 nanometers thick. Nanotechnology holds great potential in virtually every sector of the economy, including electronics, medicine, and energy.


About CNS-UCSB
The mission of the Center for Nanotechnology in Society (CNS) at the University of California, Santa Barbara is to serve as a national research and education center, a network hub among researchers and educators concerned with nanotechnologies' societal impacts, and a resource base for studying these impacts in the U.S. and abroad.


The CNS carries out innovative and interdisciplinary research in three key areas:


· the historical context of nanotechnologies;


· the institutional and industrial processes of technological innovation of nanotechnologies along with their global diffusion and comparative impacts; and


· the social risk perception and response to different applications of nanotechnologies.


The CNS is funded by an award from the National Science Foundation.



Contacts:
Valerie Walston
(805) 893-8850
W. Patrick McCray
(805) 893-2665


More
If "giant magnetoresistance" is not the first word most people think of when they think about their cool new portable music players, perhaps they should. Without it, our wafer-thin iPods would be the size of Texas toast.


Giant magnetoresistance, or GMR for short, is the technology that has allowed laptops to shrink and storage bytes to boom. It enables computers to stuff more than a trillion bits of data on a storage cell the size of a fingernail-or, in terms of songs, all the music you've ever listened to in your life on a player no bigger than a keychain.


While GMR has been a driving technology behind our modern digital age, it has done so quietly. Until now, relatively few people outside of engineering circles had ever heard of it.


That may have ended today, however, when the Royal Swedish Academy of Sciences announced it will award the 2007 Nobel Prize in Physics jointly to Albert Fert of the Université Paris-Sud in France, and Peter Grünberg of Forschungszentrum Jülich, Germany for their early GMR work. In awarding this particular achievement, the academy marks the beginning of a new epoch as this is the first Nobel prize for a true form of "nanotechnology," which promises to revolutionize many areas of science and modern life.




Technorati : ,

Malaysia's first astronaut is orbiting the Earth after months of training and a successful launch from Kazakhstan.


A Russian Soyuz spacecraft carrying Malaysia's first astronaut, a U.S. astronaut and a Russian cosmonaut blasted off to rendezvous with the International Space Station on Wednesday.


Thousands of Malaysians watched the blast-off live on television as the TMA-11 rocket carrying Sheikh Muszaphar Shukor, an orthopaedic surgeon and university lecturer from Kuala Lumpur, lifted off from Russia's Baikonur Cosmodrome in the Kazakh steppe.


Malaysia's first astronaut is orbiting the Earth after months of training and a successful launch from Kazakhstan. Sheikh Muszaphar Shukor is accompanying American Astronaut Peggy Whitson and Russian Cosmonaut Yury Malenchenko on a 12-day mission to the International Space Station. Chad Bouchard reports from Bangkok.


A Russian Soyuz TMA-11 spacecraft carrying Southeast Asia's first space traveler lifted off from Kazakhstan's Baikonur space center Wednesday night.


The spacecraft is scheduled to dock with the International Space Station on Friday.











Malaysian astronaut Sheikh Muszaphar Shukor gives the thumbs-up sign during a training session in Star City outside Moscow, 18 Sep 2007
Malaysian astronaut Sheikh Muszaphar Shukor gives the thumbs-up sign during a training session in Star City outside Moscow, 18 Sep 2007

Sheikh Muszaphar Shukor, a Malaysian orthopedic surgeon, is due to research the effects of micro-gravity and space radiation on cells, and conduct experiments on proteins in an effort to develop an HIV vaccine.

In an interview with VOA from Kazakhstan, Malaysia's Science, Technology and Innovations Minister, Jamaluddin Jarjis, says he hopes the mission will inspire a new generation of Malaysian scientists.


"Putting our man, our Malaysian man in space, is basically - we want to raise the bar for Malaysia in terms of acquiring knowledge for the future, especially the young ones, the five million kids in school," he said. "And also we are quite proud, because in conjunction with our 50th anniversary of the nation, that we are positioning ourselves as part the - connected to the world."


Muszaphar is a member of Malaysia's Malay ethnic group, and much advance study and debate went into deciding how he would honor his Muslim religious duties while in space.


The Muslim requirement to face in the direction of Mecca during daily prayers, for example, is a challenging prospect while orbiting hundreds of miles above the earth in a weightless environment. An imaginary line from Mecca into space was drawn, and it was decided that Muszaphar would face that line at the start of his prayers, and continue facing the same direction throughout the flight.


He also pledged to follow religious practice during the last days of the Muslim holy month of Ramadan, which coincide with the beginning of the mission.


Malaysian clerics exempted Muszaphar from fasting while in space, but he says he will observe the fasts anyway.


The country's Ministry of Religion has written the world's first handbook for Muslim astronauts to sort out that and other religious issues.



Malaysia paid Russia $25 million to allow Muszaphar's participation, part of a $900 million package linked to Malaysia's purchase of 18 Russian fighter jets.


The 35-year-old surgeon is scheduled to return to Earth October 21, while his two companions remain behind in the space station.





Technorati :

IBM and Second Life creator Linden Labs are teaming up in an effort to create avatars that can jump from one virtual world to the next.


Under the plan, the companies will work together to develop standards that, if broadly employed, would allow online Web users to move their digital personas seamlessly across virtual environments like Second Life and other 3-D worlds.


To kick start the effort, Linden Labs has launched an open forum called the Architecture Working Group -- where some of the more tech savvy Second Life citizens can help create a roadmap for the project. Other virtual worlds that Second Lifers could potentially connect to include Dreamville, There, and The Sims Online.


IBM says universal standards will help drive the use of virtual worlds beyond gaming and entertainment and make them more practical for businesses. Among other things, the company envisions online malls where avatars can stroll around, chat with "sales avatars", view product demonstrations, and make purchases.


"We see users demanding more from these environments and desiring virtual worlds that are fit for business," said Colin Paris, IBM's VP for digital convergence, in a statement.


IBM has already embraced Second Life for internal use. The company has held staff meetings and other employee events in the virtual world, and CEO Sam Palmisano has his own, persistent avatar. The company also recently issued a list of employee conduct rules for Second Life.


However, some IBM employees are using Second Life in ways the company likely didn't intend. A group of disaffected workers at IBM's Italian operations recently held a virtual demonstration against IBM in Second Life to protest wages and working conditions.





Technorati :

Saturn's Moon Titan


New near-infrared images from Hawaii's W. M. Keck Observatory and Chile's Very Large Telescope show for the first time a nearly global cloud cover at high elevations and, dreary as it may seem, a widespread and persistent morning drizzle of methane over the western foothills of Titan's major continent, Xanadu.


In most of the Keck and VLT images, liquid methane clouds and drizzle appear at the morning edge of Titan, the arc of the moon that has just rotated into the light of the sun.


"Titan's topography could be causing this drizzle," said Imke de Pater, UC Berkeley professor of astronomy. "The rain could be caused by processes similar to those on Earth: Moisture laden clouds pushed upslope by winds condense to form a coastal rain."


Lead author Mate Adamkovics, a UC Berkeley research astronomer, noted that only areas near Xanadu exhibited morning drizzle, and not always in the same spot. Depending on conditions, the drizzle could hit the ground or turn into a ground mist. The drizzle or mist seems to dissipate after about 10:30 a.m. local time, which, because Titan takes 16 Earth days to rotate once, is about three Earth days after sunrise.


"Maybe only Xanadu has misty mornings," he said.


Adamkovics, de Pater and their colleagues in UC Berkeley's Center for Integrative Planetary Studies report their observation in the Oct. 11 issue of Science Express, an online version of the journal Science.


Titan, larger than the planet Mercury, is the only moon in the solar system with a thick atmosphere, which is comprised mostly of nitrogen and resembles Earth's early atmosphere. Previous observations have shown that the entire moon is swathed in a hydrocarbon haze extending as high as 500 kilometers, becoming thinner with height. The south pole area exhibits more haze than elsewhere, with a hood of haze at an altitude between 30 and 50 kilometers.


Because of its extremely cold surface temperature - minus 183 degrees Celsius (-297 degrees Fahrenheit) - trace chemicals such as methane and ethane, which are explosive gases on Earth, exist as liquids or solids on Titan. Some level features on the surface near the poles are thought to be lakes of liquid hydrocarbon analogous to Earth's watery oceans, and presumably these lakes are filled by methane precipitation. Until now, however, no rain had been observed directly.


"Widespread and persistent drizzle may be the dominant mechanism for returning methane to the surface from the atmosphere and closing the methane cycle," analogous to Earth's water cycle, the authors wrote.


Actual clouds on Titan were first imaged in 2001 by de Pater's group and colleagues at Caltech using the Keck II telescope with adaptive optics and confirmed what had been inferred from spectra of Titan's atmosphere. These frozen methane clouds hovered at an elevation of about 30 kilometers around Titan's south pole.


Since then, isolated ethane clouds have been observed at the north pole by NASA's Cassini spacecraft, while both Cassini and Keck photographed methane clouds scattered at mid-southern latitudes. Also in 2005, the Huygens probe, build by the European Space Agency and released by Cassini, plummeted through Titan's atmosphere, collecting data on methane relative humidity.



These data provided evidence for frozen methane clouds between 25 and 30 kilometers in elevation and liquid methane clouds - with possible drizzle - between 15 and 25 kilometers high. The extent of the clouds detected in the descent area was unclear, however, because "a single weather station like Huygens cannot characterize the meteorology on a planet-wide scale," said UC Berkeley research astronomer Michael H. Wong.


The new images show clearly a widespread cloud cover of frozen methane at a height of 25 to 35 kilometers - "a new type of cloud, a big global cloud of methane," Adamkovics said - that is consistent with Huygens' measurements, plus liquid methane clouds in the tropopause below 20 kilometers with rain at lower elevations.


Because earlier observers thought that the methane droplets in these clouds were too sparse to be seen, they referred to the frozen and liquid methane clouds as "sub-visible."


"The stratiform clouds we see are like cirrus clouds on Earth," Adamkovics said. "One difference is that the methane droplets are predicted to be at least millimeter-sized on Titan, as opposed to micron-sized in terrestrial clouds - a thousand times smaller. Since the clouds have about the same moisture content as Earth's clouds, this means the droplets on Titan are much more spread out and have a lower density in the atmosphere, which makes the clouds 'subvisible' and thus hard to detect."


If all the moisture were squeezed out of Titan's clouds, it would amount to about one and a half centimeters (six-tenths of an inch) of liquid methane spread around Titan's surface, Adamkovics said. This is about the same moisture content as some of Earth's clouds.


Since 1996, de Pater and colleagues have been using infrared detectors on the Keck telescopes to regularly monitor clouds and hazes on Titan. In past years, they have also used the VLT. The advantage of observing at infrared wavelengths is that Titan's haze is relatively transparent. At optical wavelengths, these haze layers form an impenetrable layer of photochemical smog.


By observing at different infrared wavelengths, scientists can probe different altitudes in Titan's atmosphere, depending on the strength of the methane absorption at that wavelength. Then, by using the methane absorption profile, they can pinpoint particular altitudes in Titan's atmosphere, allowing astronomers to see the surface and judge the altitude of methane clouds. Adamkovics first saw evidence of widespread, cirrus-like clouds and methane drizzle when analyzing Feb. 28, 2005, data from a new instrument on the European Southern Observatory's VLT - the Spectrograph for INtegral Field Observations in the Near Infrared (SINFONI).


Sharper images and spectra taken on April 17, 2006, by the OH-Suppressing Infra-Red Imaging Spectrograph (OSIRIS) on Keck II confirmed the clouds. Both instruments measure spectra of light at many points in an image rather than averaging across the entire image. By subtracting light reflected from the surface from the light reflected by the clouds, the researchers were able to obtain images of the clouds covering the entire moon.


"Once we saw this in both data sets, we altered our radiative transfer models for Titan and recognized that the only way to explain the data was if there was liquid or solid methane in the atmosphere," Adamkovics said. "This is a big step in helping us understand the extent to which solid clouds and liquids are spread throughout Titan's atmosphere."


UC Berkeley graduate student Conor Laver is the fourth author on the Science Express paper. The work was supported by the National Science Foundation and Technology Center for Adaptive Optics (CfAO), NASA and the Center for Integrative Planetary Science (CIPS) at UC Berkeley.





Technorati :

Boeing : Delaying Delivery of Its 787


The Boeing Company's announcement yesterday that it would delay initial deliveries of the 787 Dreamliner by six months is a blow to a program that had been seen as the most successful in commercial aviation - a seemingly perfect blend of new technology, marketing and production line innovations.



Skip to next paragraph





Robert Sorbo/Reuters

Boeing workers at the 787's debut in Everett, Wash., in July.






Yet, as the company's stock fell below $100 a share yesterday, Boeing officials remained confident that the program remains on track for the long haul. Analysts as well saw the delay as more of a temporary setback, and not of the same magnitude as the problems that its rival, Airbus, has experienced in producing its superjumbo A380, which has also fallen behind schedule, but by two years.


Boeing's delivery delay was caused by the problems of Boeing's global chain of suppliers in completing their work, as well as unanticipated difficulties in its flight-control software.


This delivery delay comes after Boeing announced last month a three-month delay in the plane's flight-test program caused, in part, by a worldwide shortage of fasteners that hold together the plane's fuselage, wing and tail sections.


"We are very disappointed over the schedule changes that we are announcing today," said W. James McNerney Jr., Boeing's chief executive. "Notwithstanding the challenges that we are experiencing in bringing forward this game-changing product, we remain confident in the design of the 787, and in the fundamental innovation and technologies that underpin it."


With 710 orders worth $100 billion from 50 airlines, the Dreamliner has been the fastest-selling commercial aircraft in history. It is also one of the most innovative. It is being made, in pieces, all over the world, with only the final assembly taking place at Boeing's plant in Everett, Wash.


Its fuselage will be the first to make extensive use of composite materials rather than traditional aluminum. It will use new energy-efficient engines, and its interior cabin is being designed to provide more humidity and bigger windows for passenger comfort.


The first delivery of the planes, to All Nippon Airways of Japan, is now scheduled for late November or early December 2008, rather than the original date of May. The first test flight will take place at the end of March 2008, rather than at the end of this year, Boeing said.


While Boeing said the delays would not lower the company's earnings for this year or for 2008, the announcement was clearly a setback in the image of a program that had appeared to be nearly flawless in its execution. It also showed that Boeing, which had the program on a highly ambitious schedule, might have been overly optimistic about what it could deliver - and when.


"Annoyance is the first word that comes to mind," said Howard Rubel, an analyst with Jefferies & Company. "It's annoying because they have done so many good things to get this program right. But this provides that the program is a little more complicated than they expected."


In a conference call with analysts, Mr. McNerney said that Boeing anticipated producing 109 Dreamliners through the end of 2009, three fewer than initially planned. When pressed by analysts over whether this new delivery schedule, in light of the supply chain problems, was still realistic, Mr. McNerney maintained that it was.


"Recognizing that there is risk with any new airplane program," Mr. McNerney said, "we still remain confident that this new plan is achievable and we are all aligned to make it happen."


Boeing' shares fell $2.77, or 2.7 percent, $98.68.


Scott Carson, head of Boeing Commercial Aviation, said yesterday it was too early to determine what penalties Boeing might have to pay to customers as a result of the delay.


"We have taken this into account in our financial models," said Mr. Carson. "In some cases, our customers say, 'We will work with you.' Some will insist on some form of compensation."


One of the big American customers for the 787 is Northwest Airlines, which has orders for 18 Dreamliners and had expected to get its first planes in August 2008. The Dreamliner is central to Northwest's plans to expand its international routes by more than 4 percent a year through 2010.


But Ben Hirst, a Northwest spokesman, said a six-month delay would probably not hurt its plans. "A longer delay, obviously, we would have to recalibrate," he said.


Continental Airlines has 25 787s on order. The first one was expected in 2009 with deliveries continuing through 2013, said Dave Messing, a spokesman. "It's too early to tell what impact, if any, Boeing's announced 787 program delay will have on Continental," he added. Continental has not determined whether the 787s will replace existing planes or add to its fleet, Mr. Messing said.


Cai von Rumohr, an analyst with Cowen & Company, said that the delay gave Boeing that "chance to do it right" and added that it should not hurt Boeing in its competition with Airbus, which suffered after announcing a six-month delay in its A380 superjumbo jet in June 2006. It has also had problems in producing the A350. The A350, a midsize, wide-bodied plane, would compete directly with the 787 but is not expected to be available until 2013.


"People got over the setbacks in the A380," Mr. von Rumohr said. "They are not going to go to Airbus, whose plane is not going to be ready until 2013."


Production of the 787 is being spread to suppliers across the globe in an effort to cut costs and spread the financial risk involved in the program. Yet even with this new and far-flung production - as well as the new composite body - Boeing had planned for an ambitious test-flight program for the Dreamliner, scheduling just nine months from first flight until first delivery, two months less than the tests on its newest commercial plane, the 777.


But Boeing maintained that the delay would provide the company with the breathing room to work out its supply chain problems and get the program back on schedule.


"The reason we think we will meet the new timetable is the detailed bottoms-up planning that we have done to assure that we can make it," Mr. McNerney said.


source : http://www.nytimes.com




Technorati :

Genomic Alterations 2.0 : How to make the most of the evolving technology for detecting copy number variants.


Between missing chunks of chromosomes and single nucleotide polymorphisms (SNPs) lies a vast middle ground of genomic alterations. Among these are copynumber variations (CNVs) - the differences between individuals in the number of copies of a genomic region. "The total nucleotide content that is encompassed by CNVs most certainly exceeds that of SNPs," says Stephen Scherer of The Hospital for Sick Children in Toronto.


The recent surge of interest in CNVs has induced a proliferation of technologies designed to detect them in normal DNA, congenital diseases, and cancer cells, in which copynumber changes may induce their unruly divisions.


Scientists have largely turned to comparative genomic hybridization (CGH) arrays, which involve hybridizing two genomes - one as a reference and one to be tested - that fluoresce in different colors. Measuring the color that dominates at a given region determines whether the test genome contains an insertion or deletion at that site. Bacterial artificial chromosome (BAC) arrays allow researchers to mine the entire genome for CNVs. Two companies, NimbleGen Systems and Agilent Technologies now make oligonucleotide probes specifically for CNV detection. Alternatively, genotyping arrays, originally designed to identify SNPs, have been modified by users and by companies - mainly Affymetrix and Illumina - to uncover CNVs.


Researchers are systematically comparing the pros and cons of different platforms, says Charles Lee of Harvard Medical School in Boston, but so far, "there's no consensus on which is the best platform to use." SNP arrays are usually more cost-effective for genome-wide association studies that match known CNVs with gene expression or a particular disease, while oligonucleotide arrays are a better choice for detecting novel variants.


According to Evan Eichler of the University of Washington in Seattle, "the problem is that current prices are all too high." Regardless of platform, one array runs from $300 to $1,000, he says, but "we need prices at about $100 to $150 to screen a large number of samples genome-wide" in order to detect rare variants. Cost is rapidly becoming less of a barrier, Scherer says, with prices of all the major platforms, especially SNP arrays, dropping substantially in the past year.


Since the technology is still relatively new, key kinks remain to be worked out. The Scientist tracked down five researchers who found the fixes to their CNV detection woes.
















BAC IN TIME

User: Frank Speleman, Ghent University Hospital, Belgium


The project: Using BAC arrays left over from the Human Genome Project to screen cell lines for CNVs that associate with disease, including neuroblastoma and Hodgkin lymphoma.


The problem: BAC arrays are time-intensive to create, and the arrays themselves are not completely reproducible. Also, they might miss copy-number differences smaller than 50-100 kb.


The solution: Oligonucleotide and SNP arrays offer much quicker answers to copy-number questions, Speleman says. He expects that BAC arrays will die out in the near future, but his group continues to use them, mainly because of the time they've already invested in creating them. Moreover, since their genome coverage is comprehensive, BACs also offer more robust data than other arrays.




ANEUPLOID PROBLEMS

User: Ken Lo, Roswell Park Cancer Institute, Buffalo, New York


The project: Looking for CNVs in medulloblastoma and glioblastoma cell lines.


The problem: Tumor samples have two characteristics that make copynumber analyses difficult: abnormal numbers of chromosomes, and a heterogenous cell composition.


The solution: Lo's group uses both BAC arrays and Affymetrix SNP arrays, and picks through their data to manually correct suspected errors. For example, a detected single-copy gain might actually be a loss in a tetraploid cell or a gain in a diploid cell. They then look for a loss of heterozygosity (the loss of one parental allele) to decide which is the case.


Current platforms "are all designed based on the premise that the natural ploidy state of your DNA sample is two," Lo says. What's more, he adds, "sometimes, you can't tell whether [a data problem] is a ploidy issue or a tumor heterogeneity issue." Tumors are often mixtures of many types of cells, and copy-number changes occur differently in each. Since the cells are merged before DNA extraction, the results reflect an average across different cell types. His group is collaborating with Yuhang Wang, a computer scientist at Southern Methodist University in Dallas, who is developing algorithms to control for these issues. For now, says Lo, "you have to really, really think about the results."




OLD DOG, NEW TRICKS

User: George Zogopoulos, University of Toronto


The project: Genome-wide scans to detect CNVs in the general population and in patients with gastrointestinal cancer.


The problem: Genotyping platforms like the Affymetrix array that Zogopoulos uses can generate noisy data and don't cover the whole genome, particularly regions rich in repetitive sequences that are likely to contain CNVs.


The solution: "Given that [SNP arrays] weren't primarily designed for this, it's important to validate using a second laboratory approach," Zogopoulos says. He and his colleagues confirm their results with quantitative PCR. The sensitivity of PCR is often right at the needed level to detect copy-number changes, with the use of a high number of replicates to generate statistical power, he says.


Despite their drawbacks, Zogopoulos stuck with SNP-based arrays for their sensitivity to very fine-scale copynumber changes, and the ability to detect both SNPs and CNVs in a single assay. "We had generated the data for a different project, and we took advantage of the wealth of genetic data and reanalyzed it for copynumber variation."




MULTIPLEX FOR CONTROL

User: Matthew Hurles, Genome Dynamics and Evolution Group, The Wellcome Trust Sanger Institute, Cambridge, UK


The project: Screening genomic samples from thousands of individuals to look for an association with common conditions such as diabetes, rheumatoid arthritis, and hypertension.


The problem: Results from such a large sample are often plagued by what Hurles terms "batch effects." Quality between sets of extracted DNA can vary, or discrepancies in DNA processing might arise simply because different people run the samples at different times. "Systematic differences can really screw up your association studies," Hurles says.


The solution: Most manufacturers offer "multiplex" arrays, which contain fewer probes but multiple sets of the same probes, Hurles says. "They're not going to give you whole-genome coverage, but they're targeted towards the CNVs that you already know exist." One of the best ways to control for batch issues is to run control genomic samples on each subarray in the multiplex setup, he says. "That kind of approach minimizes any systematic differences that you might get between cases and controls. It doesn't get rid of the effect, but it means it has less of an impact on your association testing." As for quality differences in your original DNA samples, Hurles says, "ideally you control that by not having it in the first place."




Technorati :

Microsoft’s mobile team reorgs, focuses on Live services :Microsoft and Nokia


The mobile unit recently reorg'ed in a way that gives the marketing of Microsoft's Live services more visibility.


The Mobile Communications business - under Senior Vice President Pieter Knook - has "realigned," in Microsoft's words. The result: "Two separate discipline teams get rolled into one," a corporate spokeswoman acknowledged, while, at the same time "two robust product marketing organizations" get created. These marketing orgs are "Platform Planning and Product Management" and "Mobile Services Planning and Product Mangement."


The Mobile Services planning team is "responsible for planning and marketing of Windows Live and new mobile service offerings delivered with and through Mobile Operators on Windows Mobile and other converged device phone platforms," the spokeswoman added.


As a result of the reorg:


John O' Rourke will lead the new Platform Planning and Product Management team.
Bart Wojciehowski will lead the Mobile Services Planning and Product Management organization.
Tony Mestres will continue leading the Worldwide Partner and Segment Engagement team.
Scott Horn will continue to lead the Mobile Communications Business' Campaigns and Communications group.
All of these execs continue to report to Knook.


Mobile communication falls under Microsoft's Entertainment and Devices (MED) division. Microsoft considers mobile as a "major component of the Connected Entertainment vision" for MED.


Microsoft has been stepping up its campaign to get mobile phone makers to add support for Windows Live services and the Windows Live suite. In August, it struck such a deal with Nokia for S60 and S40 devices.


source: http://blogs.zdnet.com/microsoft/?p=666





Technorati :

2007 MIT Innovations in Management Conference :Creating and Maintaining a Sustainable Business Strategy


Aletter for you :


The 2007 MIT Innovations in Management Conference is scheduled for December 5-6, 2007. The annual conference is offered by MIT's Industrial Liaison Program (ILP) and co-sponsored by MIT Sloan. This year's theme is "Creating and Maintaining a
Introducing MIT's new Sustainable Business Lab (S-Lab), the conference will feature research outcomes, application concepts, emerging trends, best practices, and enabling technologies that address issues related toencouraging, planning, and measuring sustainability.



For more information on the conference, you can download the brochure or visit the ILP website.



We worked with the ILP to schedule the conference around the dates of some of our executive programs, allowing participants to attend both the conference and a program in one trip. The programs offered near the conference dates are:



We hope to see you in December.



Best regards,



Diana V. García-Martínez


Director, Open Enrollment Programs


Wong Auditorium, Tang Center


From climate change and deforestation to accelerating rates of social inequality and environmental degradation, our current business models are consuming natural and social capital at unsustainable rates. How can business respond? Can sustainability and profit by aligned? How can investments in sustainable products and practices create new markets and build profitable, healthy enterprises?


Developed in conjunction with senior faculty of the MIT Sloan School of Management, this program will seek to answer the following questions:

• What kinds of businesses can be built around making products and services that address the problems of sustainability?
• How can existing profit-driven companies place sustainability at the heart of what they do? What evidence do we have that such practices work?
• How does one value the long-term and intangible costs and benefits?
• What organizational and industry structures can be created to help support and diffuse more sustainable business practices (e.g., industry consortia, public-private partnerships, etc.)?
• How can business take advantage of new scientific and engineering developments to achieve sustainability targets and create new business opportunities in sustainability-related markets?
• How can you successfully manage the change process required to implement best practices around sustainability?


Registration Fees:

Full Registration Fee: $1,750
ILP Members*: complimentary


*A $50 processing fee will apply to each complimentary registrant who fails to attend without canceling in advance. Cancellations must be received in writing via email: register@ilp.mit.edu no later than November 28, 2007.


Registration Payment Methods:

Please make checks payable to MIT.
VISA, MasterCard, American Express and Discover accepted. (Payable in US dollars only.)

Cancellation Policy:

Cancellations received in writing via email register@ilp.mit.edu on or before November 28, 2007 are entitled to a full refund less a $50 processing fee. No refunds will be made after November 28, 2007. Substitutions, may also be made in writing by by November 28, 2007, however, any substitutions after that date will be made at the Conference On-Site Registration Desk.

Accommodations:

A block of rooms has been reserved at the Hyatt Regency Cambridge hotel near the MIT campus. Please call for reservations directly at 1-800-223-1234 or 617-492-1234. Rooms are assigned on a first-come, first-served basis, and reservations must be made no later than November 4, 2007. Please refer to the MIT ILP Innovations in Management Conference to receive the $169 room rate, plus local tax, single or double occupancy.




Technorati :

A colossal project called XFEL located in Germany will allow the collective sciences gain understanding of solar cells, fuel cells and watch how atoms and molecules combine.


The X-ray laser project XFEL holds new possibilities in experimental research. The XFEL project will enable researchers to film chemical reactions, map the atomic details of molecules, and capture the 3D images of the nanocosmos. The acronym XFEL stands for, "X-ray free-electron laser."


An easy description of what XFEL does is that it accelerates to a high energy status electrons and then makes the electrons emit high-intensity X-ray laser flashes.


The X-ray laser is a European endeavor with connections to DESY research center currently in the planning stages of construction. The DESY center in Hamburg, Germany is set for construction in early 2008 and is expected to be completed by 2013, according to a XFEL news release.


The plan involves a connection between the DESY plant in Hamburg-Bahrenfield to the city of Schenefeld. Most of the facility will be housed underground, but portions of the facilities may be observed above ground. The investment for the project construction will cost 986 million Euros. The XFEL project includes in all three sites. The DESY Bahrenfield, Osdorfer Born, and Schenefeld.


The XFEL accelerator tunnel will begin in DESY Bahrenfield. On this site the electrons will be prepared for acceleration. It will also be used to access shafts and halls which will be used in the construction and installation of the components required in the tunnels.
The main accelerator will end at Osdorfer Born, where the electron bunches will be separated and distributed to the various tunnels for generating light. The Schenefeld facility will be the place where the experiments with the X-ray laser. It will be home for over 350 scientists chosen for their interests and expertise in the area. The scientists will include members from Germany and the international community.


The range of applications for this technology include, improving scientific knowledge of the process taking place in fuel cells and solar cells. Scientists will also be able to watch how biomolecules at work observe the detail of how atoms and molecules combine to make materials.


The application of this science can be broadly applied to chemistry, biology, material science and physics. The possibilities for the XFEL technology being utilized in cross-disciplinary experimentation can only be imagined at this point. The unique feature of the XFEL X-ray laser is that the measured flash exposure time is a quadrillionth of a second. This infinitesimal exposure time ensures the photograph or image will not be blurred


Source: http://www.physorg.com/news111337354.html




Technorati :

Research :Functional divergence of former alleles may explain an asexual organism's evolutionary success


Asexual organisms typically have gone extinct within one million years because a lack of genetic exchange doesn't allow for the removal of deleterious mutations or the sharing of advantageous ones. But a class of aquatic invertebrates called bdelloid rotifers have persisted for 35 to 40 million years, earning the term "ancient asexuals."


The divergence of alleles into separate genes with different but advantageous functions could explain the puzzling evolutionary success of certain asexual organisms, researchers report today in Science.


"This could point the way, in part, as to why bdelloids are so successful," David Mark Welch of the Marine Biological Laboratory in Woods Hole, Mass., told The Scientist.


Alan Tunnacliffe at the University of Cambridge and his colleagues examined genes associated with surviving dry spells, or desiccation tolerance, and found two copies for lea genes, which are known to preserve enzymes during desiccation in multiple organisms. Their sequences differed by about 13 percent, which is greater than allele differences in sexual animals. The researchers also localized the genes to different chromosomes, which would be expected of alleles from the same gene, and therefore also expected in former alleles.


Tunnacliffe and his colleagues found that the two genes provide different protective benefits to the animal during desiccation. One gene protects proteins from aggregating, while the other appears to associate with the cell membrane, perhaps preventing it from leaking. "Sequence divergence and subsequent functional divergence helped these organisms survive desiccation," Tunnacliffe told The Scientist.


The evidence supports the idea that these were former alleles that accumulated enough mutations to become separate genes, a process termed the "Meselson effect." Matthew Meselson at Harvard University and Mark Welch first described the process in bdelloids in 2000 in a paper that has been cited more than 140 times. The difference in Tunnacliffe's findings, said Mark Welch, is "he was able to come up with some functional assays," rather than just divergent sequences.


Such divergence gives asexual organisms an advantage, the authors argue -- the effect could not occur in sexually reproducing animals, because alleles become homogenized during recombination. The findings suggest asexual reproduction could actually be an "evolutionary mechanism for the generation of diversity," they write.


So far the Meselson effect has not been observed in other organisms, perhaps because the phenomenon is unique and linked to bdelloid's desiccation tolerance, said Mark Welch, who wrote an accompanying commentary in Science. Another reason is that very few asexual organisms do not undergo meiosis, which is part of the definition of the effect.


However, Roger Butlin at the University of Sheffield told The Scientist that additional genes are not necessarily a straightforward solution to asexuality. "Having more copies of genes doesn't get you out of the problem of [disadvantageous] mutation accumulation," he said. "I think we have to look elsewhere for how they've managed to remain asexual for so long." Butlin said bdelloids' large population size and ability to distribute widely might have contributed to their success.


Butlin said the next step will be to look at the evolutionary fates of other gene copies in bdelloids and Tunnacliffe said he will start to look for other functionally divergent genes. "I think this must be going on throughout the genome," Tunnacliffe said.


The authors assume these genes were former alleles, rather than gene duplication, but their assumption makes sense, Mark Welch noted. "If it was a gene duplication, and if we are right about the structure of the bdelloid genome, then there should be four copies," he said. But because Tunnacliffe found only two divergent genes, it appears they were former alleles. "I personally think they've got it right."


Tunnacliffe's functional assays were done in vitro. He said he would like to do more studies on the activities of the two genes' proteins. "What we'd really like to know is, do these proteins do the same job in a living animal?"




Technorati :

This new mechanism could help explain the ear's remarkable ability to sense and discriminate sounds.


MIT finds new hearing mechanism
Discovery could lead to improved hearing aids


MIT researchers have discovered a hearing mechanism that fundamentally changes the current understanding of inner ear function. This new mechanism could help explain the ear's remarkable ability to sense and discriminate sounds. Its discovery could eventually lead to improved systems for restoring hearing.


The research is described in the advance online issue of the Proceedings of the National Academy of Sciences the week of October 8.


MIT Professor Dennis M. Freeman, working with graduate student Roozbeh Ghaffari and research scientist Alexander J. Aranyosi, found that the tectorial membrane, a gelatinous structure inside the cochlea of the ear, is much more important to hearing than previously thought. It can selectively pick up and transmit energy to different parts of the cochlea via a kind of wave that is different from that commonly associated with hearing.


Ghaffari, the lead author of the paper, is in the Harvard-MIT Division of Health Sciences and Technology, as is Freeman. All three researchers are in MIT's Research Laboratory of Electronics. Freeman is also in MIT's Department of Electrical Engineering and Computer Science and the Massachusetts Eye and Ear Infirmary.


It has been known for over half a century that inside the cochlea sound waves are translated into up-and-down waves that travel along a structure called the basilar membrane. But the team has now found that a different kind of wave, a traveling wave that moves from side to side, can also carry sound energy. This wave moves along the tectorial membrane, which is situated directly above the sensory hair cells that transmit sounds to the brain. This second wave mechanism is poised to play a crucial role in delivering sound signals to these hair cells.


In short, the ear can mechanically translate sounds into two different kinds of wave motion at once. These waves can interact to excite the hair cells and enhance their sensitivity, "which may help explain how we hear sounds as quiet as whispers," says Aranyosi. The interactions between these two wave mechanisms may be a key part of how we are able to hear with such fidelity - for example, knowing when a single instrument in an orchestra is out of tune.


"We know the ear is enormously sensitive" in its ability to discriminate between different kinds of sound, Freeman says. "We don't know the mechanism that lets it do that." The new work has revealed "a whole new mechanism that nobody had thought of. It's really a very different way of looking at things."


The tectorial membrane is difficult to study because it is small (the entire length could fit inside a one-inch piece of human hair), fragile (it is 97 percent water, with a consistency similar to that of a jellyfish), and nearly transparent. In addition, sound vibrations cause nanometer-scale displacements of cochlear structures at audio frequencies. "We had to develop an entirely new class of measurement tools for the nano-scale regime," Ghaffari says.


The team learned about the new wave mechanism by suspending an isolated piece of tectorial membrane between two supports, one fixed and one moveable. They launched waves at audio frequencies along the membrane and watched how it responded by using a stroboscopic imaging system developed in Freeman's lab. That system can measure nanometer-scale displacements at frequencies up to a million cycles per second.


The team's discovery has implications for how we model cochlear mechanisms. "In the long run, this could affect the design of hearing aids and cochlear implants," says Ghaffari. The research also has implications for inherited forms of hearing loss that affect the tectorial membrane. Previous measurements of cochlear function in mouse models of these diseases "are consistent with disruptions of this second wave," Aranyosi adds.


Because the tectorial membrane is so tiny and so fragile, people "tend to think of it as something that's wimpy and not important," Freeman says. "Well, it's not wimpy at all." The new discovery "that it can transport energy throughout the cochlea is very significant, and it's not something that's intuitive."


This research was funded by the National Institutes of Health



Using a stroboscopic imaging system developed in MIT Professor Dennis Freeman's lab, Freeman's team obtained this video of wave motion along the ear's tectorial membrane (at top, the actual video showing nanometer-scale displacements, and at bottom, the same video, motion magnified (Liu et al., 2005) to make the motion more apparent).
View via MIT TechTV




Technorati :

passenger screening machines for Airlines"millimeter-wave passenger imaging technology"


It's the newest weapon in the TSA's air safety arsenal. It's called "millemeter" wave technology" and it's on the job beginning Thursday at the Sky Harbor International Airport in Phoenix.


The machine creates a 3-d image of the passenger's body then sends it to a viewing station in another room where a TSA agent looks for potential threats.


"It's passenger imaging technology, so it allows us to see the entire image of the passenger's body and anything that might be hidden on the person" said Ellen Howe of TSA.


The new technology includes new privacy protection also. The screener in the viewing room can't see the passenger's face and the images from the machine are deleted, once the traveler is cleared to fly.


You'll see the new machine after passing through the first layer of airport security. It's an option for travelers selected for extra screening who don't want to be patted down by an officer.


"This way, they won't have to have anyone touch them, and they can get through the process very quickly" said Howe.


"You don't have to worry about being patted down, they don't have to have somebody there to pat you down. It'll save time, I think, if anything" said traveler Mark Bongiovi.


"Any time they can improve the process, make it more efficient for travelers, it's a good thing" said traveler Wendy Gilpin.


TSA officials say from start to finish the scan takes about 60 seconds. The field tests start Thursday in Phoenix and in the weeks ahead the TSA will be testing in other major cities.


THEnew type of walk-through security machine will debut at several U.S. airports in the coming days as the Transportation Security Administration tries out the latest in body scanning technology.


It's called "millimeter-wave passenger imaging technology," and it produces a more detailed picture than the metal detectors in use now at airports to screen for weapons and explosives..


Because it produces such a detailed image, however, technology and privacy experts at the American Civil Liberties Union are not satisfied that the new technology meets privacy standards.


In a written statement issued Thursday, Barry Steinhardt of the ACLU said the technology can pick up graphic body images and even medical details like whether a passenger has a colostomy bag.


Steinhardt called the screening an "assault on the essential dignity of passengers that citizens in a free nation should not have to tolerate."


TSA spokeswoman Elle Howe said privacy will be respected with the new millimeter-wave technique.


"We want to preserve passengers' privacy and make them feel comfortable with trying a technology like this," she said.


A TSA officer will escort a passenger to the machine for the test, but the person looking at the actual body scans will be at a different location and will not see the passenger, the TSA said.


In addition, the scans will have a "modesty filter" to blur out faces, and no images will be saved.


But the ACLU expressed concern that TSA officers would not be able to resist the temptation to save images of certain people, such as celebrities, and that the plan to blur faces might later be changed.


This is how the new scanners work. The passenger steps into a machine where he or she is quickly scanned with radio waves.


Those waves reflect off the body to transmit a three-dimensional image of the passenger that looks like a fuzzy photo negative. A TSA officer studies the image on a screen and looks for unusual shapes that might mean a passenger is carrying something suspicious.


Passengers who are asked to undergo a second screening can choose a pat-down search or the millimeter-wave test.


The TSA says the machines scan a passenger twice, each scan taking less than two seconds. But it takes another minute or two for a screener to review the scans before signaling a passenger to move on.


The TSA demonstrated the screening technology at a news conference Wednesday near Washington
Howe said the millimeter wave is harmless and "can see more than a magnetometer," which is the first screening machine airport passengers encounter.


"A magnetometer only picks up metal or weapons, so this could see other materials that might be hidden on the body and it also produces an image" rather than just a beep, she said.


Asked if the millimeter wave could detect an object hidden in a body cavity, she said only that the TSA will learn more about the technology as it's tested at U.S. airports.


The TSA has been testing another type of imaging technology called backscatter. This technology also came under some fire because it shows very detailed body images -- which led some critics to call it an electronic strip search. So, the backscatter was altered and blurred to show more of an outline of the body.


The TSA will continue to test the backscatter scanners in some airports. TSA officials said they are a long way from deciding whether they'll settle on just one of these imaging technologies.


Phoenix Sky Harbor Airport in Phoenix, Arizona, begins using the new machines Thursday -- to be offered as an option for people who are asked to be screened a second time.


Los Angeles International Airport in California and John Fitzgerald Kennedy International Airport in New York are also slated to try the machines


Britain, Spain, Japan, Australia, Mexico, Thailand and the Netherlands are using the millimeter wave screening. In the United States, some courthouses and jails are trying it




Technorati :

Find here

Home II Large Hadron Cillider News