Search This Blog

Friday, September 12, 2008

Nanotechnology at the interface of cell biology, materials science and medicine

The atomic force microscope (AFM) and related scanning probe microscopes have become resourceful tools to study cells, supramolecular assemblies and single biomolecules, because they allow investigations of such structures in native environments. Quantitative information has been gathered about the surface structure of membrane proteins to lateral and vertical resolutions of 0.5 nm and 0.1 nm, respectively, about the forces that keep protein–protein and protein–nucleic acid assemblies together as well as single proteins in their native conformation, and about the nanomechanical properties of cells in health and disease. Such progress has been achieved mainly because of constant development of AFM instrumentation and sample preparation methods.

This special issue of Nanotechnology presents papers from leading laboratories in the field of nanobiology, covering a wide range of topics in the form of original and novel scientific contributions. It addresses achievements in instrumentation, sample preparation, automation and in biological applications. These papers document the creativity and persistence of researchers pursuing the goal to unravel the structure and dynamics of cells, supramolecuar structures and single biomolecules at work. Improved cantilever sensors, novel optical probes, and quantitative data on supports for electrochemical experiments open new avenues for characterizing biological nanomachines down to the single molecule. Comparative measurements of healthy and metastatic cells promise new methods for early detection of tumors, and possible assessments of drug efficacy. High-speed AFMs document possibilities to monitor crystal growth and to observe large structures at video rate. A wealth of information on amyloid-type fibers as well as on membrane proteins has been gathered by single molecule force spectroscopy—a technology now being automated for large-scale data collection.

With the progress of basic research and a strong industry supporting instrumentation development by improving robustness and reliability and making new instruments available to the community, nanobiology has the potential to develop into a field with great impact on our understanding of the complexity of life, and to provide a major contribution to human health.

This special issue of Nanotechnology on nanobiology would not have been possible without the highly professional support from Nina Couzin, Amy Harvey and the Nanotechnology team at IOP Publishing. We are thankful for their most constructive and effective help in pushing the project forward. We are also thankful to all the authors who have contributed with excellent original articles, as well as to the referees who have helped to make this special issue such an insightful document of a rapidly moving field.


Andreas Engel1 and Mervyn Miles2
1 University of Basel, Switzerland
2 University of Bristol, UK

MIT work could help develop better computer vision systems


Watch and learn: Time teaches us how to recognize visual objects
In work that could aid efforts to develop more brain-like computer vision systems, MIT neuroscientists have tricked the visual brain into confusing one object with another, thereby demonstrating that time teaches us how to recognize objects.

As you scan this visual scene (indicated with green circle), you spot a beaver out of the corner of your eye. As you glance towards it, the image is swapped for a monkey. Using analogous stimuli to produce swaps at specific locations in the visual field, MIT graduate student Nuo Li and professor James DiCarlo show that the brain starts to confuse different objects after a few hours exposure to this altered visual world. The confusion is exactly that which is expected if the brain uses temporal contiguity to teach it how to recognize objects. Movie courtesy of Nuo Li, McGovern Institute for Brain Research at MIT

It may sound strange, but human eyes never see the same image twice. An object such as a cat can produce innumerable impressions on the retina, depending on the direction of gaze, angle of view, distance and so forth. Every time our eyes move, the pattern of neural activity changes, yet our perception of the cat remains stable.

"This stability, which is called 'invariance,' is fundamental to our ability to recognize objects -- it feels effortless, but it is a central challenge for computational neuroscience," explained James DiCarlo of the McGovern Institute for Brain Research at MIT, the senior author of the new study appearing in the Sept. 12 issue of Science. "We want to understand how our brains acquire invariance and how we might incorporate it into computer vision systems."

A possible explanation is suggested by the fact that our eyes tend to move rapidly (about three times per second), whereas physical objects usually change more slowly. Therefore, differing patterns of activity in rapid succession often reflect different images of the same object. Could the brain take advantage of this simple rule of thumb to learn object invariance?

In previous work, DiCarlo and colleagues tested this "temporal contiguity" idea in humans by creating an altered visual world in which the normal rule did not apply. An object would appear in peripheral vision, but as the eyes moved to examine it, the object would be swapped for a different object. Although the subjects did not perceive the change, they soon began to confuse the two objects, consistent with the temporal contiguity hypothesis.

In the new study, DiCarlo, who is also an associate professor in the MIT Department of Brain and Cognitive Sciences, and graduate student Nuo Li sought to understand the brain mechanisms behind this effect. They had monkeys watch a similarly altered world while recording from neurons in the inferior temporal (IT) cortex -- a high-level visual brain area where object invariance is thought to arise. IT neurons "prefer" certain objects and respond to them regardless of where they appear within the visual field.

"We first identified an object that an IT neuron preferred, such as a sailboat, and another, less preferred object, maybe a teacup," Li said. "When we presented objects at different locations in the monkey's peripheral vision, they would naturally move their eyes there. One location was a swap location. If a sailboat appeared there, it suddenly became a teacup by the time the eyes moved there. But a sailboat appearing in other locations remained unchanged."

After the monkeys spent time in this altered world, their IT neurons became confused, just like the previous human subjects. The sailboat neuron, for example, still preferred sailboats at all locations -- except at the swap location, where it learned to prefer teacups. The longer the manipulation, the greater the confusion, exactly as predicted by the temporal contiguity hypothesis.

Importantly, just as human infants can learn to see without adult supervision, the monkeys received no feedback from the researchers. Instead, the changes in their brain occurred spontaneously as the monkeys looked freely around the computer screen.

"We were surprised by the strength of this neuronal learning, especially after only one or two hours of exposure," DiCarlo said. "Even in adulthood, it seems that the object-recognition system is constantly being retrained by natural experience. Considering that a person makes about 100 million eye movements per year, this mechanism could be fundamental to how we recognize objects so easily."

The team is now testing this idea further using computer vision systems viewing real-world videos.

This work was funded by the NIH, the McKnight Endowment Fund for Neuroscience and a gift from Marjorie and Gerald Burnett.

MIT quantum insights could lead to better detectors



Improved efficiency could enable research, military and medical uses
David Chandler, MIT News Office
September 11, 2008

24hoursnews
A bizarre but well-established aspect of quantum physics could open up a new era of electronic detectors and imaging systems that would be far more efficient than any now in existence, according to new insights by an MIT leader in the field.

MIT Professor of Mechanical Engineering Seth Lloyd has found that a peculiar quantum-physics property called entanglement can be harnessed to make detectors--similar in principle to radar systems used to track airplanes in flight or ships at sea--that are as much as a million times more efficient than existing systems. In addition, beams of entangled light could be swept across a scene to reconstruct a detailed image, with a similar improvement in efficiency.

The new findings, being reported this week in the journal Science, are purely theoretical, but Lloyd says that laboratory experiments have already proven the feasibility of both the light sources and the detectors needed for such a quantum-based photodetection system, so he anticipates that within a year it should be possible to build a laboratory-scale system to demonstrate the new concept.

"It should be possible to have at least a proof-of-principle demonstration within six months to a year," Lloyd said.

For example, military applications could include improved night-vision systems, which send out beams of infrared light--invisible to the naked eye--to sweep across a scene, and then use an infrared detector to reconstruct an image from the light that is reflected back. A more efficient system, using the quantum-entanglement effect, would make it much more difficult for an adversary to detect the fact that such a system was being used, because there would be so much less infrared light needed to provide the illumination.

Theoretically, such a system could be used to allow medical diagnostic systems such as CT scans to work with a vastly reduced X-ray output, thereby making them much safer for the patient, but such applications would be much further in the future. It could also someday be used for safer microscope imaging of living organisms.

Entanglement is a strange property that was deduced theoretically on the basis of the laws of quantum physics, and has been demonstrated over the last several years in a variety of laboratory experiments. Under certain circumstances, when an atom gives off two photons of light at the same time, the two are "entangled" even as they go off in different directions, so that anything that changes one of the photons simultaneously changes the other as well.

This odd property makes it possible to perform seemingly impossible feats such as "quantum teleportation," in which all of the properties of one subatomic particle are recreated in a different particle some distance away. It has also been demonstrated as a way of producing seemingly foolproof encryption systems for data transmission. But explanations of exactly what underlies the entanglement phenomenon remain controversial.

Lloyd says that he cannot provide a simple, intuitive explanation for why the quantum illumination system described in this report actually works, but is certain that the theoretical calculations demonstrating it are correct. "It is as if the two entangled photons retain a memory of each other long after any such memory should have faded away," he said.

'energy revolution' urges by Hockfield- MIT


Hockfield urges Congress to unleash 'energy revolution'
Says increased federal R&D funding key to progress


Reimagining energy - Op-ed by Susan Hockfield, Washington Post, Sept. 11, 2008
Testimony by Susan Hockfield before the House Select Committee on Energy Independence and Global Warming - Transcript

MIT President Susan Hockfield urged Congress Wednesday to sharply increase federal funding for energy research, saying such a move could help unleash an "energy revolution" capable of resolving several of America's problems at once.

"We stand on the verge of a global energy technology revolution," Hockfield said in testimony before the House Select Committee on Energy Independence and Global Warming in Washington. "The question before us is: Will America lead it and reap the rewards? Or will we surrender that advantage to other countries with clearer vision?"

At the hearing, titled "Investing in the future: R&D needs to meet America's energy and climate challenges," Hockfield said boosting federal energy research could simultaneously help address the problems of a shaky economy, geopolitical instabilities linked to energy consumption and security, and the growing evidence of climate change.

"If one advance could transform America's prospects," she said, "it would be having a range of clean, renewable, low-carbon energy technologies, ready to power our cars, our buildings and our industries, at scale, while creating jobs and protecting the planet." Toward that end, the MIT Energy Initiative, in addition to a range of important scientific and engineering advances, has already generated landmark reports on nuclear, geothermal and coal technologies, and has additional reports in the works on solar power, cap-and-trade policy and other energy approaches.

Chaired by Massachusetts Congressman Edward Markey, the House Select Committee on Energy Independence and Global Warming was created last year to address issues related to the urgent challenges of oil dependence and climate change. In addition to Hockfield, the committee heard testimony from Stephen Forrest, vice president of research at the University of Michigan; Jack Fellows, vice president of the University Corporation for Atmospheric Research; and Daniel Kammen, professor at UC-Berkeley.

While federal funding for energy research has helped power the economy in the past, Hockfield noted, it has dwindled alarmingly in recent years, from 10 percent of the federal research budget in 1980 to just 2 percent today. At the same time, corporate R&D by energy companies has also plummeted, she said, to less than one-quarter of 1 percent of revenues, compared to the 18 percent invested by pharmaceutical companies.

"Congress funded the basic research that spawned the information technology revolution and the biotech revolution," she said. "Today, to spark an energy revolution, Congress must lead again."

Hockfield pointed out that at the beginning of World War II, former MIT Dean of Engineering and Vice President Vannevar Bush persuaded President Franklin D. Roosevelt to make major investments in R&D, which resulted in innovations that not only helped to win the war but also spurred an ongoing partnership between the government and universities that "launched many of our most important industries, produced countless medical advances and spawned virtually all of the technologies that define our modern quality of life."

There is great potential for a similar impact today, she said.

Hockfield was asked for her impression of how much interest there was among students in working on such energy technologies. "The students' interest level is absolutely deafening," she said. "Students are wildly enthusiastic." As an example, she pointed to work done by the student-led MIT Energy Club, with its more than 700 members.

To take the lead in developing the new energy technologies the world needs, Hockfield said, the United States should triple its investment in energy research promptly, then move to a higher level as the Department of Energy builds its capacity to translate basic research to the marketplace. She called for industry, government and universities to work together on a collaborative "roadmap" to plan those next steps for coming years. And she emphasized the importance of spreading that research money broadly across a portfolio of energy research areas, not just those that seem poised for the most immediate return.

"We can't choose winners now, we don't know what they will be," she said.

The first step, she suggested, is to set up the collaborative panel to create a detailed strategic plan for the coming years.

"We need work going on across a range of technologies," Hockfield said. "We need to develop everything we can get our hands on." By doing so, she said, "we can turn this global energy challenge into a global opportunity."

Hockfield will speak on energy again next week in Washington, at a press conference Wednesday at the National Press Club, which will also feature two energy industry leaders and the director of a national laboratory. The event will highlight the importance of federally funded R&D to the nation's commercial competitiveness. And MIT will also be represented at a hearing on energy policy this Friday before the Senate Energy Committee, which will hear testimony from Institute Professor John Deutch.

See detail ;

Testimony before the House Select Committee on Energy Independence and Global Warming

MIT awaits data from world's biggest physics experiment


The compact muon solenoid (CMS) experiment at CERN's Large Hadron Collider will look for the Higgs boson, shown here in simulation.
Dozens of MIT physicists are waiting anxiously to sift through data from the world's biggest physics experiment, which officially started today when scientists sent the first beam of protons zooming at nearly the speed of light around the 17-mile Large Hadron Collider near Geneva, Switzerland.

Some 40 MIT researchers are among the thousands of physicists from around the world collaborating on the LHC, the world's most powerful particle accelerator. MIT has the largest American university group working on one of the collider's four detectors, known as the CMS (compact muon solenoid) detector, and a smaller group working on another LHC detector known as ATLAS (a toroidal LHC apparatus).

The first circulating beam is a major accomplishment on the way to the ultimate goal: high-energy beams colliding in the centers of the LHC's particle detectors. Scientists participating in these experiments will analyze the collisions in search of extraordinary discoveries about the nature of the physical universe. Beyond revealing a new world of unknown particles, the LHC experiments could explain why those particles exist and behave as they do. They could reveal the origins of mass, shed light on dark matter, uncover hidden symmetries of the universe and possibly find extra dimensions of space.

"The start of the LHC culminates about 20 years of design and construction work. The accelerator and the experiments are ready to go. We expect LHC data to arrive on MIT campus very shortly," says MIT Professor Bolek Wyslouch of the CMS group. "We hope to see new particles and new processes that may explain probably the most fundamental properties of matter."

For physicists, the excitement about the first beam event is unparalleled. "For much of my career, starting in the early 70's, the standard model of high-energy physics has worked marvelously well but some of its foundations still remained untested," says MIT physicist Frank Taylor, the U.S. ATLAS muon project leader. "Theoretical physicists have been very creative over the last three and a half decades with many beautiful ideas which are mathematically consistent but may not represent nature. Now we have an instrument to check these theories and perhaps to find something not even dreamed of. We're very excited!"

Added Professor Steven Nahn, another member of the CMS team, "The LHC represents the first opportunity in a long time to both close the chapter on the prevailing model of how our world works on the most fundamental levels, and, at the same time, perhaps start a whole new chapter. I feel like I'm Vasco de Balboa seeing the Pacific for the first time -- a whole new ocean out there -- not sure how big it is or what it contains, but it is certainly worth exploring."

Other MIT members of the CMS team are Associate Professors Christoph Paus and Gunther Roland, Professor Wit Busza and senior research scientist George Stephans.

The LHC is operated by the European Organization for Nuclear Research (CERN). The accelerator is located on the outskirts of Geneva near the French border, lying below farmland at depths ranging from 60 to 120 meters.

East and west -Europe is the best

LHC proves that Europe is the center of physics

America vs Europe ? now a days the comments from the global commentry is that Europe is the best in Physics
The successful start-up of the Large Hadron Collider represents not just a huge victory for particle physics but also a victory for Europe. Once upon a time there was a brain drain from Europe to the U.S. – not only Albert Einstein in the 30s but also Wehrner von Braun in the 40s (”Once the rockets are up who cares where they come down? That’s not my department, says Wehrner von Braun”) and all the way through the 1970s, 80s and 90s.
But today? There’s no doubt that Europe – especially CERN — is the center of the science world. The Europeans took the lead in building the LHC, kicking in $6 billion. The US contribution? Just over $500 million, Alan Boyle reports at MSNBC.
Besides the LHC, there’s the ITER fusion research center in southern France and potentially another fusion project, the HiPER laser-fusion facility.
Meanwhile, in Washington, politicians yanked support for ITER and ripped $94 million out of physics research. Some of the funding has been restored but many positions were lost.
Michio Kaku points to the cancellation of the planned Superconducting Super Collider in 1994.

“Let’s be blunt about this: There could be a brain drain of some of our finest minds to Europe, beause that’s where the action is,” Kaku said. “We had our chance, but Congress canceled our supercollider back in 1994. We’re out of the picture. We can basically tag along after the Europeans, begging them for time on their machine — but really, the action is in Europe now.”

What will the US role be for the next major project, the International Linear Collider? The US is supposedly interested but it will have to compete with newly rich nations like China and India that boast serious scientific minds of their own. Beijing just hosted an exploratory meeting on hosting the ILC.

Find here

Home II Large Hadron Cillider News