Search This Blog

Sunday, November 4, 2007

Technology : MIT develops 'tractor beam' for manipulation of cells on silicon

E. coli cells are manipulated on a silicon chip by MIT researchers using 'optical tweezers' to form the letters 'MIT.'

Tool could manipulate tiny objects on a chip,

In a feat that seems like something out of a microscopic version of Star Trek, MIT researchers have found a way to use a "tractor beam" of light to pick up, hold and move around individual cells and other objects on the surface of a microchip.

The new technology could become an important tool for both biological research and materials research, say Matthew J. Lang and David C. Appleyard, whose work is being published in an upcoming issue of the journal Lab on a Chip. Lang is an assistant professor in the Department of Biological Engineering and the Department of Mechanical Engineering. Appleyard is a graduate student in Biological Engineering.

The idea of using light beams as tweezers to manipulate cells and tiny objects has been around for at least 30 years. But the MIT researchers have found a way to combine this powerful tool for moving, controlling and measuring objects with the highly versatile world of microchip design and manufacturing.

Optical tweezers, as the technology is known, represent "one of the world's smallest microtools," says Lang. "Now, we're applying it to building [things] on a chip."

Says Appleyard, "We've shown that you could merge everything people are doing with optical trapping with all the exciting things you can do on a silicon wafer … There could be lots of uses at the biology-and-electronics interface."

For example, he said, many people are studying how neurons communicate by depositing them on microchips where electrical circuits etched into the chips monitor their electrical behavior. "They randomly put cells down on a surface, and hope one lands on [or near] a [sensor] so its activity can be measured. With [our technology], you can put the cell right down next to the sensors." Not only can motions be precisely controlled with the device, but it can also provide very precise measurements of a cell's position.

Optical tweezers use the tiny force of a beam of light from a laser to push around and control tiny objects, from cells to plastic beads. They usually work on a glass surface mounted inside a microscope so that the effects can be observed.

But silicon chips are opaque to light, so applying this technique to them is not an obvious move, the researchers say, since the optical tweezers use light beams that have to travel through the material to reach the working surface. The key to making it work in a chip is that silicon is transparent to infrared wavelengths of light--which can be easily produced by lasers, and used instead of the visible light beams.

To develop the system, Lang and Appleyard weren't sure what thickness and surface texture of wafers, the thin silicon slices used to manufacture microchips, would work best, and the devices are expensive and usually available only in quantity. "Being at MIT, where there is such a strength in microfabrication, I was able to get wafers that had been thrown out," Appleyard says. "I posted signs saying, 'I'm looking for your broken wafers'."

After testing different samples to determine which worked best, they were able to order a set that was just right for the work. They then tested the system with a variety of cells and tiny beads, including some that were large by the standards of optical tweezer work. They were able to manipulate a square with a hollow center that was 20 micrometers, or millionths of a meter, across--allowing them to demonstrate that even larger objects could be moved and rotated. Other test objects had dimensions of only a few nanometers, or billionths of a meter. Virtually all living cells come in sizes that fall within that nanometer-to-micrometers range and are thus subject to being manipulated by the system.

As a demonstration of the system's versatility, Appleyard says, they set it up to collect and hold 16 tiny living E. coli cells at once on a microchip, forming them into the letters MIT.

The work was supported by the Biotechnology Training Program of the National Institutes of Health, the W.M. Keck Foundation and MIT's Lincoln Laboratory.

Technorati : , , , ,

Nanotechnology :MIT works toward 'smart' optical microchips

Light-powered micro-machines could advance telecommunications,

Rings, one millionth of a meter in size, are the moving parts of a 'smart' micromachine that could be powered and controlled by light on an optical chip. The rings move around and adapt to the color of light that is traveling through the bar, right.

A new theory developed at MIT could lead to "smart" optical microchips that adapt to different wavelengths of light, potentially advancing telecommunications, spectroscopy and remote sensing.

Postdocs Peter Rakich, left, and Milos Popovic of MIT's Research Laboratory of Electronics stand in front of a monitor that shows a demonstration of the way they propose to control microchips with light.

Drawn by the promise of superior system performance, researchers have been exploring the concept of microchips that manipulate light instead of electricity. In their new theory, the MIT team has shown how such chips could feature tiny machines with moving parts powered and controlled by the very light they manipulate, giving rise to fundamentally new functionality.

"There are thousands of complex functions we could make happen by tinkering with this idea," said Peter Rakich, an MIT postdoctoral associate who invented the theoretical concept along with postdoc Milos Popovic. The work was described in the cover story of the November issue of Nature Photonics.

For example, such chips could one day be used to remotely adjust the amount of bandwidth available in an optical network, or to automatically process signals flowing through fiber-optic networks, without using any electrical power, Rakich said.

Coauthors on the paper were Marin Soljacic, assistant professor of physics; and Erich Ippen, the Elihu Thomson Professor of Electrical Engineering and professor of physics.

"The idea that opto-nanomechanical devices can be designed to self-adapt to all-optical control--i.e., by self-aligning their resonances to optical control frequencies and by permitting all-optical tuning and dimension control--is new and exciting," said Ippen.

Earlier this year an MIT team composed of many of the same researchers showed that photonic circuitry could be integrated on a silicon chip by polarizing all of the light to the same orientation. The current work shows how tiny mobile machines can be built on such chips, taking advantage of the substantial pressures exerted by photons as they strike the walls of a cavity.

In the macroscopic world, light waves do not exert significant forces, but in the unique world of the microscopic, coupled with ultrapure laser light, photons bouncing off the walls of a cavity can build up a measurable force called radiation pressure. This is similar to the pressure exerted by gas molecules trapped in an aerosol can.

To take advantage of this radiation pressure, the researchers propose machines built from ring-shaped cavities only millionths of a meter in size located on the chip surface. When pressure on the cavity walls is high enough, the cavity is forced to move. This movement forms a critical part of an optical micromachine, which adjusts its configuration to respond to light in a predesigned way.

A unique application of this concept involves processing data that travels in fiber-optic networks. Today resonators employed in fiber-optic networks have to be synchronized with the incident light to ring at its frequency, in the same way an opera singer has to tune the pitch of her voice to make a wine glass ring.

Remarkably, a "smart" resonator based on the MIT concept could chase the frequency (color) of the laser light incident upon it. As the frequency of the laser beam changes, the frequency of the resonator will always follow it, no matter where it goes.

In other words, this new, unique resonator is like a wine glass that self-adjusts to the pitch of the singer's voice and follows it along throughout a song, Rakich said. He noted that physical systems that adapt to driving light and behave like these nanomachines do not exist elsewhere in nature.

By coupling the resonating cavities with nano-scale cantilevers, optical devices analogous to microelectromechanical systems (MEMS) devices can be created.

Although the researchers focused on ring-shaped cavities, their model could be applied to other structures as well.

"Our objective now is to develop a variety of light-powered micro- and nanomachines with unique capabilities enabled by this technology," explained Popovic. "But the first step will be to demonstrate the concept in practice."

The research was funded in part by the Army Research Office through MIT's Institute for Soldier Nanotechnologies.

Technorati : , , ,

TECH Market and economy : Silicon Valley economy despite uncertainty nationwide

Technology companies are prospering while Wall Street continues to deliver shocks like Merrill Lynch's $7.8 billion write-down due to a collapsing mortgage market and the reported looming departure of Citigroup Chief Executive Charles Prince. Adding to the woes, oil prices have topped $90 a barrel.

A week of turmoil in the financial industry and stock market has left technology stocks and Silicon Valley largely unscathed, reflecting trends that are making the region an economic star.

The tech-heavy Nasdaq composite index finished slightly higher over the past week, while the Dow Jones industrial average and Standard & Poor's 500 index had significant drops.

Apple had a surprisingly strong quarter, and Google stock passed $700 a share as the Mountain View Internet search company joined eBay, Yahoo, Intel, Seagate Technology and Genentech in reporting strong revenues. Microsoft revenue was up more than 23 percent from the same quarter in the previous year.

The strength extends to the broader Silicon Valley economy. A forecast by Spectrum Economics, released last month, reports that the region "unleashed itself from a faltering U.S. economy" during the past two years.

"Silicon Valley soared while the U.S. economy swooned," said the September 2007 update to the study, done for the San Jose Redevelopment Agency.

The United States may have lost its global competitive advantage in many areas, bua clear advantage, said Sung W. Sohn, Hanmi Bank's chief executive and president.

The housing market's troubles haven't affected the valley's overall economy, Spectrum said, while rising energy costs benefit high tech and green tech, one a valley staple and the other an emerging growth area.

"If you want to save energy, you have to use more electronics," Spectrum's chairman, Richard Carlson, said in an interview.

Venture capital is making a comeback in the wake of the mortgage bubble. "Compared to those screwball collateralized mortgage obligations, high tech looks simple," Carlson said.

Exports are buoying up the regional economy, too, as the declining dollar makes U.S. products a bargain to foreign buyers.

American technology companies "are huge exporters," Carlson said.

Though there was a drop of almost 2,200 in valley jobs in September over the month before, the numbers were skewed by a decline in public-sector hiring. That may be due to a late start for the school year, he said. Computer manufacturers added 1,800 jobs in September compared with a year earlier, but posted small losses from the previous month.

"The nation's economic problems will continue, but not at a level that is likely to threaten Silicon Valley's growth," the Spectrum study concluded.

The valley's job growth may slow somewhat but should outpace the nation's, according to Stephen Levy of the Center for Continuing Study of the California Economy.

"We'll be affected by housing, but less so because we didn't have a surge in home building or subprime loans," Levy said.

"We have the strongest new exciting sector in the world of opportunities around clean tech and the new funding for it," Levy said. "We are alive on the venture capital side and the Internet side. Nothing that's happened has threatened the resurgence of our economic base."

While the U.S. Commerce Department reported that the national economy turned in a solid 3.9 percent third-quarter growth in gross domestic product, economists worry that it may be the last encouraging quarterly number for a while.

That's because while consumer spending has remained strong, economists puzzle over where consumers were getting their spending money. The refinancing boom is over, and that was believed to have propelled consumer spending over the past two or more years.

"The 'Energizer Bunny' of the U.S. economy is the consumer," Carlson said. "The consumer has been taking some pretty serious hits, but it just keeps going."

t technology is one where it retains .

Technorati : , ,

EARTH QUAkE : Living on a fault: Homeowners shake off quake risk

Earth quake :

While only a few people in the east foothills experienced the shaking so near the epicenter, the Sabins are hardly alone living close to danger. There are at least 369,000 more people living near three major faults - the San Andreas, the Hayward and the Calaveras - than during the 1989 Loma Prieta quake. That 6.9 temblor, centered in the Santa Cruz Mountains, killed 62 people and caused $6 billion in destruction.

The newer population .

estimate comes from a 24hoursNews review of U.S. Geological Survey fault maps and data collected in 1990 and 2007 by the U.S. Census Bureau and the California Department of Finance. It is almost certainly low.

It does not include the populations of unincorporated areas of Santa Clara, Alameda, Contra Costa and San Mateo counties. And it does not include the growth of cities such as San Jose and Palo Alto, only parts of which are within five miles of one of the faults.

But for perspective, those 369,000 people represent more than 40 percent of the total population growth in those four counties since 1990.

The most growth - more than 200,000 people - has been along the Hayward Fault, particularly in Milpitas, Fremont and Hayward - an area seismologists say could be primed for another big shake.

A tour of fault country, days after the Bay Area's strongest quake since Loma Prieta, took a reporter from the redwood-clad billionaire estates of Woodside to the sunny suburban hillsides of Fremont to the rural hills above San Jose, where somebody like Kathy Sabin can still have enough land to keep 47 animals near the nation's 10th-most-populous city.

Homeowner after homeowner, when asked why they moved so close to a known fault, said accepting earthquake danger is part of the bargain you strike in exchange for the incredible views, weather, culture and outdoor life of the Bay Area. None of them intends to move. And there is also a subtle psychological adjustment: It's not so much a discounting of earthquake danger, but a sense that other natural disasters elsewhere are somehow worse.

As a California native, "I'm kind of used to earthquakes," said Kathy Sabin. "I would be more afraid of a tornado or a hurricane."

If the East Coast and the Gulf Coast have their hurricanes and blizzards, and the Midwest its tornadoes and floods, earthquakes are "our" natural disaster. Even as they threaten us, they also seem to define who we are.

San Andreas Fault

Loaded with plywood building materials, the truck wound its way through the rural groves of Woodside, past gated estates, past women on horseback, past an aptly named "Why Worry Lane."

The center of Woodside is less than a mile from the San Andreas Fault, but there's a lot of construction. Town records say the value of building permits for new construction, additions and alterations is up 35 percent in 2007 over the same period last year.

Real estate agents can recount multimillion-dollar sales falling through because of a mountain lion in the back yard, but hardly ever due to the nearby fault.

Though people are aware of the fault, it has not hurt real estate values, said Jayne Williams, an agent with Coldwell Banker. She was speaking with a friend - Pam McReynolds of La Honda - in the Woodside center, across the street from where Williams had been during the Loma Prieta quake.

"We've grown up with it," Williams, a Woodside native, said of the fault.

Of course, Williams' sister moved to Cape Cod to escape earthquakes. But Williams and McReynolds said they wouldn't trade natural disasters with her.

Farther south along the San Andreas - even on Loma Prieta Way in the mountains above Los Gatos, where a wag might say you're asking for trouble - new homes are being built.

But asked for the most earthquake-safe place to build a house in the Bay Area, USGS seismologist Tom Brocher only chuckles.

"As a rule of thumb, if you're within five miles, you are going to be strongly shaken," he said.

"If I told you to live in San Ramon, you're living on the Calaveras Fault, or close to it. In Pacifica, you're on the San Andreas Fault. Almost all of us live within five miles or so of one of the major faults."

Brocher said people should consider the earthquake hazard zone disclosures amid the stack of papers they sign when they buy a home.

"All you want to do is sign and take ownership of the house," he said. "We would like people to pay attention to that. They should definitely be aware of the risk."

And where in the Bay Area does a USGS seismologist live?

Brocher said he chose his home in Millbrae as much for its flat lot - safe from earthquake-triggered landslides - and its distance from soil liquefaction hazards around San Francisco Bay, as for its relative proximity to a fault line. He's just 1.5 miles from the San Andreas.

"I told my wife, 'Look you're not going to have any view of the bay,' " he said.

Hayward Fault

A good chunk of the Bay Area's population wasn't living here during the Loma Prieta quake. Many weren't even alive. About 29 percent of the population of Santa Clara County is under 21 - too young to remember Loma Prieta even if they had been born in 1989.

So for many, the Alum Rock quake was a first. But not for Sandy Movahed. The Fremont resident definitely remembers 1989: The Loma Prieta quake demolished her office in San Francisco. And yet she bought a home with her husband a few hundred yards from the Hayward Fault. A court reporter who transcribed lengthy state hearings on earthquakes, Movahed knows plenty about the phenomenon. She is confident the Bay Area's strict building codes will provide some protection.

"Bottom line," she says, "nobody knows when there's going to be an earthquake."

And one more thing: Her sunny neighborhood under the grassy hills, which overlooks the silver sweep of San Francisco Bay, has "probably the most perfect weather in the world."

Calaveras Fault

At the epicenter of Tuesday's quake, nobody seemed too concerned about future quakes.

Don and Joann Reed said the Loma Prieta quake caused a mini-tsunami that washed right out of their swimming pool. This quake did nothing like that.

"We're on fractured rock here, and I hear tell that fractured rock is better for you than solid rock," Joann Reed said.

Living on the Calaveras Fault for 32 years, the retired couple dismissed the thought of moving to escape a quake.

"Earthquakes don't happen as often as the hurricanes, and the tornadoes and the floods and those bad electrical storms and the awful heat with the humidity in the summertime and the just intolerable cold in the wintertime," Joann Reed said of the troubled life elsewhere. "Here it's an occasional earthquake and the rest of the time it's just like heaven."

Technorati : , , ,

Single spins controlled by an electric field

Single spins controlled by an electric field

Researchers in the Netherlands have shown that it is possible to control the spin of a single electron by using an electric field rather than a magnetic field, as is usually the case. The breakthrough could have potential applications for spintronics and quantum computing

Spintronics is a growing area of research that exploits the spin as well as the charge of electrons. It is has already been used to increase the amount of data that can be stored on hard-disks and could someday form the basis of practical quantum computers that perform calculations by manipulating the spins of single electrons.

A key element of spintronics is the ability to flip the spin of an electron from a spin-up to a spin-down state. In the new work, a team led by Lieven Vandersypen at the Kavli Institute of Nanoscience at Delft University of Technology deposited metallic gold gates onto a gallium arsenide substrate, creating a small region where only a single electron can sit. The researchers were then able to use these so-called "quantum dots" to manipulate the spin of the electron in a controlled manner.

Although previously researchers have been able to flip the spins of electrons confined in these dots by applying a magnetic field, it is not easy to generate a magnetic field locally on a chip that is strong enough to rotate the spin. "To then manipulate an array of single spins is almost impossible," says Vandersypen.

In their new experiments, the team used two quantum dots each separated by 0.2 ┬Ám. If the spins in the dots are both parallel, neither electron can hop from one dot to the other because of the Pauli exclusion principle. However, applying an electric field causes one of the spins to rotate.

Indeed, if the field is applied for long enough the electron's spin can rotate until it is anti-parallel to the other electron, then it can jump across to the other dot and cause a current flow. Eventually, if the field is applied even longer, the spin goes back to being parallel again. Vandersypen's PhD students Katja Nowack and Frank Koppens, who carried out the experiment, found that the current varies sinusoidally when plotted against the time over which the electric field is applied. Known as Rabi oscillations, this finding proved they were able to control the rotation of the spin.

The driving mechanism for an electric field to control the spin of an electron lies in the spin-orbit interaction. As the electron orbits around a nucleus it produces a magnetic field that changes its own magnetic moment so that, in the electron's rest frame, an electric field appears as a magnetic field. The team calculated that the coupling from the gallium arsenide electric field to the single electron's spin in the quantum dot is strong enough to be able to change the direction of its spin when an electric field is applied.

Having shown that it is possible to control single spins in quantum dots via localized electric fields, the researchers at Delft now plan to produce an array of quantum dots where each electron's spin state can be manipulated. They plan to use these arrays to form controllably coupled spins, which could pave the way for producing entangled states between the electrons.

The spin on a single electron has revealed itself to US researchers thanks to magnetic resonance force microscopy. Their research could lead to a new approach to three-dimensional imaging of biological molecules at atomic resolution and the long-awaited read-out devices for quantum computers

Technorati : , , , ,

MySpace joins Google's OpenSocial

MySpace joins Google's OpenSocial

MySpace has signed up to Google's new OpenSocial standard, which aims to bring interoperability to web applications that can be embedded in social networking websites.

This means that MySpace will be able to offer all applications created by third-party developers that are compatible with the OpenSocial application programming interface (API).

For developers, the addition of MySpace to OpenSocial is a major step, opening their applications to that social networking site's massive base of users.

On Tuesday, Google confirmed the existence of the OpenSocial programme, which is widely seen as not only Google's strongest move in social networking to date, but also as a response to the rising popularity - and threat - of Facebook.

Although Facebook is the second-most popular social networking site in the world, it is growing faster than MySpace, thanks in large part to the fact that Facebook opened its platform to external developers in May, something MySpace is now copying.

To date, Facebook has about 7,000 applications available for its members. It hasn't said whether it will participate in OpenSocial, although Google has said the door is open.

"Despite reports, Facebook has still not been briefed on OpenSocial. When we have had a chance to understand the technology, then Facebook will evaluate participation," said Brandee Barker, Facebook spokeswoman.

OpenSocial can, in theory, dilute this distinguishing feature of Facebook, by offering a core set of APIs that will let developers write an application once that is compatible with multiple sites.

In other words, OpenSocial seeks to address the inconvenience for developers of having to port applications to different social networking websites.

Other partners in OpenSocial include Oracle, Salesforce, Hi5, iLike, LinkedIn, Slide, Ning, Friendster, Six Apart and Plaxo.

Originally considered of interest only to teens and young adults for communicating with friends, social networking sites have broadened their appeal as they have proven useful for professional networking and business activities.

Within sites like Facebook, a lot of formerly dispersed online activities are united under a single virtual roof, making these sites very attractive for advertisers. Users share a lot of personal information on these sites, making the users easy to target with ads.

There are also question marks about advertising on social networks, primarily because their content is mostly unregulated, and sometimes objectionable, as it is generated by millions of individuals. In addition, social networking sites are under close watch by law enforcement agencies worldwide, because sexual predators have used these sites to stalk and victimise others, including minors.

An earlier sign of Google's sense of urgency about the social networking market was its reported courting of Facebook when the latter was recently seeking a partner to invest in the company and earn a deal to provide advertising to it.

Microsoft eventually won, buying a 1.6 percent stake that values Facebook at an eye-popping $15 billion, although the social networking company reportedly will have revenue of just $150 million this year.

OpenSocial represents "the first release of technical details" for the forthcoming MySpace application development platform, Google and MySpace said.

MySpace, which already runs ads from Google's ad network, has been in talks for the past year about possible collaborations in the area of social applications, the company said.

MySpace's own application development platform will launch "in a few months," but with the OpenSocial APIs, developers will be able to start writing applications for MySpace immediately.

"The applications will be able to be tested within the MySpace environment and then the MySpace Platform will officially go live in the coming months," read a company statement.

Some applications built with the OpenSocial APIs will be available in test mode on the MySpace site prior to the platform's launch.

Technorati : , , , ,

Emergency operation saves space station

Emergency operation saves space station

Solar panels fully deployed after spacewalkers make repairs

Astronauts successfully unfurled a torn solar power wing at the international space station on Saturday after spacewalker Scott Parazynski cut loose a tangled clump of wires and patched everything up.

His emergency surgery saved the solar energy panel - and the space station.

"This was just a fabulous effort," said Mike Suffredini, the space station's program manager. "Our baby is still beautiful to us."

In the tense buildup to the spacewalk - one of the most difficult and dangerous ever attempted - NASA repeatedly warned that station construction would have to be halted if the wing could not be fixed.

The prospect was so grave that NASA felt it had no choice but to put Parazynski practically right up against the swaying power grid, which was coursing with more than 100 volts of electricity. No other astronaut had ever been so far away from the safe confines of the cabin.

Even before Parazynski made his way back inside, the radio traffic was full of cheers and congratulations.

Shouts of "Yay! All right! Beautiful! Great news!" streamed from the linked shuttle-station complex once the wing was unfurled to its full 115-foot (35-meter) length. Mission Control promptly relayed thanks from NASA's top brass.

"It was an honor," Parazynski replied.

The commander of the docked shuttle Discovery, Pamela Melroy, who supervised the wing repairs, cautioned everyone to hold off on "the victory dance" until Parazynski and his spacewalking partner, Douglas Wheelock, were safely back inside. "Then we can all rejoice," she said.

Tense hours
It took almost an hour for Parazynski to be maneuvered back from the wing, riding on the end of a nearly 90-foot (27.5-meter) robotic arm extension that just barely reached the damage. That's how long it took him to get out there, too.

Parazynski worked on the damage for more than two hours, cutting hinge and guide wires that became snarled and snagged the wing when it was being extended Tuesday. The astronauts had just relocated a massive beam at the space station, and finished extending its first solar power wing, when the second wing got hung up after extending only 90 feet (27.5 meters).

Technorati : , , , ,

Innovation software increases efficiency

Innovation software increases efficiency

Invention Machine has released the Japanese version of it Goldfire Innovator software.
Invention Machine, a provider of software that accelerates the process of innovation, has released the Japanese version of Goldfire Innovator. Invention Machine's Goldfire Innovator software brings unprecedented structure and discipline to the innovation process, enabling engineers to systematically tackle engineering problems across a product's lifecycle to reduce the delivery time for new product innovations, rapidly resolve problems in existing products and continuously improve production processes.

Goldfire Innovator empowers users to rapidly fuel product pipelines with more competitive, more cost-effective and higher performing product offerings.

Goldfire Innovator brings efficiency and repeatability to the product innovation process by integrating a problem analysis workbench with a semantic knowledge engine and relevant technical content.

Goldfire Innovator enables organisations to better access internal corporate knowledge as well as worldwide technical literature, such as a database of more than 15 million patents and a library of thousands of proprietary scientific effects.

This allows them to deliver precise answers to problem challenges, stimulate concept generation and to avoid duplication of effort.

Goldfire Innovator includes a significant number of customer-driven enhancements to Invention Machine's widely used TechOptimiser and semantic technologies, bringing together problem analysis methodologies and concept generation capabilities along with pinpoint access to worldwide patent databases and rich scientific and engineering content.

Beyond significant ease of use enhancements over earlier products, Goldfire Innovator features preconfigured and customisable workflows that guide users through problem solving processes, improve problem analysis and understanding, focus efforts on solving the right problem, and avoid rework and conceptual dead-ends.

'We are committed to enabling our Japanese clients to make the innovation process more structured, repeatable and efficient', said Mark Atkins, Chairman, President and CEO of Invention Machine.

'By increasing research and ideation efficiencies, Goldfire Innovator allows engineers to conceive more and better ideas and to overcome technical obstacles to fuel product pipelines and improve production processes'.

'Goldfire Innovator provides the Japanese market with an innovation platform to enable sustainable growth, competitive advantage and customer base expansion'.

'Based on our eight years' experience of consulting, supporting and marketing Invention Machine's products in Japan, we are sure that the Japanese version of Goldfire Innovator will become a more powerful weapon for the engineers in the Japanese manufacture to solve their problems', said Yoshihisa Konishi, Project Manager, Knowledge Creation Business of Mitsubishi Research Institute.

'Especially in Japan, where the intellectual property problems are gaining importance, this problem solving environment will exert far reaching effects'.

Technorati : , , ,

Computer Sims Vital Tools in Exploring Nanoworld

Computer Sims Vital Tools in Exploring Nanoworld

Years ago, when Uzi Landman and his colleagues set out to uncover some of the rules that govern why a non-reactive metal like gold acts as a catalyst when it is in nanoclusters only a few atoms in size, they didn't sit down in a lab with the precious metal. Instead, they ran computer simulations and discovered that gold is a very effective catalyst when it is in clusters of eight to two dozen atoms. They also found that electrical charging of gold is crucial to its catalytic capabilities. Six years later, the team has verified their earlier predictions experimentally, and they stand ready to further explore environmental effects on catalysis.

This practice of partnering computer simulations with real-world experiments is becoming more vital as scientists delve deeper into realms where the actors are measured on the nanoscale, Landman told a group of scientists Thursday, February 17 at the annual meeting of the American Association for the Advancement of Science (AAAS).

"Small is different," said Landman, director of the Center for Computational Materials Science and professor of physics at the Georgia Institute of Technology. "We cannot use the way physical systems behave on the large scale to predict what will happen when we go to levels only a few atoms in size. In this size regime, electrons transport electricity in a different way, crystallites have different mechanical properties and gold nanowires have strength twenty times larger than a big bar of gold, and inert metals may exhibit remarkable catalytic activity. But we know the rules of physics, and we can use them to create model environments in which we can discover new phenomena through high-level computer-based simulations."

Computers are constantly becoming more powerful and capable of conducting more detailed explorations at the same time scientists across the globe are increasing their interest in the science of the small. The intersection of these two trends, said Landman, is allowing scientists to investigate realms that are too small for today's technology to explore experimentally.

It's not just a matter of making faster calculations, he said. "Experimentally, we can't always go down to the resolution we need to see, explain and predict things, but with computer simulations we can go to any resolution we need," said Landman. "Therefore, you can ask questions, deeper questions, on how materials behave on the small scale, even if you can't get to that fine resolution experimentally."

This doesn't mean that experiments aren't necessary, said Landman. "It's a supplementary and complimentary approach. The pillars of scientific methodology are composed now of experimentation, analytical theory and computer simulation."

In addition to their work on nanocatalysis, Landman and colleagues have used simulations to explore other phenomena, such as the possibility of producing and maintaining a stable flow of liquid on the nanoscale. Their models predicted that it is possible to produce liquid jets only six nanometers wide. To date, in collaboration with Landman's theory group, there are teams of engineers building nozzles that can produce jets in the 100 nanometer range. Within one year, said Landman, they expect to produce "nanojets" in the 10 nanometer range.

"The opportunity to make new discoveries in ways that weren't possible before is an incredible gift and it has come about only because we can now simulate environments on the computer that are either not yet possible, too expensive or too dangerous to do in the lab," said Landman. "We are now at a point in history where the science of the small holds the promise of producing a windfall of scientific discoveries. Computers serve tools for discovery in this exciting adventure."

Technorati : , , , , ,

Research predicts size-induced transition to nanoscale half-metallicity

Research predicts size-induced transition to nanoscale half-metallicity
How big does a cluster of metal atoms actually have to be before it starts acting like a metal: ductile, malleable and a conductor?

The emergence of metallic attributes, usually referred to as the transition to metallicity, is among the most intricate aspects of the size evolution of properties of atomic clusters that are metals in bulk quantities. Researchers at Argonne and other research centers worldwide are looking for answer to this question, which is central for establishing limits of miniaturization in nanoelectronic devices.

An even more intricate question is whether researchers can identify a nanoscale analog of the bulk half-metallic state and the size-driven transition to that state.

A recent study by Argonne theorists suggests that the answer to this question is yes. Their work represents the first prediction of a nanoscale analog of the bulk half-metallic state.

In distinction from normal metals, in which electrons with alpha and beta spins carry the electrical current, half metals are elements or compounds with spin-polarized conductivity. Electrical current in half metals translates into spin transport, which lies at the foundation of spintronics technology. Scientists in the field of spintronics study how to use "spin" or magnetic properties of particles, such as electrons, to develop novel and better sensors, recording devices, switches and quantum computers.

Even the common metallic state becomes a complex phenomenon at the nanoscale, Julius Jellinek of Argonne's Chemical Science and Engineering Division.

"Small or medium atomic clusters of metallic elements may lack all attributes normally associated with the bulk metallic state," he said. "These attributes then grow in as clusters grow in size. The same should be true of the half-metallic state, and our research shows that it is."

The Argonne theorist collaborated with an experimental group at the Johns Hopkins University led by Kit Bowen, Jr. The experiments have indicated that, as a small, negatively charged manganese cluster grows, the gap between the energies of its two most external electrons decreases and closes when the cluster size reaches six atoms.

Computations and subsequent analysis by Jellinek and his colleagues revealed that the closure of the gap between the electron energy levels takes place in one spin manifold, but not the other. This spin-polarized nature of the energy gap closure is what constitutes the nanoscale analog of the bulk half-metallic state. Understanding the finite-size analog of the bulk half-metallicity and the size-driven transition to it is central for many areas of nanoscience and nanotechnology, in particular nanospintronics.

The prediction of finite-size half-metallicity must still be tested experimentally using future spin-polarized photoelectron spectroscopy measurements. "The finite-size analog of half-metallicity may be more ubiquitous than the bulk half-metallic state," Jellinek said. "Nanoscale half-metallicity may emerge as a transient state in the size-driven evolution of properties of systems even for elements and substances that are not half-metals in bulk quantities."

The study was published in the journal Physical Review B and republished by the Virtual Journal of Nanoscale Science and Technology. Collaborators on this research were Julius Jellinek, Paulo H. Acioli and Juan Garcia-Rodeja from Argonne National Laboratory, and Weijun Zheng, Owen C. Thomas and Kit Bowen, Jr. from the Johns Hopkins University.

Argonne National Laboratory, a renowned R&D center, brings the world's brightest scientists and engineers together to find exciting and creative new solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America 's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.

Technorati : , , , , ,

New technique makes atomic-level microscopy 100 times faster

New technique makes atomic-level microscopy 100 times faster

Using an existing technique in a novel way, Cornell physicist Keith Schwab and colleagues at Cornell and Boston University have made the scanning tunneling microscope (STM) -- which can image individual atoms on a surface -- at least 100 times faster.
The simple adaptation, based on a method of measurement currently used in nano-electronics, could also give STMs significant new capabilities -- including the ability to sense temperatures in spots as small as a single atom.

The STM uses quantum tunneling, or the ability of electrons to "tunnel" across a barrier, to detect changes in the distance between a needlelike probe and a conducting surface. Researchers apply a tiny voltage to the sample and move the probe -- a simple platinum-iridium wire snipped to end in a point just one atom wide -- just a few angstroms (10ths of a nanometer) over the sample's surface. By measuring changes in current as electrons tunnel between the sample and the probe, they can reconstruct a map of the surface topology down to the atomic level.

Since its invention in the 1980s, the STM has enabled major discoveries in fields from semiconductor technology to nano-electronics.

But while current can change in a nanosecond, measurements with the STM are painfully slow. And the limiting factor is not in the signal itself: It's in the basic electronics involved in analyzing it. A theoretical STM could collect data as fast as electrons can tunnel -- at a rate of one gigahertz, or 1 billion cycles per second of bandwidth. But a typical STM is slowed down by the capacitance, or energy storage, in the cables that make up its readout circuitry -- to about one kilohertz (1,000 cycles per second) or less.

Researchers have tried a variety of complex remedies. But in the end, said Schwab, an associate professor of physics at Cornell, the solution was surprisingly simple. By adding an external source of radio frequency (RF) waves and sending a wave into the STM through a simple network, the researchers showed that it's possible to detect the resistance at the tunneling junction -- and hence the distance between the probe and sample surface -- based on the characteristics of the wave that reflects back to the source.

The technique, called reflectometry, uses the standard cables as paths for high-frequency waves, which aren't slowed down by the cables' capacitance.

"There are six orders of magnitude between the fundamental limit in frequency and where people are operating," said Schwab. With the RF adaptation, speeds increase by a factor of between 100 and 1,000. "Our hope is that we can produce more or less video images, as opposed to a scan that takes forever."

The setup also offers potential for atomic resolution thermometry -- precise measurements of temperature at any particular atom on a surface -- and for motion detection so sensitive it could measure movement of a distance 30,000 times smaller than the size of an atom.

Technorati : , , , , , ,

Lava provides window on early Earth

Lava provides window on early Earth
Researchers at Harvard and the University of Hawaii believe they've resolved a long-standing controversy over the roots of islands - volcanoes in the middle of tectonic plates - showing that the islands' lava provides a window into the early Earth's makeup.

Assistant Professor of Geochemistry Sujoy Mukhopadhyay and Helge Gonnermann at the University of Hawaii ran sophisticated computer models examining changes in gases dissolved in magma as they rise from the mantle through the Earth's crust. The magma emerges as lava, sometimes in spectacular eruptions. As it cools, it can pile up to enormous heights, building ocean islands such as Hawaii's Mauna Kea, the world's largest mountain measured from its base to its summit.

The controversy revolves around how one interprets apparently conflicting evidence presented by the helium in the magma of oceanic islands versus that in mid-ocean ridges - the long undersea mountain chains that run along the sea floor where tectonic plates spread apart and new oceanic crust is created.

One measure - the ratio of two different types of helium called isotopes - indicates that the lava making up oceanic islands is in part derived from the Earth's mantle and has been unchanged since the formation of the Earth.

The second measure, however - the magma's low concentration of helium - seems to indicate that the part of the mantle that melts to produce the oceanic island has been previously melted, which would let helium gas escape. This would indicate that the lavas making up oceanic islands like Hawaii have been recycled, going through a process of melting and solidifying and melting again, like lavas that erupt in the mid-ocean ridges.

In a report in the Oct. 25 issue of the journal Nature, Gonnermann and Mukhopadhyay explain that the low concentration of helium in island magma doesn't have to mean that it has been recycled. The two showed that helium would be lost from the island magma as it moved to the surface for the first time and as the enormous pressure it was under decreased. As the pressure declines, gases such as helium and carbon dioxide dissolved in the magma form bubbles, much like bubbles in a soda bottle when the top is popped.

The presence of a larger amount of carbon dioxide in the ocean island lavas compared with mid-ocean ridge lavas is key, Mukhopadhyay said, because it forms bubbles and provides a place for helium gas to cross into from the liquid magma. Once the magma reaches the Earth's surface, the carbon dioxide and helium are lost to the air or water where it emerges.

Mukhopadhyay said that these results have far-reaching effects in the understanding of how the Earth's geology works. Dominant theories hold that a slow circulation within the mantle - the layer between the crust and the core - coupled with the movement of the continental plates bringing material to the surface and back down again, have recycled the entire Earth over billions of years, leaving no material from the primordial Earth to be studied.

If the oceanic island lava is a remnant of the primordial Earth, however, it will require rethinking those theories to allow parts of the Earth to remain in their original state.

"We're showing that the geochemical data from ocean islands are indeed consistent with parts of the mantle not having melted over Earth history, and now we have to come up with scenarios or models of mantle convection that leave certain parts of the mantle untouched," Mukhopadhyay said.

Gonnermann said there may be a layer hidden somewhere in the lower mantle that is out of the main circulation or there may be pockets of primordial material scattered throughout.

"The challenge is to understand how this can be," said Gonnermann, who began the work in 2005 as Daly Postdoctoral Fellow at Harvard's Department of Earth and Planetary Sciences.

Mukhopadhyay said noble gases such as helium are powerful tools to understand degassing of volatiles over long periods of time because they are inert. Unlike carbon dioxide and water, which are also present in the mantle's rocks, helium doesn't interact with plants, animals, bacteria, and other biological entities, so scientists can be assured that its presence, absence, or the state it's in is a result of geological, not biological, processes.

Using helium and other nonreactive noble gases as tools, Mukhopadhyay said he'd like to continue looking back in time, examining the longstanding question of how the release of gases from the Earth's rocks influenced the makeup of the early atmosphere.

Technorati : , , , ,

NEC Develops Security Minded -Picture Perfect ATM Display

NEC Develops Security Minded -Picture Perfect ATM Display

NEC has developed a sharp, clear security minded module for ATM technology. Sensing the ever challenging problems of identity and personal data theft,

The NEC LCD Technologies division has developed a TFT LCD Module display that maintains sharpness and clarity while shifting effortlessly between wide and narrow angle views. This improvement is specifically aimed at commercial ATM machines where the security of the displayed data can and has been compromised by over the shoulder peeks by others.

NEC has produced a working model of this security minded display by placing a polarizing plate at the back of the new panel. This panel disperses light across 140 degree or 30 degree angles. Previous methods utilized screens in front of the panel to change the angle. This older version seriously hampered the sharpness and clarity of the display.

According to a NEC Headquarters press release, the angle-switching control enables the light from the backlight system to be switched from a diffused pattern to a straight pattern by control signals.

The distinct advantage of the new LCD module is its ability to show non-private commercial advertising in one mode and then switch to the privacy mode when personal data is displayed.

Currently, NEC has produced two fields of visions for the display. The possible development of other views and flexibility of the technology is under consideration by the NEC LCD Technologies division.

Technorati : , , , , , , ,

'Phononic Computer'

Phononic Computer''

Phononic Computer' Could Process Information with Heat

Most computers today use electrons to carry information, while theoretical optical computers use photons. Recently, physicists from Singapore have proposed a third type of computer: a "phononic computer," which would use heat, carried by phonons, to perform operations similar to its electronic counterpart.

"Heat is very abundant and very often it is regarded as useless and harmful for information processing," Professor Baowen Li of the National University of Singapore told "The merit of our paper is that we demonstrate that, in addition to the existing electrons and photons, the phonons can also perform a similar function. This provides an alternative way for information processing. Moreover, the heat can be harnessed to use."

Li and co-author Lei Wang from the NUS have demonstrated how to make thermal logic gates for possible use in future phononic computers, with their results published in a recent issue of Physical Review Letters.

Logic gates, one of the basic elements of computers, perform an operation on one or more logic inputs to produce a single logic output. In electronic logic gates, the inputs and outputs are represented by different voltages. However, in a thermal logic gate, the inputs and outputs are represented by different temperatures.

The key element of the logic gate is the thermal transistor (which was invented by Li's group last year), which works similar to how a field-effect transistor controls electric current. The thermal transistor is composed of two terminals that are weakly coupled, plus a third control terminal.

"Like all other theoretical modeling, we use heat bath to produce heat, which is a kind of random atomic or molecular motion," Li explained. "To conduct heat, you don't need too much external power. Any temperature difference will lead to heat conduction."

In the researchers' model, heat is conducted by lattice vibration. When the vibration spectra of the two terminals are combined, their overlap determines the heat current. For example, when the two spectra overlap, the heat can easily travel between the terminals, representing the "on" state. When the vibration spectra do not overlap, very little heat (or no heat) passes through, representing the "off" state. The "negative differential thermal resistance" (NDTR) that occurs due to the match/mismatch of vibrational spectra of the terminals' interface particles, makes the "on" and "off" states both stable, making the thermal logic operations possible.

"Like we explain in our Physical Review Letters article, all these logic gate functions can be achieved only when the system has the so-called negative or super response, by which we mean that the large temperature difference (change) will induce the small heat current," Li said. "This is the so-called 'negative differential thermal resistance.'" The NDTR phenomenon was also discovered by Li's group in 2006.

The researchers demonstrate how combining thermal transistors can be used to build different thermal logic gates, such as a signal repeater. A signal repeater "digitizes" the heat input, so that when the temperature is higher or lower than a critical value, the output is either "on" or "off," but not in between. By connecting a few thermal transistors in series, the researchers achieved a nearly ideal repeater. Besides signal repeaters, they also demonstrated a NOT gate, which reverses the input signal, and an AND/OR gate, made from the same thermal transistor model.

While the current model simply shows the feasibility of thermal logic gates, Wang and Li predict that an experimental realization of the devices in nanoscale systems may not be too far off. They point out that another thermal device, the solid-state thermal rectifier, was experimentally demonstrated in 2006, just a few years after the proposed theoretical model.

"One advantage of a phononic computer might be that we don't need to consume a lot of electricity," Li said. "We may use the redundant heat produced by electronic devices or provided by Mother Nature to do useful work. Another advantage is that, one day, human beings can control and use heat wisely so that we may save a lot of energy-which is a big issue nowadays."

Technorati : ,

Computer Museum: A peek back at the history of computing

For an industry that's just 30 years old, personal computing has a lot of history.

The ConBrio 200R, one of four that exist in the world.

Here at the Computer History Museum, just a stone's throw from the Microsoft campus here in Silicon Valley, PC industry veterans, tech enthusiasts, and even a few kids came out for the annual Vintage Computer Festival.

The event is highlighted by seminars and panels on topics like "Deconstructing the Intel 4004" and "The Disk Drive Industry Family Tree," but the real payoff is the Exhibit Hall, in which hobbyists display their dusty, yellowed sets of two-decades-old computers, usually arranged around a theme: calculators, Macintosh, Hewlett-Packard, Atari, and more. The collections are usually culled from attics, basements, garages and, of course, eBay.

Of the sessions, a clear crowd favorite was the DigiBarn presentation. DigiBarn is a computer museum housed on a farm in California's Santa Cruz Mountains, famous among the PC hobbyist crowd for its extensive library of seminal computers and accessories. DigiBarn is run by Bruce Damer and Allan Lundell, who describe their museum as the container for "the garages of Silicon Valley."

DigiBarn first opened 10 years ago as Damer's ode to the birth of the graphical interface, first developed by Xerox. Now his collection includes a range of PCs, calculators, and even flight computers for airplanes.

The centerpiece of this year's festival was the LINC (Laboratory Instrument Computer), first developed in 1961 at Washington University St. Louis. DigiBarn's Damer hailed it as the "first personal workstation devoted to one person." The hulking gray metal box covered in knobs and a tiny display was invented to do online biomedical research in individual research labs.

In the exhibit hall, Tom Wilson of Woodside, Calif., an attendee of the Vintage Computer Festival for several years, showed off his collection as an exhibitor for the first time. He brought along his Atari 800 and Atari 400 computers, along with an Atari cassette player, printer, and popular games like Donkey Kong, Frogger, Qbert, Pac-Man, and more, which he let anyone try their hand at.

Next to a vast collection of calculators sat a hulking combination computer and keyboard--not the computer peripheral, but the musical kind. Called the ConBrio 200R, it's one of four that exist in the world. It was originally invented in 1980 by three students at the California Institute of Technology to write synthesized music. It recently fell into the hands of Brian Kehew, a Los Angeles-based music producer. While he's not touring as the keyboard player for rock band The Who, as he did last summer, Kehew is hatching a plan to be the first person to use the ConBrio to record music. But first he has to figure out how to use it.

"There's no owner's maual," Kehew said. "I'm learning how to work it by myself with tips from the old (inventors)."

Technorati : , ,

Space Station's Damaged Panel Is Fixed

Astronauts patched a damaged solar panel on the international space station yesterday during a tricky and dangerous seven-hour spacewalk.

Perched on the tip of an extension of the station's long robotic arm, astronaut Scott E. Parazynski snipped off tangles of broken and frayed wires that had ripped open two spots on the huge solar array, and installed five jury-rigged straps to reinforce the damaged area, allowing the panel to finally unfurl fully.

In this image provided by NASA television astronaut Scott Parazynski, top, gets into the foot restraints with the assistance of astronaut Douglas Wheelock at the end of the 90-foot robotic arm and boom extension, center, which will carry Parazynski for a 45-minute ride to the damage site at the start of the space walk to repair the damaged solar array Saturday Nov. 3, 2007.

"Excellent work, guys, excellent," space station commander Peggy A. Whitson said after the tense, painstaking job was finally done.

The spacewalk was considered particularly risky, with Parazynski venturing farther from the safety of the station than ever before. The repairs were unusually complicated because the astronauts were unable to fully assess the damage before getting close to the array and had to hope that their quickly improvised repair plans would work. Normally, such a repair mission would take weeks or even months of preparation and rehearsal.

But without the repairs, the damaged solar wing could have become structurally unstable, posing a hazard to the outpost and requiring that it be jettisoned.

Without the panel, the station would not have enough power to continue expanding. That could have forced a postponement of the installation of the next component, a European laboratory, next month. NASA is under pressure to complete the construction of the station before it retires the aging space shuttle fleet in 2010.

So, wearing protective spacesuits, Parazynski and astronaut Douglas H. Wheelock ventured out of the station, orbiting about 213 miles above the East Coast, just past 6 a.m. to begin the unprecedented job.

"Go out there and fix that thing for us," Whitson radioed just before the pair left the safety of the station's airlock.

With Wheelock positioned at the base of the solar array, Parazynski anchored his feet to the end of a 50-foot boom from the space shuttle; the boom was grasped in the middle by the station's 58-foot robotic arm. The arm carried him on a slow-motion, 45-minute trip half a football field away to just barely reach the damaged panel.

Dramatic live-television images showed Parazynski atop the extended arm with the bright orange solar array behind him and the brilliant blue and white Earth below.

Once there, Parazynski, an emergency-room physician, assessed the full extent of the damage for the first time, describing a daunting "hairball" of tangled wires in the area that was mangled when the solar panel was deployed Tuesday. The panel suffered two tears; the largest was about 2 1/2 feet long.

All the tools and all the metal parts of Parazynski's spacesuit were wrapped with insulating tape to minimize the risk of the astronaut getting shocked by the electric array, which is generating 160 volts. His bulky gloves were also covered with extra mittens for added protection.

Using an L-shaped device, dubbed a "hockey stick," to periodically and gently nudge the array away, as well as needle-nose pliers, vice-grips and clippers to cut away and secure loose wires, Parazynski methodically completed the repair, radioing his colleagues each step of the way.

Parazynski, one of the most experienced spacewalkers, installed five "cuff links," which the astronauts had pieced together from spare parts aboard the station. The three- to five-foot long pieces are insulated cables with aluminum plates at each end that Parazynski slipped into holes in the array like cuff links into a shirt sleeve. They are designed to support the damaged area and prevent further tearing.

Once all five were installed, Parazynski and Wheelock watched closely as ground controllers slowly unfurled the solar panel to its full 110-foot length. Parazynski then rode the robotic arm back to safety, returning inside the station with Wheelock at 1:22 p.m.

"It certainly was a really good day overall," said Dina Contella, lead spacewalk officer, during a briefing afterward.

The torn array, however, is just one of the problems facing the station. Metal shavings were discovered earlier in a joint on the station, jamming control of the solar arrays on that side. Space station managers are trying to determine how to fix that malfunction.

Technorati : , ,

Three robots finished the DARPA Urban Challenge

Stanford, CMU robots cross the finish line.

Three robots finished the DARPA Urban Challenge within the allotted time Saturday, a new milestone in the development of self-driving vehicles.

In the running for the $2 million first prize and $1 million second prize are Stanford's robotic VW Passat, Virginia Tech's modified Ford Escape Hybrid, and Carnegie Mellon's autonomous Chevrolet Tahoe. These teams finished the urban challenge's three missions within the allotted six hours and without significant problems.

Other teams including the Ben Franklin Racing Team's robotic Toyota Prius also completed the course, but it's uncertain whether it crossed the finish line in time given various stops in the race.

DARPA plans to name the winners Sunday after compiling and evaluating all of its data on the race vehicles, including data on their speed and compliance with basic traffic rules. DARPA officials had said that the fastest car wouldn't necessarily win if it didn't pass all of the driving rules.

But DARPA Director Tony Tether said that he hadn't seen anything egregious among the first three finalists.

"We have a winner," Tether said in an interview with CNET here at the former George Air Force Base.

Stanford University's robot car Junior crossed the finish line first, but it's unclear whether it will take first prize like it did in the 2005 Grand Challenge, a 132-mile race across the Nevada desert.

CMU's robot Boss followed Junior across the finish line a couple of minutes later, but Boss had started the race as much as 40 minutes after Junior left the starting gate. It was scheduled to leave the starting gate first early Saturday, but it experienced technical problems involving interference with its Global Positioning System, thanks to a local Jumbotron. So CMU gained significant time throughout its three required missions in the day.

Virginia Tech's team, VictorTango, crossed the finish line third. The team was first to leave the starting gate around 8 a.m. and it completed its first two missions first.

Technorati : ,

DARPA Urban Challenge : Robots drop fast in driverless car race

Oshkosh Truck's TerraMax truck is eliminated from the race after nearly running into an old shopping center.

(Credit: Stefanie Olsen/CNET Networks)

After two hours in the race, three teams have fallen out of the DARPA Urban Challenge, leaving eight driverless cars to finish the urban course.

So far, Team Oshkosh, Team Annieway, and Intelligent Vehicle Systems have been eliminated from the challenge for various reasons.

Team Oshkosh, a more than 24,000 pound Oshkost truck, nearly ran into an old shopping center here at the former George Air Force Base after it ran over a parking lot curb. The team of Intelligent Vehicle Systems, a collaboration with Honeywell, Ford, and Delphi, had difficultly negotiating what to do at a stop sign.

Without those three teams, Stanford, the Massachusetts Institute of Technology and Carnegie Mellon are among the eight teams remaining in the race.

A little after 10 a.m. PDT, Virginia Tech's VictorTango team completed its first mission in the challenge, after which it moved back to the starting shoot. Stanford University's robot Junior had already completed its first route, too.

After finishing a mission, the robot receives another mission definition file, or MDF, a USB connector with waypoints on the course to guide the robot's basic navigation. Teams must complete three missions total in six hours.

As Virginia Tech hit a milestone, the team of the University of Central Florida hit something else. The robot, Knight Rider, ran into an abandoned house on the course. It remains to be seen whether Knight Rider will be cut from the contestants.

Technorati : ,

Space Station's Damaged Panel Is Fixed

Astronauts patched a damaged solar panel on the international space station yesterday during a tricky and dangerous seven-hour spacewalk.

Perched on the tip of an extension of the station's long robotic arm, astronaut Scott E. Parazynski snipped off tangles of broken and frayed wires that had ripped open two spots on the huge solar array, and installed five jury-rigged straps to reinforce the damaged area, allowing the panel to finally unfurl fully.

In this image provided by NASA television astronaut Scott Parazynski, top, gets into the foot restraints with the assistance of astronaut Douglas Wheelock at the end of the 90-foot robotic arm and boom extension, center, which will carry Parazynski for a 45-minute ride to the damage site at the start of the space walk to repair the damaged solar array Saturday Nov. 3, 2007.

Technorati : , ,

World's Most Complex Silicon Phased-array Chip Developed

World's Most Complex Silicon Phased-array Chip Developed

This is the first 16 element phased array chip that can send at 30-50 GHz. The uniformity and low coupling between the elements, the low current consumption and the small size - it is just 3.2 by 2.6 square millimeters - are all unprecedented. As a whole system, there are many many firsts," said Gabriel Rebeiz, the electrical engineering professor from the UCSD Jacobs School of Engineering leading the project.

This chip - the UCSD DARPA Smart Q-Band 4x4 Array Transmitter - is strictly a transmitter. "We are working on a chip that can do a transmit and receive function," said Rebeiz.

"This compact beamforming chip will enable a breakthrough in size, weight, performance and cost in next-generation phased arrays for millimeter-wave military sensor and communication systems," DARPA officials wrote in a statement.

"DARPA has funded us to try to get everything on a single silicon chip - which would reduce the cost of phased arrays tremendously. In large quantities, this new chip would cost a few dollars to manufacture. Obviously, this is only the transmitter. You still need the receiver but one can easily build the receiver chip based on the designs available in the transmitter chip. Our work addresses the most costly part of the phased array - the 16:1 divider, phase shifters, amplitude controllers and the uniformity and isolation between channels," said Rebeiz

The chip also contains all the CMOS digital circuits necessary for complete digital control of the phased array, and was done using the commercial Jazz SBC18HX process. This is a first and greatly reduces the fabrication complexity of the phased array. The chip has been designed for use at the defense satellite communications frequency - the Q-band - which goes from 40 to 50 GHz.

"If you take the same design and move it to the 24 or 60 GHz range, you can use it for commercial terrestrial communications," said Rebeiz who is also a lead on a separate project, funded by Intel and a UC-Discovery Grant, to create silicon CMOS phased array chips that could be embedded into laptops and serve as high speed data transfer tools.

The Intel project is a collaboration between Rebeiz, Larry Larson and Ian Galton - all electrical engineering professors at the UCSD Jacobs School of Engineering. Larson also serves as the chair of the Department of Electrical and Computer Engineering.

"If you wanted to download a large movie file, a base station could find you, zoom onto you, and direct a beam to your receiver chip. This could enable data transfer of hundreds of gigabytes of information very quickly, and without connecting a cable or adhering to the alignment requirements of wireless optical data transfer," explained Rebeiz who estimated that this kind of system could be available in as little as three years.

Phased Array Background Information

Phased arrays have been around for more than half a century. They are groups of antennas in which the relative phases of the signals that feed them are varied so that the effective radiation pattern of the array is reinforced in a particular direction and suppressed in undesired directions. This property - combined with the fact that radio waves can pass through clouds and most other materials that stymie optical communication systems - has led engineers to use phased arrays for satellite communications, and for detecting incoming airplanes, ships and missiles.

Some phased arrays are larger than highway billboards and the most powerful - used as sophisticated radar, surveillance and communications systems for military aircraft and ships - can cost hundreds of millions of dollars. The high cost has prevented significant spread beyond military and high-end satellite communication applications. Engineers are now working to miniaturize them and fully integrate them into silicon-based electronic systems for both military and commercial applications.

The new UCSD chip packs 16 channels into a 3.2 by 2.6 mm² chip. The input signal is divided on-chip into 16 different paths with equal amplitude and phase using an innovative design, and the phase and gain of each of the 16 channels is controlled electronically to direct the antenna pattern (beam) into a specific direction.

By manipulating the phase, you can steer the beam electronically in nanoseconds. With the amplitude, you control the width of the beam, which is critical, for example, when you send information to from one satellite to another but you don't want the signal to reach any nearby satellites. And with amplitude and phase control, you can synthesize deep nulls in the antenna pattern so as to greatly reduce the effect of interfering signals from neighboring transmitters.

Technorati : , , , , , , ,

How To Predict The Future Of The Past Tense

How To Predict The Future Of The Past Tense

Verbs evolve and homogenize at a rate inversely proportional to their prevalence in the English language, according to a formula developed by Harvard University mathematicians who've invoked evolutionary principles to study our language over the past 1,200 years, from "Beowulf" to "Canterbury Tales" to "Harry Potter."

Writing this week in the journal Nature, Erez Lieberman, Jean-Baptiste Michel, and colleagues in Harvard's Program for Evolutionary Dynamics, led by Martin A. Nowak, conceive of linguistic development as an essentially evolutionary scheme: Just as genes and organisms undergo natural selection, words -- specifically, irregular verbs that do not take an "-ed" ending in the past tense -- are subject to powerful pressure to "regularize" as the language develops.

"Mathematical analysis of this linguistic evolution reveals that irregular verb conjugations behave in an extremely regular way -- one that can yield predictions and insights into the future stages of a verb's evolutionary trajectory," says Lieberman, a graduate student in applied mathematics in Harvard's School of Engineering and Applied Sciences and in the Harvard-MIT Division of Health Sciences and Technology, and an affiliate of Harvard's Program for Evolutionary Dynamics. "We measured something no one really thought could be measured, and got a striking and beautiful result."

"We're really on the front lines of developing the mathematical tools to study evolutionary dynamics," says Michel, a graduate student in systems biology at Harvard Medical School and an affiliate of the Program for Evolutionary Dynamics. "Before, language was considered too messy and difficult a system for mathematical study, but now we're able to successfully quantify an aspect of how language changes and develops."

Lieberman, Michel, and colleagues built upon previous study of seven competing rules for verb conjugation in Old English, six of which have gradually faded from use over time. They found that the one surviving rule, which adds an "-ed" suffix to simple past and past participle forms, contributes to the evolutionary decay of irregular English verbs according to a specific mathematical function: It regularizes them at a rate that is inversely proportional to the square root of their usage frequency.

In other words, a verb used 100 times less frequently will evolve 10 times as fast.

To develop this formula, the researchers tracked the status of 177 irregular verbs in Old English through linguistic changes in Middle English and then modern English. Of these 177 verbs that were irregular 1,200 years ago, 145 stayed irregular in Middle English and just 98 remain irregular today, following the regularization over the centuries of such verbs as help, laugh, reach, walk, and work.

Lieberman and Michel's group computed the "half-lives" of the surviving irregular verbs to predict how long they will take to regularize. The most common ones, such as "be" and "think," have such long half-lives (38,800 years and 14,400 years, respectively) that they will effectively never become regular. Irregular verbs with lower frequencies of use -- such as "shrive" and "smite," with half-lives of 300 and 700 years, respectively -- are much more likely to succumb to regularization.

Lieberman, Michel, and their co-authors project that the next word to regularize will likely be "wed."

"Now may be your last chance to be a 'newly wed'," they quip in the Nature paper. "The married couples of the future can only hope for 'wedded' bliss."

Extant irregular verbs represent the vestiges of long-abandoned rules of conjugation; new verbs entering English, such as "google," are universally regular. Although fewer than 3 percent of modern English verbs are irregular, this number includes the 10 most common verbs: be, have, do, go, say, can, will, see, take, and get. Lieberman, Michel, and colleagues expect that some 15 of the 98 modern irregular verbs they studied -- although likely none of these top 10 -- will regularize in the next 500 years.

The group's Nature paper makes a quantitative, astonishingly precise description of something linguists have suspected for a long time: The most frequently used irregular verbs are repeated so often that they are unlikely to ever go extinct.

"Irregular verbs are fossils that reveal how linguistic rules, and perhaps social rules, are born and die," Michel says.

"If you apply the right mathematical structure to your data, you find that the math also organizes your thinking about the entire process," says Lieberman, whose unorthodox projects as a graduate student have ranged from genomics to bioastronautics. "The data hasn't changed, but suddenly you're able to make powerful predictions about the future."

Lieberman and Michel's co-authors on the Nature paper are Nowak, professor of mathematics and of biology at Harvard and director of the Program for Evolutionary Dynamics, and Harvard undergraduates Joe Jackson and Tina Tang. Their work was sponsored by the John Templeton Foundation, the National Science Foundation, and the National Institutes of Health.

Technorati : , , ,

Find here

Home II Large Hadron Cillider News