Search This Blog

Saturday, April 5, 2008

A modern computer contains two different types:components and semiconductor components

Hybrid computer materials may lead to faster, cheaper technology
A modern computer contains two different types of components: magnetic components, which perform memory functions, and semiconductor components, which perform logic operations. A University of Missouri researcher, as part of a multi-university research team, is working to combine these two functions in a single hybrid material.

This new material would allow seamless integration of memory and logical functions and is expected to permit the design of devices that operate at much higher speeds and use considerably less power than current electronic devices.

Giovanni Vignale, MU physics professor in the College of Arts and Science and expert in condensed matter physics, says the primary goal of the research team, funded by a $6.5 million grant from the Department of Defense, is to explore new ways to integrate magnetism and magnetic materials with emerging electronic materials such as organic semiconductors.

The research may lead to considerably more compact and energy-efficient devices. The processing costs for these hybrid materials are projected to be much less than those of traditional semiconductor chips, resulting in devices that should be less expensive to produce.

“In this approach, the coupling between magnetic and non-magnetic components would occur via a magnetic field or flow of electron spin, which is the fundamental property of an electron and is responsible for most magnetic phenomena,” Vignale said. “The hybrid devices that we target would allow seamless integration of memory and logical function, high-speed optical communication and switching, and new sensor capabilities.”

Vignale studies processes by which magnetic information can be transferred from a place to another.

“One of the main theoretical tools I will be using for this project is the time-dependent, spin-current density functional theory,” Vignale said. “It is a theory to which I have made many contributions over the years. The results of these theoretical calculations will be useful both to understand and to guide the experimental work of other team members.”

The researchers focused a powerful drug directly on tumors in rabbits using drug-coated nanoparticles

A tumor treated with fumagillin nanoparticles (left) is smaller than an untreated tumor. Nanoparticles containing an image-enhancing metal (yellow) show that the treated tumor has much less blood vessel growth than the untreated tumor

Nano-sized technology has super-sized effect on tumors
Anyone facing chemotherapy would welcome an advance promising to dramatically reduce their dose of these often harsh drugs. Using nanotechnology, researchers at Washington University School of Medicine in St. Louis have taken a step closer to that goal.

The researchers focused a powerful drug directly on tumors in rabbits using drug-coated nanoparticles. They found that a drug dose 1,000 times lower than used previously for this purpose markedly slowed tumor growth.

"Many chemotherapeutic drugs have unwanted side effects, and we've shown that our nanoparticle technology has the potential to increase drug effectiveness and decrease drug dose to alleviate harmful side effects," says lead author Patrick M. Winter, Ph.D., research assistant professor of medicine and biomedical engineering.

The nanoparticles are extremely tiny beads of an inert, oily compound that can be coated with a wide variety of active substances. In an article published online in The FASEB Journal, the researchers describe a significant reduction of tumor growth in rabbits that were treated with nanoparticles coated with a fungal toxin called fumagillin. Human clinical trials have shown that fumagillin can be an effective cancer treatment in combination with other anticancer drugs.

In addition to fumagillin, the nanoparticles' surfaces held molecules designed to stick to proteins found primarily on the cells of growing blood vessels. So the nanoparticles latched on to sites of blood vessel proliferation and released their fumagillin load into blood vessel cells. Fumagillin blocks multiplication of blood vessel cells, so it inhibited tumors from expanding their blood supply and slowed their growth.

Human trials have also shown that fumagillin can have neurotoxic side effects at the high doses required when given by standard methods. But the fumagillin nanoparticles were effective in very low doses because they concentrate where tumors create new blood vessels. The rabbits that received fumagillin nanoparticles showed no adverse side effects.
Senior author Gregory M. Lanza, M.D., Ph.D., associate professor of medicine and of biomedical engineering, and Samuel A. Wickline, M.D., professor of medicine, of physics and of biomedical engineering, are co-inventors of the nanoparticle technology. The nanoparticles measure only about 200 nanometers across, or 500 times smaller than the width of a human hair. Their cores are composed mostly of perfluorocarbon, a safe compound used in artificial blood.

The nanoparticles can be adapted to many different medical applications. In addition to carrying drugs to targeted locations, they can be manufactured to highlight specific targets in magnetic resonance imaging (MRI), nuclear imaging, CT scanning and ultrasound imaging.

In this study, researchers loaded blood-vessel-targeted nanoparticles with MRI contrast agent and were able to make detailed maps of tumor blood vessel growth using standard MRI equipment. The MRI scans showed that blood vessel formation tended to concentrate in limited areas on the surface at one side of tumors instead of dispersing uniformly, which was a surprise.

"Using the blood-vessel targeted nanoparticles, we get a far more complete view of tumor biology than we would get with any other technique," Winter says. "If you followed a tumor over a period of time with the nanoparticles and MRI scans, you would have a much better understanding of the tumor's reaction to treatment."

The researchers say they believe nanoparticle technology will be very useful for monitoring cancer treatment results in both the short and long term.

"It gives you a way of determining whether you should continue treatment, change the dose or even try a different treatment altogether," Lanza says.

Prior work has shown that the nanoparticles can be loaded with many kinds of drugs. The researchers used fumagillin nanoparticles in these experiments to demonstrate the feasibility of this approach, but they plan further investigations with other versions of the nanoparticles.

"What this report clearly demonstrates is that our nanoparticles can carry chemotherapeutic drugs specifically to tumors and have an effect at the tumor site," Lanza says. "Sometimes when I give presentations about our nanotechnology, people react as if it was science fiction or at best a technology of the distant future. But we've shown that the technology is ready for medical applications now."

The nanoparticles will be tested this year in preliminary human clinical trials to determine the optimal method for using them as imaging agents. These studies will lay essential groundwork for using the nanoparticles as therapeutic agents.

Researchers have used graphene to measure an important and mysterious fundamental constant and glimpse the foundations of the universe.

Magnified image of research samples with small holes covered by graphene. One can see light passing through them by the naked eye.
Graphene gazing gives glimpse of foundations of universe
Researchers at The University of Manchester have used graphene to measure an important and mysterious fundamental constant - and glimpse the foundations of the universe.

The researchers from The School of Physics and Astronomy, led by Professor Andre Geim, have found that the world’s thinnest material absorbs a well-defined fraction of visible light, which allows the direct determination of the fine structure constant.

Working with Portuguese theorists from The University of Minho in Portugal, Geim and colleagues report their findings online in the latest edition of Science Express. The paper will be published in the journal Science in the coming weeks.

The universe and life on this planet are intimately controlled by several exact numbers; so-called fundamental or universal constants such as the speed of light and the electric charge of an electron.

Among them, the fine structure constant is arguably most mysterious. It defines the interaction between very fast moving electrical charges and light – or electromagnetic waves – and its exact value is close to 1/137.

Prof Geim, who in 2004 discovered graphene with Dr Kostya Novoselov, a one-atom-thick gauze of carbon atoms resembling chicken wire, says: “Change this fine tuned number by only a few percent and the life would not be here because nuclear reactions in which carbon is generated from lighter elements in burning stars would be forbidden. No carbon means no life.”

Geim now working together with PhD students Rahul Nair and Peter Blake have for the first time produced large suspended membranes of graphene so that one can easily see light passing through this thinnest of all materials.

The researchers have found the carbon monolayer is not crystal-clear but notably opaque, absorbing a rather large 2.3 percent of visible light. The experiments supported by theory show this number divided by Pi gives you the exact value of the fine structures constant.
The fundamental reason for this is that electrons in graphene behave as if they have completely lost their mass, as shown in the previous work of the Manchester group and repeated by many researchers worldwide.

The accuracy of the optical determination of the constant so far is relatively low, by metrological standards.

But researchers say the simplicity of the Manchester experiment is “truly amazing” as measurements of fundamental constants normally require sophisticated facilities and special conditions.

With large membranes in hand, Prof Geim says it requires barely anything more sophisticated then a camera to measure visual transparency of graphene.

“We were absolutely flabbergasted when realized that such a fundamental effect could be measured in such a simple way. One can have a glimpse of the very foundations of our universe just looking through graphene,” said Prof Geim.

“Graphene continues to surprise beyond the wildest imagination of the early days when we found this material.

“It works like a magic wand – whatever property or phenomenon you address with graphene, it brings you back a sheer magic.

“I was rather pessimistic about graphene-based technologies coming out of research labs any time soon. I have to admit I was wrong. They are coming sooner rather than later.”

T-REX is monster light source

REX is monster light source with multiple applications

When it comes to laser-based light sources, there are few brighter than T-REX, an LLNL project developed jointly by the NIF & Photon Science Principle Directorate and the Physical Sciences Directorate.

Technically known as the Thomson-Radiated Extreme X-ray Source, T-REX is an advanced, laser-based light source in which novel, energetic, picosecond laser pulses are scattered from relativistic electrons to produce monochromatic, highly collimated, tunable X-rays and gamma-rays.

The system will be able to study isotopes, allowing researchers to address challenges in homeland and international security, nonproliferation, advanced nuclear power systems and nuclear waste identification. For example, in the Department of Homeland Security's FINDER project for high-confidence detection of nuclear materials to enhance port security. Addressing these national security missions also may also lead to new possibilities for medical and industrial applications of isotope-specific imaging.

T-REX achieved megaelectronvolts (MeV) class first light late last month. On March 26, its 10 picosecond electron beam had been powered up to 120 MeV and collided with UV laser photons, and was used to produce gamma-ray energy of .776 MeV, making it the brightest such instrument in the world in this energy range.
We are still working on verifying the absolute record brightness of the source," said Chris Barty, program director for the Lab's Photon Science and Applications Program.

"But without a doubt, the ~0.75 MeV radiation produced by T-REX is unique in the world with respect to its brightness, spectral purity, tunability, pulse duration and laser-like beam character," he said.

The system builds on a past Livermore project called Picosecond Laser-Electron Interaction for the Dynamic Evaluation of Structures (PLEIADES), which was funded by the Laboratory Directed Research and Development (LDRD) Program.

In 2003, the PLEIADES system generated record pulses of 70-kiloelectronvolts (keV) X-rays. Traditionally, beams in this particular energy regime are created in synchrotron facilities.

According to Barty, "With its MeV-range capabilities, T-REX's peak brightness will be up to 10 orders of magnitude greater than current third-generation synchrotron light sources."

Bright gamma-ray pulses tuned to specific nuclear energy levels may be used to detect specific nuclei and isotopes, through a process called nuclear resonance fluorescence, first described by Edward Teller in 1948.

Nearly all nuclei have a set of nuclear "fingerprints" — several photon-excited states unique to individual isotopes. When a photon with the defined energy hits a targeted nucleus, the photon is absorbed. The excited nucleus then decays, radiating photons of the characteristic energy in all directions. The absorption of resonant photons as well as the emitted energy spectrum can be used to identify the nuclear species or isotope of the target.

The Department of Homeland Security's Domestic Nuclear Detection Office is funding research to explore this imaging and detection capability. The proposed system, called fluorescence imaging in the nuclear domain with extreme radiation (FINDER), could be used to image the isotopic composition of materials inside well-shielded objects, such as cargo containers moving through an inspection terminal. If successful, a FINDER system based on T-REX technology could provide a solution to the challenge of detecting concealed highly enriched uranium.

Barty added that his team is pursuing dynamic applications for T-REX, such as capturing "isotope snapshots" of the movement of materials with 100-billionth of a second shutter speed.

"It's a new area that may very well have a large impact on the Lab's core national security missions," said Barty.

To fix a potentially fatal shaking problem on its snazzy new moon rocket, NASA is considering something that works for mud-stained pickups: heavy-duty

NASA goes low-tech to fix high-tech problem
To fix a potentially fatal shaking problem on its snazzy new moon rocket, NASA is considering something that works for mud-stained pickups: heavy-duty shock absorbers.

For nearly half a year, NASA's No. 1 technical problem in designing its Ares I rocket, which will eventually propel astronauts back to the moon, has been a sound wave vibration problem from its solid rocket motors.

If the vibrations hit the right frequency, they could shake the astronauts to death -- or at the least make it impossible for them to work. The astronauts would be in the Orion crew capsule launched on top of the Ares.

The leading solution is to put weight on springs in parts of the bottom end of the rocket and underneath astronauts' seats to dampen the vibrations. Think MacPherson struts, said Garry Lyles, who heads a NASA team working on the problem.

"These are actually absorbers that are used in vehicles today, especially 1-ton and 1½-ton pickup trucks," Lyles said Thursday.

He said it's possible that further analysis and tests will reveal that the shaking problem that's turned up in computer models of the still unbuilt Ares may be a non-issue. But engineers are seeking solutions just in case.

NASA is not ready to proclaim the case closed and still considers it the highest level of potential problem, Lyles said.

Ares project manager Steve Cook called it "a very manageable issue."

There are many such challenges that face NASA's return-to-the moon program, according to a report issued Thursday by outside federal auditors.

The Government Accountability Office highlighted other potential problems, including too much weight in both the rocket and Orion capsule, design issues with a new engine for a booster, insufficient facilities for certain types of testing, and private industry's inability to make the Orion capsule's 1960s-style peel-away heat shield.

None of the technical problems is "a fatal flaw," the report's author, Christine Chaplain, told a House Science subcommittee Thursday.

Former astronaut Kathryn Thornton, associate dean of engineering at the University of Virginia, said experts believe that one of the biggest problems is that the space agency is set on a schedule of returning people to the moon by 2020 without enough money.

Internet security :Debugging the Efforts To Tackle Cybercrime

Its time to depand on web so it the most necessary job to mantain the high level security of data and other related.
Officials weakened another provision that had rattled ISPs, which were concerned that companies could be asked to create an exhaustive list of data types that law enforcement authorities could seek. Instead, service providers are encouraged to spell out the data available but with the recognition that not all data is available for every investigation."
The Council of Europe settled on voluntary guidelines Wednesday to strengthen cooperation between the police and Internet service companies, starting a long process to build support for a common global system to combat cybercrime.
The ambition of the group is to build on its binding international treaty on cybercrime that has already been signed by 43 nations, including the United States, Japan and most Western European countries. Their aim is to help investigators obtain data quickly when tracking cybercrime that spreads across many national borders.

The guidelines -- adopted at a special conference in Strasbourg of more than 200 people representing law enforcement agencies, trade groups for Internet service providers and companies ranging from Microsoft to eBay -- are also a practical attempt to smooth uneasy confrontations that service providers complain are common when investigators seek information.

"Anybody can take them, use them if they like," said Alexander Seger, who heads the council's technical cooperation unit, which developed the guidelines over the past six months. "If service providers and law enforcement believe their cooperation is perfect, they may not need them," Seger said. "But if they want to improve their cooperation, this may be useful for them."

Seger noted that countries that signed the international treaty -- which dates back to 2001 and defines forms of cybercrime like child pornography and fraud -- wanted guidance for practical issues.

But trade industry groups sought to limit the pool of information that investigators could fish from, and expressed concern about the cost and liability of providing information to investigations that fail or go awry. Pavan Duggal, a lawyer and consultant on cybercrime legislation in India, recalled an incident where a service provider in India gave information in error to investigators, which resulted in the jailing of the wrong man.

The Council of Europe, based in Strasbourg, represents 50 states, including all the members of the European Union, and five nonvoting members, including the United States, Canada, Japan and Mexico. It seeks to promote global cooperation through binding treaties that harmonize international standards.

Seger, who presided over the evolution of the guidelines for law enforcement and private companies, said the suggestions would be presented to the council's cybercrime convention committee this week, with the goal of making them more formal recommendations.

The guidelines provide a standard format for the exchange of information between investigators and service providers, setting out a system for the police to approach a special 24-hour network with specific data requests that can link them to service providers in other countries

But the guidelines also take note of privacy considerations and existing human rights conventions, spelling out that legal authorities must proceed with "due diligence" to verify information given by service providers.

Michael Rotert, who is chairman of ECO, an association representing the German Internet industry, and vice president of the European trade industry for Internet service providers, had pushed hard for reimbursement for private companies that aid investigators, warning that small service providers could be bankrupted by sweeping, labor-intensive requests.

By the time the guidelines were completed, the Council of Europe added that the "issue of cost reimbursement should be considered by relevant parties."

Council officials weakened another provision that had rattled the ISP trade group, which was concerned that companies could be asked to create an exhaustive list of data types that law enforcement authorities could seek.

Instead, service providers are encouraged to spell out the data available but with the recognition, according to the guidelines, "that not all this data will be available for every criminal investigation."

"Now we have a very easy description of things that should be done," said Jean-Christophe Le Toquin, Internet safety director for Microsoft, which had supported the council's efforts to develop guidelines with a contribution of more than $500,000.

He said that bringing together officials from law enforcement and the Internet industry was "a topic very few people wanted to touch."

But Microsoft officials at the conference said that Internet service providers would benefit from improved cooperation with law enforcement authorities, which would aid them in their own efforts to track down people who were misusing the system through fraud, phishing schemes or their latest bane -- typosquatting.

Typosquatting is a form of cybersquatting that occurs when a Web site is created with the misspelled name of a common brand like Microsoft.

"It's going to be very useful document," Le Toquin said, "and we are definitely going to use it."

New Storage System Delivers Speed-Atrato's V1000 storage

The world of data storage has been enriched with Atrato's Velocity1000, a high-performance storage system. The V1000 is able to provide up to 50 terabytes of storage and it can handle up to 11,000 input/output operations per second (IOPS). The performance of the system might be even more impressive than that as, according to Henry Baltazar, a storage analyst with The 451 Group, V1000 handles the I/Os from its disks.

Velocity1000 is based on a SAID system (Self-maintaining Array of Identical Disks). There are 100 to 200 2.5" hard drives stuck together into the chassis. The difference between this storage system and the conventional arrays is that V1000 handles the data directly from its disks at impressive speeds, while other systems feature high speeds only for data stored in cache. This way, the Self-maintaining Array of Identical Disks allows the storage system to give more performance than conventional ones. Also, the chassis may be adjusted for more speed or to fit more storage.
Atrato's V1000 storage system aims to provide high-volume, on-demand storage to thousands of simultaneous users. The storage system is based on a Self-maintaining Array of Identical Disks (SAID) that crams 100 to 200 2.5-inch hard drives into a small chassis, offering as much as 50 terabytes of storage at high-performance rates.
Companies that seek to thrive in the world of Web 2.0 and entertainment-on-demand need to be able to provide high-volume, on-demand storage to thousands of simultaneous users. The first product offering from Atrato Inc. promises to provide the necessary performance.
Atrato has launched the Velocity1000 (V1000) storage system. The company is addressing a major challenge to high-performance IT environments -- not more storage, but rather the speed at which data can be accessed.

"It solves a fairly unique problem" for companies that have large stores of data, according to Henry Baltazar, a storage analyst with The 451 Group. "It's hard to give random access to that data when you have thousands of people trying to get at that stream" at the same time.

At the core of Atrato's offering is a Self-maintaining Array of Identical Disks (SAID) that can handle more than 11,000 IOPS (input/output operations per second, a performance measurement). Baltazar said this measurement might be more impressive than it sounds. Conventional arrays can deliver more I/Os than that -- but only for data stored in cache. Where the V1000 differs is that it can handle this many I/Os for data served from its disks.

"If you can cache the data set, hundreds of thousands of I/Os are not out of line," Baltazar told us. "But if 3,000 people want to watch Lost at the same time, that's a very different problem."

SAID Architecture

The SAID architecture crams 100 to 200 2.5-inch hard drives into a small chassis, offering as much as 50 terabytes of storage at high-performance rates. Baltazar said the chassis can be optimized for either higher spindle counts or for capacity.

Atrato says this architecture is the solution that IT managers are seeking in other, less effective, ways.

"IT managers assume that by adding rack units, they will get the increased IOPS needed," noted Dan McCormick, cofounder and CEO of the company. "The reality is that over-allocating not only fails to deliver the expected performance gains, but also adds costly power- and space-intensive overhead to the data center."

Need for Partnerships

Baltazar said that the company has some strong competition for its target market, including IBM, EMC, Network Appliance and Dell, and while the company has just attracted $18 million in venture capital, it will likely have to use much of that on marketing and sales efforts to prove to potential customers that the SAID model is a viable one. He added that the company would have a better chance of success if it partners with large resellers and creates an OEM agreement with a major storage vendor.

"I think that's why they're going after this vertical specifically," Baltazar said, rather than looking to position the V1000 for typical enterprise applications such as archiving e-mail or working with Oracle. He added that the product lacks key components for that space anyway, such as replication.

For now, the company has a lot of work to do. "They've got to go out there and get those partnership relationships forged," Baltazar said. Atrato announced Tuesday that SRC Computers, founded by Seymour Cray, is working together with Atrato on a solution-integration partnership for high-performance computing environments.

Pricing on the V1000 starts in the low six figures, depending on configuration.

Second life : IBM Project Aims To Put Second Life Inside the Firewall

IBM and Linden Labs are working together on a enterprise version of Second Life, where employees can cross between a private virtual world inside a firewall and outside on the Second Life Web site. The tools under development are based on Second Life Grid. IBM's task is to ensure security for custom-built virtual worlds inside enterprises.

IBM and Linden Labs, creator of the Second Life virtual world, announced Thursday that they are developing tools for enterprise-quality virtual worlds. The goal is to solve a key problem for enterprises that want to use the avatar-based environment: the need to cross back and forth across a corporate firewall.
IBM will test an approach that will allow users to traverse both the public Second Life "mainland" and IBM's custom-built world behind a firewall -- without having to log on and off.

The solutions are based on the Second Life Grid, Linden's platform that allows organizations to create private worlds.

'Major Milestone'

"The goal is to allow IBM employees to access public spaces and private spaces within one Second Life client interface while privatizing and securing portions of the Second Life Grid behind IBM's firewall," IBM said.

Colin Parris, IBM vice president for digital convergence, said the company sees a "need for an enterprise-ready solution that offers the same content-creation capabilities but adds new levels of security and scalability." With security-rich additions, custom virtual environments can become a "viable option for enterprises," he added.

"Deploying regions of the Second Life Grid behind IBM's firewall is a major milestone in the evolution of the Internet and will help accelerate the growth and adoption of all virtual worlds," said Ginsu Yoon, Linden Labs' vice president of business affairs.

Extending Familiar Tools

But is there really a place for the fun and games of virtual worlds in a fast-paced business environment? Definitely, said Charles King, principal analyst with Pund-IT, in an e-mail. Second Life can be a "simple extension of common collaborative tools in use at many companies," he said.

IBM's Lotus suite, for example, supports collaboration features ranging from Facebook-style employee profiles to instant messaging to online meetings to instant-dial VOIP telephony, King said. "The Second Life technology simply takes that a step further, leveraging avatars in these employee interactions but doing so in secure company-controlled environments," he added.

For large companies with employees all over the world, "I think over time Second Life technology could offer employees new ways to engage that are extensions of the Lotus tools they're already using. How it will eventually evolve is a question mark, but its main initial benefit is likely to be its essential familiarity," King said.

As virtual worlds continue to grow in popularity, "government should be aware of virtual worlds and how they are impacting business, education and general society," said Jill Hurst-Wahl, a social-networking consultant, in an e-mail. "With all of that activity, governments should be aware of virtual worlds, understand how they are being used, and then look for ways of interacting with their citizens through those worlds."

IBM said there is strong enterprise interest in the technology, but security concerns have been a roadblock. "We talk to customers all the time who want to use this technology in their companies, but they worry about keeping the conversations and information secure," said Neil Katz, chief technology officer of IBM's digital convergence group.

The effort is just part of the collaboration between IBM and Linden Labs. Last year, the companies announced a project to develop standards for virtual worlds. A key goal is to allow users to cross seamlessly between different worlds.

Second life :Coming Soon to a Mobile Phone Near You
Most of the innovative new features coming to mobile phones are likely to first become available for higher-end devices like smartphones. But most companies say they plan to eventually roll their services out to a broader base of cell phone users, particularly as even basic phones come equipped with more advanced features.

How cool? In the coming months, you'll be able to dictate text messages and surf the Web just by speaking commands -- no tapping or clicking required. If you're trying to figure out where to go to lunch, you'll be able to call up a map marked with local eateries your friends and family recommend. And you'll be able to film movie clips on your cell phone and send them live to somebody else's gadget.

Rapid hardware advances are making all these new offerings possible. Cell phones are morphing into minicomputers, packed with more processing power and bigger screens, and more of them are coming loaded with features like GPS. Faster connections are also driving the changes. Developers can work with tools like streaming video that wouldn't be practical with creaky connections.

Of course, everything isn't going to change overnight. Not all of these applications will work on all devices, and to use some of them, you may have to get a phone with particular features like GPS or a built-in video camera.

Most of the features are likely to first become available for higher-end devices like smart phones. But most companies say they plan to eventually roll their services out to a broader base of cell-phone users, particularly as even basic phones come equipped with more-advanced features.

Here's a sampling of the new applications scheduled to hit the market soon.

Cyber Fraud Steals $239 Million From Consumers

The FBI reports that online fraud is at an all time high which has stolen $239 million from consumers last year.
U.S. consumers reported losing more than $239 million from online fraud last year, up from $198 million in 2006, according to data released today by the FBI.

Internet auction fraud (35.7 percent) and merchandise non-delivery (24.9 percent) were the most frequently reported types of cyber fraud. The median loss amount per fraud incident last year was approximately $680, the report said. The most costly scams involved investment fraud, which cost consumers about $3,500 per incident -- and check fraud ($3,000). In nearly 74 percent of the cases, the perpetrators contacted the victim via e-mail.

The number of cyber cases in the FBI's investigative report come from nearly 220,000 fraud complaints reported last year to the IC3, a partnership between the FBI and the National White Collar Crime Center.

By: Jennifer Hong
Apr 4, 2008, 2:27 PM EDT

Portions of this article are from Brian Krebs' "Security Fix" blog at the Washington Post Web site. Brian Krebs offers pointers on updating you on computer security developments as they arise -- Internet scams, innovative viruses and worms, useful security tools and resources, and important security patches for popular software titles.

According to data released Friday by the FBI, US consumers reported losing more than $239 million from online fraud last year. That's up from $198 million reported in 2006.

Internet auction fraud (35.7 percent) and merchandise non-delivery (24.9 percent) were the most frequently reported types of cyber fraud. The median loss amount per fraud incident last year was approximately $680.

The most costly scams involved investment fraud, which cost consumers about $3,500 per incident while check fraud rose to $3,000. In nearly 74 percent of the cases, the perpetrators contacted the victim via e-mail.

The number cases in the FBI's investigative report come from nearly 220,000 fraud complaints reported last year to the IC3, a partnership between the FBI and the National White Collar Crime Center. As such, they should not be viewed as representative of the total cost of cyber crime, because most cyber fraud is not reported, said Tom Kellerman, vice president of security awareness at Core Security Technologies, a Boston based security firm.

"Most fraud is related to identity theft, such as setting up fraudulent lines of credit for individuals, and intellectual property theft for corporations which is very difficult to ascertain and measure," Kellerman said. "The real statistic to note is what percentage of cybercrime cases are actually noticed let alone investigated and those of us in the know fully understand that this number is marginal."

Security Fix recently published an analysis on the true costs of cyber crime that offered evidence to suggest the growing problem may cost businesses and consumers more than $100 billion a year. However, that number will remain purely speculative until financial institutions are forced to more accurately classify and report cyber-fraud in all its forms and until regulators start publishing more data about it.

Portions of this article are from Brian Krebs' "Security Fix" blog at the Washington Post Web site. Brian Krebs offers pointers on updating you on computer security developments as they arise -- Internet scams, innovative viruses and worms, useful security tools and resources, and important security patches for popular software titles.

Security Fix
Recently published an analysis on the true costs of cyber crime that offered evidence to suggest the problem may cost businesses and consumers more than $100 billion a year. But that number will remain purely speculative until financial institutions are forced to more accurately classify and report cyber-fraud in all its forms and until regulators start publishing more data about it.

The first manned, hydrogen-powered plane has been successfully tested in the skies above Spain

The hydrogen-powered plane is capable of carrying two people

The first manned, hydrogen-powered plane has been successfully tested in the skies above Spain, its makers say.

The small, propeller-driven craft, developed by aviation giant Boeing, made three short flights at an airfield south of Madrid, the company said.

It was powered by hydrogen fuel cells, which produce only heat and water as exhaust products.

The tests could pave the way for a new generation of greener aircraft, the company said.

Boeing's chief technology officer John Tracy said the flights were "a historical technological success" and "full of promises for a greener future".

Small future

Three test flights of the two-seater aircraft took place in February and March at an airfield at Ocana, south of Madrid. The plane was modified to include a hybrid battery and fuel cell system developed by UK firm Intelligent Energy.

The fuel cells, which create electricity by combining oxygen and hydrogen, were used to power an electric motor coupled to a propeller.

During take-off the plane's batteries were used to provide an additional boost, but whilst in the air, the plane relied entirely on the cells.

Boeing said the plane has a flying time of 45 minutes but tests were limited to around half that time.

Although the test had been successful, the firm said it did not believe fuel cells could be the primary power source for large passenger aircraft.

However, it could be used as a secondary source of energy for large planes, according to Nieves Lapena, the engineer responsible for the test flights, but this may take some time to develop.

"In my opinion, we are talking about a delay of about twenty years," she said.

Green skies

Hydrogen-powered planes have been flown before, but never with a human pilot onboard.

In 2005, California-based AeroVironment successfully completed test flights of its Global Observer craft which was powered by liquid hydrogen.

Other companies are also seeking to develop more environmentally-friendly planes, amid concerns over their contribution to climate change.

Earlier this year, the airline Virgin Atlantic conducted the first commercial flight powered partly by biofuel.

And last year, defence firm Qinetiq flew a solar-powered plane for 54 hours, smashing the official world record for the longest-duration unmanned flight.

Zephyr, as the craft was known, could be used for military applications, as well as for Earth-observation and communications.

Other unmanned prototypes have been shown off by the American space agency Nasa.

However, in 2010, Swiss balloonist Bertrand Piccard plans to launch Solar Impulse, a manned plane in which he will attempt to circumnavigate the globe.

To carry the precious payload, the craft will have a huge wingspan of 80m (262ft), wider than the wings of the Airbus A380.

As the plane is piloted by only one person at a time, it will have to make frequent stopovers. The current plan is for the journey to be broken into five legs each lasting between four or five days.

MySpace unveils new music service

Executives from MySpace officially announced the creation of MySpace Music, a service that will be jointly operated by News Corp.'s MySpace and, at least initially, three out of the four top record labels.

The Thursday morning teleconference MySpace held with the press was anticlimactic since details about the service have been leaking for weeks.

The service will roll out gradually over the next three to four months and offer free streaming music, unprotected MP3 downloads, ringtones, and e-commerce offerings such as merchandise and ticket sales, said MySpace CEO Chris DeWolfe. The goal is to make MySpace a one-stop shop for everything music. Among the top four music companies, EMI was the lone holdout. A source with knowledge of the negotiations said that MySpace and EMI continue to seek a deal.

(For more on what lies ahead for EMI, read what the incoming chief of its digital unit, Douglas Merrill, had to say in this interview with CNET from Wednesday: "Will former Google exec help save the music industry?")

The partnership with MySpace is another sign that the music industry has decided to embrace the Web and digital technology instead of waging war against it. As CD sales continue to shrink and piracy expands, the labels are moving toward the inevitable: a redefining of how they make money from music. With MySpace Music, the labels will get an equity stake in the new joint venture and a share of all the revenues the service collects.

To this point, none of the challengers to Apple's iTunes has been able to gather an audience of any relevance or able to cut licensing deals that would provide them with a music offering that equals or surpasses Apple's.

That changed today.

MySpace has 110 million users, 30 million who listen to music on the site. Combine those numbers with the 5 million music acts that promote themselves on the site and MySpace already has impressive music credentials. James McQuivey, an analyst with Forrester Research, said MySpace could help modernize the music industry.

"MySpace has the audience and environment to enable the music industry to get to the next digital level," McQuivey said. "What iTunes offers is a good buying experience but that's not all people do with music. They they talk about it, they share it, they try things out. Remember, this is the kind of activity that (record label) Universal Music Group was suing MySpace for previously."

McQuivey continued: "I think the labels said to themselves,'Oh, if we enable fans to have a fully immersive experience, they might spend more on music. MySpace can offer a place where all aspects of the music experience can be expressed. Imeem was getting close to this but MySpace, if they don't mess it up, should take the music industry to Music 2.0"

Thomas Hesse, president of global digital business at Sony BMG Music Entertainment agreed that part of what attracted the record companies to MySpace was its audience.

"MySpace is already one of the largest music communities on the Internet," Hesse said during an interview with CNET "We're aligning our efforts to reach fans through every conceivable platform."

DeWolfe did not disclose what prices might be, nor would he disclose information about the status of a copyright-infringement suit brought MySpace by Universal Music last year. A source said that the suit was settled for a large sum.

Although DeWolfe declined to discuss financial terms of the deal, the source said that it is non exclusive, meaning that the labels are free to make similar arrangements if they choose. Facebook has been reportedly talking to the labels about launching its own music service.
Three Record Companies Team Up With MySpace for Music Web Site

In the latest effort by the ailing music industry to bolster its declining prospects, three of the industry’s four major companies have struck a deal with the social networking site MySpace to start a music Web site.

MySpace's co-founders, Tom Anderson, right, and Chris DeWolfe, center, with United State's Army Gen. James Lovelance, inaugurated the Operation MySpace concert last month.
MySpace said on Thursday that as part of the deal it would turn its popular MySpace Music site into a joint venture, bringing in Universal Music Group, Sony BMG Music Entertainment and Warner Music Group as minority owners. The music companies are expected to make their entire digital music catalogs available for listening and downloading on the new site, which will be introduced later this year.

The deal highlights the music companies’ scramble to keep pace as consumers migrate toward the fast-changing market for digital downloads, upending the industry’s traditional approach to marketing and distribution. It is also an attempt to encourage competition to Apple’s iTunes Store, which some music executives have criticized for exercising too much control in pricing and on other business terms.

In a sign of how quickly the landscape is shifting, Apple said Thursday that it had surged to become the nation’s largest music retailer, surpassing Wal-Mart for the first time, based on data from research firm NPD Group for the first two months of this year.

The latest deal also comes as MySpace is angling to differentiate itself from rivals like Facebook and retain its role as a central site for music fans. Many thousands of musical artists, from top stars to garage bands, have pages on MySpace where fans can interact with them and listen to songs. But Web surfers have been flocking to music-oriented social networks like Buzznet and Imeem, where listeners can also hear music free.

Chris DeWolfe, chief executive of MySpace, a division of the News Corporation, described the new service as a one-stop source for all music, in all its various digital incarnations.

Visitors to the site will be able to listen to free streaming music, paid for with advertising, and share customized playlists with their friends. They will also be able to download tracks to play on mobile devices, putting the new site in competition with similar services like those from Apple, and eMusic.

A subscription-based music plan, where users pay a monthly amount for unlimited access to downloadable tracks, is also being considered, Mr. DeWolfe said. Additional products like tickets, T-shirts, ring tones and other music merchandise will also be available.

“This is really a mega-music experience that is transformative in a lot of ways,” he said. “It’s the full 360-degree revenue stream.”

Some artists already offer ways to buy T-shirts and other items from their MySpace pages, but music executives involved in the planning of the new venture suggest that its marketing efforts will be much more comprehensive. MySpace Music will be run by an executive team that will report to a board composed of representatives from MySpace and the music labels.

EMI Group, the fourth major music corporation, was not part of the deal, but people involved in the negotiations said it would probably join soon.

The major record companies, who have suffered a long slump as CD sales have declined, are eager to prop up digital sales. Sales of albums in the United States, including digital sales, have declined roughly 11 percent so far this year, and sales of individual digital tracks, though up about 29 percent, have not increased enough to make up for that drop. Overall music sales dropped to $11.5 billion in 2006, from a peak in 1999 of nearly $14.6 billion.

The decline has forced the industry into a new age of experimentation. All four major record labels dropped copy restrictions for Amazon’s new music service, partly in an effort to counterbalance Apple’s strong position.

In another approach, the industry is seeking revenue that does not come directly from its customers, like the ad-supported element of the MySpace service. Music executives have also recently embraced such concepts as tacking extra fees onto the cost of portable music players or Internet access to compensate the industry for rampant piracy.

Michael Nash, Warner Music’s executive vice president for digital strategy, said it would be simplistic to view the MySpace venture as a gambit to challenge iTunes, which is closely tied into the iPod player. Unlike iTunes, he said, MySpace Music “is kind of a hardware-agnostic play” that wants to convert the existing social-networking audience into paying customers.

“It’s about being in business with that construct,” Mr. Nash said, referring to MySpace’s music site, which has emerged as the pre-eminent site for fans seeking to sample music from current artists.

Rich Greenfield, an analyst at Pali Capital, said MySpace was offering a big opportunity to the music companies.

“They have a huge community that wants to talk, share and learn about music,” Mr. Greenfield said. “Nobody else has that. There is music discovery happening on MySpace that is far deeper and broader than what’s going on iTunes.”

But first MySpace will have to prove that it can actually sell music. Though the company earns $70 million a month in advertising for the News Corporation, according to estimates by Pali Capital, it has never successfully sold products on a wide scale. A download service for independent music, begun in 2006 with Snocap, a music start-up, was considered a disappointment.

MySpace has not always had a friendly relationship with the music companies. Universal Music sued MySpace on the grounds of copyright infringement in 2006, saying the site’s users were illegally sharing music and videos. Universal has decided to drop the lawsuit in exchange for an unspecified cash settlement, according to people briefed on the negotiations.

State Stem Cell Grants Awarded

The Connecticut Stem Cell Research Advisory Committee doled out almost $10 million in state grants to Connecticut scientists Tuesday, including one that has the potential to take some of the controversy out of stem cell research.

That grant went to a group of University of Connecticut scientists who formed a rare collaboration between researchers at the main campus in Storrs and the Health Center in Farmington.

Led by Theodore Rasmussen of the Center for Regenerative Biology at UConn, the group plans to coax human skin cells into embryonic cells through a process called nuclear reprogramming. The process, one of the hottest fields in biology, does not require the use of human embryos to create stem cells, removing a major ethical hurdle to stem cell research.

Stem cells are the building blocks for every type of cell in the body, capable of maturing into any type of tissue. Although the ability to use stem cells to cure disease remains a dream, there is hope that they someday could be used to treat a wide variety of ailments, including heart disease, Parkinson's disease and spinal cord injuries.

Another grant went to a private biotech company called Evergen that was started at UConn by cloning pioneer Xiangzhong "Jerry" Yang, who announced two years ago that he would attempt to be the first to clone a human embryo for the purpose of creating stem cells.

Yang has returned to his native China, where he is battling cancer, and his lab is being run by other researchers. And while it appears that nuclear reprogramming might make embryo cloning obsolete, the committee Tuesday set aside $900,000 for Evergen's work.

Before finalizing the awards late Tuesday afternoon, the 13-member advisory committee, composed of physicians, researchers, a dentist and a Parkinson's patient, spent two days poring over 87 grant applications seeking a total of more than $40 million.

The chosen scientists, including 10 young researchers just entering the field of stem cell investigation, two collaborative groups and a handful of established individual researchers, will share $9.8 million.

The money is part of Connecticut's $100 million investment in stem cell research — with $10 million a year to be awarded over 10 years. Tuesday's were the second round of awards. The money is designed to promote stem cell research in Connecticut, despite a ban on using federal research dollars for embryonic stem cell research.

The longest debate, by far, centered on a proposal by Yale School of Medicine researcher D. Eugene Redmond to find a way to repair brain cells damaged by Parkinson's disease by transplanting stem cells from human fetal brain tissue into the brains of monkeys.

The so-called neural stem cells have showed promise in mice, but monkey brains are much more similar to those of humans and thus an essential component of testing before such treatment could ever be tried on human subjects, said Haifan Lin, director of the Yale Stem Cell Center.

Although the committee was enthusiastic about the project, it first cut Redmond's funding request by $500,000 because the monkey research is to be done on the Caribbean island of St. Kitts, and the committee opposed exporting state money. As the haggling continued, Redmond's request for $2 million ultimately was cut in half — a cut that Lin said could effectively kill the research.

Grants awarded Tuesday include:

•$1.8 million to improve laboratories at Yale so the university could create initiating cells that would be used by other researchers doing stem cell experiments; produce new lines of embryonic stem cells; and use new, faster techniques to identify genes that can be responsible for certain diseases.

•$250,000 to a UConn lab that is working on a new technique to separate stem cells.

•$900,000 to Evergen to continue its work on nuclear transfer to create cloned human embryos.

•$1.1 million to Redmond's Parkinson's research at Yale.

•$634,000 to the UConn collaborators working on the nuclear reprogramming alternative to using human embryos for stem cell research.

•About $500,000 each to at least three UConn researchers and one from Yale working on various stem cell projects.

•$200,000 each, for a total of $2 million, to 10 young scientists just getting started in stem cell research.
Contact Hilary Waldman at

"citizen scientists"

Wouldn't it be nice to have thousands of collaborators, collecting data and sharing observations, who didn't demand a salary at all? A nation-wide initiative called Project Budburst is enlisting the help of so-called "citizen scientists" to nip the effects of climate change in the bud. But is using the public as a data source scientifically sound?

The idea of citizen science is nothing new. Hobbyists interested in particular plants or animals have been collecting valuable data for centuries, often even corresponding with professional scientists, publishing papers, and presenting work at scientific meetings. Historically, however, amateur naturalists tended to come to the professionals when they found something interesting. Now, in large part thanks to the internet, it's the other way around.

Project Budburst is a field campaign to track the effects of global warming in the US by monitoring the seasonal activities of a variety of plant species. Volunteers from across the country are urged to watch for key "phenophases" — events such as first leafing, first flower or seed dispersal — and record their observations on the project's website. Project Budburst "allows individuals to feel they are part of a greater understanding of climate change," said project coordinator Sandra Henderson of the University Corporation for Atmospheric Research in Boulder, Colorado.

The project — which is backed by, among others, the National Science Foundation, the USDA Forest Service, and research institutions in seven states — has two main goals, Henderson told The Scientist. "First and foremost, it's an education and outreach effort. But we also hope to collect useful data that will help scientists."

A pilot project run last spring involved volunteer contributors across 26 states who recorded a total of 913 phenological events. Based on the high rates of participation in states such as Utah, Michigan, and Colorado, this year's scheme aims to be larger and more comprehensive nation-wide.

Using citizen science networks allows researchers to gather vast amounts of data and carry out large-scale studies more feasibly than would otherwise be possible, said Graham Appleton, head of publicity at the British Trust for Ornithology (BTO). He told The Scientist that the amount of time invested by volunteers in the UK each year was "equivalent to over one thousand full-time staff."

Humphrey Crick, a senior ecologist with the BTO, said the role of the scientist is "absolutely crucial" for designing citizen science projects with very clear instructions. He recognizes that collecting data in this way can introduce biases, but noted that their effects should be constant over time, so should not greatly affect the ability to reveal changes based on long-term trends. For example, the BTO's Common Birds Census suggests that the number of European starlings in the UK declined by 50% since the mid-1970s; a result that Crick said is robust even if the data — much of which was collected by non-scientists — doesn't necessarily indicate the absolute size of the bird's population. Furthermore, he said that statistical methods can be used to minimize biases and removt.launchURL,0e peculiar data.

Researchers at the Cornell Lab of Ornithology, which has been involved in large-scale, citizen science projects for over 40 years, use clearly defined protocols to "dampen out some of the variation" introduced by using many different data collectors, said Janis Dickinson, director of the citizen science program there. In their bird watching programs, they require their participants to log the time spent observing, and to record occasions even when no sightings are made.

Still, Dickinson acknowledged that some errors can creep in to the data, but though this introduces more variation, she feels these mostly balance each other out. "If you have a whole lot of data, you can handle a lot of error without introducing significant biases," she said.

What do you think: Can data collected by average joes be scientifically valid and useful? Share your thoughts on citizen science, by posting a comment to this blog.

New patent rules overturned

A Virginia court struck down today (April 1) new patent rules which c and biotech companies argued would have limited their ability to protect their intellectual property.

The new rules, which were finalized by the c and Trademark Organization (USPTO) last August, limit inventors to two continuing applications, which add claims to an existing patent, and cap the total number of claims in a patent at 25. "Specifically in life sciences that has a huge effect," Lisa Haile, a patent attorney and co-chair of the Global Life Sciences Sector at the law firm DLA Piper, told The Scientist, because the timeframe of life science discoveries is so long.

Previously, inventors were allowed to file unlimited continuing applications. University-based inventors and biotech companies could file continuances as the scope of their discoveries became clearer with further research, and, for example, could extend patent coverage from one or two new molecules to an entire class of compounds.

The agency, however, argued that the new rules would streamline the patent process and help reduce its backlog of cases. The rules were set to go into effect on November 1, 2007, but in response to a lawsuit filed against the USPTO by GlaxoSmithKline the court issued an 11th hour temporary injunction against them on October 31 while the case was in process.

According to today's court ruling, the patent agency did not have the authority to make such substantive regulatory changes.

Haile noted, however, that the ruling can be appealed.

The NCI's bioinformatics network,caBIG, integrates cancer data fromacross the United states.Its goal: to speed the transition from research to therapy

By Kenneth Buetow
Artwork by Brendan Monroe.
I was at the National Cancer Institute's (NCI) Intramural Program Scientific Retreat this past January listening to a plenary presentation by Cambridge University's Bruce Ponder, when a fascinating question caught my attention. Ponder described how variation in a gene called fibroblast growth factor receptor 2 (FGFR2) is associated with breast cancer. Judah Folkman, a legend among cancer researchers, stood up with a question for Ponder: Has anyone looked at the role of endostatin in breast cancer susceptibility? Endostatin is part of the same network as FGFR2, he explained; moreover, endostatin is located on chromosome 21, and trisomy 21 is protective against breast cancer. Ponder replied that his group had never examined the endostatin locus.

As quickly as the session ended, I stepped up to a Web browser and connected to the online resource caBIG (Cancer Biomedical Informatics Grid). By simply entering the endostatin locus, I was able to see that Folkman's scientific hunch was right on target: Multiple variants within the locus are significantly associated with breast cancer, and those loci are protective. Sadly, Judah Folkman passed away two days later, before I had a chance to share with him the product of his insight.

caBIG is a response to a desperate need. From my position as a senior cancer researcher at the NCI, groundbreaking observations and insights in biomedicine are accumulating at a dizzying rate. However, from the perspective of the approximately 1.4 million US patients who will hear their physicians say, "You have cancer," progress is unacceptably slow. Something needed to be done to expedite the transformation of scientific findings into clinical solutions.

Four years ago, I and my colleagues at the National Cancer Institute responded with the launch of caBIG - a smart, World Wide Web of cancer research. Through the collaborative effort of member cancer centers, we collectively created more than 40 tools to squeeze the most out of cancer data, and a new, international infrastructure to connect the data. Researchers can use sophisticated tools to query multiple databases of raw data in order to generate or validate hypotheses. Already, researchers have published more than 45 articles, and we expect those numbers to grow as the value of these tools becomes obvious.

Biomedical researchers struggle to meaningfully integrate their findings. Cancer is an immensely complex disease and in order to get a sense of the big picture, scientists need to combine observations from genomics, proteomics, pathology, imaging, and clinical trials. There was, however, no systematic way to do this. Encouraged by the support of our community and spurred to the challenge by our advisory boards, we set out to put a new set of tools into the hands of scientists - tools that would allow them to manage and understand the tsunami of biomedical data becoming available.

The caBIG was conceived in 2003 and born in the spring of 2004. It is indeed a big idea: to develop a state-of-the-art informatics platform that provides researchers all the capabilities they'd need to fight the "war on cancer." A large-scale, global concept for connectivity such as caBIG was unheard of in biomedicine in 2004 and is still foreign in most research domains today.

So, how did we develop caBIG? Given the urgency - more than 500,000 cancer deaths occur annually - we needed to start fast and learn quickly. My team at the NCI organized and launched a developmental "pilot phase" with NCI Cancer Centers, a collection of more than 60 long-standing research communities distributed throughout the country. These Centers had a limited history of cross-center collaboration. Could they work together? How many would be willing to be pioneers?

To answer this we went on the road to find out what cancer centers needed and what they were willing to share. The response was overwhelming. It became immediately clear that we needed to create a dynamic process that allowed rapid adjustment and growth. Our approach was to create a cross-institutional, virtual community composed of "workspaces." These workspaces would focus on topics ranging from creating a virtual tissue repository, to building tools that incorporate different data sources in research. Individuals, organizations, and institutions would work together to contribute applications, infrastructure, data, and insights. Participants could benefit directly, from the collective expertise of this international collaboration.

caBIG is a response to a desperate need.For this virtual community to succeed it was important to embrace the individual diversity of members and to connect them, as opposed to creating one big central resource where everyone needed to place their information. As such, caBIG focused on providing tools and infrastructure that could be run by individual laboratories, organizations, or institutions and connect electronically through the Internet. This strategy is called standards-based interoperability, and caBIG has realized it through a services-oriented architecture called caGrid. It is worth noting that caBIG adopted international standards where they existed and extended them as needed to address new problems.

We used all possible mechanisms to invite, engage, and sustain relationships among participants. Weekly teleconferences, supported by Webcasts of presentations and countless listsrvs kept members of the community connected; these discussions are now archived here. Today, these ongoing, virtual interactions continue, augmented by regular face-to-face meetings and a weekly e-newsletter, called "What's BIG this Week," which distributes meeting schedules and key discussion topics for the upcoming week.

The workspaces attract a wide diversity of participants, including informatics experts, clinicians, bench researchers, patient advocates, and senior executives at pharmaceutical companies. More than 190 organizations have participated in caBIG, including NCI-designated Cancer Centers, federal agencies, academic institutions, not-for-profit organizations, and biotech and pharma companies. Essentially, a corps of more than 1,000 individuals is finding creative ways to use caBIG and to sharpen its tools. The initiative isn't meant to serve only the "big science" centers. The tools empower individual laboratories, organizations, and institutions to innovate through traditional (often single investigator-driven) research programs.

Andrea Califano of Columbia University, for example, developed a software program called cancer Workbench (caWorkbench). The program allows an investigator to electronically grab and analyze microarray data from different sources and perform multiscale analysis of genomic and cellular networks. It has become one of the tools that can be freely downloaded or accessed through the caBIG Web browsers.

We intend for researchers to share not only their software, but also their data, where possible. There are obvious challenges associated with data sharing between industry and academia, and between academic researchers vying for the same funding pools. Also, appropriate protections need to be provided for research participants who have generously donated information and material. One caBIG workspace, the data sharing and intellectual capital workspace, focuses on these issues. It integrates expertise from technology transfer specialists, legal counsel, ethicists, security experts, institutional review boards, privacy authorities, and the advocacy community, among others, to create frameworks that guide data sharing and address security and protection of human subjects.

At the end of our pilot phase, we had assembled a vibrant community, a rich collection of tools, and a unique infrastructure to connect and share. The next challenge is to see whether and how the broader universe of biologists and clinicians will adopt them.

One of the first research programs to use caBIG on a large scale was a project aimed at finding cures for brain cancer. Brain cancer is relatively rare in the general population but is the leading cause of cancer mortality in children. The median survival of patients is approximately one year, and the few long-term survivors face significant lifelong neurocognitive deficits. Arguably, survival has not significantly improved in more than a decade, with rarity representing one of the main barriers to progress - no individual investigator or institution sees enough cases to conduct clinical research.

To address this issue, the Glioma Molecular Diagnostic Initiative (GMDI) was launched. Led by Howard Fine of the NCI, multiple investigators from around the world have collaborated to more accurately characterize gliomas, the most common form of brain tumor, using immunohistochemistry, genetics, and molecular biology.

The power of this project is that it integrates massive, heterogeneous datasets using a user-friendly interface.In partnership with the caBIG community, Fine's group created the Repository of Molecular Brain Tumor Data (REMBRANDT). By customizing the caBIG infrastructure, REMBRANDT provides an unprecedented opportunity to conduct in silico research, both for hypothesis generation and external validation. It can be used for gene discovery, elucidation of the role of pathways, and molecular target identification and validation. Use of the data is free and without expectation of coauthorship, coinventorship, and any other type of remuneration.

REMBRANDT has already paid dividends. Sun and colleagues have discovered that stem cell factor (SCF) is a critical angiogenic factor in the pathogenesis of malignant gliomas (Cancer Cell 9:287-300, 2006). The authors were able to correlate SCF to clinical outcome. This association was made painstakingly through extensive in vitro and in vivo studies. Today, that same relationship can be revealed by simply querying the large patient population in REMBRANDT. The results of the REMBRANDT analysis show that questions concerning gene expression and survival are accessible through a few intuitive clicks of a mouse. Any researcher can use the underlying caBIG software (caIntegrator DataMart) that powers REMBRANDT to integrate large databases of disparate data sources and to query them with a number of statistical tools (see infographic).

Other tools are already generating novel scientific findings. Louise Showe's group at the Wistar Institute in Philadelphia was looking for a way to distinguish two cancers that had very similar histology, but which required very different treatment protocols. Her group wanted to distinguish the two cancers on a genetic basis. In order to perform her analysis, Showe had to integrate gene-expression arrays from four different institutions to get the volume of data she needed. In addition, the data to be integrated came from two kinds of Affymetrix chips. Showe and colleagues fine-tuned a tool called distance-weighted discrimination (DWD), which statistically manipulated the data from the different microarray platforms so that they could be analyzed as a single source. The group found a panel of 10 genes that positively distinguished the two cancers. The DWD tool is also now accessible to researchers on the caBIG Web site.

The caBIG community is also generating tools and infrastructure to connect collaborative networks and to make large databases easier to use. The NCI and the National Human Genome Research Institute are currently creating a comprehensive catalog of the gene changes that underlie multiple forms of cancer. Called the Cancer Genome Atlas (TCGA), it is a multidimensional, molecular characterization of cancer. In the pilot phase of the program, data on large scale sequencing, gene expression, DNA fragment copy number changes, loss of heterozygosity, and the epigenetic state of the genome are being generated on a common collection of biospecimens that will be annotated with rich clinical information.

The caBIG connects and integrates the terabytes of TCGA data generated at 11 different sites distributed throughout the United States. Using the caBIG infrastructure, data are shared electronically with the public. Out of concern for human subjects' privacy and protection, not everything is made public. A data-access committee reviews users and permits access to protected data only to those who qualify. However, with authorization, it is possible to use caBIG tools to comprehensively interrogate this unique data collection.

The power of this project is that it integrates massive, heterogeneous datasets using a user-friendly interface. For example, a researcher can browse the mutations found in a particular cancer using the Cancer Genome Workbench and then use the Pathway Interaction Database to look at how those mutations come together in a cellular pathway (see infographic). Using the TCGA portal, the researcher can combine the mutations with data on gene deletions and amplifications to study how they interact in multiple pathways.

Further, using tools similar to those in REMBRANDT, researchers can examine the joint effects of mutation and gene expression in altering survival time. By simply selecting from a menu of gene mutations observed in the sample set and choosing any genes characterized in the genome-wide expression studies, a user can see how different components interact to alter disease outcome.

While the current sample sizes in the TCGA are too small to generate anything but intriguing hypotheses, it is already generating provocative observations. For example, SCF expression appears to alter prognosis in glioblastoma multiform (as above), but only in individuals who have a p53 mutation.

One goal we have here at NCI is to connect, by the year 2010, all NCI comprehensive and community cancer centers in the United States. The data collected at each center will be shared (as appropriate), and all multicenter clinical cancer trials will connect to each other electronically and to the Food and Drug Administration. Perhaps most ambitiously, institutions will be collaborating and publishing studies through caBIG. The longer-term goal is to extend the standards, infrastructure, and vocabularies that were pioneered for cancer to connect the entire biomedical enterprise, regardless of the disease studied.

As a result, I envision huge advances that will bring a new era in molecular medicine. Benefits include early identification of disease-causing genes, which will enable us to delay or prevent the progression to clinical symptoms. Subgrouping of diseases by genetic biomarkers will allow us to predict how a disease will advance and how amenable it will be to therapeutic options. Moreover, the capacity to monitor patient response to treatment will obviate useless approaches and make it possible to prescribe "the right drug for the right patient at the right time."

It's a daunting challenge. But we need - and intend - to move at warp speed to serve the patient community. By putting the right data in the right hands at the right time, we can quicken discovery, eliminate unnecessary redundancy of research, and better understand clinical success and failure. This comprehensive, integrated view fully embraces the complexity of cancer and will allow us to determine rationally how to combine multiple interventions.

Have a comment? E-mail us at

Kenneth Buetow is the NCI Associate Director for Bioinformatics and Information Technology, a laboratory chief in the Laboratory of Population Genetics at the National Cancer Institute, and he is the founder of the caBIG project.

Find here

Home II Large Hadron Cillider News