Search This Blog

Friday, February 29, 2008

Robots as part of daily life

Bill Passed to Vitalize Korea's Robotics Industry
Korea's robotics industry is coming alive with support from lawmakers. The National Assembly passed a bill Tuesday aimed at stimulating the development and distribution of "intelligent robots."
An intelligent robot refers to service or personal robots that interact with people for various needs like educating children and helping out old people in their daily lives.

An intelligent robot is a step forward from a one-armed manipulator that's limited to repetitive tasks, as intelligent robots are able to recognize their surroundings and cope with changing situations.

The new law calls for developing research plans and policies to promote the intelligent robot market as well as creating industrial standards for robotics products.

The law also allows for establishing special investment firms targeting robotics manufacturers.

The Commerce, Industry and Energy Ministry says as of 2006 Korea stood sixth in the world for its domestic robotics market size and the industry is marching ahead by 35 percent per year.

The Center for Intelligent Robots has already been carrying out projects to advance intelligent robotics and create better infrastructure for utilizing robots.

And they say by 2020 Korea can be one of the leading countries for intelligent robot production as it already relies heavily on robots in manufacturing and also possesses advanced IT, another key ingredient for putting robots to work
Japan experiments with robots as part of daily life
At a university lab in a Tokyo suburb, engineering students are wiring a rubbery robot face to simulate six basic expressions: anger, fear, sadness, happiness, surprise and disgust.

Hooked up to a database of words clustered by association, the robot _ dubbed Kansei, or "sensibility" _ responds to the word "war" by quivering in what looks like disgust and fear. It hears "love," and its pink lips smile.

"To live among people, robots need to handle complex social tasks," said project
leader Junichi Takeno of Meiji University. "Robots will need to work with emotions, to understand and eventually feel them.

While robots are a long way from matching human emotional complexity, the country is perhaps the closest to a future _ once the stuff of science fiction _ where humans and intelligent robots routinely live side by side and interact socially.

Robots are already taken for granted in Japanese factories, so much so that they are sometimes welcomed on their first day at work with Shinto religious ceremonies. Robots make sushi. Robots plant rice and tend paddies.

There are robots serving as receptionists, vacuuming office corridors, spoon-feeding the elderly. They serve tea, greet company guests and chatter away at public technology displays. Now startups are marching out robotic home helpers.

They aren't all humanoid. The Paro is a furry robot seal fitted with sensors beneath its fur and whiskers, designed to comfort the lonely, opening and closing its eyes and moving its flippers.

For Japan, the robotics revolution is an imperative. With more than a fifth of the population 65 or older, the country is banking on robots to replenish the work force and care for the elderly.

In the past several years, the government has funded a plethora of robotics-related efforts, including some 4.6 billion yen (US$42.7 million;euro28.7 million) for the first phase of a humanoid robotics project, and 1.1 billion yen (US$10.2 million;euro6.8 million) a year between 2006 and 2010 to develop key robot technologies.

The government estimates the industry could surge from about 558 billion yen (US$5.2 billion;euro3.5 billion) in 2006 to 3 trillion yen (US$26 billion;euro17.5 billion) in 2010 and nearly 7.5 trillion yen (US$70 billion;euro47 billion) by 2025.

Besides financial and technological power, the robot wave is favored by the Japanese mind-set as well.

Robots have long been portrayed as friendly helpers in Japanese popular culture, a far cry from the often rebellious and violent machines that often inhabit Western science fiction.

This is, after all, the country that invented Tamagotchi, the hand-held mechanical pets that captivated the children of the world.

Japanese are also more accepting of robots because the native Shinto religion often blurs boundaries between the animate and inanimate, experts say. To the Japanese psyche, the idea of a humanoid robot with feelings doesn't feel as creepy _ or as threatening _ as it might do in other cultures.

Still, Japan faces a vast challenge in making the leap _ commercially and culturally _ from toys, gimmicks and the experimental robots churned out by labs like Takeno's to full-blown human replacements that ordinary people can afford and use safely.

"People are still asking whether people really want robots running around their homes, and folding their clothes," said Damian Thong, senior technology analyst at Macquarie Bank in Tokyo.

"But then again, Japan's the only country in the world where everyone has an electric toilet," he said. "We could be looking at a robotics revolution."

That revolution has been going on quietly for some time.

Japan is already an industrial robot powerhouse. Over 370,000 robots worked at factories across Japan in 2005, about 40 percent of the global total and 32 robots for every 1,000 Japanese manufacturing employees, according to a recent report by Macquarie, which had no numbers from subsequent years.

And they won't be claiming overtime or drawing pensions when they're retired.

"The cost of machinery is going down, while labor costs are rising," said Eimei Onaga, CEO of Innovation Matrix Inc., a company that distributes Japanese robotics technology in the U.S. "Soon, robots could even replace low-cost workers at small firms, greatly boosting productivity."

That's just what the Japanese government has been counting on. A 2007 national technology roadmap by the Trade Ministry calls for 1 million industrial robots to be installed throughout the country by 2025.

A single robot can replace about 10 employees, the roadmap assumes _ meaning Japan's future million-robot army of workers could take the place of 10 million humans. That's about 15 percent of the current work force.

"Robots are the cornerstone of Japan's international competitiveness," Shunichi Uchiyama, the Trade Ministry's chief of manufacturing industry policy, said at a recent seminar. "We expect robotics technology to enter even more sectors going forward."

Meanwhile, localities looking to boost regional industry clusters have seized on robotics technology as a way to spur advances in other fields.

Robotic technology is used to build more complex cars, for instance, and surgical equipment.

The logical next step is robots in everyday life.

At a hospital in Aizu Wakamatsu, 300 kilometers (190 miles) north of Tokyo, a child-sized white and blue robot wheels across the floor, guiding patients to and from the outpatients' surgery area.

The robot, made by startup Tmsk, sports perky catlike ears, recites simple greetings, and uses sensors to detect and warn people in the way. It helpfully prints out maps of the hospital, and even checks the state of patients' arteries.

The Aizu Chuo Hospital spent about some 60 million yen (US$557,000;euro375,000) installing three of the robots in its waiting rooms to test patients' reactions. The response has been overwhelmingly positive, said spokesman Naoya Narita.

"We feel this is a good division of labor. Robots won't ever become doctors, but they can be guides and receptionists," Narita said.

Still, the wheeled machines hadn't won over all seniors crowding the hospital waiting room on a weekday morning.

"It just told us to get out of the way!" huffed wheelchair-bound Hiroshi Asami, 81. "It's a robot. It's the one who should get out my way."

"I prefer dealing with real people," he said.

Another roadblock is money.

For all its research, Japan has yet to come up with a commercially successful consumer robot. Mitsubishi Heavy Industries Ltd. failed to sell even one of its pricey toddler-sized Wakamaru robots, launched in 2003 as domestic helpers.

Though initially popular, Sony Corp. pulled the plug on its robot dog, Aibo, in 2006, just seven years after its launch. With a price tag of a whopping 200,000 yen (US$2,000;euro1,350), Aibo never managed to break into the mass market.

One of the only commercially successful consumer robots so far is made by an American company, iRobot Corp. The Roomba vacuum cleaner robot is self-propelled and can clean rooms without supervision.

"We can pretty much make anything, but we have to ask, what are people actually going to buy?" said iRobot CEO Helen Greiner. The company has sold 2.5 million Roombas _ which retail for as little as US$120 (euro81) _ since the line was launched in 2002.

Still, with the correct approach, robots could provide a wealth of consumer goods, Greiner stressed at a recent convention.

Sure enough, Japanese makers are catching on, launching low-cost robots like Tomy's 31,290 yen (US$280;euro185) i-Sobot, a toy-like hobby robot that comes with 17 motors, can recognize spoken words and can be remote-controlled.

Sony is also trying to learn from past mistakes, launching a much cheaper 40,000 yen (US$350;euro235) rolling speaker robot last year that built on its robotics technology.

"What we need now isn't the ultimate humanoid robot," said Kyoji Takenaka, the head of the industry-wide Robot Business Promotion Council.

"Engineers need to remember that the key to developing robots isn't in the lab, but in everyday life."

Still, some of the most eye-catching developments in robotics are coming out of Japan's labs.

Researchers at Osaka University, for instance, are developing a robot to better understand child development.

The "Child-Robot with Biomimetic Body" is designed to mimic the motions of a toddler. It responds to sounds, and sensors in its eyes can see and react to people. It wiggles, changes facial expressions, and makes gurgling sounds.

The team leader, Minoru Asada, is working on artificial intelligence software that would allow the child to "learn" as it progresses.

"Right now, it only goes, 'Ah, ah.' But as we develop its learning function, we hope it can start saying more complex sentences and moving on its own will," Asada said. "Next-generation robots need to be able to learn and develop themselves."

For Hiroshi Ishiguro, also at Osaka University, the key is to make robots that look like human beings. His Geminoid robot looks uncannily like himself _ down to the black, wiry hair and slight tan.

"In the end, we don't want to interact with machines or computers. We want to interact with technology in a human way so it's natural and valid to try to make robots look like us," he said.

"One day, they will live among us," Ishiguro said. "Then you'd have to ask me: 'Are you human? Or a robot?'"

Microsoft cutting price of Vista :Microsoft e-mails reveal Intel pressure over Vista

Microsoft has said it plans to cut the cost of its Windows Vista operating system sold at retail outlets.
Although no exact date has yet been given, Microsoft said price cuts would be introduced in 70 countries.

In the US, the cost of the most expensive version, Vista Ultimate, will be reduced to $319 (£161) from the current retail price of $399.

Analysts said Microsoft was aiming to boost the number of customers upgrading to Vista, which was introduced in 2007.

Really its a'Greater opportunities'

The price cuts apply to the packaged versions of Vista, which account for less than 10% of its sales.

By contract, 90% of Vista sales are to PC manufacturers, which the operating system is pre-installed.

"We anticipate these changed will provide greater opportunities... to sell more stand-alone copies of Windows," said Brad Brooks, a Microsoft corporate vice president.

Microsoft says it has now sold 100 million Vista licences since it was launched.

As far back as 2005, Microsoft executives knew that confusing hardware requirements for the Windows Vista Capable program might get them in trouble. But they did it anyway--over the objection of PC makers--at the behest of Intel, according to e-mails released as part of a class-action lawsuit pending against Microsoft.

In early 2006, Intel's Renee James, vice president and general manager of Intel's software and solutions group, was able to prevail on Microsoft's Will Poole to change the proposed requirements for Microsoft's proposed "Vista Ready" marketing program to include an older integrated graphics chipset that couldn't run Vista's Aero interface. At the time, Intel was worried that it wouldn't be able to ship the more advanced 945 chispet, which was capable of running Aero, in step with Microsoft's proposed schedule for the introduction of the marketing upgrade plan.

This led to the creation of the "Vista Capable" logo, which is the reason Microsoft is now in court, facing a class-action lawsuit on the part of PC owners who bought so-called Vista Capable machines in late 2006 only to find those machines could only run Vista Basic, which doesn't feature the Aero interface. The potential for confusion was well-understood both outside the company, as noted here in this CNET News.com story from March 2006, and within the company, as multiple e-mail threads reveal.

treasure trove of e-mails has been released as part of that case, and the Seattle Post-Intelligencer's Todd Bishop has spotlighted a number of e-mails that call into question whether Microsoft was acting, at least in part, on Intel's behalf when it set the requirements for the Vista Capable marketing program. Several pages of e-mails were redacted by the court. All e-mails quoted in this report were taken verbatim, typos and all, from a PDF file put together by the Seattle Post-Intelligencer in a blog posted by Bishop yesterday.

"In the end, we lowered the requirement to help Intel make their quarterly earnings so they could continue to sell motherboards with the 915 graphics embedded," Microsoft's John Kalkman wrote in a February 2007 e-mail to Scott Di Valerio, who at the time managed Microsoft's relationships with the PC companies and recently took a job with Lenovo. The change took place in January 2006, and was formally rolled out by Poole, currently corporate vice president of Microsoft's unlimited potential group, without the knowledge of Jim Allchin, the now-departed Microsoft executive who was supposed to be in charge of Vista's development

Intel declined to comment on specific e-mails until it had a chance to review them. But in response to the Kalkman e-mail, read to a Intel representative, the company said, "We do not know who John Kalkman is. We do know that he is not qualified to know anything about Intel's internal financials or forecasts related to chipsets, motherboard or any other products. He would have no visibility into our financial needs in any given quarter."

(Credit: Microsoft)The planning for the Vista Capable program started long before it was publicly announced in May 2006, a few months after the final delay in Vista's ship date was announced. The idea was to mimic what Microsoft did with Windows XP, to assure customers buying PCs sold within a few months of the launch date that their hardware could run the new operating system when it was formally released. This helps PC makers avoid a swoon in demand in the weeks and months prior to the launch of a new operating system.

Microsoft knew that Vista's Aero interface would put a significant strain on the hardware used in those PCs, and so in 2005 it started putting requirements together for the Vista Ready program using Intel's 945 chipset as the baseline chipset needed for designation as "Vista Ready."

Eric Charbonneau, an unidentified Microsoft executive, told his direct reports in August 2005 in an e-mail that the older 915 chipset wouldn't cut it. "Any OEM who plans to ship an Intel 915 chipset system (using UMA, without separate discrete graphics hardware) for Summer 2006 needs to know that: 1. Their systems will not be eligible for the Windows Vista Ready designation..." Simply put, the 915 chipset couldn't support the Windows Vista Display Driver Model (WDDM), and that capability was a requirement at the time for being able to slap a "Vista Ready" sticker on a PC.

However, at some point between that e-mail and January 2006, Microsoft changed its stance on the 915 chipset. The 945 chipset was Intel's top-of-the-line integrated graphics chipset when it was introduced in May 2005, but it still sold lots of lower-end 915 chipsets in both desktops and notebooks. Intel didn't launch the notebook version of the 945 chipset until January 2006, and was apparently concerned that it would be unable to get enough 945 systems into the market by the middle of 2006, the (at the time) launch expectation for the Vista Ready program.

With notebooks a far-faster growing segment of the PC market than desktops, Intel apparently felt that if only 945 chipsets were deemed Vista Ready, that demand for systems with 915 chipsets--still a significant mix of its products--would fall off the face of the earth. And also, that it would be unable to produce enough 945 chipsets to meet its committments to PC makers--orders that might otherwise go to Advanced Micro Devices.
In January 2006, Poole sent an e-mail to several Microsoft executives informing them that the plan had changed, and that Intel approved. "I went over the new plan with Renee tonight. Not surprisingly, she is pleased with the outcome. I told her we wanted to communicate to OEMs and retail first, and then they can cascade their own communication. They are losing orders every day, so we need to get a simple communication out ASAP."

In February 2006, one month after Will Poole informed the Vista team of the decision, Microsoft's Will Johnson wrote an e-mail laying out some more of the specifics.

"We have removed the WDDM requirement for Vista Capable machines, the modern CPU and 512 RAM requirements remain intact, but the specific component that enables the graphical elements of Windows Vista (re: aeroglass) has been removed. This was based on a huge concern raised by Intel regarding 945 chipset production supply and the fact that we wanted to get as many PCs as possible logo'd by the 4/1 US retail REV date. The push to retail should be that while this opens up a wider band of machines to being Vista Capable retailers should be very aggressive in communicating to their OEMs (and thus Intel) to maximize production of 945 chipset equipped machines going forward."

According to e-mails exchanged, many inside Microsoft were appalled at the decision to let Intel's supply concerns dictate its marketing policies. Now Microsoft had to go out and create a two-tiered program promoting both "Vista Capable" machines and "Vista Premium Ready" machines.

A Vista Capable sticker would simply mean the PC could run Vista Basic, allowing PC makers to promote their PCs as "Vista" PCs while glossing over the fact that the minimum hardware requirements for that label couldn't really handle the improved graphics that were one of the major reasons to upgrade to Vista. This confusion was exactly what Microsoft and its PC partners hoped to avoid when they were first drawing up the requirements in the first place, and several e-mails show those concerns were shared widely prior to, and following, Poole's decision.

Hewlett-Packard was particularly incensed, since it had decided to adopt Intel's 945 chipset more aggressively, believing it was the only chipset that would support the Vista Ready program.

Microsoft's Mark Croft wrote in response to Poole's e-mail that, "We need good messaging for the elimination of WDDM in Capable, as we have had this as a requirement since inception over 18 months ago."

But perhaps the most surprised executive inside Microsoft at the move was Allchin, the head of the Vista development team.

"We really botched this," he wrote in a thread responded to Poole's e-mail. "I was not involved in the decision making process and I will support it because I trust you thinking behind the logic. BUT, you have to do a better job with customers that what was shown here. This was especially true because you put me out on a limb making a commitment. This is not ok."


Will Poole, co-head of Microsoft's emerging markets efforts, who authored the e-mail acknowledging pressure from Intel.

(Credit: Microsoft)Later, in a private e-mail, Mike Ybarra of Microsoft pleaded with Alchin to step in and reverse the decision. "Jim, I am passionate about this and believe this decision is a mistake," he wrote. "We are caving to Intel. We worked hard the last 18 months to drive the UI experience and we are giving this up."

Allchin appeared to agree in his response, but seemed resigned to fate.

"It might be a mistake. I wasn't involved and it is hard for me to step in now and reverse everything again," he wrote to Ybarra. "We might be able to thread the needle here if we make 'capable' just related to 'old' type hardware."

And so, confusion began, just as Microsoft employees and partners predicted it would. Some Microsoft marketing units started saying that the even older 865 chipset would now qualify for the Vista logo program, which was squashed. But it was easy to see where the confusion stemmed once the requirement for WDDM was dropped, as essentially anything relatively modern that could easily run Windows XP would be capable of running Vista Basic.

Anantha Kancheria wrote to Rajesh Srinivasan as part of a discussion in March 2006 around the 865 confusion, and employed a little gallows humor.

"Based on the objective criteria that exist today for capable even a piece of junk would qualify. :) So based on that yes 865 would qualify. For the sake of Vista customers, it would be a complete tragedy if we allowed it. I don't know how to help you prevent it."

The 865 was eventually scrubbed from the program, but the 915 was allowed to remain. And so, PCs with the 915 chipset were sold as Windows Vista Capable, while others sold with the 945 chipset or better were labeled Vista Premium Ready. As predicted, confusion ensued, and even Microsoft executives and directors were snared.

Steve Sinofsky, the former head of Microsoft office development and current head of Windows and Windows Live development, wrote an e-mail to Microsoft's Brad Goldberg in July 2006 asking about a Dell Latitude he purchased that he thought was labeled as "Vista Ready," but in reality didn't have enough graphics hardware to run Vista.


Steven Sinofsky, Microsoft senior vice president, Windows and Windows Live Engineering Group

(Credit: Microsoft)Goldberg, then vice president of Windows product management, explained, "Some PCs that are windows vista capable will run aero and some will not. In the interim we've created a marketing designation that allows OEMs to market PCs as "premium ready." every pc that is premium ready will run aero."

Goldberg continued, "for holiday oems will be heavily pushing premium ready machines but because Intel was late with their integrated chipset the majority of the machines on the market today are windows vista capable but not premium ready. originally we wanted to set the capable bar around aero but there are a bunch of reasons why we had to back off...a bit messy and a long story that I'm happy to walk you through if helpful. :)" Goldberg has since been reassigned.

In January 2007, Jon Shirley, a former Microsoft COO and current member of the board of directors, wrote CEO Steve Ballmer an e-mail complaining about driver support for some peripherals he wanted to use with his Vista PC. Ballmer forwarded the e-mail to Sinofsky, asking for input on whether Microsoft should be doing anything differently.

Sinofsky launched into a post-mortem on Vista itself, with this graph pertaining to Intel.

"Intel has the biggest challenge. Their "945" chipset which is the baseline Vista set "barely" works right now and is very broadly used. The "915" chipset which is not Aero capable is in a huge number of laptops and was tagged as "Vista Capable" but not Vista Premium. I don't know if this was a good call. But these function will never be great. Even the 945 set has new builds of drivers coming out consistently but hopes are on the next chipset rather than this one."

Ballmer's response? "Righto thanks."

Microsoft is now defending itself against claims the Vista Capable program was misleading and unfair, all thanks to a decision to allow Intel to sell older chipsets that couldn't run Vista's Aero interface--really one of the main reasons to upgrade--with the word "Vista" attached. As the e-mails show, many within the company knew they were heading down this path when they embarked on a two-tier logo program, but the need to keep Intel happy--over the objection of the world's largest PC maker--won out in the end.

UPDATED: 6:25 p.m., PST - Microsoft issued the following statement after this blog was posted: "We included the 915 chipset as part of the Windows Vista Capable program based on successful testing of beta versions of Windows Vista on the chipset and the broad availability of the chipset in the market. Computers equipped with this chipset were and are capable of being upgraded to Windows Vista Home Basic. Microsoft authorized the use of the Premium Ready designation on PCs that could support premium features of Windows Vista."

Dell Profit Falls Short, Stock Down Late

Dell reported a 6.5% drop in net income in the quarter ended Feb. 1 to $679 million from $726 million the same period a year ago. Contributing to the decline was $83 million in expenses related to research and development stemming from the recent acquisitions of EqualLogic and Everdream, and $54 million in expenses related to severance costs and facility closures from the company's restructuring efforts

The world's second-largest personal-computer maker, which has been moving beyond direct sales into the retail sector to better compete with No. 1 Hewlett-Packard Co. (HPQ), warned that it could incur costs as it realigns its business to boost growth and profitability. In recent after-hours trading, Dell shares were at $20.07, down 3.8% from Thursday's close. Dell's stock has lost nearly a third of its value since October as recession worries hurt the technology sector.

Dell reported net income of $679 million, or 31 cents a share, for the quarter ended Feb. 1. In the prior-year period, Dell's net income was $726 million, or 32 cents a share. The latest quarter included 11 cents in charges and stock-compensation costs.

Revenue rose 11% to $15.99 billion from $14.47 billion. Analysts polled by Thomson Financial had expected the Round Rock, Texas, company to post earnings, excluding items, of 36 cents a share on revenue of $16.27 billion.

Gross margin rose to 18.8% from 17.1%.

U.S. consumer revenue grew 12%, driven by a 25% increase in shipments, aided by new product offerings and the company's expansion into retail. "Unit share increased by over three points - the largest quarterly gain in over three years," Dell said.

Revenue at the company's mobility unit, which sells laptops, rose 24%.

Looking ahead, Dell warned it will "continue to incur costs" as it realigns its business to improve growth and profitability. "While the company believes these actions are necessary to drive long-term sustainable value, they may adversely impact the company's near-term performance." Dell also said results could be "adversely impacted by more conservative spending by its customers."

However, Dell said it is "benefiting from accelerating growth and an improving mix of products and geographic regions," and it "expects to achieve substantial improvements in cost and productivity."

Dell said earlier this month it is cutting more than 1,200 jobs, about 900 of them at a call center in Canada. That followed last year's announcement that Dell would lay off 10% of its work force.

In November, Dell warned that its near-term performance might be hurt by further restructuring and a slower decline in component costs. It said the restructuring moves were "necessary to drive long-term sustainable value." Dell also warned it might be hurt by a seasonal shift in mix.

Before Thursday's results, some observers had said Dell was starting to see light at the end of the tunnel, after ceding its crown to rival Hewlett-Packard as the world's largest computer maker.

In January, as part of its effort to halt declining market shares and falling sales, Dell added Best Buy Co. (BBY) to the list of retail chains which now sell its computers. The list had already included Wal-Mart Stores Inc. (WMT) and Staples Inc. (SPLS). Also in January, tracking firm Gartner Inc. said Dell's worldwide PC shipments grew 17% in the fourth quarter.

Dell, which has resumed its stock-buyback program following the end of an internal accounting probe, said it spent $4 billion to buy back 179 million shares of common stock in the quarter. It said it expects to buy back at least $1 billion of stock in the first quarter.

Comcast faces a backlash of bad publicity and increasing skepticism about the way the telecommunications giant runs its high-speed Internet service

After a hearing into Comcast Corp.'s Internet policies this week, the company faces a backlash of bad publicity and increasing skepticism about the way the telecommunications giant runs its high-speed Internet service.

more stories like this
Comcast dispute with NFL sent back to trial court
ABC to offer shows via video-on-demand
FCC chief says providers can't block access 'arbitrarily'
FCC ready to curb ISP traffic management
Internet rights on debate at Harvard
Critics have denounced Comcast for paying people to occupy seats in the cramped Harvard Law School lecture hall where the Federal Communications Commission hearing was held, preventing many critics from gaining admittance. Comcast officials said they were merely trying to save enough seats for company executives. But Josh Silver, executive director of the Internet activist group Free Press, said the hired guests stayed on, preventing many Comcast critics from attending the hearing.

There are about 300 seats in the lecture hall, and when they were all filled people were turned away.

"Comcast had these guys sit through the first entire half of the day to keep those seats full," Silver said.

Despite having friends in the audience, Comcast took a verbal beating at Monday's hearing, from FCC commissioners and hostile witnesses alike. The controversy over Comcast's network management policies has helped revive the once-dormant debate on "Net neutrality," the concept of forcing Internet companies to treat all data on their networks exactly alike.

Companies like Google Inc. and online video provider Vuze Inc., which use the Internet to distribute their services, say Net neutrality is vital to their businesses. But Internet providers like Comcast say there are legitimate business and technical reasons for them to offer different levels of service to different kinds of traffic.

Broadband providers designed their network for users who mostly swap e-mails and visit websites - tasks which don't transmit, or upload, very much data. So broadband systems are designed to receive, or download, data much faster than they can upload it. "We have to engineer and manage the network for typical usage of a vast majority of customers," said Mitch Bowling, Comcast's senior vice president of online services.

But these days, many Internet subscribers use peer-to-peer software that lets thousands of Internet users share large files by uploading and downloading them to each others' computers. So when a computer with BitTorrent downloads a TV show, it starts uploading the same show to other users. As a result, many Internet users now upload far more traffic than broadband providers expected. And Bowling said just a few of these users can consume most of the capacity on a neighborhood Internet "node," which may serve several hundred households.

Comcast copes with the problem by sometimes slowing down BitTorrent data uploads from its customers' computers. Last year, BitTorrent users complained about problems on the Comcast network, but the company refused to confirm the policy until last October. Comcast still insists it has done nothing wrong. "We have to engineer and manage the network for typical usage of a vast majority of customers," said Bowling. "We are simply managing the network for the greater good

But Vuze, which filed the FCC complaint that led to Monday's hearing, uses BitTorrent to send videos to its subscribers. Comcast's policy to slow these file transfers could hurt Vuze's business. Indeed, the company argued that Comcast could use "network management" as an excuse to fend off Vuze's challenge to Comcast's cable TV business. "By degrading the high-quality video content by which Vuze differentiates itself in the marketplace, network operators can seek a competitive edge," said Vuze in its FCC complaint.

more stories like thisVirtually everyone who spoke at the hearing agreed that Comcast should provide far more information about how it manages its network. But there's less uniformity about whether to regulate the practice or ban it altogether.

Free Press wants Net neutrality legislation that would make it illegal for networks to discriminate against particular kinds of Internet traffic, such as BitTorrent. The group's general counsel, Marvin Ammori, said if Comcast installed better technology, it would be able to handle BitTorrent traffic with little trouble. But Comcast has said it will reduce its capital expenditures this year compared with 2007.

Comcast's Bowling said that merely improving the network won't eliminate the need to throttle back some kinds of traffic. He said peer-to-peer programs like BitTorrent tend to use more bandwidth as it becomes available, so network clutter doesn't get better with more capacity. "You can't outrun this problem by building more bandwidth," he said.

The organizer of a federal hearing Monday at Harvard Law School on Comcast's treatment of subscriber Internet traffic said yesterday that "seat-warmers" hired by the company prevented other people from attending.

Comcast acknowledged that it hired an unspecified number of people to fill seats, but said those people gave up their spots when Comcast employees arrived to take their places.

Catherine Bracy, administrative manager of Harvard University's Berkman Center for Internet and Society, disputed that assertion, saying most of the three dozen seat-warmers who arrived hours before the Federal Communications Commission hearing remained during the event's opening hours, as many other people were turned away.

"No employees came in to take those seats when the event started," Bracy said.

The hearing was held in response to complaints to the FCC that Comcast, the nation's largest cable company, hampered file-sharing traffic on its Internet service. The company has said that its traffic management practices are necessary to keep other Internet traffic flowing smoothly.

Comcast said it hired seat-holders only after the advocacy group Free Press urged its backers to attend. "For the past week, the Free Press has engaged in a much more extensive campaign to lobby people to attend the hearing on its behalf," the company said in a written statement.

A Comcast
spokeswoman declined to comment yesterday on Bracy's statements

US Representative Edward Markey, a Massachusetts Democrat who attended the hearing, favors Net neutrality, but said Comcast may have to manage its data traffic because of the way the company designed its network. The real problem, said Markey, is a lack of competition.

In most communities there are no more than two broadband providers - the cable TV company and the phone company. These companies have already attached wires to all local homes. A would-be rival would have to spend millions building wired networks of their own - a massive barrier to competition.

Markey favors an "unbundling" policy, in which the federal government would require cable and phone companies to sell wholesale access to the lines going into consumers' homes. This would let many companies get into the Internet access business without having to string their own wires.

In an unbundled world, said Markey, Net neutrality will take care of itself. Companies that discriminate against BitTorrent traffic would lose business to those that didn't. "Competition is a proxy for regulation," he said.

Markey acknowledged that Congress might not embrace his idea. But he said the only alternative is a Net neutrality law that could saddle Internet access providers with burdensome regulations. "If unbundling isn't possible," said Markey, "then Net neutrality is going to be the rule of the road."

Privacy question on cyber security plan


Congress worries that .gov monitoring will spy on AmericansHouse, lawmakers yesterday raised concerns about the privacy implications of a Bush administration effort to secure federal computer networks from hackers and foreign adversaries, as new details emerged about the largely classified program.
Einstein, which DHS calls an "early warning system" for cyber-incidents, is described in a Homeland Security document from September 2004 as "an automated process for collecting, correlating, analyzing, and sharing computer security information across the federal civilian government." It's still only in place at 15 federal agencies, but Homeland Security Secretary Michael Chertoff requesting $293.5 million from Congress in next year's budget to roll it out government-wide.

The round-the-clock system captures traffic flow data, which currently includes source and destination IP addresses and ports, Internet Control Message Protocol data, and the length of data packets. According to an internal 2004 privacy impact assessment (PDF), "the program is not intended to collect information that will be retrieved by name or personal identifier." Members of the U.S. Computer Emergency Readiness Team, which coordinates federal responses to cyber attacks, analyze the downloaded records once per day in hopes of detecting worms and other "anomalous activity," pinpointing trends, and advising agencies on how best to configure their systems.

At a hearing convened here Thursday by the U.S. House of Representatives Homeland Security Committee, politicians directed pointed questions to Department of Homeland Security officials about their plans to expand an existing "intrusion detection" system known as Einstein. Among other things, the system will monitor visits from Americans--and foreigners--visiting .gov Web sites.

The unclassified portions of the project, known as the "cyber initiative," focus on drastically reducing the number of connections between federal agency networks and the Internet, and more closely monitoring those networks for malicious activity. Slightly more than half of all agencies have deployed the Department of Homeland Security's program.

But administration officials have not said how far monitoring would go, and whether oversight would extend to networks operated by state, local, and private sector entities, including government defense contractors.

A more real-time scrutiny of federal data flows is necessary because "our adversaries are very adept at hiding their attacks in normal everyday traffic," DHS Undersecretary Robert Jamison told the House Homeland Security Committee yesterday. He added that DHS is developing a privacy impact assessment on the new capabilities, which will be open to public review upon completion.

Some Democrats on the oversight panel were not assuaged by the administration's testimony. Rep. Bob Etheridge (D-N.C.), said he remained concerned about the program's impact on the privacy of his constituents. "It looks a little like the fox is guarding the hen house," he said.

But Jim Lewis, director of the technology arm of the Center for Strategic and International Studies, a Washington think tank, called the privacy concerns premature and overblown.

"There's a big difference between intercepting and reading e-mail and reacting to suspicious traffic going across your network," said Lewis, whose employer is working with Congress and the private sector on a set of cyber security policy recommendations for the next president.

Tuesday, February 26, 2008

Google Working On Undersea Cables For Broadband


Google finally confirmed the rumors of turning to undersea cables and announced that, together with other five associates will start “Unity”, a trans-Pacific undersea fiber-optic cable linking the United States and Japan. The investment will cost approximately $300 million and became necessary as the demand and the current capacity of the trans-Pacific cables tent towards an imbalance.

The international consortium includes Bharti Airtel, India’s leading integrated telecom service provider, Global Transit, a South Asian network operator, KDDI, a Japanese information and communication company, Pacnet, leading Asian telecom service provider, SingTel, an Asian leading communications group covering areas of Europe, U.S. and Asia Pacific, and of course Google.

The 10,000 kilometer cable, which has been designed as a five fiber pair cable system, each fiber with a 960Gbps capability, will link Chikura, near Tokyo, to Los Angeles and the West Coast and is expected to meet the new demands in data and Internet traffic.

“The Unity cable system allows the members of the consortium to provide the increased capacity needed as more applications and services migrate online, giving users faster and more reliable connectivity,” Unity spokesperson Jayne Stowell said in a statement.

The cable system is expected to respond to the demand in data and Internet traffic between the two continents and raise the current capacity by 20 percent, according to Google, and potentially add up to 7.68 Tbps of bandwidth.

The construction is to begin immediately, after an official agreement has already been signed on February 23 in Tokyo. NEC Corporation and Tyco Telecommunications will be responsible for constructing and installing the cable system, which is set to become available in the first quarter of 2010.

Google is not the only one to have made such an initiative, as other lines, involving the names of Verizon and AT&T are already under construction or under way. And it’s no wonder, since the average growth in bandwidth demand is up 64 percent almost every year, and is expected to double within the next 5 years.

Flexible Thinking, Nanotechnology Lead to Nokia’s “Morph” Concept Phone


Nokia has built a flexible mobile phone using nanotechnology which the company says will become increasingly commonplace in the future.

The Morph device can be used as a keyboard, then bent around the wrist and worn as a bracelet.

The handset giant first touted the system five years ago and it has taken researchers at the Nokia Research Centre in Cambridge this long to get a working prototype.

Dr Tapani Ryhanen, head of the research centre, said: "We hope that this combination of art and science will showcase the potential of nanoscience to a wider audience.

"The research we are carrying out is fundamental to this as we seek a safe and controlled way to develop and use new materials."

Nokia claimed that the technology will be in mainstream phones by 2015, but that there are technical challenges still to overcome. Chief among these is power, and Nokia is investigating the use of new battery materials.

Professor Mark Welland, head of the Nanoscience Group at the University of Cambridge, said: "Developing the Morph concept with Nokia has provided us with a focus that is artistically inspirational and sets the technology agenda for our joint nanoscience research."

more...
Nokia Research Center and the University of Cambridge announced a research partnership, and here we have the first fruits of their labor, though this is far from being ready for consumers.

The Morph is a nanotechnology-based concept phone. It’s flexible and stretchable, has self-cleaning surfaces and transparent electronics, has solar charging. It’s so flexible it can even wrap around your wrist when not in use. Give it the ability to wrap around your wrist when in use, a webcam feature and you’ve got yourself a Dick Tracy wrist TV.

The Morph is on display from February 25th through May 12, 2008, at the “Design and the Elastic Mind” exhibition at the Museum of Modern Art in New York City.

While I wouldn’t expect to see any of this tech anytime soon, Nokia does say it is possible at least some elements of its design could see the light of day in actual phones within the next seven years.

And people wonder how Nokia keeps its market-leading position.

Watch a video with the Morph in “action”:

Adobe Rolls Out Air 1.0


Adobe Systems on Monday formally released Adobe AIR, Flex 3 and BlazeDS technologies for enabling the rapid development of rich Internet applications (RIAs).

With these technologies, developers can create applications that run online or offline and across any platform.

AIR was developed under the codename of "Apollo." Its main competition for building RIAs isJavaFX from Sun Microsystems and Silverlight from Microsoft.

However, Adrian Ludwig, group product manager for Adobe AIR, noted some differences.

"Silverlight isn't on the desktop. It's more of a competitor to Flash. The key to AIR is coming from the Web to the desktop. Compared to JavaFX, we delivered something. We're out there. JavaFX is not," he told InternetNews.com.

Indeed, the ability to develop a Web-based application and then use that exact same code to create a desktop application—no rewrite necessary—is a key selling point on AIR.
It's designed to be cross-platform, as Java is, requiring a just-in-time compiler on those different platforms. The tradeoff is a minor performance hit but it's not noticeable, said Ludwid.

Adobe's RIA technologies include tools, frameworks, servers, services and runtimes for building applications quickly without requiring a lot of programming skill. They use all the latest Web technologies, many of which have come from Adobe, like Flash, Flex and PDF as well as HTML and AJAX.

Flex is a free, open-source framework for building RIAs. Adobe Flex Builder 3 is focused mostly on adding new features for developers as opposed to user-oriented features. This includes an enhanced debugger, memory and usage profiling and integration with Adobe's creativity suites, such as Photoshop.

Making the framework open source is designed to appeal to developers. "It allows developers to be confident they can develop on it," said Ludwig. "They can go into the framework and make modifications to it, customize it to their environment."

"It's no longer a proprietary Adobe framework," he added. "It's about embracing developers outside of Adobe who want to make it better, for their use or the greater good of all of us."

BlazeDS is an accompanying technology for Flex that is a data services layer to relay information between clients and back-end services. Like Flex, it's open source and available for download from Adobe.

Ludwig said Adobe, with this release, has added a remoting protocol so clients can call services and applications on the server.

Adobe launched AIR, Flex and BlazeDS on Monday at its Engage event in San Francisco, a show designed to showcase Internet application development. A number of firms announced AIR-based applications, including AOL, The New York Times Co. and NASDAQ.

Salesforce.com made a little news of its own, announcing the new Force.com Toolkit for Adobe AIR and Flex, which would allow for developing RIAs on Salesforce.com's Force.com platform. Developers will have direct access to the Force.com Web services API to create applications in AIR's rapid application development language while having the power of a Force.com serverside application.

For Robert Blatt, vice president and general manager of AOL's personal media division, AIR had three things the other RIA technologies didn't have. "First is the cross-platform nature of it and the ability to build an app you could deploy in browsers and a desktop. While other technologies can do that if you try real hard, with AIR it just works so that reduces our development time," he told InternetNews.com.

Secondly, he said, is the ubiquity of Adobe's technology. "Their technology is already on the consumer's desktop and they don't have to download any more technology to get it to work. The last thing is they are doing a fantastic job erasing the boundaries of online and offline, and for my application that's absolutely essential," he said.

Turning an application loose on a desktop could be a risky proposition, as many security experts have raised concerns regarding Web 2.0's permissive nature, but Blatt has confidence in AIR. "There are valid concerns but I feel Adobe has done a nice job of limiting the capability and informing the consumer so that they don't download stuff to stuff to their computer they don't want," he said.

more....
The new online/offline platform from Adobe serves as a stepping stone for Web 2.0 applications to migrate to the desktop.


Delivering on its promise to merge online and offline content with the project once code-named “Apollo,” Adobe released the first version of the newly renamed Air on Monday, a technology designed to bring both worlds together. Adobe Air effectively allows previously online-only services to offer dedicated applications that can work with or without an Internet connection.

For instance, one of the first Adobe Air offerings, EBay Desktop, allows users to manage and sort through auctions using a more powerful interface than the one available online. It enables features such as instant updates on auction changes and live feeds of interesting auctions as they appear that were not previously possible with the version of eBay constrained to a browser window. Another application, Buzzword, serves as a word processor that seamlessly allows users to save content online or off. The Monday launch brought Air-driven programs from a host of other companies, including Nasdaq, Nickelodeon, and AOL.

Like Adobe’s older standard for online application development, the ubiquitous Flash, Air is totally free to download. Unlike Flash, the Air software developer’s kit is also free, allowing amateur developers to dabble in Adobe’s software environment without paying and likely expanding the library of available applications. Both the Air client and Air SDK can be found at Adobe’s Air download page, while a list of available applications is also available through Adobe.

Monday, February 25, 2008

Google to store patients' health records in test of new service


The U.S. health-care system is the most costly in the world. Yet it's also remarkably antiquated. The medical records of as many as 90% of patients are hidden away in old-fashioned filing cabinets in doctors' offices. Prescriptions are scribbled on paper. Most Americans need to fill out separate medical histories for each specialist they visit.


"We are trained, like Pavlov's dogs, to repeat the same information 17 times," says Scott Wallace, chief executive officer of the National Alliance for Health Information Technology, a not-for-profit alliance of health care providers, information technology vendors, and health and technology associations. The result: mistakes, duplicated tests, botched diagnoses, and billions of dollars in unnecessary costs and lost productivity.


Many providers, including Kaiser Permanente and Cleveland Clinic, have invested millions of dollars in information technology systems and creating electronic medical records for patients. Here's the rub: Much of that information can't be shared from one doctor or hospital to the next. As a result, blood-test results in the database of an Arizona doctor, for instance, are of little use when the patient is visiting a doctor halfway across the country.


Linking systems "is the real challenge in this industry," says Dr. C. Martin Harris, chief information officer at the Cleveland Clinic.


Pilot Project Kicks Off


In an effort to meet that challenge, the Cleveland Clinic and Google on Feb. 21 announced a project to give patients and doctors better access to electronic medical records. "It is clear that one of the big needs is assembling health records from a variety of places and giving people control of those records," explains Marissa Mayer, vice-president for search products and user experience at Google. And while the Cleveland/Google project may revolutionize medical record-keeping and improve how hospitals and physicians provide care, it also raises concerns over patient privacy and the security of sensitive information.


Here's how the pilot project, officially begun on Feb. 18, works. The Cleveland Clinic already keeps electronic records for all its patients. The system has built-in smarts, so that it will alert doctors about possible drug interactions or when it's time for, say, the next mammogram. In addition, 120,000 patients have signed up for a service called eCleveland Clinic MyChart, which lets patients access their own information on a secure Web site and electronically renew prescriptions and make appointments.


The system has dramatically cut the number of routine calls to the doctor and boosted productivity, though it has yet to effectively deal with information from an outside physician, Harris says. Those records are typically still on paper, and have to be laboriously added to the Cleveland Clinic system. It is a big problem, especially for the clinic's many patients who spend winters in Florida or Arizona, where they see other doctors.


Adding Google's technology lets patients jump from their MyChart page to a Google account. Once on Google, they'll see the relevant health plans and doctors that also keep electronic medical records. That means the patient can choose to share information between, say, the Arizona doctor and the Cleveland Clinic.


The system is still fairly primitive compared with sophisticated electronic information-sharing systems such as an ATM network. The information being shared is limited to data on allergies, medications, and lab results. That's because this data is more easily put in a standard form that can be read by different computer systems.


Paring Health-Care Costs


For the health-care system as a whole, though, it's an important move forward.


"What Martin [Harris] has done is revolutionary," says Wallace of the National Alliance for Health Information Technology. "It may not be the perfect solution, but it is a better solution than we have now." Over time, more information can be added, and more patients and doctors will be able to access the records. And if the pilot program works, Google intends to roll out a comparable service for the general public.


One payoff: cutting health-care costs. "There's a real potential to affect the slope of the health-care cost curve," Harris says. "I believe this kind of exchange is the way we will get the total value out of an electronic medical record."


Projects like the one started by Cleveland and Google could also have big implications for business. Companies want employees to take greater charge of their health care. Experts say employees can do a better job of that by gaining control over-and access to-records, and that they'll get a leg up, technology-wise, from the participation of such players as Google and Microsoft . "I think Google is spectacular on this," Wallace says. "Health care is a mainstream issue, and getting the purveyors of information involved in this is a brilliant step."


What's In It for Google?


How the e-health program plays out for Google is less clear. Mountain View (Calif.)-based Google is not the first high-tech giant to dip a toe into health care. Microsoft, for example, launched a health records and information service, HealthVault, in October (BusinessWeek.com, 10/4/07). The company has more than 100 partners including the Mayo Clinic, a nonprofit medical practice and large online health-information network, and hopes to use its large health software business to help bring new players on board.


On Feb. 20, the company released source code to help outside organizations and developers integrate their information and build programs around the HealthVault platform. "We think that we are the best health search out there, and we think more and more we are going to convince people of that," Sean Nolan, HealthVault's chief architect.


Being late to the game has hurt Google in the past. The company's finance site, launched May, 2006, has failed to gain much traction. It ranks 16th in the business information category of Hitwise, a company that measures Web traffic. Yahoo's much older finance site has remained No. 1 for much of the past three years. Similarly, Google's payment service Google Checkout, launched in June, 2006, has failed to grab market share from eBay's leading payments service, PayPal .


Thorny Privacy Issues


When it comes to online health information, the obvious prize is the estimated $500 million to $1 billion health search advertising. Google won't admit to aiming for that market, though, and those familiar with the project suggest revenue could come from other sources. "They aren't wedded to advertising," Wallace says. "Their attitude is that this is such a nascent area, they can play around for a while and find a way to make huge amounts of money." It's not yet clear how that might happen. "The unanswered question is what is the business model that justifies the investment of these big players," says David Lansky, senior director of the health program at the John and Mary R. Markle Foundation, a nonprofit dedicated to improving information technology in health care.


One worry is that the companies might be tempted to sell personal information. While strict laws govern patient privacy at hospitals and health-care providers, "there is no federal regulation of what these middle-layer players can do with your data," Lansky explains. And while consumers might trust Google or Microsoft now, what might happen in years or decades? "This is deeply personal information that is being collected about you and your family," says Jeff Chester, executive director of the Center for Digital Democracy. "There is unease about marketers being able to access that vast range of information."





Technorati :

MIT neuroscientists see design flaws in computer vision tests


The human brain easily recognizes that these cars are all the same object, but the variations in the car's size, orientation and position are a challenge for computer-vision algorithms.

For years, scientists have been trying to teach computers how to see like humans, and recent research has seemed to show computers making progress in recognizing visual objects.

A new MIT study, however, cautions that this apparent success may be misleading because the tests being used are inadvertently stacked in favor of computers.

Computer vision is important for applications ranging from "intelligent" cars to visual prosthetics for the blind. Recent computational models show apparently impressive progress, boasting 60-percent success rates in classifying natural photographic image sets. These include the widely used Caltech101 database, intended to test computer vision algorithms against the variety of images seen in the real world.

However, James DiCarlo, a neuroscientist in the McGovern Institute for Brain Research at MIT, graduate student Nicolas Pinto and David Cox of the Rowland Institute at Harvard argue that these image sets have design flaws that enable computers to succeed where they would fail with more-authentically varied images. For example, photographers tend to center objects in a frame and to prefer certain views and contexts. The visual system, by contrast, encounters objects in a much broader range of conditions.

"The ease with which we recognize visual objects belies the computational difficulty of this feat," explains DiCarlo, senior author of the study in the Jan. 25 online edition of PLoS Computational Biology. "The core challenge is image variation. Any given object can cast innumerable images onto the retina depending on its position, distance, orientation, lighting and background."

The team exposed the flaws in current tests of computer object recognition by using a simple "toy" computer model inspired by the earliest steps in the brain's visual pathway. Artificial neurons with properties resembling those in the brain's primary visual cortex analyze each point in the image and capture low-level information about the position and orientation of line boundaries. The model lacks the more sophisticated analysis that happens in later stages of visual processing to extract information about higher-level features of the visual scene such as shapes, surfaces or spaces between objects.

The researchers intended this model as a straw man, expecting it to fail as a way to establish a baseline. When they tested it on the Caltech101 images, however, the model did surprisingly well, with performance similar or better than five state-of-the-art object-recognition systems.

How could that be? "We suspected that the supposedly natural images in current computer vision tests do not really engage the central problem of variability, and that our intuitions about what makes objects hard or easy to recognize are incorrect," Pinto explains.

To test this idea, the authors designed a more carefully controlled test. Using just two categories--planes and cars--they introduced variations in position, size and orientation that better reflect the range of variation in the real world.

"With only two types of objects to distinguish, this test should have been easier for the 'toy' computer model, but it proved harder," Cox says. The team's conclusion: "Our model did well on the Caltech101 image set not because it is a good model but because the 'natural' images fail to adequately capture real-world variability."

As a result, the researchers argue for revamping the current standards and images used by the computer-vision community to compare models and measure progress. Before computers can approach the performance of the human brain, they say, scientists must better understand why the task of object recognition is so difficult and the brain's abilities are so impressive.

One approach is to build models that more closely reflect the brain's own solution to the object recognition problem, as has been done by Tomaso Poggio, a close colleague of DiCarlo's at the McGovern Institute.

Trip of a lifetime: MIT hosts next generation of science leaders


MIT Professor Eric Lander describes the future of biology in an inspirational talk to high school students as part of the annual meeting of the American Junior Academy of Science.

It's not every day that high school students get the chance to visit MIT research labs and see concepts that they've learned about in classes come to life.

But that's exactly what happened Thursday, Feb. 14, as high-schoolers from around the country descended on MIT as part of the annual meeting of the American Junior Academy of Science (AJAS).

The AJAS meeting was held in conjunction with the annual meeting of the American Association for the Advancement of Science in Boston. Most of the 120 high school students in attendance won their way to Boston through science fair projects, which they presented at a poster session on Friday, Feb. 15.

On Thursday, the students got a taste of life and research at MIT, including lab tours, an afternoon at the MIT Museum and a talk by MIT Biology Professor Eric Lander.

Lander, director of the Broad Institute, offered students a glimpse of cutting-edge research in the field of genomics--something they will not learn about in their biology classes, he said.

"Textbooks always tell you about what we know, but what's interesting is what we don't know," said Lander. "Textbooks don't like to write about what we don't know, because it's hard to test you on it."

Lander told the students that biology is in the midst of a revolution that will transform the field, much as the development of the periodic table of elements transformed the study of chemistry in the 1800s.

The sequencing of the human genome, completed in 2003, is just the first step of that revolution, Lander said. Ongoing projects to map human genetic variation and determine the function of all human genes will open even more doors.

In about 10 or 15 years, scientists will have unprecedented resources and knowledge at their fingertips to help them study how human diseases arise and how to fight them, Lander said.

"The high school students of 2025 are not going to be able to understand what it was like to study biology in the benighted 20th century," he said.

After Lander's talk, students flocked to the front of the Broad Auditorium to ask questions or have their photo taken with him.

"His talk was incredible," said Zach Silver, a student from Pine Crest High School in Ft. Lauderdale, Fla.

Students also had the chance to tour about 20 MIT research labs. One group visited the Department of Aeronautics and Astronautics, where they heard from graduate students and postdoctoral researchers working on a variety of projects.

Christy Edwards, a graduate student in aero-astro, demonstrated the microsatellites that MIT students have been developing for several years.

Three of the volleyball-sized satellites are now onboard the International Space Station, where scientists will fine-tune the satellites' performance before they are sent into space on their own. "It's like driver's ed for satellites," Edwards explained.

In the Man-Vehicle Lab, the students got a peek at the lightweight, skintight spacesuit that Professor Dava Newman and her students are designing for future excursions in space.

Sachein Sharma, a sophomore at the Texas Academy of Science, said he enjoyed the tour of MIT's aero-astro projects. "It's very interesting and exciting," he said. "MIT is a great place for that kind of thing."

Sharma won his way to the AJAS conference by designing a new type of blade for a wind turbine. He envisions that someday wind turbines could be used in space, possibly on Mars' surface.

His research advisor, Cathy Bambanek, a chemistry teacher at the Texas Academy of Science, said she was impressed that much of MIT's research seems to be student driven.

"It's amazing," she said. "It seems like the students have a lot of input as far as what kind of projects they would like to work on."

The day at MIT was hosted by biology instructor Mandana Sassanfar and sponsored by the School of Science, School of Engineering, School of Architecture and Planning, Department of Biology and MIT Museum.

Learning about brains from computers, and vice versa


For many years, Tomaso Poggio's lab at MIT ran two parallel lines of research. Some projects were aimed at understanding how the brain works, using complex computational models. Others were aimed at improving the abilities of computers to perform tasks that our brains do with ease, such as making sense of complex visual images.

But recently Poggio has found that the work has progressed so far, and the two tasks have begun to overlap to such a degree, that it's now time to combine the two lines of research.

He'll describe his lab's change in approach, and the research that led up to it, at the American Association for the Advancement of Science annual meeting in Boston, on Saturday, Feb. 16. Poggio will also participate in a news briefing Friday, Feb. 15, at 3PM.

The turning point came last year, when Poggio and his team were working on a computer model designed to figure out how the brain processes certain kinds of visual information. As a test of the vision theory they were developing, they tried using the model vision system to actually interpret a series of photographs. Although the model had not been developed for that purpose--it was just supposed to be a theoretical analysis of how certain pathways in the brain work--it turned out to be as good as, or even better than, the best existing computer-vision systems, and as good as humans, at rapidly recognizing certain kinds of complex scenes.

"This is the first time a model has been able to reproduce human behavior on that kind of task," says Poggio, the Eugene McDermott Professor in MIT's Department of Brain and Cognitive Sciences and Computer Science and Artificial Intelligence Laboratory.

As a result, "My perspective changed in a dramatic way," Poggio says. "It meant that we may be closer to understanding how the visual cortex recognizes objects and scenes than I ever thought possible."

The experiments involved a task that is easy for people, but very hard for computer vision systems: recognizing whether or not there were any animals present in photos that ranged from relatively simple close-ups to complex landscapes with a great variety of detail. It's a very complex task, since "animals" can include anything from snakes to butterflies to cattle, against a background that might include distracting trees or buildings. People were shown the scenes for just a fraction of a second, a task that uses a particular part of the human visual cortex, known as the Ventral 1 pathway, to recognize what is seen.

The visual cortex is a large part of the brain's processing system, and one of the most complex, so reaching an understanding of how it works could be a significant step toward understanding how the whole brain works--one of the greatest problems in science today.

"Computational models are beginning to provide powerful new insights into the key problem of how the brain works," says Poggio, who is also co-director of the Center for Biological and Computational Learning and an investigator at the McGovern Institute for Brain Research at MIT.

Although the model Poggio and his team developed produces surprisingly good results, "we do not quite understand why the model works as well as it does," he says. They are now working on developing a comprehensive theory of vision that can account for these and other recent results from the lab.

"Our visual abilities are computationally amazing, and we are still far from imitating them with computers," Poggio says. But the new work shows that it may be time for researchers in artificial intelligence to start paying close attention to the latest developments in neuroscience, he says.

MIT's crossword king girds for annual battle of wits


Math professor and crossword puzzle fiend Kiran Kedlaya works on Friday's New York Times puzzle. He will compete in the American Crossword Puzzle Tournament on Feb. 29
A surprising number of crossword puzzle fans have backgrounds in math, computer science or some other technical field. That's certainly the case at MIT, where graduate students in the math department gather most weekday afternoons over tea to tackle The New York Times crossword puzzle.

Teamwork is encouraged, but one person usually stays on the sidelines: associate math professor Kiran Kedlaya, a champion crossword puzzle solver who could likely finish the puzzle by himself in less than 10 minutes.

"They won't let him join, because he's too good," says Michael Sipser, head of the math department.

Kedlaya, one of the top crossword solvers in the United States, is heading to Brooklyn, N.Y., this weekend for his 11th appearance in the American Crossword Puzzle Tournament.

Kedlaya studies number theory and algebraic geometry in his academic life; he enjoys crosswords because they let him combine his math skills with his interest in words and language.

"When I do crosswords, I'm using a part of my brain I don't get to use much in my job," says Kedlaya, who also composes puzzles for MIT's Mystery Hunt, held during IAP.

His best crossword tournament finish was in 2006, when he came in second place. He began doing crossword puzzles seriously in college, then started going to the tournament, made famous in the 2006 documentary "Wordplay," while a grad student at MIT.

Kedlaya finished fourth in 2005, the year the tournament was filmed for "Wordplay," and makes a couple of brief appearances in the movie.

To get ready for the tournament, Kedlaya does two or three crossword puzzles a day, mostly from The New York Times. Practice is critical to improve speed and perform well in the tournament, he says.

"Some people might be naturally good at crosswords, but you don't get to be this fast without training," says Kedlaya. "There's something similar to athletic training going on here. People are getting into shape, doing their daily puzzles, trying to get psyched up."

The New York Times crossword puzzles, which get more difficult as the week goes on, are the gold standard by which puzzle solvers judge themselves. For a really good solver, Monday's puzzle would take about three minutes, while Friday and Saturday puzzles would take seven to 10 minutes, says Kedlaya. Sunday's puzzle, which is larger, could take eight to 15 minutes.

Unlike Scrabble, another game popular with math-oriented people, crossword puzzles require knowledge of word meanings. Kedlaya suspects that is why they appeal to disparate groups of people like mathematicians and computer scientists, writers and editors, and musicians.

"There seems to be some conflation between math skills, music skills and language skills," he says.

The national tournament draws several hundred people and is the pre-eminent event for crossword puzzle solvers. Competitors solve seven puzzles during the first day of the event, with a 15-minute time limit for each one. The atmosphere can get pretty intense, Kedlaya says.

"It's like taking the SAT," he says. "Everyone is in there working on their paper. Nobody is talking."

The solvers with the top three scores (fastest times and fewest mistakes) compete in the final round, held on the second day. In that round, which Kedlaya reached in 2006, finalists solve the puzzles on large easels at the front of an auditorium, wearing headphones to block out the color commentary broadcast to spectators.

"It is pretty stressful," Kedlaya says. "You're standing in front of a room of more than 500 people, writing on an easel that is sturdy, but not completely sturdy. If you hit it too hard, it shakes."

Teaching in front of a large classroom is excellent practice for this, he says.

For the past three years, 23-year-old Tyler Hinman has won the tournament. However, Kedlaya says he's confident about his own chances going into the tournament this year. "There are a number of solvers that have a shot at the top place, and I'm in that group," he says.

MetaRAM Develops New Technology That Quadruples Memory Capacity



MetaRAM Develops New Technology That Quadruples Memory Capacity of Servers and Workstations; Reduces Price by Up to 90 Percent

MetaSDRAM(TM) for AMD and Intel(R)-Based Systems Now Available.
MetaRAM, a fabless semiconductor company focused on improving memory performance, today announced the launch of DDR2 MetaSDRAM™, a new memory technology that significantly increases server and workstation performance while dramatically decreasing the cost of high-performance systems. Using MetaRAM's DDR2 MetaSDRAM, a quarter-terabyte, four-processor server with 16 cores starts at under $50,000*, up to a 90 percent reduction in system cost** -- all without any system modifications. MetaSDRAM, designed for AMD Opteron™ and Intel® Xeon®-based systems, is currently available in R-DIMMs from Hynix Semiconductor, Inc. and SMART Modular Technologies. Servers and workstations from Appro, Colfax International, Rackable Systems and Verari Systems are expected in the first quarter of 2008.

"I've spent my career focused on building balanced computer systems and providing compatible and evolutionary innovations. With the emergence of multi-core and multi-threaded 64 bit CPUs, I realized that the memory system is once again the biggest bottleneck in systems and so set out to address this problem," said Fred Weber, CEO of MetaRAM. "MetaRAM's new MetaSDRAM does just that by bringing breakthrough main memory capacity to mainstream servers at unprecedented price points, without requiring any changes to existing CPUs, chipsets, motherboards, BIOS or software."

MetaSDRAM is a drop-in solution that closes the gap between processor computing power, which doubles every 18 months -- and DRAM capacity, which doubles only every 36 months. Until now, the industry addressed this gap by adding higher capacity, but not readily available, and exponentially more expensive DRAM to each dual in-line memory module (DIMM) on the motherboard.

The MetaSDRAM chipset, which sits between the memory controller and the DRAM, solves the memory capacity problem cost effectively by enabling up to four times more mainstream DRAMs to be integrated into existing DIMMs without the need for any hardware or software changes. The chipset makes multiple DRAMs look like a larger capacity DRAM to the memory controller. The result is "stealth" high-capacity memory that circumvents the normal limitations set by the memory controller. This new technology has accelerated memory technology development by 2-4 years.

MetaRAM Company Details

MetaRAM received its first round of funding in January 2006, demonstrated its first working samples in July 2007 and released its first chipset into production in November 2007. The company was co-founded by industry luminary and former AMD CTO Fred Weber and is funded by venture firms including Kleiner Perkins Caufield & Byers, Khosla Ventures, Storm Ventures and Intel Capital.

"Kleiner Perkins invested in MetaRAM because we believed in the founders and their technical vision. MetaRAM has assembled a first class team and executed flawlessly in bringing the DDR2 MetaSDRAM chipset to market in a short period of time. MetaRAM has the leadership, vision, and talent to challenge existing technological limitations and open new capabilities for computing," said Bill Joy, Partner of Kleiner Perkins Caufield and Byers, and a member of MetaRAM's board of directors.

"The rapid adoption of Quad-Core Intel® Xeon® processors and platform virtualization, combined with the growth of data intensive applications, is driving demand for increased server memory capacity," said Bryan Wolf, managing director, Enterprise Platforms, Intel Capital. "MetaRAM's technology presented an opportunity for Intel to participate as both an investor and a strategic technology collaborator to deliver a compatible solution that enhances system performance."

MetaSDRAM Technical Details

MetaSDRAM, underpinned by more than 50 pending patents, solves the memory capacity problem affordably by enabling multiple mainstream DRAMs to look like a larger capacity DRAM to the CPU. The MetaSDRAM chipset combines four separate 1Gb DDR2 SDRAMs into a single virtual 4Gb DDR2 SDRAM which acts exactly as a monolithic 4Gb DDR2 MetaSDRAM would.

The DDR2 MetaSDRAM chipset is optimized for low power and high performance. MetaRAM's MetaSDRAM features include:


-- WakeOnUse™ power management which improves the power efficiency of the
DRAMs, thus enabling two to four times the memory to fit into a typical
system's power delivery and cooling capabilities.
-- Dynamic command scheduler that ensures that the MetaSDRAM is
compatible with the JEDEC DDR2 protocol.
-- Low latency circuit design and an innovative clocking scheme, which
allow the MetaSDRAM-based DIMMs to fit into existing memory controller
designs.
-- Unique split-bus stacked DRAM design that enables flexible access of
the multiple DRAMs in a stack.


MetaSDRAM Chipset Availability


-- MetaSDRAM MR08G2 chipset enables 2-rank 8GB DIMMs and is capable of
functioning at speeds up to 667MT/s. It consists of an AM150 Access Manager
and 5 FC540 Flow Controllers working as a group. The chipset is currently
in full production and is available at $200 each in 1,000 kit quantities.
-- MetaSDRAM MR16G2 chipset enables 2-rank 16GB DIMMs and is capable of
functioning at speeds up to 667MT/s. It consists of two AM160 Access
Managers and 9 FC540 Flow Controllers. The chipset is qualified for
production and is priced at $450 each in 1,000 kit quantities.


Compatible Platforms


-- AMD: Platforms based on Dual-Core and Quad-Core AMD Opteron™
processors
-- Intel: Platforms based on Dual-Core and Quad-Core Intel® Xeon®
processors with the 5100 MCH


Module Availability

Modules are currently available from:


-- Engineering samples are currently available from Hynix Semiconductor:
8GB PC2-4200 R-DIMM Module (HYMP31GP72CUP4-C6). For more information,
please visit www.hynix.com
-- Qualification samples are currently available from SMART Modular
Technologies (NASDAQ: SMOD): 8GB PC2-4200 R-DIMM Module (SG5721G4MG8C66HM),
$1500 budgetary pricing. For more information, please visit www.smartm.com

Server and Workstation Availability

Servers and workstations are expected in Q1 from:


-- Appro: Appro XtremeServers and XtremeWorkstations. For more
information please visit www.appro.com
-- Colfax International: Colfax CX1254-N2 and CX1460-N2 1U Rackmount
Servers and Colfax High-End Workstation CX980. To configure and purchase
please visit www.colfax-intl.com
-- Rackable Systems
-- Verari Systems: Newly introduced BladeRack® 2 X-Series blade-based
storage and server solutions, and Verari's high-end visualization
workstations. For more information please visit www.verari.com


Target Markets

MetaRAM products are designed for high performance rack-mount servers and workstations that run compute-intensive applications such as CAD/EDA simulations, database transaction processing (OLTP), business intelligence, digital content creation, and virtualization. These and other heavy workload applications are the backbone of industries like aerospace, automotive, financial services, animation, oil and gas exploration, and semiconductor design and simulation.

MetaRAM is headquartered in San Jose, Calif. and employs 35 people. More information on MetaRAM and its breakthrough product can be found on its web site at www.MetaRAM.com.

About MetaRAM

MetaRAM is a fabless semiconductor company focused on improving memory performance. The company's first product -- MetaSDRAM™ -- enables four times the amount of standard memory to be placed into existing systems without any modifications. The company is privately held, and venture funded by Kleiner Perkins Caufield and Byers, Khosla Ventures, Storm Ventures, and Intel Capital and is headquartered in San Jose, California. For more information, please go to www.metaram.com.

MetaSDRAM and WakeOnUse are Trademarks of MetaRAM

All other trademarks are the property of their respective owners

*The Colfax CX1460-N2 is a 1U rackmount server with 256GB of DDR2 memory and four AMD Opteron 8000 series processors. Starting under $50,000.






New Server Chips Quadruple Memory Capacity
Startup Metaram has developed a technique to pack more RAM onto a memory module.
Startup company Metaram on Monday is expected to announce technology that overcomes traditional server memory limitations and allows users to quadruple memory without adding new hardware.

Targeted at servers, the MetaSDRAM chipset sits between the DRAM module and a memory controller, processing commands and manipulating the controller to allow the system to have up to four times more memory.

The capability of Metaram's chipset to read the additional memory means memory makers can pack more RAM on a memory module, overcoming limitations that typically throttle the amount of memory that can fit in servers.

For example, an 8-socket x86 server is limited to 256G bits of RAM, but MetaSDRAM chipsets quadruple that to 1T byte of RAM.

"That allows the system to overcome traditional limitations to read the additional RAM on a [memory module]," said Jeremy Werner, senior manager of marketing at Metaram.

The ability to plug four times the memory into a slot on a motherboard is very attractive and allows servers to perform better, said Nathan Brookwood, an analyst at Insight 64. "If you can put a terabyte of memory in a system, your entire Oracle database can sit in the memory. That's a rocket booster," Brookwood said.

It also results in cost savings, Brookwood said. Users can add four times the memory capacity without adding CPUs, he said.

Memory manufacturers can plug the chipset on existing memory modules, according to the company. Hynix and Smart Modular Technologies are supplying the technologies in the memory modules, according to Metaram.

Metaram is shipping separate chips that can help double and quadruple the DRAM capacity of memory modules. The MetaSDRAM MR08G2 chip, which helps double the capacity of memory modules, is available to memory makers for US$200 in quantities of 1,000. Metaram did not share pricing information on the chipsets that quadruple memory. The chips are compatible with Advanced Micro Devices- and Intel-based x86 systems, Metaram said.

With the MetaSDRAM chips, Metaram has found a way for users to fit memory modules into existing infrastructure that users can adopt quickly, Brookwood said. This follows the rationale of Fred Weber, one of the founders of Metaram and former chief technology officer for Advanced Micro Devices.

"It reflects the same design philosophy when AMD came up with their Opteron boxes," Brookwood said. "Intel said x86 couldn't do 64-bits, but Weber said that the problem with Itanium is it doesn't fit into existing infrastructure," Brookwood said. Weber and AMD figured out how to fit the 64-bit architecture into chips that could be implemented into existing infrastructure, Brookwood said.

While Metaram's technology overcomes bottlenecks facing traditional system architecture, it could have its limits, analysts said.

"It is not a revolutionary product, but it is a novel way to handle additional memory," said Will Strauss, principal analyst at Forward Concepts. PCs and servers support only limited memory today, and this product will be effective until new PC designs are introduced in the future, he said.

Adobe AIR launches


Adobe Systems on Monday is set to finally release Adobe Integrated Environment software, which is on the leading edge of a movement to make Web applications act more like traditional desktop applications.

At the company's Engage event in San Francisco on rich Internet application design, executives will announce the availability of AIR 1.0, a free download for Windows and Macintosh.



The wall between the web and your computer continues to crumble with today’s launch of Adobe AIR, a runtime environment that allows you to deploy Internet applications on the desktop. AIR has already been available in public testing mode for several months, but the official launch should lead to greater usage — and, if we’re lucky, a flood of innovative web/desktop hybrids.

AIR offers a “best of both worlds” approach, says Michele Turner, an Adobe vice president of product management and marketing. Web developers can use the technologies they’re used to, such as HTML and Ajax, and the applications can be built quickly and accessed remotely. But, like a desktop program, AIR apps can also read and write local files, as well as work with other applications on your computer.

AIR’s official launch puts it ahead of competitors JavaFX and Mozilla Prism, which are still in development or public testing. (Many think of Microsoft Silverlight as a competitor, but that’s a misconception, Turner says, because it’s a browser plug-in, not a desktop environment.)

Adobe is also releasing Adobe Flex 3, a tool for building Flash applications. Like AIR, Flex is already available in public testing mode. Components of the Flex software developer kit were already open source, but the release means the Flex SDK is now completely open.

A number of big financial players will use Adobe AIR to keep customers up-to-date about site news and account status, including as eBay, Deutsche Bank and NASDAQ. Cable TV children’s channel Nickelodeon has created a video jigsaw puzzle application, and The New York Times is using AIR to build the desktop component of ShifD, which will allow Times readers to move newspaper content back-and-forth between their computers and their mobile devices.

Start-ups are already making use of AIR too. For example, Unknown Vector used AIR to build its desktop video player (our coverage), and Acesis, which launched at last month’s DEMO, a medical records software, based parts of its medical records software in AIR (our coverage).

more......
Adobe today announced the availability of its Adobe Integrated Runtime (AIR) cross-operating system for taking rich Internet applications (RIA) to the desktop.

Adobe also released Flex 3, an open-source development tool set aimed at helping developers build RIAs.

AIR is a runtime environment for building RIAs in Adobe Flash, HTML and AJAX. The product includes the Safari WebKit browser engine, SQLite local database functionality, and APIs that support desktop features such as native drag and drop and network awareness.

Nasdaq Stock Market Inc. and the American Cancer Society are among several organizations running beta versions of AIR to bridge the gap between the Web and the desktop. Both said they turned to the technology because it doesn't require that developers learn new skills.

"AIR takes the capabilities of Flex and Flash and extends that to the desktop," said David Wadhwani, general manager and vice president of Adobe's platform business unit. "With the release of AIR, we've expanded our developer base to the millions of AJAX and HTML developers of the world."

Wadhwani added that FedEx Corp. has developed an AIR application to track packages in real time on the desktop, and Deutsche Bank AG is using AIR to provide alerts about financial transactions.

In addition, business intelligence software vendor Business Objects SA has been working with Adobe to develop reports on transactional data that run in AIR and can be e-mailed to multiple users who can then access live feeds from those reports to do an analysis, he said.

Adobe also released Flex Builder 3, its commercial Eclipse-based plug-in for developing RIAs. Flex Builder 3 integrates with Adobe's Creative Suite 3 set of tools to make it easier for designers and developers to work together, Adobe said. It will be available in two versions: The standard edition is $249, while the professional version costs $699.

Finally, Adobe also made available its BlazeDS open-source tool that promises to help developers boost the data transfer capabilities and performance of RIAs. BlazeDS is made up of components from Adobe's LiveCycle Data Services suite.

Sunday, February 24, 2008

U.S. 'confident' over satellite hit



Videotape of the Navy mission to shoot down a dying spy satellite made available Thursday shows an interceptor missile ascending atop a bright trail of burning fuel, and then a flash, a fireball, a plume of vapor. A cloud of debris left little doubt that the missile had squarely hit its mark as the satellite spent its final days orbiting more than 130 miles above the Pacific Ocean.
A different kind of doubt still lingers, though, expressed by policy analysts, some politicians and scientists, and not a few foreign powers, especially China and Russia:

Should the people of the world be breathing a sigh of relief that the risk of a half-ton of frozen, toxic rocket fuel landing who knows where has passed? Or should they be worried about the latest display of the United States’ technical prowess, and see it as a thinly veiled test for a shadow antisatellite program?

Defense Secretary Robert M. Gates, who personally gave the order to go ahead with the satellite shootdown Wednesday, told reporters in Hawaii on Thursday that he was prepared to share some details of the operation with China to ease its concerns that the debris might still prove dangerous. Adm. Timothy J. Keating, the commander of American forces in the Pacific, has reached out to several nations in the region to explain the mission, as well.

Addressing the diplomatic concerns, senior officials dismissed questions raised by the Chinese and the Russians, and echoed by some arms control analysts, about whether the episode was really a test of space weaponry. They pointed out that the missile used in the operation, the Navy’s SM-3 interceptor, was designed to counter a limited ballistic missile attack and had to be reprogrammed for this unexpected task, the likes of which the authorities are unlikely ever to face again.

In missile defense, an interceptor must find a red-hot enemy warhead as it arcs on a relatively short ballistic path, a task often described as “hitting a bullet with a bullet.” This time, the target — much larger then a warhead, almost the size of a school bus — was circling Earth predictably about 16 times a day.

It was still a bit of a long shot. The fuel tank that was the bull’s eye was only about 40 inches across.

And although the United States has hit test targets in space before — including a satellite destroyed in 1985 in a demonstration of an antisatellite weapon launched from a fighter jet — the successful demonstrations have been relatively few and far between.

What Wednesday’s successful strike in space conclusively proved was not infallibility but a robust and flexible military capability that can be cited by either side in what no doubt will be the ensuing debate.

The mission was conducted from Navy warships. So the United States can move this capability at will over three-quarters of the globe.

The missile-defense interceptor was converted to an antisatellite capability in little more than a month. No expensive research and development program. No battles with Congress over money. No starting from scratch on white boards in some laboratory.

This demonstration of military agility has to cause any adversary to pause.

“This was uncharted territory,” said Gen. James E. Cartwright of the Marines, who is vice chairman of the Joint Chiefs of Staff. “The technical degree of difficulty was significant here.”

General Cartwright noted that important elements of the nation’s missile defense system had been used, in particular the sensors.

“That was the key piece that we would take from the missile defense system,” he said.

To ready the missile-defense rocket for the mission, he said: “We added a lot of instrumentation. We made some modifications to the software to be able to go after a satellite.”

In somewhat theatrical language, the mission was hailed by Riki Ellison, president of the Missile Defense Advocacy Alliance, one of the more energetic groups promoting the development of ballistic missile defenses.

“The factual reality of using deployed missile defenses to destroy a falling satellite or a ballistic missile or even a meteor from space that would risk human life is an achievement for mankind,” a statement from the organization said.

Yet, even the successful mission in no way proves that the United States is safe from nuclear attack, or that it can do what it wants in space.

Mr. Gates, at the start of a weeklong series of meetings in Asia, said that the debate over whether the United States’ missile defense system worked was “behind us” but that issues remained about exactly what types of missile threats the system could be used against.

“The question of whether this capability works has been settled,” Mr. Gates said in Hawaii after a tour of the destroyer Russell, which participated in the satellite operation. “The question is against what kind of threat, how large a threat, how sophisticated a threat.”

The White House and the Pentagon said the hazard posed by the failed National Reconnaissance Office satellite was from its hydrazine fuel. It may be 24 to 48 hours before officials can state with certainty that the fuel tank was punctured and that the hydrazine is no longer a threat.

But Representative Edward J. Markey, a Massachusetts Democrat on the House Homeland Security Committee, said, “The geopolitical fallout of this intercept could be far greater than any chemical fallout that would have resulted from the wayward satellite.”

Mr. Markey said: “The Bush administration’s decision to use a missile to destroy the satellite based on a questionable ‘safety’ justification poses a great danger of signaling an ‘open season’ for other nations to test weapons for use against our satellites. Russia and China are sure to view this intercept as proof that the United States is already pursuing an arms race in space, and that they need to catch up.”

The Chinese warned Thursday that the United States Navy’s action could threaten security in outer space. Liu Jianchao, the Chinese Foreign Ministry spokesman, said at a news conference in Beijing that the United States should promptly share data about the passage of the remaining pieces of the satellite.

“China is continuously following closely the possible harm caused by the U.S. action to outer space security and relevant countries,” Mr. Liu said, according to The Associated Press.

more....



The U.S. is confident that its shooting down of a disabled spy satellite with a missile managed to destroy its potentially toxic fuel tank.


Marine Gen James Cartwright said there was a 80-90% chance that the satellite's tank had been destroyed.

A fire ball, vapor cloud and spectral analysis indicating the presence of hydrazine all indicated that the tank had been hit, he told reporters.

The operation has been criticized by China and Russia. ""We're very confident that we hit the satellite,"" Gen Cartwright said at a Pentagon briefing hours after the missile was fired. ""We also have a high degree of confidence that we got the tank.""

It would take another 24-48 hours for officials to confirm whether the operations had been completely successful, he said.

Gen Cartwright said he could not rule out that hazardous material might fall to earth, but said there was no evidence of this happening so far.

He added that officials would continue to track debris falling over the Atlantic and Pacific Oceans over the next two days. ""Thus far we've seen nothing larger than a football,"" he said.

The satellite, USA 193, was struck 153 nautical miles (283 km) above earth by an SM-3 missile fired from a warship in waters west of Hawaii.

Arms race?

Operatives had only a 10-second window to hit the satellite, which went out of control shortly after it was launched in December 2006.

The missile needed to pierce the bus-sized satellite's fuel tank, containing more than 450kg (1,000lbs) of toxic hydrazine, which was otherwise expected to survive re-entry.

USA 193 lost control shortly after launch on a Delta II rocket China called on the US on Thursday to provide more information about the mission.

Russia suspects the operation was a cover to test anti-satellite technology under the U.S. missile defense program.

The U.S. denies the operation was a response to an anti-satellite test carried out by China last year, which prompted fears of a space arms race.

US officials had said that without an attempt to destroy the fuel tank, and with the satellite's thermal control system gone, the fuel would have been frozen solid, allowing the tank to resist the heat of re-entry.

If the tank were to have landed intact, it could have leaked toxic gas over a wide area - harming or killing humans if inhaled, officials had warned.

""The intent here was to preserve human life ... it was the hydrazine we were after,"" Gen Cartwright said on Thursday.

The U.S. has also denied that it shot down the satellite to prevent parts of it from falling into the hands of foreign powers.

Gen Cartwright said most of the satellite's intelligence value was likely to have been destroyed.

Find here

Home II Large Hadron Cillider News