Search This Blog

Saturday, October 27, 2007

Intel chip production for Next-Gen Processors -45nm Microprocessors

Volume production of a new generation of microprocessors for desktop personal computers, laptops, servers and other computing devices officially began today inside of Intel Corp.'s first high-volume 45nm manufacturing factory in
Chandler, Arizona. The first 45nm chips, which were produced on Intel's D1D fab in Hilsboro, Oregon, will be unveiled next month and will serve high-performance desktops and mainstream servers.

The chips will eventually replace older models throughout Intel's product line for PCs, laptops, servers, and consumer products.
Intel (NSDQ: INTC) has started high-volume production of its next-generation quad-core microprocessors, which the company plans to start selling next month.
Production of the 45-nanometer chips started at the company's Chandler, Ariz., manufacturing plant, called Fab 32, on Thursday. The microprocessors are built to pack more transistors than ever before on a single chip, which means more processing power at the same level of power consumption as older chips.

Intel's $3 billion Fab 32 is the first factory to start volume production of the new chips. The processors will eventually replace older models throughout Intel's product line for PCs, laptops, and servers, as well as ultra-low power chips for mobile Internet and consumer electronic devices.

Intel's first 45-NM chip, codenamed Penryn, will be a quad-core desktop processor in the form of Core 2 Extreme QX9650. The chip ships Nov. 12 at a clock speed of 3.0 GHz, making the new product Intel's fastest quad-core processor. Pricing hasn't been released, but media reports have pegged it at $999 to computer makers.

Besides having a better performance-to-power ratio, Intel's 45-nm line will also be more environmentally friendly, because it won't use the chemical Halogen.

Fab 32, Intel's sixth 300mm wafer factory, is the second factory to make the new processors, although its the first to begin high-volume production. The first to make a 45-nm chip for Intel was the company's Oregon development facility, called D1D, in January. D1D is now moving toward high-volume production, along with two other Intel plants, Fab 28 in Israel and Fab 11x in New Mexico. The latter two, which will also use 300mm wafers, are scheduled to start production next year. The use of such large silicon wafers is a plus because it lowers manufacturing costs.

At 1 million square feet, Fab 32 is the size of more than 17 football fields. More than 1,000 people work at the plant, which is among Intel's most environmentally friendly factories with a 15% reduction in global warming emissions.

Intel is ahead of rival Advanced Micro Devices in 45-nm manufacturing. AMD won't ship similarly made chips until sometime in 2008. Intel's smaller competitor shipped its first quad-core processor in September, about a year after Intel.

Technorati : ,

"Without carbon capture and sequestration, we are all toast."

Research on a dire problem--carbon capture--gets going ,
Jiang Lin, a scientist with the China Sustainable Energy Program with Lawrence Berkeley Lab, issued that gloomy proclamation earlier this week, and it's a fitting description of the current world situation when it comes to global warming. To make it worse, I asked Lin about how the world is responding to the challenge. Not well.

"We haven't invested in deep research or spent much money in testing out the scenarios," he said. "There are a lot of uncertainties."

Still, it's not over yet, and the University of Texas this week announced it has received a $38 million grant to study the feasibility of injecting carbon dioxide into brine-filled underground wells over a 10-year period.

The project is part of the Southeast Regional Carbon Sequestration Partnership (SECARB), funded by the National Energy Technology Laboratory of the Department of Energy. SECARB's goal is to study carbon-dioxide injection and storage capacity of the Tuscaloosa-Woodbine geologic system that stretches from Texas to Florida. The region has the potential to store more than 200 billion tons of the gas, which the department says it equal to about 33 years of emissions.

Beginning in the fall, SECARB scientists will start to inject a million tons of carbon dioxide a year into a brine reservoir near Natchez, Mississippi. The brine is up to 10,000 feet below the surface.

In some ways, the U.S. is the Saudi Arabia of gaping holes. The U.S. has produced more oil than anywhere else in the world, historically speaking--250 billion gallons have been sucked out of the ground here--there is lots of empty space underground, according to Chevron's CTO Don Paul, who spoke this week at the Dow Jones Alternative Energy Innovations Conference.

Sequestration, though, poses logistical and financial challenges, Paul said. Just to capture the carbon dioxide coming out of power plants, factories and other "stationary" carbon-dioxide emitters, it would take an infrastructure the same size as the natural gas infrastructure.

"That's a lot of pipe," Paul said. Paul also issued some interesting facts on peak oil.

UT awarded $38 million to study carbon dioxide storage
A 10-year, $38 million project to study the feasibility of storing carbon dioxide underground to combat global warming has been awarded to the University of Texas.

The university's Bureau of Economic Geology will inject carbon dioxide into brine formations deep underground about 15 miles east of Natchez, Miss. It's thought that sequestering major greenhouse gases emitted by power plants and other sources could reduce atmospheric emissions that contribute to global warming.

This is the next step in a series of bureau-led experiments to test much-needed carbon capture and storage technologies," said Scott Tinker, the state geologist and director of the bureau, a unit of UT's Jackson School of Geosciences.

Southeast Regional Carbon Sequestration Partnership, a federally funded organization, awarded the project. Various universities, corporations and national laboratories are also involved in the project, but the bureau is coordinating it, UT spokesman J.B. Bird said.

Officials will begin injecting carbon dioxide this fall into the formations of brine, or salt water. Key tasks include estimating the storage capacity of such reservoirs and developing methods for documenting carbon dioxide retention. The project is the first intensively monitored, long-term effort of its type in the nation.

BIONIC ARM :Innovative artificial limb that controls movements by thought.


A man who lost both of his arms in an accident is getting some high-tech help with an innovative artificial limb that controls movements by thought.

Two years ago, a healthy Jesse Sullivan, 56, was at his job repairing utility lines when he accidentally touched a live wire, costing him both his arms up to his shoulders.

Like most amputees, Sullivan was fitted with a traditional artificial prosthesis, relying on chains and buttons to move his arm. But then doctors at the Rehabilitation Institute of Chicago offered a "bionic" arm for his other lost limb, putting him at the forefront of biomechanical technology.

"I didn't really didn't know what was available. It was a scary thing," Sullivan remembers. "I thought maybe it would be like the 'Six Million Dollar Man' [on TV]."

To get the new arm, Sullivan first underwent surgery to graft existing nerve endings from his shoulder onto the pectoral muscle on his chest. Those nerves grew into the muscle after about six months. Electrodes on the graft can now pick up any thought-generated nerve impulses to the now-absent limb and transmit those to the mechanical prosthesis, controlling the movements of the arm.

Sullivan's doctor says this is the first time a nerve-muscle graft has been used to control an artificial limb.

Now, when Sullivan thinks about closing his hand, the nerve that used to make the hand close spurs a little piece of his chest muscle to contract, said Dr. Todd Kuiken, one of Sullivan's doctors at the Rehabilitation Institute of Chicago. Sensors over that muscle then tell the hand to close via tiny connecting wires.

"This is 1920s surgery but it's for a 21st century application," said Kuiken. "So what's really novel about this is not so much the surgical technique but the reason for doing the surgery and using it to help control artificial limbs and make them work better."

New wave technology
Some researchers have used electrodes implanted in the brain or in the scalp, while others have experimented with detectors outside the body, such as in Sullivan's case. But the basic idea behind neuroprosthetic devices is the same: creating communication between the brain and the outside object that needs to be moved.

"To move something, you have to get a command signal from the brain to [an object,] whether it's a wheelchair, robot or your own arm," explains Dr. Brian Schmit, an assistant professor of biomedical engineering at Marquette University.

The technology has the possibility to dramatically change those that have lost limbs or who are paralyzed. Some of the thought-driven devices being developed could navigate wheelchairs, control a robotic arm's movement, or even move a computer mouse, according to Schmit.
"If you can provide to someone who can't move access to a computer, it opens a lot of doors," Schmit says. "The brain can change to new circumstances. The body has an amazing ability to learn and adapt."

About 8 percent of the estimated 387,500 amputees in the United States are those that have lost their arms, according to the Northwestern University Prosthetic-Orthotic Center.

"With these new prostheses, these patients can now use their nerves in the natural way to control their artificial hand so that you have a more natural feel to its use," Kuiken said. "It's faster and more agile."

For now, the medical procedure performed on Sullivan is limited to amputated arms. The hope is that one day it can be applied to other limbs as well.

Sullivan said the experimental surgery was worth the uncertainty at first if it can help others later.

"If this benefits another person then it is well worth it," Sullivan said. "And it has benefited me so I'm well satisfied."

Southern California Devastated By Wildfires - is it effect of Global warming ?

Around a million people have already been displaced by the disastrous fires in Southern California, which threaten the rich and the poor alike and are still burning out of control. Most of those affected are in the San Diego area, where hundreds of homes have been destroyed and hundreds of thousands have fled their homes to protect their lives. President Bush, who declared a state of emergency in California, is scheduled to visit the area Thursday.
But they said the extreme conditions that stoked the wildfires could become more common as the world warms.

Are the massive fires burning across Southern California a product of global warming?

Scientists said it would be difficult to make that case, given the dangerous mix of drought and wind that has plagued the region for centuries or more.

Research suggests that rising temperatures are already increasing fire damage in many parts of the West.

In a study published last year in the journal Science, researchers looking at Western federal forests found nearly seven times more land burned from 1987 to 2003 than in the previous 17 years.

The analysis mainly attributed this to a 1.5-degree rise in average spring and summer temperatures. With spring arriving earlier and snow melting faster, the forests dried out sooner, extending the average fire season by more than two months.

The study, however, found Southern California was different from the rest of the West, with no increase in the frequency of fire as temperatures rose.

"In Southern California, it's hot and dry much of the year," said Anthony Westerling, a climate scientist at UC Merced and the study's lead author. In other words, Southern California was already perfect for fire.

"That is a fire-prone environment regardless of whether we are in a climate-change scenario," said Tom Wordell, a wildfire analyst at the National Interagency Fire Center in Boise, Idaho. "I don't want to be callous, because many people are homeless and suffering, but if you live in a snake pit, you're going to get bit."

But eventually global warming could make Southern California's occasional droughts more persistent, exacerbating the fire danger.

Conditions as dry as the Dust Bowl of the 1930s could prevail in the Southwest by the middle of this century, according to a study published this year in Science.

The researchers based their conclusion on climate models showing how warmer temperatures would expand the reach of a powerful atmospheric circulation pattern known as the Hadley cell. Changes to the cell would dry air through the subtropics, including a swath from Colorado to California.

The study suggested that the transformation may already be underway. The Southwestern United States has been in drought since 2000, although tree-ring records show there have been far drier periods during the last millennium.

Scientists said more persistent drought would inevitably lead to more fires, as long as intermittent periods of moisture allowed vegetation to grow as fodder for flames.

In Southern California, hillsides were ripe for fire because big rains two years ago allowed vegetation to flourish, then severe drought during the last year dried it out.

The future of the Santa Ana winds is harder to predict.

Norman Miller, a climate scientist at Lawrence Berkeley National Laboratories, published an analysis last year in Geophysical Research Letters predicting that rising temperatures, fueled by greenhouse gas emissions, would eventually push peak Santa Ana winds from mid-October to late November.

That could, over decades, make fires worse by giving the landscape more time to dry out. Global warming, he said, could intensify wind flow by increasing the difference between inland and coastal temperatures.

The Santa Ana winds form when high-pressure air in the Great Basin, between the Rocky Mountains and the Sierra Nevada, rushes toward the low pressure of the coastlands. As the air descends, it heats up, gaining speed as it is forced through narrow canyons.

The winds, which typically gust to 45 mph, were recorded at more than 80 mph several times this week -- strong but inside the range of normal variability

iPhone off the charts :Apple limits iPhone to two per customer

Apple has begun limiting sales of its popular iPhone to two per customer and no longer accepts cash for payment.

"Customer response to the iPhone has been off the charts, and limiting iPhone sales to two per customer helps us ensure that there are enough iPhones for people who are shopping for themselves or buying a gift," Apple spokeswoman Natalie Kerris said

“We’re requiring a credit or debit card for payment to discourage unauthorized resellers,” said Apple spokesperson Natalie Kerris, calling the demand for the iPhone "off the charts."

The previous limit was five iPhones per person, which tempted hackers to buy in quantity to resell or unlock them from their intended use with AT&T Mobility, reported Zdnet.

Apple believes that of the 1.4 million iPhones bought since they came on the market in late June, an estimated 250,000 were purchased to unlock them, Wired News said Saturday.

The new restrictions also are aimed at ensuring enough iPhones for increased holiday sales. Analysts estimate Apple's iPhone manufacturing capability is restricted to no more than 500,000 iPhones a month.

The iPhone soon will launch in Germany, the United Kingdom and France and with holiday sales gearing up, Apple wants to ensure no market runs out of phones, Wired News reported.

Newtoon teaches Physics on your Phone

Newtoon teaches Physics on your Phone

A mobile phone and web-based gaming activity that embeds physics learning into the core of its application, Newtoon is a collaborative project between UK-based Futurelab and Soda Creative that is designed to encourage children to create, play, edit and share micro-games based on Newton's laws of physics.

By motivating children to make use of their own phones for learning and encouraging mobile applications within the classroom, the project aims to offer teachers an engaging and exciting new tool for education, as well as hopefully inspiring students to involve science into their lives outside the school walls. There are two key aspects; the 'microlab' which allows teachers to demonstrate and explain physics principles, and the 'microgame' allows pupils and teachers to create their own games based on these principles, explained in the scenarios give on their website:

Scenario 1

A science teacher is anxious about KS3 Unit 8J: Magnets and electromagnets. She wonders how she can excite her pupils about the world of magnetism. The teacher launches Newtoon on the whiteboard and searches for a tutorial on 'magnets'. She opens a research microlab and by moving and rotating the bar magnet, she demonstrates that the ferrous bar always attracts while the bar magnet both attracts and repels depending on polarity. On their desktops, the pupils then select 'dog's dinner', a micro-game which explores magnets. Racing against the clock, the pupils steer a dog towards the bone, avoiding the magnetic forces.

Scenario 2

During the science lesson, all the pupils' games are collected into a game-carousel at the Newtoon website. At home, a pupil, Laura loads the game-carousel onto her mobile phone and challenges her family to play her creations. "How does it work?" her mum asks. Laura explains that her game, 'dream-date', uses magnetic variables to make her game characters attract and repel each other depending on how 'cute' they are, using pictures she has imported from the internet. She then shows her mum that her game has been the most played by her classmates, and that she has improved in her understanding of physics

Having been prototype tested in schools around the UK already this year, and with trails due to launch any day now, this is an exciting new system for the future of learning that may finally begin to bring about the materialisation of the much-deliberated re-think to the tenets of teaching.

'The evolution of a gaming community has the potential to invoke an interactive and collaborative classroom culture with doing, debating and deliberating science at its heart. This will involve exploring the possibilities of a 21st century science curriculum.'


Newtoon is a mobile phone and web activity which aims to embed physics learning in mobile gaming. It enables young people to author, play, edit and share fast-paced microgames for their mobile phones, where game rules are based on a set of Newtonian physics principles.

Technorati : , , ,

Senate approves extension of Internet tax ban

Senate approves extension of Internet tax ban

The Senate has approved legislation extending a moratorium on state Internet access taxes for seven years.

With only days left before the Internet tax ban was set to expire, the Senate reached a compromise between lawmakers who proposed a shorter extension and those who insisted it should be made permanent.

"By keeping the Internet tax-free and affordable, Congress can encourage Internet use for distance learning, telemedicine, commerce and other important services," Sen. Ted Stevens, of Alaska, said in a statement on Thursday night.

The vote came about two weeks after the House of Representatives approved a four-year extension of the Internet tax ban.

The two chambers must work out their differences on the bill before a final version can be approved and signed by President George W. Bush.

On Friday, Bush listed the Internet tax ban extension among a list of tasks that Congress had failed to accomplish.

"I urge Congress to keep the Internet tax-free -- and to get a bill to my desk that I can sign," Bush said.

The state tax ban has been in place since 1998. It was last renewed by Congress in 2004 for three years. It is scheduled to expire on November

Internet service providers say the price of Internet access could rise by as much as 17 percent if the moratorium on state taxes were allowed to expire.

Some senators, including many Republicans, had argued that a permanent ban on Internet taxes is needed to spur more investment by broadband service providers. They complained that Senate Democratic leaders had blocked a vote on a permanent moratorium.

Technorati : , , , , ,

Space-Based Solar Power Effort

Space-Based Solar Power Effort

Pentagon Promotes Space-Based Solar Power Effort

Huge solar arrays placed into a continuously and intensely sunlit orbit around the earth would be able to generate gigawatts of electrical energy that could be electromagnetically beamed back to earth. The receiving stations down on the ground would be designed to deliver the power to the existing electrical grid

A new report from the Pentagon's National Security Space Office (NSSO) postulates that space-based solar-power platforms could begin fulfilling planetary demand for electricity by 2050. The report noted that while significant challenges remain, the technologies for making extraterrestrial relay stations a reality "are more executable than ever before and current technological vectors promise to further improve its viability."
The study's authors are advising the U.S. government to inaugurate a coordinated national program for fostering the technology's development, with the first step consisting of a proof-of-concept demonstration in outer space.

The best way to convince the public that the concept is viable is to show people that the technology actually works, said NSSO spokesperson Lt. Colonel Paul Damphousse. "It's not a stretch to prepare equipment to put on the space station to demonstrate beaming" and to test other vital components, Damphousse noted.

A Flying Hoover Dam

The new NSSO study, which includes input from more than 170 experts worldwide, might seem like science fiction to some, but so did the article "Extra-Terrestrial Relays" published by mathematician and science fiction author Arthur C. Clarke in 1945. Today the world derives incalculable benefits from Clarke's pioneering vision of how communication platforms in geostationary orbit over the earth's equator could relay TV and radio programs to virtually every inhabitable place on the planet.

The space station would give scientists the ability to test a wide variety of devices and component technologies far more rapidly than you could anywhere else in space right now, said the president of the Space Power Association and report contributor John Mankins.

"We could use it to validate key concepts of operations: automated assembly, repair, maintenance," Mankins explained. "And it could be a staging point for larger-scale demonstrations" which are "achievable within a decade, not 50 years away."

The first large-scale system could plausibly be on the scale of the Hoover Dam, which would represent enough power to light a city, Mankins noted. But the power could also "be directed to more than one ground location where the markets are. It will be a matter of identifying the new opportunities, project by project."

Extraterrestrial Power Relays

According to the NSSO's Space-Based Study Group, a single kilometer-wide band of geosynchronous earth orbit experiences enough solar flux in one year to nearly equal the amount of energy contained within all known recoverable conventional oil reserves on Earth today.

Huge solar arrays placed into a continuously and intensely sunlit orbit around the earth would be able to generate gigawatts of electrical energy that could be electromagnetically beamed back to earth. The receiving stations down on the ground would be designed to deliver the power to the existing electrical grid, convert it into synthetic hydrocarbon fuels, or even broadcast it directly to consumers.

"In the coming century we will need to find as much energy as the world uses today in green form, not just once, but two, three, or more times over," Mankins noted. "And in technological competitiveness, we need to do ambitious things as a nation to renew our technological strength in all areas."

A U.S. funded demonstration would engage the interest of foreign governments concerned about future energy demands, the report's authors noted. Moreover, full deployment of the technology in space would help nations to avoid future military conflicts over increasingly scarce energy resources , they said.

Technorati : , , , ,

Lunar Lander Challenge: DIY engineers try and fly their homemade rockets

As you know, NASA said they'll be going back to the moon, for human missions, and other governments have said they'll do the same.The big event at X Prize Cup 2007 is the Northrop Grumman Lunar Lander Challenge, in which DIY engineers try and fly their homemade rockets from one concrete pad to another, 100 meters away. NASA put up the $2 million in prize money, hoping they'll get a sense of how a new generation of mooncraft might look. Instead of paying hundreds of millions to a giant corporation for paper plans, NASA, along with Northrop Grumman, is checking out the crowd-sourcing approach to space exploration. I spoke to William Pomerantz, the director of Space Projects for The X Prize Foundation and the man overseeing the competition.

William Pomerantz: It's an annual competition for teams that can build a rocket that has the power required to go from lunar orbit to lunar surface and back
The Apollo LM (lunar module), which was built by the Grumman corporation and that did the job every time and did it perfectly, has been retired. They are all in museums. No one has tried to do that job again in the last 35 years. Right now there is not a spacecraft that can do the job.

NASA, by spending an incredibly small amount of money for them, hopes to get people thinking in this direction. The teams are designing vehicles, first on paper and then building models, and working the kinks out. And they are showing how, in a short time period, a small group of people can build, test and get one of these vehicles operational many times within a couple of hours. Even better for NASA is that multiple teams are doing this. That was one reason we took that $2 million and, rather than winner take all, split it four ways. Level one and two is like JV and varsity, and each of those levels is divided into first and second prizes.

The idea was to give teams a stepping stone. We knew level two was incredibly difficult, so level one provides them with a benchmark somewhere in between, so they can have a pat-on-the-back moment, get some seed money into their systems, and to have a way promote themselves to investors or possible partners. And by offering a second prize, if we have a clear frontrunner there would still be incentives for other teams to keep working.

Technorati :

New room extended at space station

The US space shuttle Discovery linked up with the International Space Station (ISS) yesterday on a mission to prepare the orbital outpost for new European and Japanese laboratories.
With shuttle commander Pamela Melroy at the controls, Discovery eased up to the station and latched onto a docking port at 8.40am EDT.
Working both outside the station and within it, the astronauts moved the Harmony module, which will serve as a connection point for two new laboratories for the station, to a temporary location on the side of the station.
The space station’s robot arm, operated by Stephanie Wilson and Daniel Tani, smoothly moved the 16-ton module out of the shuttle and onto the station, where automatic bolts secured it in place in a temporary home on the left side of the station’s living quarters.

The work outside was more strenuous. Astronauts Scott E. Parazynski and Douglas Wheelock began their spacewalk shortly after 6 a.m. Eastern time. They prepared the Harmony module for its removal from the shuttle’s payload bay and performed some of the preliminary work for the other big task of the mission, moving an enormous set of solar arrays and the truss they stand on from their initial position atop the station to the permanent home on the far end of the truss on the station’s left side.

So far, technical difficulties on the mission have been minor.

So little insulating foam was shed from the shuttle’s external tank that mission managers have determined that a more focused inspection of the shuttle’s heat shield is unnecessary. When that word was passed up to the shuttle on Thursday afternoon before the crew sleep period was to begin, the shuttle commander, Pamela A. Melroy, responded enthusiastically, ”Oh, man, that is fantastic news.”

She said that it was a relief to know that tile and panel damage was not a concern and that they would be able to take the time that would have gone to inspection and use it to further prepare the Harmony module for entry. “We just can’t wait to get inside,” Ms. Melroy said.

The spacewalk, for the most part, went smoothly. The astronauts struggled occasionally with balky bolts and hose connectors, which are optimistically called “quick-disconnect” devices. They were wary of the small amounts of frozen ammonia that drifted away from some the hoses, because they could contaminate the atmosphere within the station if brought in on the space suits. The amount of ammonia, which is used as a coolant, was small.

At one point, Paolo Nespoli, the Italian astronaut who was coordinating the spacewalk from inside the station, asked his colleagues to take a small break to enjoy what might be the greatest perk of working in space: the view. He asked them to look over the starboard side as the station passed over Houston.

The two spacewalkers oohed and aahed as the familiar coastline slid by below.

“Hello, Houston!” Dr. Parazynski said.

The spacewalkers were back in the airlock before noon.

Over the communications system, Ms. Melroy congratulated Dr. Parazynski and Mr. Wheelock on the work of the entire team, which she said she watched while making lunch for the crew.

Dual-3G laptop chip by Qualcomm :foster compatibility with cellular broadband technologies

The leading wireless operator Qualcomm on Wednesday launched a chip that will make it easier to build laptops compatible with the two dominant cellular broadband technologies in the United States.

Currently, business-oriented laptops are generally available with chips for either AT&T's, Verizon Wireless' or Sprint Nextel's networks. AT&T's network uses a technology called HSPA, or High-Speed Packet Access, while Sprint and Verizon Wireless use EV-DO, or Evolution-Data Optimized.

Both network technologies are also being rolled out overseas, with HSDPA being the dominant choice.

Qualcomm's new Gobi chip can connect to either type of network, which should make it easier for laptop users to shop around for the carrier that has the best coverage and prices in their area.

The chips are available immediately, and Qualcomm expects them to appear in laptops in the second quarter of next year.

The chips may increase the choices for cellular broadband users, but those networks are competing not just with each other but with WiMax, another long-range wireless technology that promises higher data speeds on a network that's cheaper to build. Qualcomm's chip does not support WiMax.

In the U.S., Sprint is building a WiMax network, in alliance with Clearwire, which already has a network in parts of the country.

More about Qualcomm

The world's leading wireless operators, device manufacturers and content providers are partnering with Qualcomm to revolutionize the mobile marketplace. With significant investments in research and development, we're creating the wireless technologies that power the mobile experience that consumers want now. And to ensure our partners' success in the wireless broadband evolution, Qualcomm is innovating the wireless device, service and network landscape with solutions that converge communication, computing and CE platforms into a seamless mobile broadband experience.

Technorati : , ,

Find here

Home II Large Hadron Cillider News