Search This Blog

Saturday, November 10, 2007

Sony Ericsson :announces first USB Mobile Broadband Modem

Sony Ericsson :announces first USB Mobile Broadband Modem

Sony Ericsson announced the MD300, a state-of-the-art business tool that allows busy professionals to quickly and easily connect their PCs or laptops to the internet whenever and wherever they need. The first Sony Ericsson Mobile Broadband product with a USB interface, the MD300 is simple to install and start up. It comes pre-installed with everything you need to go online, including PC software.

"The MD300 combines high data download speeds with simple installation and configuration, making it the ideal choice for busy professionals who demand no-compromise connectivity," says Johan Tysklind, Marketing Director Mobile Computing at Sony Ericsson. "The MD300 is the first offering in our forthcoming portfolio of mobile broadband products that will combine Sony Ericsson's dependability with a stylish form factor."

As compact and easy to use as a USB memory stick, the Sony Ericsson MD300 is a mobile broadband modem that plugs into any standard USB port. With Microsoft Windows software and drivers built in, the card installs and configures itself when plugged in, making it a matter of seconds to get connected. Mac drivers will be available for download. The MD300's simple, intuitive Wireless Manager interface makes using it a breeze. This PC interface will also enable instant access through a pre configured button to the PlayNow mobile entertainment experience from Sony Ericsson.

The MD300 is all you need to get your mobile office up-to-speed, whether you're on a HSDPA, UMTS, EDGE or GPRS network anywhere in the world. It's versatile enough to use with your laptop or desktop computer at home or in your hotel room, with its portfolio of accessories including the desk stand and USB extension cable.

The MD300 will be available in luxury black and metallic silver in selected markets from Q1 2008.

Technorati : , , , ,

The world's smallest double slit experiment

The world's smallest double slit experiment

To perform the experiment, a supersonic jet of hydrogen (source at bottom) is ionized by a beam of x-rays from the Advanced Light Source (not shown). The doubly photoionized molecule blows apart, and the protons (red) strike the detector at left while the electrons (blue), trapped in a magnetic field, strike the detector at right. The energy of all the particles and the original orientation of the molecule can be determined from the measured results.

The big world of classical physics mostly seems sensible: waves are waves and particles are particles, and the moon rises whether anyone watches or not. The tiny quantum world is different: particles are waves (and vice versa), and quantum systems remain in a state of multiple possibilities until they are measured -- which amounts to an intrusion by an observer from the big world -- and forced to choose: the exact position or momentum of an electron, say.

On what scale do the quantum world and the classical world begin to cross into each other? How big does an "observer" have to be? It's a long-argued question of fundamental scientific interest and practical importance as well, with significant implications for attempts to build solid-state quantum computers.

Researchers at the Department of Energy's Lawrence Berkeley National Laboratory and their collaborators at the University of Frankfurt, Germany; Kansas State University; and Auburn University have now established that quantum particles start behaving in a classical way on a scale as small as a single hydrogen molecule. They reached this conclusion after performing what they call the world's simplest -- and certainly its smallest -- double slit experiment, using as their two "slits" the two proton nuclei of a hydrogen molecule, only 1.4 atomic units apart (a few ten-billionths of a meter). Their results appear in the November 9, 2007 issue of Science.

The double slit experiment

"One of the most powerful ways to explore the quantum world is the double slit experiment," says Ali Belkacem of Berkeley Lab's Chemical Sciences Division, one of the research leaders. In its familiar form, the double slit experiment uses a single light source shining through two slits, side by side in an opaque screen; the light that passes through falls on a screen.

If either of the two slits is closed, the light going through the other slit forms a bright bar on the screen, striking the screen like a stream of BBs or Ping-Pong balls or other solid particles. But if both slits are open, the beams overlap to form interference fringes, just as waves in water do, with bright bands where the wavecrests reinforce one another and dark bands where they cancel.

So is light particles or waves? The ambiguous results of early double slit experiments (the first on record was in 1801) were not resolved until well into the 20th century, when it became clear from both experiment and the theory of quantum mechanics that light is both waves and particles -- moreover, that particles, including electrons, also have a wave nature.

"It's the wave nature of electrons that allows them to act in a correlated way in a hydrogen molecule," says Thorsten Weber of the Chemical Sciences Division, another of the experiment's leading researchers. "When two particles are part of the same quantum system, their interactions are not restricted to electromagnetism, for example, or gravity. They also possess quantum coherence -- they share information about their states nonlocally, even when separated by arbitrary distances."

Correlation between its two electrons is actually what makes double photoionization possible with a hydrogen molecule. Photoionization means that an energetic photon, in this case an x-ray, knocks an electron out of an atom or molecule, leaving the system with net charge (ionized); in double photoionization a single photon triggers the emission of two electrons.

"The photon hits only one electron, but because they are correlated, because they cohere in the quantum sense, the electron that's hit flies off in one direction with a certain momentum, and the other electron also flies off at a specific angle to it with a different momentum," Weber explains.

The experimental set-up used by Belkacem and Weber and their colleagues, being movable, was employed on both beamlines 4.0 and 11.0 of Berkeley Lab's Advanced Light Source (ALS). In the apparatus a stream of hydrogen gas is sent through an interaction region, where some of the molecules are struck by an x-ray beam from the ALS. When the two negatively charged electrons are knocked out of a molecule, the two positively charged protons (the nuclei of the hydrogen atoms) blow themselves apart by mutual repulsion. An electric field in the experiment's interaction region separates the positively and negatively charged particles, sending the protons to one detector and the electrons to a detector in the opposite direction.

"It's what's called a kinematically complete experiment," Belkacem says, "one in which every particle is accounted for. We can determine the momentum of all the particles, the initial orientation and distance between the protons, and the momentum of the electrons."

What the simplest double slit experiment reveals

"At the high photon energies we used for photoionization, most of the time we observed one fast electron and one slow electron," says Weber. "What we were interested in was the interference patterns."

Considered as particles, the electrons fly off at an angle to one another that depends on their energy and how they scatter from the two hydrogen nuclei (the "double slit"). Considered as waves, an electron makes an interference pattern that can be seen by calculating the probability that the electron will be found at a given position relative to the orientation of the two nuclei.

The wave nature of the electron means that in a double slit experiment even a single electron is capable of interfering with itself. Double slit experiments with photoionized hydrogen molecules at first showed only the self-interference patterns of the fast electrons, their waves bouncing off both protons, with little action from the slow electrons.

"From these patterns, it might look like the slow electron is not important, that double photoionization is pretty unspectacular," says Weber. The fast electrons' energies were 185 to 190 eV (electron volts), while the slow electrons had energies of 5 eV or less. But what happens if the slow electron is given just a bit more energy, say somewhere between 5 and 25 eV? As Weber puts it, "What if we make the slow electron a little more active? What if we turn it into an 'observer?'"

As long as both electrons are isolated from their surroundings, quantum coherence prevails, as revealed by the fast electron's wavelike interference pattern. But this interference pattern disappears when the slow electron is made into an observer of the fast one, a stand-in for the larger environment: the quantum system of the fast electron now interacts with the wider world (e.g., its next neighboring particle, the slow electron) and begins to decohere. The system has entered the realm of classical physics.

Not completely, however. And here is what Belkacem calls "the meat of the experiment": "Even when the interference pattern has disappeared, we can see that coherence is still there, hidden in the entanglement between the two electrons."

Although one electron has become entangled with its environment, the two electrons are still entangled with each other in a way that allows interference between them to be reconstructed, simply by graphing their correlated momenta from the angles at which the electrons were ejected. Two waveforms appear in the graph, either of which can be projected to show an interference pattern. But the two waveforms are out of phase with each other: viewed simultaneously, interference vanishes.

If the two-electron system is split into its subsytems and one (the "observer") is thought of as the environment of the other, it becomes evident that classical properties such as loss of coherence can emerge even when only four particles (two electrons, two protons) are involved. Yet because the two electron subsystems are entangled in a tractable way, their quantum coherence can be reconstructed. What Weber calls "the which-way information exchanged between the particles" persists.

Says Belkacem, "For researchers who are trying to build solid-state quantum computers this is both good news and bad news. The bad news is that decoherence and loss of information occur on the very tiny scale of a single hydrogen molecule. The good news is that, theoretically, the information isn't necessarily lost or at least not completely."

Technorati : , ,

The Open Source Time Machine Replicant

The Open Source Time Machine Replicant

An interesting aspect of the project is the fact that an Apple technology is being looked on with admiration from at least some quarters of the open source community. On the Flyback site at Google Code, the About section said, "Apple's Time Machine is a great feature in their OS, adding that Linux has "almost all of the required technology" already built in to recreate it.

Some members of the open source community have taken such a shine to Time Machine, they have been working on bringing the feature to Linux, even while some Mac users are complaining about problems with the technology.

Dubbed "Flyback," the project has been working on developing an rsync-based GUI (graphical user interface) solution for Linux with Time Machine as the model.

Flyback lacks the OpenGL-based 3-D effects of Time Machine, as well as many of the features of Apple's (Nasdaq: AAPL)

Function vs. Form

The developers' focus has instead been on the principal purpose of Time Machine, which is snapshot-based backups that allow you to revisit previous incarnations of your hard drive.

An interesting aspect of the project is the fact that an Apple technology is being looked on with admiration from at least some quarters of the open source community.

On the Flyback site at Google (Nasdaq: GOOG) Code, the About section said, "Apple's Time Machine is a great feature in their OS, adding that Linux has "almost all of the required technology" already built in to recreate it.

The Interface Debate
In a thread at Slashdot on Flyback, much of the discussion has centered around how important Apple's GUI for Time Machine is to making it a good product.

In that discussion, for instance, Slashdot member "robot love" noted, "I realize the interface doesn't do the heavy lifting in an application, but I wish the FLOSS crowd would finally clue in to the fact that ease-of-use matters."

The rest of the discussion on this issue shows some of the changes in the way some perceptions about Apple and Mac OS X have begun to shift in a positive direction in recent years.

While researching for this article, we also found a based on Time Machine principles at In that article, the author wrote that Time Machine itself harkens back to Linux techniques written about some 10 years ago by Mike Rubel.

Technorati :

Nanotechnology is showing lights of a pick up power of invention

Nanotechnology is showing lights of a pick up power of invention :Nanotechnology focuses on new applications

Scientific advances today are accomplished at the intersections of various fields, according to Frans Johansson's brilliant book, "The Medici Effect." Breakthroughs come when disparate disciplines collide in new ways. This innovation is readily seen in nanotechnology, or the creation and use of materials -- even machines -- at the atomic or molecular scale. While the "sexiest" nanotechnology focuses on new applications, many possibilities exist to vastly improve existing techniques and procedures.

I got a lesson on one such potential use recently at Oak Ridge National Lab, which -- by design - -is sort of a "Medici effect" all its own, meaning the lab steers scientists from various fields into multidisciplinary efforts to solve vexing problems. Being a strategy consultant to Oak Ridge, I'm like a kid in a candy shop when it comes to receiving briefings from lab scientists because -- no matter the project -- it's easy to imagine real-world applications ranging far beyond the subject at hand.

As an expert on globalization, I focus a lot on transparency, with my analytic mantra being, "connectivity drives code." By that I mean, the more you engage the larger world (connectivity), the more you become subject to rules (code).

Want to live all by yourself in a shack in the woods? That means fewer rules for you, because your code is simply shutting yourself off from the outside world.

Want to travel all over this planet and engage in all sorts of commerce? That's going to mean a whole lot more rules apply, and with all those rules comes an abundance of transparency. You will be increasingly tracked, tagged and located by networks.

But what if you're someone who believes in that more primitive, isolated life and you're willing to fight and kill and die to impose that choice on others?

If that's your chosen ideology, then you will destroy other people's connectivity to keep that integrating world at bay. You'll live largely off the grid and engage global networks for the twin purposes of winning converts and sowing chaos.

In short, you'll leave no traces, just destruction, so tracking you will be no mean trick. You're like a criminal who doesn't want to leave any fingerprints behind.

Police have detected fingerprints at crime scenes for over a century to identify culprits. In the old days, the primary method involved spreading "fairy dust" ( i.e., various powders) over surfaces suspected of containing fingerprints -- hence "dusting for prints." If the suspect left behind oily enough prints, the dust would stick to them and reveal identifying information. Your fingers get oily, for example, when you touch oily body parts like your face or hair.

But say our suspect is more careful, washing his hands or using gloves or leaving prints solely on harder-to-dust surfaces, like certain metals or plastic bags or a victim's skin.

By the late 1980s the new gold standard in lifting "cleaner" prints involved superheating special glue until it vaporized and could bond with the targeted fingerprint, creating a sort of protected cast visible to the naked eye. This technology was superior to dusting because it could reveal prints based on less residual material, interacting with base components such as amino acids and glucose.

But this technology still suffered a time limit: the longer a print dried out, the fewer chemical components were left behind to react to the superglue.

So today's cutting edge in fingerprinting involves boosting the signal, so to speak. You want to be able to compile an identifying print from the slimmest amount of biological residue left behind.

By constructing new forms of dust employing nano-engineered shapes (e.g., rods, cubes, spheres, pyramids), scientists are figuring out how to enhance the most difficult-to-obtain fingerprints. These particles are used to shift the wavelength of light that is directed against targeted surfaces, resulting in an identifiable scattering signal.

Where can this go?

How about a rape kit that lifts the perpetrator's prints off the victim's body? Or international inspectors scanning mass graves to gather evidence for a war crimes prosecution? Or ... you get the idea.

In this increasingly connected world, it's our inability to finger bad actors that -- in the end --allows them to create the most terror. Make better fairy dust, crack tougher codes, connect more dots, create more transparency, and you've got fewer bad actors.

In this global war, the smallest things will matter most.

Thomas P.M. Barnett is a distinguished strategist at the Oak Ridge Center for Advanced Studies and senior managing director of Enterra Solutions LLC. Contact him at tom(AT) For more stories visit


Technorati : , ,

US newspapers to build online advertising network

US newspapers to build online advertising network

A group of US newspapers are considering joining to form a 'one-stop' Online Marketing network.

Reports suggest a consortium featuring Tribune, Gannett, Media News, Hearst and Cox Newspapers could provide competition to Yahoo's network, which was established in 2006.

Tribune alone owns the Chicago Tribune, the LA Times, Newsday and a number of TV stations.

Although unwilling to comment on the deal itself, Tim Landon, president of Tribune Interactive, told the Chicago Tribune: "We have ongoing discussions all the time with newspaper companies and online companies. We have a good relationship with all of them."

The paper reports the deal is needed to help bolster revenue from slowing print networks.

It is also thought that a number of other newspaper groups could also join the network. The Washington Post and McClatchy are two names that have been linked with a possible expansion.

Commenting on the rumours, Lean Levitt, the vice president of digital media at Cox, told the Tribune "the more the better".

Technorati : , ,

Google and Microsoft look to join behavioural ad targeting body

Google and Microsoft look to join behavioural ad targeting body

Behavioural Internet Advertising is an extremely effective way of reaching a huge audience, but it is causing some controversy in the US.

A representative of the Texas attorney general's office told ClickZ News that some states in the US could be looking to target the behavioural ad sector and certain online advertising firms.

A possibility is that the Federal Trade Commission could launch an investigation into possible privacy infringements.

However, the industry itself believes that self-regulation is adequate, with most of the major internet players signing up to the Network Advertising Initiative (NAI).

The NAI is the primary self-regulatory body for behavioural ad technologies. Yahoo is already a member, and Microsoft and Google are both also looking to join.

Behavioural ad targeting is becoming increasingly refined. The growth of social networks, on which users post personal information, is likely to mean that online advertisers will be able to pinpoint their audiences exactly and not bother internet users with irrelevant promotional material.

Use link building to improve your website's presence in the search engines.

Technorati : , ,

Finally: Google massacre PageRank’s of spammy paid links

Finally: Google massacre PageRank’s of spammy paid links

Recent Google algorithm update has led to many sites' PageRank dropping drastically, although almost all Direct Traffic's clients saw their own results improve.

This 'PageRank massacre' saw websites penalised for selling links that where, random, site-wide or big blocks of paid links, with many prominent publishers affected., and all dropped from PR7 to PR5, while Search Engine Guide and Search Engine Journal both slipped from PR7 to PR4. Other sites that were hit include Problogger, Andy Beard, Courtney Tuttle and StatCounter (which suffered a PR10 to PR6 fall).

Google have been prevaricating about this latest update for some time, but Direct Traffic believes that the search engine cannot relinquish its reliance on links completely, since this is the key feature of its algorithm.

Moving forward, the company will improve its algorithm in a slow, measured way so as not to throw search listings into chaos.

It will first of all remove those paid link offenders who are easiest to spot - for example, sites that have collections of links in the footer with no surrounding text, those with links from irrelevant websites, and link spammers (especially those using link churning - changing link positions or text at random).

Direct Traffic has always had a focus on responsible Link Building, and maintains that websites should always have the end user in mind.

Andy Boyd at Webmaint points out that Google is simply looking to root out poor quality link sellers, and that so long as you keep your quality high you will actually benefit as these players are removed.

Eric Ward from Search Engine Land has added his voice, welcoming the devaluation of web directories - which Direct Traffic has never valued as a long-term solution for link sourcing.

Direct Traffic welcomes the moves being made by Google to improve website relevance by taking the low-quality link sellers out of the picture.

"Google is just making relevancy count more, for us that is always much more of a concern than PR," James Helliwell from Team DT.

Andy Beard, whose site was negatively affected, speculates that Google may be keeping blog network interlinking down.

"Many of the reputable sources that have received a penalty are part of extensive blog networks, and they have one factor in common," he wrote. "They have massive interlinking between their network sites.

"They may also sell links or advertising that passes PageRank on some of their less visible properties, but those properties benefit from the high PageRank sites that link to them, with Sitewide Links.

"Some of these sites have been known to add or knock millions off of the price of Apple shares in the past, what do you think it is going to do to Google?"

Technorati : , , , , ,

Social Networks Find Ways To Monetize User Data

Sites are quickly finding ways help advertisers parlay user data and actions into targeted marketing.

Social networks are sitting on a treasure trove of personal data in the form of profiles chock full of information about the people who use their sites. These sites are quickly finding ways to turn this data into sources of income by giving advertisers opportunities to use it for targeted marketing.

This explains why a week after consumer groups asked the Federal Trade Commission to look into youth-oriented marketing at social networking sites, Facebook and MySpace, the two largest social networks, announced plans to give advertisers better tools to reach their predominantly youthful audience.

Much of the data is supplied by social network users themselves--age, gender, ZIP code, phone number, schools--and other information is generated by users' actions, such as lists of friends and groups they belong to. This data is used to segment the audience into specific categories--drinkers, sports enthusiasts, and so on--for targeted marketing.

What makes MySpace and Facebook data so valuable to advertisers is that it provides insight into the habits and affinities of the audiences, which skews toward the desirable youth demographic.

Facebook's new social approach to advertising, called Facebook Ads, lets advertisers "target exactly who they want based on how people affiliate themselves," says Chamath Palihapitiya, Facebook's VP of product marketing and operations.

For example, a restaurant called Junnoon has created a Facebook page that lets users make reservations. When a customer does so, the person's Facebook friends get alerted through the social newsfeed, and Junnoon has the option of buying a sponsored message to accompany that notification.

No user data is shared with advertisers using Facebook Ads. However, third-party developers who create applications for the Facebook platform can make use of social data, using an application called Facebook Beacon.

Fandango uses Beacon to let movie ticket buyers share their entertainment plans with their friends on Facebook. Users can opt out of Beacon's information-sharing system, Palihapitiya says. Facebook doesn't provide a way to opt out of social ads.

Third-party developers that create applications for the Facebook platform also can access and share data about users, depending on their privacy settings, including personal data, though not contact information.

Users can opt out of this sort of data sharing, but there's a catch. While Facebook says in its privacy policy that it has taken steps to restrict possible misuse of such information, "we of course cannot and do not guarantee that all platform developers will abide by such agreements."

Facebook launches online advertising system to rival MySpace

Social networking site Facebook has launched a new online advertising system that will allow companies to target specific audiences.

Under the scheme, advertisers will be able to build their own pages and allow users to identify themselves as fans of a product. Users can view related media and become friends or 'fans' of the product, allowing their details to be used in an advertisement.

"What we're building here is a massive network of real world connections through which people can share information," said the company's chief executive, Mark Zuckerberg.

"In essence, these social ad feeds turn users into brand ambassadors," Business Week writes. "Users are continually promoting what they like and what they've bought to their online connections, who, ideally, value their opinions and may even share their interests."

This move is ideal for internet advertisers looking to target specific audiences in the fast-growing world of social networking.

Rival site MySpace has also launched a targeted ad platform that will allow companies to place Internet Advertising, called SelfServe.

Link building can be used to bring your website to the attention of many more customers.

Technorati : , , ,

A Planetary System That Looks Familiar

They say there is no place like home, but it is beginning to look as if there is a place sort of like home 41 light-years from here in the constellation Cancer.

Astronomers reported Tuesday that there were at least five planets circling a star there known as 55 Cancri, where only four had been known before, making it the most extensive planetary system yet found outside our own. It is also the one that most resembles our solar system, with a giant planet orbiting far out from the star and four smaller ones circling closer in.

The new addition to the system circles 55 Cancri at roughly the distance of Venus in our own solar system, in the so-called habitable zone where it is warm enough for liquid water. But, with 45 times the mass of Earth, the planet is more apt to resemble Neptune or Saturn than Earth, and thus would be a deadly environment for any kind of life that we know.

"It's a system that appears to be packed with planets," Prof. Debra Fischer of San Francisco State University said of 55 Cancri. She is the leader of the team that reported its results in a paper to be published in The Astrophysical Journal and in a telephone news conference on Tuesday from NASA's Jet Propulsion Laboratory in Pasadena, Calif.

The scientists said the discovery augured well for the chance that with time and more data, astronomers would find places out there that look like home. They also said it marked the beginning of a transition between studying planets and studying planetary systems.

Another team member, Geoff Marcy, a professor at the University of California, Berkeley, said the discovery had him "jumping out of my socks." He said, "We now know our Sun and its family of planets is not unusual."

Jonathan I. Lunine, a professor at the University of Arizona who was not part of the work, said that astronomers were on the verge of beginning to answer a question posed by Albertus Magnus, the medieval German philosopher and priest who wondered whether there was but one world or many worlds. We now know, Dr. Lunine said, "how lonely the universe is, how far we live from distant stars."

In the last decade, about 250 planets have been discovered around other stars - the vast majority of them by the so-called wobble technique of monitoring a star's light for signs of the slight to-and-fro motion induced by the gravitational tugs of orbiting planets.

As technology and techniques have improved, the planet hunters have been able to move down the scale from Jupiter-size planets to ones only a few times as massive as Earth. But detecting rocky planets like Earth is probably beyond the current technology and must await future space-based missions, the astronomers admit.

One of the first of these "exoplanets" discovered, in 1996, was at 55 Cancri. Dr. Fischer and her colleagues have been observing that star for 18 years, adding more planets to the list of its retinue as they have made their presence known.The outermost and heaviest planet in the system, which is four times as massive as Jupiter, circles at a distance of 500 million miles, slightly farther than Jupiter in our own system, and takes 14 years to complete an orbit.

The star's three innermost planets all circle more tightly than Mercury at distances from 22 million to 3.5 million miles. The closest of three is also the smallest, only 18 as massive as Earth and surely permanently scorched.

The new planet, which Dr. Fischer called "one of the more annoying planets" because it resisted being folded into their mathematical models for such a long time, basks in the lukewarm light of its star from a distance of around 70 million miles, taking 260 days to complete one orbit. Although too massive for life itself, Dr. Marcy said, the planet could harbor rocky moons, just as Saturn and Neptune in our own solar system do, and these would be warmed to the same lukewarm temperatures as Earth.

The moons would have to be as massive as Mars, however, in order to keep their water from escaping into empty space. Dr. Marcy said, "All bets are off on what evolutionary biology would be like on one of these moons."The astronomers said they were also intrigued by the large gap - a band about 450 million miles across - between the new planet and the outermost one, in which they have detected nothing. There is a similar, but smaller, gap in our own solar system between Jupiter and Mars, caused by the disruptive effects of Jovian gravity on planetary formation. Dr. Lunine suggested that the more massive Cancri planet could have had a similar and deeper disruptive effect.

But the possibility remains that rocky planets could be lurking beneath detectability in that gap. Dr. Lunine said, "This gives us a name and an address to point out space telescopes at in the future."

Technorati : ,

New Windows on old PCs

Microsoft program puts new Windows on old PCs.

Under a new program announced Friday, large companies that sell refurbished PCs can get a brand spanking new copy of Windows to put on the machines--provided they pay Microsoft for the privilege.

The initiative, which provides refurbishers with a special version of Windows XP, could help save more machines from heading to the landfill. In many cases, though, it means Microsoft will be getting paid twice for putting Windows on the same PC.

That's because, to properly resell a refurbished PC using its original copy of Windows, Microsoft requires that resellers have either the "certificate of authenticity" that came with the PC or its restore disks--things that often get lost along the way. Businesses can also try to get a duplicate copy from the manufacturer, but that's a difficult and time-consuming process that doesn't scale well for the large refurbisher handling thousands of machines a month.

Microsoft won't say how much it is charging for the special versions of Windows XP, other than to say it is somewhat less than a computer maker would pay to put Windows on a new machine.

The company has had a smaller program that allowed refurbishers to put Windows onto machines destined for charities and educational institutions, but the new program addresses the broader market of PCs that are resold for general use.

For Microsoft, the refurbished PC market is an area worthy of more attention. The company did a study in 2004 and found that 20 million computers a year were being sold through formal refurbishment operations. The company estimates today that number has grown to 28 million, with growth in the refurbished market likely outpacing new PC sales growth.

"It's a part of the market that's been growing in both size and importance as PC specs improve and as countries tighten (their) environmental regulations," said Hani Shakeel, a senior product manager at Microsoft.

Today, rather than deal with the thorny licensing restrictions, many refurbishers just sell their PC's "naked"--that is without any operating system, leaving it up to the buyer to install Linux or a full retail copy of Windows, or perhaps go the piracy route.

Microsoft is launching the program with two large refurbished-PC sellers on board, but hopes to sign up additional North American refurbishers as well as computer makers worldwide. PC makers already have the right to sell refurbished versions of PCs they made originally, but they could use the new program to resell other brands of PCs they get through various take-back programs.

The idea of more PCs getting reused is one that is hard to argue with, as more and more usable PCs sit idle because of the hassles and concerns associated with re-use. But it does seem to me that there should be an answer that doesn't necessitate paying Microsoft twice to run Windows on the same PC.

Technorati : , , ,

Find here

Home II Large Hadron Cillider News