Search This Blog

Tuesday, April 29, 2008

Intel and Cray tied up on supercomputers


Intel is to embark on a partnership programme with supercomputing specialist Cray Systems.

The two companies will develop the main components for a new generation of supercomputers using multi-core chips and advanced interconnection methods.

The new systems will be targeted at traditional supercomputing markets, such as engineering calculations and scientific modelling and analysis.

Cray president and chief executive Peter Ungaro said: "This collaboration provides the HPC market segment with access to the best microprocessors the industry has to offer at any point in time, in the most advanced supercomputers in the world."

The two companies expect the initiative to bear fruit in 2010, when Cray plans to ship the first of its Cascade line of supercomputers.

The Cascade project, which is partially backed by a grant from the US Defense Advanced Research Projects Agency, is an attempt to use multiple processor types and computing methods in a 'hybrid' supercomputer.

The move also marks a win for Intel. Cray had originally planned to base Cascade on chips from rival vendor AMD.
More
Shares of supercomputer maker Cray Inc. rose Monday after an analyst upgraded the supercomputer maker, citing greater chip availability from Advanced Micro Devices Inc.

The Seattle-based company's stock rose 48 cents, or 7.6 percent, to $6.83.

Northland Securities analyst Chad Bennett raised his rating to "Outperform" from "Market Perform" and kept a $9 price target. The target implies he expects the stock to rise about 42 percent over Friday's $6.35 close.

Bennett said news from AMD that it will begin shipping a type of core semiconductor gives him "increasing comfort" on chip supply.

The company is set to post first-quarter results on Tuesday, but Bennett said investors should look at the rest of the year, especially after the AMD news.

GTA IV


HOW can Grand Theft Auto IV ever live up to the expectations bestowed upon it? From its first unveiling over a year ago, hype levels have been steadily escalating towards the stratosphere, and surely it would be impossible for Rockstar to satiate the lust of those deprived of a fresh Grand Theft Auto experience for nearly four years?

It's clear that this iteration of Grand Theft Auto is a markedly different beast to its predecessor San Andreas, and whatever your feelings about the last outing there's no doubting the series' new direction works towards creating a more immersive experience. Nearly all vestiges of videogame signifiers have been shorn away and for once it's perfectly valid to state that playing the game is akin to watching a cinema blockbuster. The HUD has been refined, appearing only when Niko is engaged in action, and the omnipresent map at the bottom left of the screen is often all that remains to remind onlookers that this is an interactive entertainment.

Videogame artefacts such as hidden packages have been omitted – though those who have a fetish for collecting need not worry, as in their place is a series of collectables of a more naturalised manner - and the jumps that have long marked out the Grand Theft Auto games are more subtly implemented. Indeed, it was only after a few hours play that we realised they were still there, so well camouflaged they were amidst the architecture of Liberty City.

And what a creation the city is. Grand Theft Auto IV's Liberty City is one of the finest worlds we've seen in gaming since we gallivanted around the Hyrule of Ocarina of Time. It's a world that lives and breathes with its own authenticity, and with an effortlessness that has rarely been glimpsed in gaming to date. A storm comes in and the lighting engine paints the streets of Broker with a melancholic taint, with passers-by erecting their umbrellas in the downpour and those more ill prepared raise their suitcases above their heads and run for cover, while in the background a lone saxophonist plays under the shelter of a band stand. The sun shines and the streets burst into life, light rays bouncing off the pavement and glistening over the bonnets of traffic. The level of detail informs every aspect of Liberty City, with even the most secluded alleyway exuding its own atmosphere and conspiring to make Grand Theft Auto IV's world one of the most complete witnessed to date.

This whole world is painted with its own distinct perspective - yes, this is a more realistic Grand Theft Auto than ever before, but it's also one of the most stylised entries, with a filter applied to the graphics attaining an effect that's akin to pointillism and at times can seem almost impressionistic. That's not to say there aren't occasional dips in framerate or texture creep, but none of this ever impinges on what is undoubtedly one of the most achingly beautiful videogame creations to date.

Above all it's a gritty creation, with the more down to earth and grimy nature of Niko's story reflected in each brick that builds Liberty City. At the beginning of the game, when confined to the Broker District by a terrorist threat, Niko can take a peek at what lies ahead of him in his quest, his view of the peaks of the skyscrapers of Algonquin filtered through the dirt of Liberty City that blights the lens. These details stretch to the interiors as well – this is a world of squalor, and no more is that evident than when climbing a flight of stairs to Niko's first abode in the slums of Broker, the strip lights humming and a muffled television audible through the door of a neighbouring flat.

This Liberty City is unmistakably a mirror image of contemporary New York, and with this fresh focus Rockstar has delivered one of its most potent satires yet. From the terrorist alert that initially locks down the city to the feeds from Weasel News that beautifully ape a certain real-life feral news service reporting on Niko's more outlandish escapades, to the mayoral election that is so brutally fought out over the airwaves between the fictional Michael Graves and John Hunter, it's never too difficult to ascertain the real-life sources for Grand Theft Auto IV's swipes.


Spyware targets frustrated GTA IV gamers
Gamers desperate to get their mitts on Grand Theft Auto IV are being targeted in an opportunistic spyware scam. Spam emails offer prospective marks free entry to a draw offering a PlayStation 3 loaded with the much-anticipated game as a prize.

In reality, these illicit emails are loaded with spyware designed to swipe personal financial information from compromised PCs.

Grand Theft Auto IV for the PS3 and the Xbox 360 was released today to delirium from avid gamers. But some wouldbe buyers have been left disappointed as game stores have been unable to fulfill demand to the extent that even a minority of fans who pre-ordered the game have been left empty-handed.

Spammers are seeking to exploit this disappointment with a carefully targeted spam scam.

Consumer-focused spam filtering firm ClearMyMail claims that more than half of the junk mail being blocked by its service on Tuesday is Grand Theft Auto IV-related. The vast majority of the junk mail messages offer the opportunity to win a PlS3 complete with the game.

"We are seeing unprecedented levels of spam in relation to the game; with more than half of the spam our service is blocking relating to Grand Theft Auto, most of which contain viruses and spyware," said Dan Field, Managing Director of www.ClearMyMail.com. Field advised keen gamers to wait until they can legitimately purchase the game rather than fall victim to "opportunists" capitalising on pent-up demand.

Stuart Rowe, chief operating officer at online retailer Play.com, said the game has created unprecedented demand. "We're experiencing trading levels similar to what we would see in the run up to Christmas, taking over 80 orders per minute at peak times," he said. "We have had to recruit extra warehouse staff to work through the night to ensure product arrives on the day of launch." ®

Google unveiled an algorithm called PageRank


Google engineers last week presented an interesting paper at the WWW2008 conference in Beijing which proposes to apply its PageRank system of finding relevant Web pages to radically improve the accuracy of image search results using Google. This new technology is being called VisualRank, according to an fascinating story on the subject in the New York Times.

The paper, titled "PageRank for Product Image Search," (PDF) was published by Yushi Jing and Shumeet Baluja of Google. In it they talk about using PageRank to analyze the "visual link structure" that can be created among a group of similar images. This paper proposes to move away from the current model of many image search engine rankings, described as using "the text clues of the pages in which images are embedded to rank images."

The new model would "identify 'authority' nodes on an inferred visual similarity graph and propose an algorithm to analyze the visual link structure that can be created among a group of images." A numerical weight would be assigned to each image and, according to the paper, ranking would occur based upon "expected user behavior given the visual similarities of the images to be ranked." The assumption, this blog points out, is that "people are more likely to go from an image to other similar images."

Nearly a decade ago, Google unveiled an algorithm called PageRank, reinventing the way we search for web pages. Now, the company says, it has a technology that can do much the same for online image search.

Last week, at the International World Wide Web Conference in Beijing, two Google-affiliated researchers presented a paper called "PageRank for Product Image Search," trumpeting a fledging algorithm that overhauls the primitive text-based methods used by the company's current image search technologies.

"Our experiment results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results," Shumeet Baluja and Yushi Jing tell the world from the pages of their research paper, available here.

Of course, the most recent Google Image Search results are often rubbish. Currently, when ranking images, the big search engines spend little time examining the images themselves. Instead, they look at the text surrounding those images.

By contrast, Google's PageRank for Product Image Search - also known as "VisualRank" - seeks to actually understand what's pictured. But the technology goes beyond classic image recognition, which can be time consuming and/or expensive - and which often breaks down with anything other than faces and a handful of other image types. In an effort to properly identify a wider range of objects, Baluja and Jing have merged existing image processing techniques with the sort of "link analysis" made famous by PageRank.

"Through an iterative procedure based on the PageRank computation, a numerical weight is assigned to each image," they explain. "This measures its relative importance to the other images being considered."

With classic image recognition, you typically take a known image and compare it to other images. You might use a known photo of Paris Hilton, for instance, to find other Paris pics. But VisualRank takes a different tack. Google's algorithm looks for "visual themes" across a collection of images, before ranking each image based on how well it matches those themes.

As an example, the researchers point to an image search on the word "McDonald's." In this case, VisualRank might identify the famous golden arches as theme. An image dominated by the golden arches would then be ranked higher than a pic where the arches are tucked into the background.

Baluja and Jing recently tested their algorithm using images retrieved by Google's 2000 most popular product searches, and a panel of 150 people decided that VisualRank reduced the number of irrelevant results by 83 per cent. The question is whether this could be applied to Google's entire database of images.

At the moment, this is just a research paper. And Google isn't the first to toy with the idea of true image search. After launching an online photo sharing tool that included face and character recognition, the Silicon Valley based Riya is now offering an image-rec shopping engine, known as Like.com, that locates products on sale across the web. And the transatlantic image rec gurus at Blinkx are well on their way with video search. ®

Apple Refreshes iMac

Apple this morning announced an update to its iMac line. The latest version of the popular all-in-one system includes new Intel Core 2 Duo processors with 6MB L2 cache and 1066 MHz front-side bus, a standard 2GB of memory on most models, and the most power graphics yet available on the system. The top of the line units feature a 3.06 GHz Intel processor and NVIDIA GeForce 8800 GS graphics.

Starting at $1,199, the 24-inch system features a slew of other features, including built-in AirPort Extreme 802.11n Wi-Fi, Bluetooth 2.1+EDR, Gigabit Ethernet, an iSight video camera, USB 2.0 ports, and a FireWire 400 and FireWire 800 port. On the software side of the equation, each new iMac ships with the iLife suite and OS 10.5 Leopard.

The iMac's "aluminum and glass all-in-one design has been an incredible hit with our customers and is just one of the reasons Mac sales are growing three and a half times faster than PC sales," Philip Schiller, the company's senior vice president of worldwide product marketing, said in a statement. "With the latest Intel processors, a faster new graphics option and more memory, customers now have even more reasons to love the iMac."

Apple is also touting the new iMac's environmental friendliness. The system is composed of "highly recyclable and durable materials including scratch-resistant glass and professional grade aluminum," Apple said. Each entry into the line is rated EPEAT Silver and meets Energy Star's new 4.0 power consumption requirements. The company is also offering free recycling of old Macs and PCs, with their Apple Recycling program.

The new models are available now through Apple's retail and online stores. Prices start at $1,199 for the 20-inch 2.4 GHz iMac. The 20-inch 2.66 GHz iMac will go for $1,499, and the 24-inch 2.8 GHz iMac for $1,799

Yesterday’s refresh of the iMac line is interesting, not because of the bump in speed that the refresh itself offers but because of the direction that Apple is going in. Is Apple preparing the way for gaming on the Mac?

First, let’s look closer at that 3.06GHz processor option for the 24-inch iMac. Some of you might be wondering what that processor is. Well, I can tell you now that it’s not an early incarnation of the Centrino 2 technology as some have suggested. It’s a special run on an existing CPU overclocked to handle the 3.06GHz workload and 1,066MHz. It’s definitely hosted on a Santa Rosa motherboard (the Intel GM965 northbridge and ICH8 southbridge gives that away). Apple’s working the current technology hard to get 3.06GHz out of something designed to give 2.8GHz here. Why?

Then there’s the GPU. This seems to be a stock nVIDIA 8800M GTS (even though Apple calls it an 8800 GS, but then again, Apple called Mobility HD 2600 XT parts HD 2600 Pro). This is a high-end GPU and certainly offers far more power than most Mac users currently need from the iMac.

Putting the overclocked processor and a high-end nVIDIA GPU in a box makes this system look like a gaming system to me. Sure, Apple is constrained by its use of mobile parts in the iMac, but the company does seem intent on squeezing as much power as it can out of the components.

Whether Apple is starting to cater for gamers using Boot Camp, or Apple is preparing the way for Mac OS X-based gaming I’m not sure, but either way it looks to me like Apple is getting into gaming. And why not, it’s a lucrative market!

Nanoengineered barrier invented



Nanoengineered barrier invented to protect plastic electronics from water degradation


A breakthrough barrier technology from Singapore A*STAR's Institute of Materials Research and Engineering (IMRE) protects sensitive devices like organic light emitting diodes (OLEDs) and solar cells from moisture 1000 times more effectively than any other technology available in the market, opening up new opportunities for the up-and-coming plastic electronics sector.



A team of scientists from Singapore's Institute of Materials Research and Engineering (IMRE) has developed a new patented film that has the highest reported water vapour barrier performance to date, as tested by the UK Centre for Process Innovation.
The tests have shown that the new film is 1,000 times more impervious to moisture than existing technologies. This means a longer lifetime for plastic electronic devices such as solar cells and flexible displays that use these high-end films but whose sensitive organic materials are easily degraded by water vapour and oxygen.

The new technology is a boon to the burgeoning plastic electronics industry that aims to deliver flexible, lightweight and cheap electronics products to consumers in ways that silicon electronics may never reach such as disposable or wraparound displays, cheap identification tags, low cost solar cells and chemical and pressure sensitive sensors.

A research institute of the Singapore's Agency of Science, Technology and Research (A*STAR), IMRE's breakthrough technology comes as Singapore seeks to jumpstart a plastic electronics industry locally as part of the country's long-term plan to anchor new knowledge-intensive industries in the economy.

The global plastic electronics industry is projected to grow to a market size of more than US$23 billion in the next 5 years .

The performance of devices like organic light emitting diodes (OLEDs) and solar cells is sensitive to moisture because water and oxygen molecules seep past the protective plastic layer over time and degrades the organic materials which form the core of these products.

Current commercially available films used to protect these materials have a barrier property or water vapour transmission rate of about 10-3g/m2 per day, or one thousandth of a gram per square meter per day at 25°C and 90% relative humidity (RH).

However, the ideal film for organic devices would require a barrier property of better than 10-6g/m2/day at 39°C and 90% RH, or one millionth of a gram per square meter per day.

Defects such as pinholes, cracks and grain boundaries are common in thin oxide barrier films when fabricated onto plastic substrates. These defects cause a 'pore effect', where oxygen and water molecules are able to seep through and penetrate the plastic barrier.

Current barrier technologies focus on reducing these defects by using alternate organic and inorganic multi-layers coated on plastic. These multiple layers "stagger" corresponding pores in adjacent layers and create a 'tortuous', lengthy pathway for water and oxygen molecules, making it more difficult to travel through the plastic.

In contrast, IMRE has taken an innovative approach to resolve the 'pore effect' by literally plugging the defects in the barrier oxide films using nanoparticles. This reduces the number of barrier layers needed in the construction of the barrier film down to two layers in this unique nanoengineered barrier stack. IMRE's barrier stack consists of barrier oxide layers and nanoparticulate sealing layers.

The nanoparticles used in the barrier film have a dual function - not only sealing the defects but also actively reacting with and retaining moisture and oxygen.

The result is a breakthrough moisture barrier performance of better than 10-6g/m2/day, or one millionth of a gram per square meter per day, which surpasses the requirements for flexible organic device substrates.

The barrier film also has a lag time of more than 2300 hours at 60°C and 90% RH (i.e. the time required for moisture to pass through the barrier film under those conditions).

These plastic barrier properties were tested and validated by the Centre for Process Innovation, UK.

"With a level of protection that surpasses the ideal requirements for such films to date, manufacturers now have the opportunity to extend the lifetime of plastic electronic devices by leaps and bounds!," says Senthil Ramadas, principal investigator of the project.


stumbling block in developing ultra-high barrier substrates has been the availability of an appropriate testing methodology.

Overcoming this hurdle, the IMRE project team has developed a highly sensitive moisture and oxygen permeation measurement system in tandem with the development of the film which is able to effectively measure permeation of less than 10-8g/m2/day. This system has been successfully implemented in a number of service based industry projects.

Adds Senthil, "Together with our expertise in encapsulation processes and permeation measurement technologies we are also able to provide a total solution package for industries such as flexible solar cells and OLED displays producers".

Recognising the potential of the high performance substrate technology, Exploit Technologies Pte Ltd (ETPL), the commercialisation arm of A*STAR, has funded the team through a 'flagship project' that seeks out research with excellent commercialisation potential.

Boon Swan Foo, the Executive Chairman of ETPL said, "Exploit Technologies sees commercial potential in A*STAR IMRE's breakthrough barrier film technology. It has excellent promise for enabling the fast growing plastic electronics industry. We want to take this technology from the lab to the market."

"The research team is already in talks with solar cells and flexible displays and lighting industry manufacturers who are currently evaluating the barrier films for product qualification", says Dr. Mark Auch, a member of the IMRE team who is actively involved in the commercialization of the technology.

IMRE has already signed agreements with a number of companies to advance the technology into the commercial domain. This includes a collaboration agreement with G24Innovations, a thin film solar cell manufacturer to look into developing the films for use in solar cells.

Clemens Betzel, the president of G24Innovations, who was in Singapore for the signing of the cooperation agreement, said, "The cutting edge work of IMRE's Barrier Substrates is likely to mean significant progress for Dye Sensitized Solar Cells, as exclusively manufactured today by G24I. We are looking forward to broadening our relationship with IMRE in the coming months."

IMRE has also signed a commercialisation agreement with KISCO (Asia), a subsidiary of the Japanese parent company KISCO Ltd., to commercialise and market the barrier films in the Asia Pacific region.

"We have a long-standing research relationship with IMRE and are very familiar with their work. We have high confidence in the quality of IMRE's barrier films and we believe, that this partnership will be beneficial to both parties," says Albin Tan, General Manager of KISCO (Asia), Singapore.

Current barriers have a series of alternating polymer and metal oxide layers that make up the plastic. This staggers adjacent 'pinholes', natural defects in the layers, thus slowing the passage of moisture and air through the 'pinholes'.

The secret behind the effectiveness of IMRE's technology lies in the unique barrier stack design, where nanoparticles are used when layering the barrier films. The design has a special layer of nanoparticles between the "pinhole" oxide layers. The innovativeness becomes clear as the nanoparticles "plug" the gaps and cracks in the oxide layer thus making for a more impermeable layer. In addition to sealing of oxide barrier film's defects, the nanoparticles absorb and retain the water and oxygen molecules. This concept helps reduce the number of barrier stacks to two or three only.

IMRE has successfully resolved the 'pore effect issue' in multi-layered barrier stacks and developed ultra high barrier plastic substrates (barrier properties < 10-6 g/m2/day) for high barrier applications. Our calcium test results show that there is no calcium oxidation up to 2300hrs at 60°C and 90% relative humidity.

Source: Agency for Science, Technology and Research (A*STAR), Singapore







Technorati : , , , , , , , ,

Scientists make chemical cousin of DNA



Biodesign Institute scientist John Chaput and his research team have made the first self-assembled nanostructures composed entirely of glycerol nucleic acid -- a synthetic analog of DNA. The nanostructures contain additional properties not found in natural DNA, including the ability to form mirror image structures. The ability to make mirror image structures opens up new possibilities for nanotechnology. Credit: Biodesign Institute at Arizona State University


Scientists make chemical cousin of DNA for use as new nanotechnology building block


In the rapid and fast-growing world of nanotechnology, researchers are continually on the lookout for new building blocks to push innovation and discovery to scales much smaller than the tiniest speck of dust.


at this scale holds great potential for advancing medical and electronic applications. DNA, often thought of as the molecule of life, is an ideal building block for nanotechnology because they self-assemble, snapping together into shapes based on natural chemical rules of attraction. This is a major advantage for Biodesign researchers like Hao Yan, who rely on the unique chemical and physical properties of DNA to make their complex nanostructures.


While scientists are fully exploring the promise of DNA nanotechnology, Biodesign Institute colleague John Chaput is working to give researchers brand new materials to aid their designs. In an article recently published in the Journal of the American Chemical Society, Chaput and his research team have made the first self-assembled nanostructures composed entirely of glycerol nucleic acid (GNA)-a synthetic analog of DNA.

"Everyone in DNA nanotechnology is essentially limited by what they can buy off the shelf," said Chaput, who is also an ASU assistant professor in the Department of Chemistry and Biochemistry. "We wanted to build synthetic molecules that assembled like DNA, but had additional properties not found in natural DNA."

The DNA helix is made up of just three simple parts: a sugar and a phosphate molecule that form the backbone of the DNA ladder, and one of four nitrogenous bases that make up the rungs. The nitrogenous base pairing rules in the DNA chemical alphabet fold DNA into a variety of useful shapes for nanotechnology, given that "A" can only form a zipper-like chemical bond with "T" and "G" only pair with "C."

In the case of GNA, the sugar is the only difference with DNA. The five carbon sugar commonly found in DNA, called deoxyribose, is substituted by glycerol, which contains just three carbon atoms.

Chaput has had a long-standing interest in tinkering with chemical building blocks used to make molecules like proteins and nucleic acids that do not exist in nature. When it came time to synthesize the first self-assembled GNA nanostructures, Chaput had to go back to basics. "The idea behind the research was what to start with a simple DNA nanostructure that we could just mimic."

The first self-assembled DNA nanostructure was made by Ned Seeman's lab at Columbia University in 1998, the very same laboratory where ASU professor Hao Yan received his Ph.D. Chaput's team, which includes graduate students Richard Zhang and Elizabeth McCullum were not only able to duplicate these structures, but, unique to GNA, found they could make mirror image nanostructures.

In nature, many molecules important to life like DNA and proteins have evolved to exist only as right-handed. The GNA structures, unlike DNA, turned out to be 'enantiomeric' molecules, which in chemical terms means both left and right-handed.

"Making GNA is not tricky, it's just three steps, and with three carbon atoms, only one stereo center," said Chaput. "It allows us to make these right and left-handed biomolecules. People have actually made left-handed DNA, but it is a synthetic nightmare. To use it for DNA nanotechnology could never work. It's too high of a cost to make, so one could never get enough material."

The ability to make mirror image structures opens up new possibilities for making nanostructures. The research team also found a number of physical and chemical properties that were unique to GNA, including having a higher tolerance to heat than DNA nanostructures. Now, with a new material in hand, which Chaput dubs 'unnatural nucleic acid nanostructures,' the group hopes to explore the limits on the topology and types of structure they can make.

"We think we can take this as a basic building block and begin to build more elaborate structures in 2-D and see them in atomic force microscopy images," said Chaput. "I think it will be interesting to see where it will all go. Researchers come up with all of these clever designs now."




Technorati : , , , , , , ,

study evidence modern birds came from dinosaurs



Scientists study evidence modern birds came from dinosaurs


It looks like chickens deserve more respect. Scientists are fleshing out the proof that today's broiler-fryer is descended from the mighty Tyrannosaurus rex. And, not a surprise, they confirmed a close relationship between mastodons and elephants.


Fossil studies have long suggested modern birds were descended from T. rex, based in similarities in their skeletons.


Now, bits of protein obtained from connective tissues in a T. rex fossil shows a relationship to birds including chickens and ostriches, according to a report in Friday's edition of the journal Science.


"These results match predictions made from skeletal anatomy, providing the first molecular evidence for the evolutionary relationships of a non-avian dinosaur," Chris Organ, a postdoctoral researcher in biology at Harvard University said in a statement.


Co-author John M. Asara of Harvard reported last year that his team had been able to extract collagen from a T. rex and that it most closely resembled the collagen of chickens.


They weren't able to recover dinosaur DNA, the genetic instructions for life, but DNA codes for the proteins they did study.


While the researchers were able to obtain just a few proteins from T. rex, they have now been able to show the relationships with birds.


With more data, Organ said, they would probably be able to place T. rex on the evolutionary tree between alligators and chickens and ostriches.


"We also show that it groups better with birds than modern reptiles, such as alligators and green anole lizards," Asara added.


The dinosaur protein was obtained a fossil found in 2003 by John Horner of the Museum of the Rockies in a barren fossil-rich stretch of land that spans Wyoming and Montana. Mary H. Schweitzer of North Carolina State University and the North Carolina Museum of Natural Sciences discovered soft-tissue preservation in the T. rex bone in 2005.


The research of Organ and Asara indicates that the protein from the fossilized tissue is authentic, rather than contamination from a living species.


The researchers also studied material recovered from a mastodon fossil and determined it was related to modern elephants.


Their research was funded by the National Institutes of Health, National Science Foundation, Paul F. Glenn Foundation and the David and Lucille Packard Foundation.


Meanwhile, in another paper in Science, researchers report refining a method to determine ancient dates that will allow them to better pinpoint events such as dinosaurs' extinction.


A team led by Paul Renne, director of the Berkeley Geochronology Center and an adjunct professor of earth and planetary science at the University of California, Berkeley, said they were able to refine the so-called argon-argon dating method to reduce uncertainty. The method compares the ratio or two types of the element argon found in rocks.


The greater precision matters little for recent events in the last few million years, according to Renne, but it can be a major problem for events in the early solar system. For example, a one percent difference at 4.5 billion years is almost 50 million years.


The new system reduces that potential uncertainty to one-fourth of one percent, the researchers said.




Del.icio.us : , , , , , ,
Flickr : , , , , , ,
Zooomr : , , , , , ,

what it looks like inside a black hole, astronomers recently obtained one of the closest views yet.



Astronomers get closer view of black hole jet


While we may never know what it looks like inside a black hole, astronomers recently obtained one of the closest views yet. The sighting allowed scientists to confirm theories about how these giant cosmic sinkholes spew out jets of particles travelling at nearly the speed of light.


Ever since the first observations of these powerful jets, which are among the brightest objects seen in the universe, astronomers have wondered what causes the particles to accelerate to such great speeds. A leading hypothesis suggested the black hole's gigantic mass distorts space and time around it, twisting magnetic field lines into a coil that propels material outward.


Now researchers have observed a jet during a period of extreme outburst and found evidence that streams of particles wind a corkscrew path away from the black hole, as the leading hypothesis predicts.


"We got an unprecedented view of the inner portion of one of these jets and gained information that's very important to understanding how these tremendous particle accelerators work," said Boston University astronomer Alan Marscher, who led the research team. The results of the study are detailed in the April 24 issue of the journal Nature.


The team studied a galaxy called BL Lacertae (BL Lac), about 950 million light years from Earth, with a central black hole containing 200 million times the mass of our Sun. Since this supermassive black hole's jets are pointing nearly straight at us, it is called a blazar (a quasar is often thought to be the same as a blazar, except its jets are pointed away from us).


The new observations, taken by the National Science Foundation's Very Long Baseline Array (VLBA) radio telescope, along with NASA's Rossi X-ray Timing Explorer and a number of optical telescopes, show material moving outward along a spiral channel, as the scientists expected.


These data support the suggestion that twisted magnetic field lines are creating the jet plumes. Material in the center of the galaxy, such as nearby stars and gas, gets pulled in by the black hole's overwhelming gravity and forms a disk orbiting around the core (the material's inertia keeps it spiraling in a disk rather than falling straight into the black hole). The distorted magnetic field lines seem to pull charged particles off the disk and cause them to gush outward at nearly the speed of light.


"We knew that material was falling in to these regions, and we knew that there were outbursts coming out," said University of Michigan astronomer Hugh Aller, who worked on the new study. "What's really been a mystery was that we could see there were these really high-energy particles, but we didn't know how they were created, how they were accelerated. It turns out that the model matches the data. We can actually see the particles gaining velocity as they are accelerated along this magnetic field."


The astronomers also observed evidence of another phenomenon predicted by the leading hypothesis - that a flare would be produced when material spewing out in the jets hit a shock wave beyond the core of the black hole.




Del.icio.us : , , , , , ,
Ice Rocket : , , , , , ,
Flickr : , , , , , ,

Sunday, April 27, 2008

Plethora of interacting galaxies on Hubble's birthday



NASA Releases Largest Collection of Hubble Images

In celebration of the 18th anniversary of Hubble’s launch, NASA released a series of 59 new images of colliding galaxies. Astronomy textbooks typically present galaxies as staid, solitary, and majestic island worlds of glittering stars.


But galaxies have a wild side. They have flirtatious close encounters that sometimes end in grand mergers and overflowing “maternity wards” of new star birth as the colliding galaxies morph into wondrous new shapes.

As this astonishing Hubble atlas of interacting galaxies illustrates, galaxy collisions produce a remarkable variety of intricate structures.

Interactions are slow stately affairs, despite the typically high relative speeds of the interacting galaxies, taking hundreds of millions of years to complete. The interactions usually follow the same progression, and are driven by the tidal pull of gravity. Actual collisions between stars are rare as so much of a galaxy is simply empty space, but as the gravitational webs linking the stars in each galaxy begin to mesh, strong tidal effects disrupt and distort the old patterns leading to new structures, and finally to a new stable configuration.

Most of the 59 new Hubble images are part of a large investigation of luminous and ultraluminous infrared galaxies called the GOALS project (Great Observatories All-sky LIRG Survey). This survey combines observations from Hubble, the NASA Spitzer Space Observatory, the NASA Chandra X-Ray Observatory and NASA Galaxy Explorer. The Hubble observations are led by Professor Aaron S. Evans from the University of Virginia and the National Radio Astronomy Observatory (USA). The images released today can be seen here.

The Hubble Space Telescope (HST) will be repaired and overhauled in August. Seven astronauts who will fly the Atlantis space shuttle to rendezvous with Hubble will carry out the revamping mission. Their mission has already been labeled STS-125. The goal of the mission is to repair the orbiting telescope until a replacement will be manufactured in 2013.

The U.S. astronauts selected for the next servicing mission to the Hubble Space Telescope had begun already their training in February last year.

NASA had intended to mothball the Hubble before the new telescope was in place, a decision that was met with protests among astronomers who have been able to look into space 2.2 billion light years and more because they don't have to peer through Earth's atmosphere.

Missions to the space station are easier because ISS crew is on hand to help inspect the shuttle. The ISS also offers up to three months refuge for visiting crew in case of an emergency. The Hubble, which orbits 580 kilometers above Earth, offers neither. That means the shuttle would have to survive on its own for up to 25 days, with the second shuttle on stand-by at a separate launch pad for a rescue mission.

A year ago, the Hubble telescope's most far-seeing camera shut down due to a possible power failure and other problems, prompting NASA engineers to put the entire telescope on temporary standby. The Advanced Camera for Surveys (ACS) was installed in 2002 in a special shuttle mission to replace the old space camera - in orbit since 1990 - and was hailed as the gateway to some of humankind's most spectacular views of the universe.

The August STS-125 mission aims to install a cosmic origins spectrograph and to replace a wide field camera in operation since 1993 with a Wide Field Camera 3. This latest camera will be the first on the Hubble that can cover everything from the ultraviolet to the infrared spectrum.

Theoretically, the James Webb observatory will replace Hubble in 2013 the earliest. The Hubble Space Telescope (HST) was first conceived in 1946 by astronomer Lyman Spitzer, constructed since 1979 and launched in 1990.

more...
Astronomy textbooks typically present galaxies as staid, solitary, and majestic island worlds of glittering stars.

But galaxies have a wild side. They have flirtatious close encounters that sometimes end in grand mergers and overflowing "maternity wards" of new star birth as the colliding galaxies morph into wondrous new shapes.

Today, in celebration of the Hubble Space Telescope's 18th launch anniversary, 59 views of colliding galaxies constitute the largest collection of Hubble images ever released to the public. This new Hubble atlas dramatically illustrates how galaxy collisions produce a remarkable variety of intricate structures in never-before-seen detail.
Astronomers observe only one out of a million galaxies in the nearby universe in the act of colliding. However, galaxy mergers were much more common long ago when they were closer together, because the expanding universe was smaller. Astronomers study how gravity choreographs their motions in the game of celestial bumper cars and try to observe them in action.

For all their violence, galactic smash-ups take place at a glacial rate by human standards - timescales on the order of several hundred million years. The images in the Hubble atlas capture snapshots of the various merging galaxies at various stages in their collision.

Most of the 59 new Hubble images are part of a large investigation of luminous and ultra-luminous infrared galaxies called the GOALS project (Great Observatories All-sky LIRG Survey). This survey combines observations from Hubble, NASA's Spitzer Space Telescope, NASA's Chandra X-ray Observatory, and NASA's Galaxy Evolution Explorer. The majority of the Hubble observations are led by Aaron S. Evans of University of Virginia, Charlottesville/NRAO/Stony Brook University.

Wednesday, April 23, 2008

Politician takes on Wikipedia


A left-wing German politician has filed charges against online encyclopaedia Wikipedia for promoting the use of banned Nazi symbols in Germany.

Katina Schubert, a deputy leader of the Left party, said she had filed the charge with Berlin police on the grounds that Wikipedia's German language site contained too much Nazi symbolism, particularly an article on the Hitler Youth movement.

Wikipedia to be converted
Wikipedia, the online encyclopaedia written by volunteers, is to be published in Germany as a book for people who prefer turning pages to clicking links, publishing multinational Random House said on Tuesday.
Editors will distil 50 000 of the most popular entries in the German version of Wikipedia into the 1 000-page volume to go on sale in September. When begun, Wikipedia was perceived as making books redundant, with no future for printed encyclopaedias.
The book will draw on the Wikipedia community's unconventional ideas of what knowledge people want, rather than prescriptions by scholars. There will be entries for Carla Bruni (the French first lady), Playstation 3 and Donald Duck's fellow characters.
Football stadiums or the US television series Dr House will rate as entries alongside the more usual nations and statesmen.
Random House, part of the Bertelsmann group of Germany, said the selection of 50 000 headwords would be based on the most common terms searched by the 15 million monthly users of Wikipedia in German.
"It's a document of the zeitgeist," said Beate Varnhorn, chief of the Bertelsmann Lexicography Institute, adding that professional editors would check the facts and edit out incongruous passages.
She said the volume would appeal to homes that had no permanent internet connection, since books were always available, but could also be bought by people who just like to browse for interesting facts.
Arne Klempert, a spokesperson for Wikipedia Germany, said the definitions would only be short summaries of the Wikipedia articles and there was no breach of the rights of Wikipedia contributors.
Commercial republication was allowed under the Wikipedia rules accepted by the site's users. Those rules also applied to Random House, which would not be allowed to claim copyright over the book.
"They can't re-monopolise it," said Klempert, who said Random House had taken the initiative and proposed the idea to Wikipedia.
"This will demonstrate that open-source writing also offers publishing houses opportunities for commercial development."
The German Wikipedia is second in size to the English Wikipedia. It was once calculated that it would take at least 750 thick volumes to print all 2.3 million articles in the English-language version. - Sapa-dpa .

MORE,,,,.

Students 'should use Wikipedia'
Students should be allowed to use the online encyclopaedia Wikipedia as it has become more accurate and trustworthy, its founder Jimmy Wales said in comments published by BBC Online on Friday.

"You can ban kids from listening to rock 'n' roll music, but they're going to do it anyway," he was quoted as telling the Online Information conference in London this week.

"It's the same with information and it's a bad educator that bans their students from reading Wikipedia."

Wales's comments come amid continued questions over the accuracy of the site, where online users can write and update entries, compared with other, more authoritative paid-for sources like the Encyclopaedia Britannica.

He had previously said Wikipedia lacked the authority for academic work, because of often unsourced, biased or inaccurate information.

Students who copied information from the site deserved to be marked down, he told the BBC in 2005.

But new fact-checking procedures introduced since then, including real-time peer reviews, had made Wikipedia more trustworthy, he said.

"There is no substitute for peer critique," he told delegates.

As long as an article contained accurate citations, he saw "no problem" with students using it as a reference work, although he added that academics would "probably be better off doing their own research".


ABOUT Wikipedia


Wikipedia 'is the best'
The German version of the do-it-yourself online reference work Wikipedia is better than Germany's most prestigious commercial encyclopaedia, the weekly magazine Stern asserted on Wednesday.

It engaged WIK, a research institute, to compare 50 randomly chosen articles from Wikipedia with 50 matching articles in the regularly updated online version, www.brockhaus.de, of the Brockhaus, Germany's equivalent of the Encyclopaedia Britannica.

Wikipedia is a website which can be altered by anyone who notices a mistake or wants to improve the information displayed, provided the contributor presents sufficient documentary evidence to back up the new information.

Stern said the Wikipedia's average rating was 1.7 on a scale where 0 is best and five is worst. The Brockhaus rated 2.7 on the same measure.

The articles were assessed for accuracy, completeness, how up to date they were and how easy they were to read.

In 43 matches, the Wikipedia article was judged the winner.

The co-operative project trumped by being up to date with the news. It gave Italian tenor Luciano Pavarotti's death date the next day, but Brockhaus had not noted this even weeks later. But Stern said Wikipedia also had the lead in accuracy.

The German-language section of Wikipedia, numbering 673 000 articles, is the second biggest after the English version.

Bilingual readers say the German articles tend to be more formal than the English ones.

Microsoft,s Live Mesh preview


Ray Ozzie, Microsoft’s chief software architect, wrote principles for Live Mesh, its Web-based data storage and software system.

As Microsoft is preparing to take its most ambitious step yet in transforming its personal computer business into one tied more closely to software running in remote data centers.
Microsoft has officially unveiled a preview of Live Mesh, the web services platform seen as a key plank of the company's aggressive software plus services strategy.
Chief software architect Ray Ozzie, who has been evangelising the project for some time, lifted the skirt on Microsoft’s Live Mesh last night.

The service will initially provide file sharing and folder synchronisation for Windows XP and Vista PCs to a closed beta of about 10,000 testers.

There are also plans to roll out Live Mesh to Apple Macs and other platforms, but the firm hasn't set a date for when customers can expect to see that happen.

The move is Microsoft's latest attempt to build the web platform of choice for consumers by merging more of its software within a SaaSy cloud. In recent months, Redmond has been working hard at blurring the lines by making its applications' capabilities available as services.

But the likes of Google, Salesforce, Amazon and Facebook might have something to say about Microsoft's online strategy. All those firms are equally keen to be the dominant Web 2.0 force.

Down the road, Microsoft hopes to bring more features to Live Mesh, including allowing customers to connect and synch all of their digital devices such as phones, games consoles, and music players.

Microsoft already has in its armoury Exchange and SharePoint Online, and Dynamics CRM Live – which was given the official red carpet treatment earlier this week.

Ozzie has also tackled the issue of making Microsoft Office productivity and collaboration available on the PC, mobile, and as a hosted service via Office Live in a direct challenge to the increasingly popular online office suite Google Apps.

Developers will probably be attracted to Microsoft's "open" platform offer to let them write code in a variety of flavours for Live Mesh that include anything from Atom to RSS and Javascript.

Meanwhile, consumers can expect to have at least 5GB of personal online storage and unlimited peer-to-peer data for synchronising information between devices.

Microsoft, which is still hotly pursuing Yahoo! in a hostile takeover bid, said it was also looking at a number of business models to monetise Live Mesh. These include paid subscriptions and advertising.

Pilot toxicology study of intravenously injected carbon nanotubes



The toxicity issues surrounding carbon nanotubes (CNTs) are highly relevant for two reasons: Firstly, as more and more products containing CNTs come to market, there is a chance that free CNTs get released during their life cycles, most likely during production or disposal, and find their way through the environment into the body. Secondly, and much more pertinent with regard to potential health risks, is the use of CNTs in biological and medical settings. CNTs interesting structural, chemical, electrical, and optical properties are explored by numerous research groups around the world with the goal of drastically improving performance and efficacy of biological detection, imaging, and therapy applications. In many of these envisaged applications, CNTs would be deliberately injected or implanted in the body. For instance, CNT-based intercellular molecular delivery vehicles have been developed for intracellular gene and drug delivery in vitro.
What these CNTs do once inside the body and after they discharge their medical payloads is not well understood. Cell culture studies have shown evidence of cytotoxicity and oxidative stress induced by single-walled carbon nanotubes (SWCNTs), depending on whether and to what degree they are functionalized or oxidized. A recent report also found that inhaled single-walled CNTs can cause damage to the lungs in animal studies. On the other hand, another study (New nanotube findings give boost to potential biomedical applications) reported that the CNTs leave the body without accumulating in organs and without observable toxic effects (read more about this ongoing debate in The detection of carbon nanotubes and workplace safety).
So of course you need to take these results with a grain of salt (see Comparing apples with oranges - the problem of nanotubes risk assessment).
For most medical applications like drug delivery, the most relevant route into and through the body for CNTs would be in the circulatory system. However, close to nothing is known about the acute and chronic toxicity of SCWNTs when they enter the bloodstream. A new study at Stanford University tested non-covalently pegylated SWCNTs as a 'least toxic scenario', and oxidized, covalently functionalized nanotubes as a 'most toxic scenario' in a study on mice. It was found that SWCNTs injected intravenously into nude mice do not appear to have any significant toxicity during an observation period of four months following injection.
"Our study demonstrates the first systematic toxicity evaluation of functionalized SWCNTs following intravenous injection" Dr. Sanjiv Sam Gambhir tells Nanowerk. "Single administrations of high doses did not lead to acute or chronic toxicity, but we observed some changes in red blood cells. Because of the small number of animals used in the tests, our findings must be considered a pilot study. Although more extensive series are needed to confirm our results and show equivalence in other mouse strains, they do encourage further exploration of functionalized SWCNTs in biomedical applications in living animals."

Liver and spleen histology. a–f, Haematoxylin and eosin stains of liver (a–c) and spleen (d–f) tissues of mice injected with phosphate buffered saline (PBS) (a,d), non-covalently pegylated SWCNTs (SWCNT PEG) (b,e) or covalently functionalized nanotubes (SWCNT O PEG) (c,f). Finely granular brown-black pigments were seen in sinusoidal liver cells of SWCNT PEG (b, arrows) and SWCNT O PEG (c, arrows), as well as a golden-brown pigment in spleen macrophages of SWCNT PEG and SWCNT O PEG (e,f), without signs of cellular or tissue damage. (Reprinted with permission from Nature Publishing Group)
Gambhir, a professor in Stanford University's Departments of Radiology and Bioengineering, and Director, Molecular Imaging Program at Stanford (MIPS) as well as Head, Stanford Nuclear Medicine, collaborated on this project with Stanford researchers from MIPS, Hongjie Dai's group in the Department of Chemistry, and the Department of Comparative Medicine. The scientists published their findings in the March 30, 2008 online edition of Nature Nanotechnology ("A pilot toxicology study of single-walled carbon nanotubes in a small sample of mice"). This work was funded in part by the National Cancer Institute's (NCI) Center for Cancer Nanotechnology Excellence (CCNE).
All aspects of toxicity, including EKG, blood pressure, temperature, cell blood count, electrolytes, etc were monitored repeatedly during the 4-months study period. Gambhir says that, although minor changes occurred, no statistically significant changes occurred between mice given SWCNTs and those not given them.
"At the end of the monitoring period of 4 months all mice were sacrificed and a full tissue histology was done to look for signs of organ toxicity; but there was none." he says. "The Kupfer cells in the liver (a special type of cell that sits within the sinusoids, which are the phagocytes of the liver) had eaten up the nanotubes and they were found in these cells within the liver."
Gambhir and his group were motivated to do this study because they are developing imaging agents for cancer detection that rely on carbon nanotubes. Of course, before these technologies can be moved to a human trial stage the question of toxicity must be much better understood than it is today. Although preliminary, this recent study gives some hope in at least not finding any obvious toxicity problems.

Music listening enhances cognitive recovery and mood

We know from animal studies that a stimulating and enriched environment can enhance recovery after stroke,
but little is known about the effects of an enriched sound environment on recovery from neural damage in
humans. In humans, music listening activates a wide-spread bilateral network of brain regions related to attention,
semantic processing, memory, motor functions, and emotional processing. Music exposure also enhances
emotional and cognitive functioning in healthy subjects and in various clinical patient groups.The potential role
of music in neurological rehabilitation, however, has not been systematically investigated. This single-blind,
randomized, and controlled trial was designed to determine whether everyday music listening can facilitate
the recovery of cognitive functions and mood after stroke. In the acute recovery phase, 60 patients with a left
or right hemispheremiddle cerebral artery (MCA) stroke were randomly assigned to amusic group, a language
group, or a control group. During the following two months, the music and language groups listened daily to
self-selected music or audio books, respectively, while the control group received no listening material. In
addition, all patients received standard medical care and rehabilitation. All patients underwent an extensive
neuropsychological assessment, which included a wide range of cognitive tests as well as mood and quality of
life questionnaires, one week (baseline), 3 months, and 6 months after the stroke. Fifty-four patients completed
the study. Results showed that recovery in the domains of verbal memory and focused attention improved
significantlymore in themusic group than in the language and control groups.Themusic group also experienced
less depressed and confused mood than the control group. These findings demonstrate for the first time that
music listening during the early post-stroke stage can enhance cognitive recovery and prevent negative mood.
The neuralmechanisms potentially underlying these effects are discussed.

Full report

Tuesday, April 22, 2008

Technology Market :Intel Vs AMD

















Intel Challenges AMD By Slashing Prices In Half,
Intel Core 2 Quad and Xeon processor prices are reduced by 50 percent in an effort to clear out 65nm inventory and to offer a low-price alternative to rival AMD.
The Intel Core 2 Quad Q6700 (2.66 GHz) has been reduced to $266, down from $530. Likewise, the Intel Xeon X3230 (2.66 GHz) was also reduced down to $266. The Intel Core 2 Duo E6850 was also cut from $266 to $183. Intel's Celeron 430 dropped down to $34.

Intel made a surprising spring price cut for some of its processors, such as the Q6700 Core 2 Quad or the Intel Xeon X3230. Some price cuts go as high as 50 percent, while other processors have had their price reduced by 30 percent.

The motivation for this initiative remains unknown, but Patrick Ward, spokesman for Intel, said most of the processors were built on the 65 nanometer technology, while Intel is now promoting its 45nm chips, Computer World reports.

“We’re transitioning from 65nm to 45nm,” said Ward. “We’re in the process of refreshing our line. If you see a 65nm [chip], it’s older technology and we’re moving from it.”

The price cuts include the Core 2 Quad Q6700, which dropped from $530 to $266, Intel Xeon X3230 also dropped from $530 to $266. Besides these significant drops, the Core 2 Duo E6850 now costs $183 from $266, while Xeon 3085 is now $188 from $266.

Under 20 percent price cuts include Intel Core 2 Duo E6600, down 16 percent from $266 to $224, Intel Core 2 Duo E4600, down 15 percent from $133 to $113, Intel Pentium Dual Core Processor E2200 down 12 percent from $84 to $74 and the list goes on.

Dan Olds, analyst at Gabriel Consulting Group Inc. told the same newspaper: “This really keeps up the pressure on AMD. Intel blankets the market from high end to low end, with multiple choices at almost every price point – each competitive with AMD on either performance or price or both.”

As AMD prepares its new 45nm line, so is Intel, by refreshing the current product line.

“They’re making sure they have a compelling price and/or performance value proposition in every segment where they compete with AMD,” Olds also said. “In short, it isn’t getting any easier to compete with Intel.”

Windows XP SP3

Microsoft released to manufacturers (RTM) the final code for Windows XP SP3. The upgrade provides support for WPA2 and the Peer Name Resolution Protocol (PNRP) used in Windows Vista, among other things. The public version will be available for download via the Web on April 29. Based on our initial installation, the upgrade will be effortless for most Windows XP users.

The last Service Pack for Windows XP, SP2, was released in August 2004. The initial release took some users all night to download and install. The company pushed back the initial public release from June 2004 originally. Despite numerous glitches still present in the code, Windows XP SP2 was formally made public on August 20, 2004, and Microsoft had to work hard to convince users to upgrade.


Microsoft confirmed today that the final version of Windows XP Service Pack 3 has been released to PC manufacturers right on schedule. The update will be available to end users to download next Tuesday, April 29, and pushed to Windows Update in June. A post on Microsoft's TechNet developer site confirmed the release.

Microsoft gave us an early look at the update as a 580MB disk image. What we saw is barely changed from our preview of an early beta of SP3, and seeing Windows XP SP3 for the first time is highly unremarkable.


Far from being a new operating system, Windows XP SP3 is really an accumulation of updates for compatibility, security, and performance. It doesn't contain new features found in Vista, aside from Network Access Protection (NAP), which lets XP systems work with Windows Server 2008's ability to enforce system health requirements before allowing access to network assets. In addition to that feature, the only actually new ones are "Black Hole" Router Detection, more description in the Security Options control panel, kernel-level support for FIPS 140-1 Level 1 compliant cryptography, and a new Product Activation system that allows installation without immediately requiring a product key.

On a 1.5GHz Athlon system with 1GB of RAM, the installation process took a little over an hour. The setup goes through listing third-party drivers, performs a system inventory, checks space for installation, backs up files, installs new OS files, and performs a cleanup. After that, a reboot is required. For a look at the process, see our XP SP3 slideshow .

Windows XP SP3 will be available via Windows Update as a 70MB download and at Microsoft Download Center as a full installation weighing in at 580MB. It will also be made available to volume license customers, TechNet subscribers, and MSDN subscribers. As a cumulative update, it can be installed on top of SP1 or SP2, and works with any edition of XP. The update, however, is not applicable to the 64-bit version of Windows XP. In an overview document, Microsoft specifically mentions that it works with Media Center Edtion, but our preview of the beta noted that Media Center updates were stopped after installing SP3. We haven't yet tested whether this has been corrected, so stay tuned for our results.

Finally, Microsoft noted that the processes system administrators can use to deploy XP to multiple machines have not changed; further information for them is available at Deploy Windows XP Professional.

Monday, April 21, 2008

10000 rpm (rotations per minute) hard drive by Western Digital


The latest hard drives to tout the high-end Raptor name get a 3.0 Gbit/s SATA interface and supposedly outperform their older brothers by 35 percent.
Western Digital picked an extremely appropriate name for its new 10,000 rpm (rotations per minute) hard drive. Dubbed the VelociRaptor, this new drive screamed through the PC World Test Center's performance tests. Velocity is clearly the raison d'etre for this drive: The VelociRaptor handily bested our tested field of hard drives to become our top overall performer.
Western Digital took its Raptor line of high-performance hard drives a step further on Monday with the introduction of the VelociRaptor. The company claims its newest SATA drive performs up to 35 percent faster than the last generation

Unlike many hard drives, which show strengths and weaknesses in our tests, the $300 VelociRaptor actually demonstrated its strength across the PC World Test Center's entire suite of hard drive tests. In one of its most impressive feats, the VelociRaptor required just 89 seconds to write 3.06GB of files and folders, besting the next-best drive in our chart, the Western Digital Caviar SE16 750GB, by 32 seconds--a 26 percent improvement.

The VelociRaptor is an interesting drive for more than just its performance, though: The latest in Western Digital's family of Raptor 10,000 rpm drives, the 300GB VelociRaptor doubles the capacity of WD's previous-generation 150GB Raptor drive.

WD plans to target the drive at gamers and PC enthusiasts first, even though the VelociRaptor has been designed for enterprise-class applications, too, and the company expects it to be adopted in enterprise settings as well. The drive carries a mean time between failure rating of 1.2 million hours, which puts it on a par with enterprise-grade drives.

Installing the drive was easy, though you'll notice, as soon as you take the drive out of the box that the VelociRaptor is no ordinary hard drive. With the VelociRaptor, WD came up with an innovative new design approach to achieving a high-performance desktop hard drive. WD squeezed its 10,000 rpm drive into a 2.5-inch chassis; traditionally, desktop hard drives--be they 7200 rpm or 10,000 rpm--are 3.5-inch hard drives. (Although the drive itself measures only 2.5-inches, the VelociRaptor is designed for a 3.5-inch drive bay.)

WD says it chose the 2.5-inch form for a couple of reasons. From a mechanical standpoint, you get as much flutter at the outer edges of the disk when you're spinning at the higher rotations per minute. Advances in areal density, even in smaller 2.5-inch disk platter designs, meant that WD could reduce half the area, and still let the VelociRaptor double its areal density as compared with the two-year-old 150GB Raptor drive.

Heat generation remains a constant concern with hard drives, particularly when the drive is spinning as rapidly as it does on a 10,000 rpm model. WD tackles the issue head on by mounting the 2.5-inch VelociRaptor drive into a heat sink sled. The IcePack heat sink helps the VelociRaptor run cooler than the previous-generation Raptor; WD says the Ice Pack reduces the temperature by about 5 degrees. The sled also doubles as the VelociRaptor's mounting adapter, so the 2.5-inch drive fits smoothly into a 3.5-inch drive bay.

The VelociRaptor (also referred to as the WD3000GLFS) will initially be available at the end of April, shipping in RAID 0 configuration on Alienware's high-performance ALX gaming desktop by the end of April. The $300 drive will enter mass distribution when it goes on-sale in mid-May at Western Digital's Web site   and selected reselellers.

New portfolio on printers

Hewlett Packard’s Image & Printing Group (IPG) has added teeth to its Print 2.0 strategy, formulated by the company a few months ago, with the roll-out of 25 printing products and solutions, the company’s largest single offering to date.
Comprising printers, software and Web-based resources, this slew of offerings is aimed at small and medium businesses (SMBs) and home offices (SOHOs).

“These products will enable our SMB customers to have a level playing field against larger-sized competitors by improving their marketing effectiveness, increase productivity and reduce costs involved in their printing needs”, Herbert Koeck, H-P Vice-President (Commercial Printing), Asia-Pacific, said at the launch event titled “Business Go Print 2.0”, organised by the company for media persons from Asia-Pacific and Japan.

While the number and range of printers unveiled by the company - 17 in all, covering both laser and inkjet technologies and catering to single function, multi function and all-in-one needs - was in itself eye-catching, there were a couple of other noteworthy first-time offerings from the company.

First-time offerings

The company has decided to offer Colorsphere, the company’s patented ‘chemically grown’ toner technology that was hitherto found only in H-P’s higher-end printers, across all their colour laser printers including the entry-level ones. For an SMB customer, this provides access to professional quality prints at a much lower cost.

In another beginning, HP came out with its first LED-based scanner, moving away from the traditional halogen bulb. LED (Light emitting diodes) technology uses much less power and are ever-ready for use, eliminating the waiting time that the halogen bulb-based scanners need. The company has incorporated these scanners in their new multi-function range of printers.

Speaking about the fleet of printers, Christopher Morgan, Senior Vice-President, Imaging and Printing Group, Asia-Pacific & Japan said, “The needs of SMBs are various and vary from one case to another. Now we can say that we have a printer on offer for any SMB’s needs and expectations.”

New resources

The company has also come out with a few new software tools and online resources to help SMBs conceptualise, design and print their own marketing material, eliminating the need for a professional service provider.

These tools and templates ship with all the printers and can also be downloaded from the company’s website for free.

Among the eye-catchers at the event were the entry-level colour laser printer CP 1215 which, while offering Colorsphere technology and inline single-pass printing, can also print in black at the same cost-per-page as a monochrome printer; a multi-function printer CM 1312 which boasts of instant-on scanning and printing; a single-function monochrome ‘workhorse’ printer P4515 which claims to have broken the one-second-per-page barrier at a printing capacity of 62 pages per minute and the H-P in-house Marketing Starter Kit.

The imaging tools and templates contained in the Kit work harmoniously with a word-processor and are easy to use even for a layman.

Compare
KUSA - Printers have moved beyond just printing documents.


Some all-in-one printers let you remove the red-eye when you print photos. Some can even use your cell phone to print from. Consumer Reports wanted to know: Should you push your plain printer aside for a multipurpose, all-in-one machine? Manufacturers are offering many more features on all-in-one inkjet printers. Generally, they're offering bare-bones versions of their plain inkjet printers.

To compare quality, Consumer Reports put 45 printers through a number of tests. Standardized documents were printed and then compared. Testers also printed photos and evaluated them. Tests show both kinds of inkjet printers were pretty comparable in performance. But there could be big differences in how much it costs to print a page of text or a photo.

Over the long haul of the printer, that can actually add up more in terms of price than the initial sticker price. For an all-in-one inkjet printer, Consumer Reports recommends the Canon Pixma MP520. It's a Best Buy at $140. If all you do is print, and you don't need an all-in-one printer, Consumer Reports named another Canon Pixma a Best Buy. It's the Canon Pixma iP4500, for 120 dollars.

There are a number of money-saving moves you can make when printing. If it's something from the Web, first preview the pages so that you can delete blank pages at the end. When possible, print on a lower-quality setting. And using double-sided printing can cut down on the amount of paper you use.

Sunday, April 20, 2008

The Big-Bang Machine : Large hadron Collider


Since the world is going to end later this year when the European Organization for Nuclear Research finally turns on the Large Hadron Collider (which will create mini-black holes, magnetic monopoles, or convert all the matter in the universe into exotic strangelets that will get big enough to turn into matter-sucking maelstroms), this might be a good time to make amends and prep for that all.
Unlocking the secrets of the universe doesn't come cheap. The European Organization for Nuclear Research (CERN) has spent at least 10 years and $8 billion building the Large Hadron Collider (LHC), the world's biggest particle accelerator, under the Alps. This month, CERN will power up the LHC, and, later this summer, start smashing particles together to try to understand the beginnings of life, the universe -- and everything. Scientists hope the answers to vexing mysteries (What causes mass? How did matter survive the big bang?) will eventually emerge from the debris. Here's a cheat sheet to the world's biggest science project.

1.Two beams of protons are propelled in opposite directions around a 17-MILE CIRCULAR TUNNEL, located at least 165 feet beneath the French-Swiss border. Building subsurface meant lower costs (no need to buy up acres and acres of land) and a natural rock shield for the radiation produced by the LHC.

2. The particles will be guided around the tunnel by more than 1,600 superpowerful, cylinder-shaped ELECTROMAGNETS, some of which weigh more than 30 tons. The protons will zoom around the ring up to 11,245 times per second, reaching 99.9999991% of the speed of light.

3. At four points in the ring, magnets will push the beams together, causing up to 600 million PROTON COLLISIONS per second. If all goes as planned, these high-speed, high-energy crashes will create bursts of rare forces and particles that haven't been seen since the big bang 13.7 billion years ago.

4. Four huge PARTICLE DETECTORS -- the biggest, ATLAS, is 150 feet long, 82 feet high, and has more than 100 million sensors -- will track and measure the particles at each collision. Filters will discard all but the 100 most interesting crashes per second. This will still produce enough data to fill a 12-mile-high stack of CDs per year.

Then What? The results will be analyzed by 100,000 processors and thousands of scientists around the world. CERN predicts that within a year they will be able to identify particles that had previously existed only in theory. Physicists will be hunting for the elusive Higgs boson, or "god particle," which is believed to imbue matter with mass. "We'll either find it," says CERN's James Gillies, "or prove that it doesn't exist." Particle physicists also hope to learn more about the composition of dark matter and dark energy -- the invisible stuff that makes up 96% of the universe. They also want to prove that science fiction is actually reality by finding evidence of extra dimensions beyond our 3-D world.

General Motors introduces hybrid car sales in China


Customers look at Buick and Chevrolet cars at a sales office in Beijing in this Oct. 18, 2006 file photo. General Motors Corp. will sell its first gas-electric hybrid cars in China in July, introducing a model created in part by GM's Shanghai design center, the company said Saturday. (AP Photo/Greg Baker, File)



General Motors Corp. will sell its first gas-electric hybrid cars in China in July, introducing a model created in part by GM's Shanghai design center, the company said Saturday.

The Buick LaCrosse will be the second hybrid to enter the Chinese automobile market following Toyota Motor Corp.'s Prius in early 2006.

The LaCrosse is due to be unveiled Sunday at the Beijing auto show, GM managers said. They said it would be priced under 300,000 yuan ($43,000; 27,000 euros) — comparable to the Prius.

"We don't expect to see a very high volume of sales of this car in China in a short period of time. But we bring this technology to help China support sustainable growth and bring consumers in that direction," Joseph Liu, GM China's vice president for sales, told reporters.

The car was developed with contributions from GM's Pan Asia Technical Automotive Center in Shanghai, said Maryann Combs, the center's president.

Hybrids improve fuel efficiency and cut emissions by generating extra power from the brakes as a vehicle stops. But they also cost more because they require both gasoline and electric motors.

GM says the LaCrosse will be the first hybrid made in China. The Prius is assembled from imported parts.

Toyota has sold about 2,500 Priuses in China since 2006, but sales are slowing in part because Chinese drivers are unfamiliar with hybrids, according to reports in the industry press.

GM has made China, the world's second-largest and fastest-growing vehicle market, a key part of its efforts to develop alternative power sources. It announced plans in October for a $250 million (euro158.43 million) fuel research center in Shanghai.

On Saturday, chairman Rick Wagoner took part in opening an automotive energy research center partly financed by GM at Tsinghua University in Beijing, the alma mater of President Hu Jintao.

GM hopes the Chinese government will introduce incentives to encourage sales of alternative vehicles, Liu said.

The company plans to introduce a hybrid version of the Cadillac Escalade sport SUV in China next year, followed by an all-electric car as early as 2010, he said.

GM says it sold just over 1 million vehicles in China last year.

20 Percent of Scientists Admit Using Brain-Enhancing Drugs -- Do You?

Nature released the results of an online survey in which 20 percent of respondents, largely drawn from the scientific community, admitted to using brain-enhancing drugs like Ritalin (methylphenidate) and Provigil (modafinil).

Sixty-two percent of the scientists who had taken drugs used Ritalin while 44 percent reported using Provigil and only 14 percent had tried beta blockers like propranolol.

The 1,427-person survey was launched after a duo of articles this winter touched off a storm of questions about widespread neuroenhancer use by the scientific community. Jonathan Eisen of UC-Davis, an evolutionary biologist, even successfully spread an April Fools' rumor that the National Institutes of Health were planning to regulate the use of brain "steroids" as a condition of funding scientists.

All of this led me to ask Wired.com scientists (and other readers) three questions:

1. Have you used cognitive enhancers?
2. Did they work for you?
3. Would you talk to me about your experiences?

Feel free to comment, e-mail (onemosi@gmail.com),

Fitting MySQL into Sun's orbit

Sun recently experienced a hail of criticism when the company hinted that some add-ons for the popular open source database, MySQL, might be available only for paid customers.

Today Marten Mickos, former CEO at MySQL AB and now senior vice president of Sun’s database group, backed off the statement, saying that Sun has not made an official decision.

Sun, which acquired MySQL earlier this year for over $1 billion dollars, raised the ire of the MySQL community when it suggested that some high-end features due to arrive in MySQL 6 would be available only to paying customers.

A shrill chorus of critics on Slashdot and throughout the online world loudly condemned the potential move and accused Sun and MySQL of betraying the community that has helped make it successful. MySQL claims users in the tens of millions.

Mickos responded to the Slashdot post saying:

In 6.0 there will be native backup functionality in the server available for anyone and all (Community, Enterprise) under GPL.

Additionally we will develop high-end add-ons (such as encryption, native storage engine-specific drivers) that we will deliver to customers in the MySQL Enterprise product only. We have not yet decided under what license we will release those add-ons (GPL, some other FOSS license, and/or commercial).



MySQL is used by a host of major Web businesses, including Google, Yahoo, Facebook and Amazon.com. Under Mickos' leadership, outside analysts estimate, the private company garnered about $50 million in revenue last year while making its source code available for anyone to use or adapt at no cost. MySQL charges for technical support and collects licensing fees from companies that use the code in their own proprietary software.

But Mickos recently traded his chief executive title for that of senior vice president at tech giant Sun Microsystems, which acquired MySQL for $1 billion this year. And last week, as 2,000 programmers and users gathered for the annual MySQL conference at the Santa Clara Convention Center, some open-source enthusiasts blogged with alarm on learning that MySQL may offer "add-on" features to paying subscribers, without including them in the free version.

Others said that's not unusual. Mickos posted responses on several blogs, saying the move predates Sun's acquisition and is part of an effort to build revenue while keeping the core product open-source. He also said there's no decision on how the add-ons may ultimately be licensed.

Mickos' new boss, Sun CEO

Jonathan Schwartz, has called the open-source model a cornerstone of Sun's business, and describes MySQL as a key element of Sun's drive to supply everything from hardware to software for clients around the world.
We asked Mickos how MySQL fits into Sun's efforts; the following was edited for length and clarity.

Q Is Sun's approach to open source any different from what yours has been?


A With open source, you can't find two companies with exactly the same view. But when we discussed acquisition, they said we are acquiring you for what you are, not to change you.

Q Sun said downloads of your products increased from 50,000 a day to 60,000 after the deal was announced. Is that continuing?


A They did increase, but we may have reached saturation levels. (Laughs.) How many developers are there in the world?

Another thing we measure is the number of blog postings mentioning MySQL, and they have grown significantly.

Q That could be a positive or a negative, though.


A No, this is open source. It doesn't matter what they say. A blog posting saying, "MySQL stinks and here are all the defects," is a positive one. And this is what closed-source companies don't get.

When somebody is complaining about your product, they are saying, "I would love to love you but I cannot currently." These people will become your most passionate users if you listen to them and if you take action.

Q What's the conversion rate, or number of customers who are paying for MySQL compared with those who use it for free?


A We don't call it a conversion rate, because there are many who will never convert. We have worldwide about 1,000 non-paying users for every paying customer. But then, kids at the age of 11, they download our stuff and play around with it. Every student downloads it, every developer.

It's common thinking: Hey, you're open source and it's free - you will never amount to anything. But it's not true.

Yes, we will have millions who will never pay us. If you run a blog on WordPress, you don't necessarily need me to service you or your database. But if you are YouTube or Flickr or Facebook or Nokia or Cisco, then you need us, and you pay. You couldn't scale the way you scale if you didn't have MySQL and if you didn't have support from the vendor, from Sun.

Q Are you expecting to see MySQL integrated into other Sun products?


A Integrating with Sun's products is an attractive thing, but it's not the main idea. The main idea of the acquisition and of our business generally is to be the platform for the Web economy. Now, with the help of Sun, we will be able to accelerate this.

We had in our field organization 200 people; Sun has 17,000. We have a lot of usage in the big corporations, but we haven't been able to sell to them and become key vendors for them yet. With Sun, we get instant access to the big Fortune 500 and Fortune 2000 customers.

And to those who do want more than the database, we'll be able to say, "If you need the operating system, the middleware, the hardware, the development environment, you can get that from us, too."

Q Are you expecting your number of employees (currently about 400 people) to grow, and will your workforce become more concentrated here in Northern California?


A We have a distributed organization, with about 70 percent of our people working from home in 30 countries around the world.

We are a growth business and the mandate we have from Sun is to continue to grow even faster than before. So we'll continue to hire the best people, wherever they are. We have programmers in China, India, Scandinavia, the Ukraine.

The interesting thing is we have one programmer in California and he's in Los Angeles. (Laughs.) Offices are so last century.

Q Would you expect MySQL's corporate culture to change now that you are part of Sun?


A I do think it will change, but we've always had that mindset at MySQL of saying: "What are the core values that we must not change?" - but then let us change everything else.

Q You've been in Silicon Valley for five years, but you have worked in other parts of the world. What's your sense of how Silicon Valley's role in the tech industry has changed?


A Silicon Valley has changed from being a production center to a trading place. There's still development and production being done here, but this is a place where you get together to strike a deal, build partnerships, have your (conferences and other) events.

Q One component of MySQL is a product called InnoDB, developed by a company that was acquired by Oracle. Do you feel any pressure to drop that component, for competitive reasons?


A We continue to use the component. It's very good, but there are many alternatives, so it's not that we are dependent on them. The surprising thing is it's good for Oracle to have a presence in our ecosystem and it's good for us that that little piece of software is now having the backing (from Oracle) and we can trust it.

Q It's an open-source product, but you pay a license fee for it?


A It is open-source. We do pay them money for the commercial benefit we get from it. They provide us support and other things.

We always had the view that the fact that it's open-source doesn't mean you can't pay money for it. We pay money for value and our customers pay us money for value.

Find here

Home II Large Hadron Cillider News