Search This Blog

Thursday, September 6, 2007

Linux will be on a Third of Smart Phones in 2012


pb:24hours news


By 2012, Linux will be running on nearly 31 percent of all smart devices, thanks to a growth rate faster than Windows Mobile and Symbian, according to predictions from a research firm.


Linux smartphones will grow at more than 75 percent per year, according to ABI Research, and will be running on 331 million devices by 2012.


"Serious initiatives from the likes of Intel and Access are gathering pace and momentum, whilst the carrier community continues to identify Linux as one of the few operating systems that it intends to support in its long-term plans," said ABI research director Stuart Carlaw.


Symbian won't be too pleased with the figures, as it claims to currently have 72 percent of the smartphone market. However, Symbian's figures are very regional: it has around 90 percent of the Europe and "rest of the world" sectors, but it hasn't cracked the U.S. (it has less than ten percent there) and is only around 65 percent of the market in China and Japan, according to Canalys figures that Symbian quotes.


In China and Japan, Linux smartphones already have more than 30 percent market share, having grown massively since 2004 as earlier Canalys graphs show.


Access, which owns the Palm operating system, has created the Access Linux Platform (ALP). It is also planning a move to Linux for the Palm OS. Intel, meanwhile is supporting Linux in its ultra-mobile platform.


"Linux is benefiting from growing support in the handset OEM community, most notably Motorola, but also Nokia with less traditional types of devices aimed at mobile broadband applications," said Carlaw.


Motorola has revealed plans to have Linux on 60 percent of its handsets within the next two years, and founded the LiMo group.


Nokia, meanwhile is heavily committed to Symbian, but has put Linux on its N800 Internet tablet .


The other wild card is Google. The company's entry into the phone hardware market is still rumor but most of the latest rumors suggest a Linux-based phone.





Technorati :

Moray eels' hidden jaws pack second bite.



Moray eels, those snake-like predators that lurk in coral reefs, use a second set of jaws to pull prey back into their throats with deadly efficiency, researchers said on Wednesday.


Biologists have known for some time that moray eels have a second set of jaws, known as pharyngeal jaws, as do many other bony fish. But until now, biologists had never seen them put to such unique use.


"They spotted this outrageous behavior of the pharyngeal jaw thrusting way forward into the mouth, which was not suspected before," said Mark Westneat, who studies feeding mechanisms of coral reef fishes at the Field Museum of Natural History in Chicago.


"The surprise and interest was the extent of the movement, and how it grasped the prey and yanked it back into the throat," Westneat, who wrote a commentary on the findings, said in a telephone interview.


"It's one of these great 'Oh wow' stories in basic biology."


Rita Mehta and colleagues at the University of California Davis discovered the moray's special feeding ability through high-speed digital cameras, that captured the second jaw as it jutted forward while feeding.


Mehta, whose study appears in the journal Nature, said the jaws allow the eels to swallow large prey.


Mehta had set out to understand the purpose of this second set of jaws in moray eels, a diverse group of some 200 species.


ORAL GYMNASTICS


She and UC Davis Professor Peter Wainwright used X-ray and other imaging equipment from the university's veterinary school to work out how the jaws could move.


It turns out the moray accomplishes its oral gymnastics by elongating the jaw muscle, allowing the second set of jaws -- armed with large curved teeth -- to bite into the prey.


When not in use, the moray's extra set of jaws rest behind the eel's skull. When in use, they move almost the length of the animal's skull.


"What this enables moray eels to do is to grip their prey at all times," Mehta said. "It's definitely a good predator."


Of the roughly 30,000 species of fish, most devour their prey by means of suction, or as in the case of sharks, by biting off large chunks.


Mehta and Wainwright suspect moray eels may have evolved this fierce feeding method through hunting in tight spaces, such as the crevices of coral reefs.


In the wild, moray eels can reach 10 feet in length.


They are now looking into how the moray's jaws evolved. Other species of eel, such as the American eel Anguilla, feed by suction.





Technorati :

Finding Reality in Fantasy :Strange New World- Virtual reality



It's getting nippy here in the Strange New World. Fall is definitely in the air, and that means it's time to channel our inner teenager by heading back indoors for two of our picks of this week - getting ready for some football on a fab new better-quality LCD TV and playing what is probably the hottest video game right now: "Bioshock."


And for our third pick - just to stay at least a bit grown-up - first there was teller-less banking, now we have bank-less banking. Big consumer banks are warming to the idea of offering account information and services on mobile devices. The systems are in early tests, so only a few banks are trying it, but sometime soon you'll be balancing your checkbook, well, anywhere.


Here, then, are our picks for this first week of September 2007:


Finding Reality in Fantasy


We don't know how you spent your Labor Day weekend, but we confess that we spent ours exploring the Ayn Rand-inspired underwater world of "Bioshock", the first-person shoot'em-up from Take 2 Interactive ($49.99).


"Bioshock" is a wonder of modern gaming hardware and design that even game-aloof adults can admire.


You are set afloat off a bizarre island - a la "Lost," but with an early mid-20th century American Gothic vibe - ditched after a fantastically realistic ocean plane crash. Then you fight for your life in a way cool underwater megalopolis. Not only is the game great - every object such as walls, boxes and windows is in play - but the story is compelling and catered just to you.


Players use a combination of bio-engineered capabilities and good, old-fashioned guns and ammo to conquer a blizzard of enemies. And we couldn't give away the story line if we tried: With more than 40 different power and weapon combos, the game can be beaten in many, many different ways. Viewer-decided gaming plot lines have always felt contrived to us, but not here. "Bioshock" pulls off the amazing trick of seeming to offer a different experience for each and every user.


The larger back story on this title is that this game is just one in a flashy new set of titles hitting stores over the next several months: "Assassin's Creed" and "Halo 3" are the major ones to watch. These titles promise to finally take full advantage of the latest in gaming hardware.


Even if the thought of video games gives you digital hives, be sure to at least get a glimpse of these new games. They promise to change our perception of what is a game, what is a movie … and, honestly, what is reality.


If You Are Going to Buy a TV for Thursday Night …


Football is finally back on today, as Peyton Manning and the Colts and Reggie Bush and the Saints kick off the 2007 NFL season. Many - OK, we can say it - guys mark this most auspicious of sporting events with a new TV. The big news this fall is that there seems to be a real winner in the plasma/LCD wars. Liquid crystal displays have become the technology to beat not only for cheaper sets, but in better ones as well.


Sharp has come to market with a beautiful line of ultraflat TVs - the AquosHD D64U line. These sets offer full 1080p resolution, side viewing angles from essentially 180 degrees, an impressive 2,000-to-1 contrast ratio, all the necessary inputs and connectors, and - get this - it is only about 3 inches thick.


That makes the Aquos a flat-panel TV that is pretty much flat.


Better yet, this fabulous set in a 42-inch dimension lists for just $2,099, which you should be able to beat at most stores. Yes, you will see a difference between the Aquos and a better top-of-the-line plasma TV, say a Pioneer Elite, but at these prices, the difference is negligible. If you're looking for a better TV, the D64U is a downright steal.



Dialing for Dollars


Remember when we got excited over picture-mail or T9 text messaging? Yeah, ancient history. Nowadays you really have to do something impressive to raise a nerd's eyebrow. How about full-on banking on your cell phone? And this is not just a balance e-mailed to you in a text message. We're talking the full experience of an ATM, minus the people, cash dispenser and homeless guy asking for change.


Two big banks in the United States are testing the waters of delivering many of their features on mobile phones. CitiBank launched Citi Mobile earlier this year. The downloadable application lets you do most of your banking right from your cell phone. Check balances. Transfer funds. Let your credit card balance pile up by just paying the minimum - the works.


We know what you're thinking: With new cell phone hacks coming along every day, what kind of idiot would take a chance like that? CitiBank says security is excellent. The only hitch is you need a phone capable of running the app, so you are limited to more powerful units that can handle the code, though that list is growing.


Bank of America has also launched a phone banking application that is open to all phones that run the mobile Web.


So we figure the question becomes: Which bank will be the first to give us a mortgage over the phone?


Jonathan Blum and Dan Evans co-host "Strange New World," a weekly syndicated radio show. Blum hosts the blog





Technorati : ,

How to Recover (Almost) Anything- PC Matter


It's amazing how fast a single keystroke or mouse click can change your life. One false move, and bang! An hour's, day's, or even lifetime's work can slip away into digital oblivion. But not everything that disappears is lost forever. These tips will help you retrieve the seemingly irretrievable: from files long ago removed from the Recycle Bin, to hard drives you pronounced dead in years past, to text messages zapped from your cell phone's SIM card. Get it back, Loretta!


Recover a missing or deleted file: The file was there just a second ago--you'd swear to it! Before you panic and start shopping for a file-recovery program, make sure that you don't make things worse. If you're certain that you deleted the file, refrain from running any software designed to save files to the hard drive, a USB flash drive, or a memory card that the files was stored on; doing so may overwrite recoverable data.


Begin by checking the obvious. If the file isn't in XP's Recycle Bin, click Start, Search and use Windows' 'When was it modified?' option (if you don't see this option, click View, Explorer Bar, Search and in the left pane select All files and folders). In Vista, choose Start, Search, click the down arrow to the right of Advanced Search, and select Date modified in the Date dropdown menu on the left. Look for any recently created, altered, or renamed files. If you find the one you're looking for, save it onto at least two different storage devices.


If you come up empty, there's a good chance you can recover the file with an undelete utility. Two freebies--PC Inspector File Recovery and FreeUndelete--are well worth a try.


But what if you've accidentally reformatted a drive, for example? For situations where you need extra data recovery horsepower, QueTek's $49 File Scavenger offers many of the recovery capabilities of far more expensive programs. Meanwhile, Kroll Ontrack's $500 Easy Recovery Professional is the Cadillac of data recovery programs; it comes with Ontrack's high-powered data recovery tools and a suite of file repair utilities. Though it's too expensive for most individuals, it's not a bad investment for a small business or for a midsize company's IT department. Beware the fine print for Ontrack's stripped-down, $89 Easy Recovery Lite version, however; it allows you to recover only 25 files at a time--a major inconvenience if you have lots of data to recover.


Recover files from a dead or dying hard drive: Strange noises or an outburst of corrupted-file messages could very well portend the imminent failure of your hard drive. Copy important files to another drive or to a removable medium immediately. If you can't access some files that you simply must have, you may have to turn to an expensive data-recovery service such as Driversavers. If you'd like to take a crack at restoring the files yourself (a much iffier proposition), watch our video, "How to Resurrect a Crashed Hard Drive".





Technorati :

Serious Games Making Maths More Exciting For KS3



Serious Games challenging us to play a better education


Via: Bowland - Maths case studies for Key Stage 3


A new initiative is being developed for KS3 maths, funded by the Bowland Trust with additional support from the DfES. It is aimed at developing thinking, reasoning and problem solving skills in KS3 pupils. It will consist of an initial 23 case study problems each of which pupils would need to address through open questions; each problem will be able to be explored at various depths.




The work is currently in its development stage, with 15 developers working on the 23 case studies. Each case study will include class room materials together with supporting training materials for teachers specific to that case study. The formal launch will be in the spring of 2008, with the case studies available to schools for the 2008/9 school year.


A generic development package for the whole programme is also being developed which will be made available to teachers who would like to avail themselves of it. It is being designed to help teachers develop skills to make maximum use of the concepts behind the case studies.


Caspian Learning Unveils 3D Adventure Games

Caspian Learning, has been awarded a contract by The Bowland Trust and the DfES to develop two case study problems for UK schools designed to make Key Stage 3 maths more engaging.


The games, Exploring and Applying Algebra - The Velletri Scrolls and Murder at Mega Bank - Applying Mathematics and Proportional Reasoning Skills, are for potential use in the 2008/09 academic year.


The following is an overview of the two adventure games:


Murder at Mega Bank


A 3D learning-based game in which students must use proportional reasoning skills, forensic techniques and investigative methods to identify the culprit(s) of a crime - a daring and ambitious robbery and murder at Mega Bank. The case study will be accompanied by an easy to follow teacher's support pack.




Learners will begin with information gathering activities in phase one, where they examine the crime scene, identify the key actors in the scenario, identify key unknowns / eliminate irrelevant information and identify forensic and investigative methods to use to progress and test the unknowns.



In lessons 3 and 4, learners move on to an analysis and interpretation phase, where they test analyse and interpret samples, apply forensic testing, mathematical modelling, explaining and illustrating to assess the evidence and draw conclusions.


The Velletri Scrolls



The Da Vinci Code meets Lara Croft, The Velletri scrolls is a 3D learning-based game where learners need to assemble clues, unravel codes and solve problems to find the fictional riches of the Emperor Augustus.





In phase one, learners must gather evidence and tools to interpret maps, papers and decipher mathematical problems. In phase two, learners use the elements gathered in phase one, as well as analytical thinking and algebraic skills to solve problems that progress the scenario.





The case study supports different levels of difficulty, with the maths problems themselves containing different solutions, with more reward given for more sophisticated solutions. This means that students can follow their own path through, and are not precluded from progressing. The application will record any areas of difficulty and will feedback to the students and teacher.


Initiative Overview




In September 2006, the Bowland Trust issued an open invitation to submit ideas for case studies that would help to develop in pupils the skills of thinking, reasoning, analysing, interpreting and problem solving. Caspian was awarded a contract to develop two case studies for the project, selected alongside 14 other companies out of over 200 proposals.




Caspian's technology is based on proven learning principles. The case studies use 3D learning-based avatar games where different thinking processes will be explicitly embedded within the gameplay and mapped closely to curriculum objectives.




Students will be measured automatically by the applications, giving teachers a precise performance measurement for each learner.




The case studies will use and complement mathematical topics within KS3. It will be for teachers to decide whether they wish to use one or more of the case studies in their teaching, and at what point in the Key Stage. The case studies will act as an extra resource that will stimulate pupils' interest to such an extent that teachers will actively want to use them.




Chris Brannigan, CEO, Caspian Learning comments: "In developing these games we wanted to bring the excitement and glamour of adventure gaming into the world of maths, to reverse the common thought that it's a boring subject."Its vital pupils see maths as a relevant and interesting subject - the UK has relatively few A-level and undergraduate students in maths precisely because they've been turned off the subject at school. Our games will help teachers looking for that extra something to keep a class enthused."






Technorati :

Tech Market -Apple iPod




Palm-sized Nano

Leading up to Apple's grand unveiling of its refreshed iPod line, the chatter was all about the so-called "phat" iPod Nano. Turns out the "phat" Nano is anything but: Sure, it's wider than the previous slim Nano stick; but, its form is actually svelte, stylish, and lightweight. The new Nano is packed with more capabilities--namely, video playback and casual gaming--than its music-only predecessor. Plus, it carries a rated battery life of 24 hours for audio, and 5 hours for video--about enough to get you through the first two installments of The Pirates of the Caribbean series.


Fifteen Random Thoughts About the New iPods


Whenever I attend an Apple product launch, I know the drill: By the end of the day, I'll have a head full of random thoughts and questions regarding the stuff that was unveiled. As usual, I'll document 'em here for posterity.


First, though, a few plugs for other iPod-related content here on PC World--all of it courtesy of Melissa Perenson, our senior products editor. Here's Melissa's video report on the new 'Pods. Here's a slideshow she put together. And here are her thoughts on today's news.


Now for my iPod-related brain dump:


They really are beautiful. You won't see this until you see the new iPod lineup in person, but the industrial design is probably the best that Apple or anyone else in consumer electronics has ever done--they just look great. Especially the Nano: The change in dimensions not only accommodates the larger screen, but somehow makes the player positively endearing. Funny but true: There's one model called the iPod Touch, but the metal finishes on the Classic and Nano are the ones that make those two players feel as good as they look.


I'm reserving judgment on the new user interface. I've always liked the streamlned minimalism of the iPod UI. The new one as seen on the Nano and Classic is a departure, with a fair amount of graphical frippery--like cover art floating behind menus--that serves no great purpose. I'm not saying it's a mistake, but I'd want to live with it awhile before declaring it an improvement on the old one.


Bye Bye, Classic? Finally, the iPod that we think of when we think of iPods has a name--it's the Classic. That doesn't seem like a name you'd give a product you expected to sell forever--Coca-Cola Classic notwithstanding. I kinda wonder if Apple now thinks of the Touch as the flagship iPod, and if it won't be long until the Classic gives way to a Touch with a big honkin' hard drive or, conceivably, a ton of flash RAM.


A hundred and sixty gigs! For now, though, the high-end Classic's 160GB of space is pretty darn startling. (Normally at Apple events, I feel like I'm surrounded by people who'll ooh and aah at the most mundane of spec bumps; when Jobs unleashed this one, I was oohing and aahing with the best of 'em.) I wonder how many people will buy this model, and what percentage of them will immediately fill up them up?


Rotating storage lives! The 160GB Classic certainly shows there's still a place for hard disks inside iPods--if Apple were to put 160GB of flash storage inside an iPod, it would have to charge several thousand dollars for it. I suspect, though, that by Fall 2008. most iPods will be solid-state, with one or two disk-based models left in the lineup.


Will the Touch succeed? Until now, there's been a logical progression of iPod models, from small, low-capacity, and cheap (Shuffle) to big, high-capacity, and relatively pricey (full-sized iPod). The Touch ends that clarity by being large, low-capacity, and relatively pricey. Will people spend $399 for an iPod that won't hold all their music? I'm not sure.


Is the Touch really a computer? I think Apple's being pretty savvy selling it as a media player and downplaying the fact it contains Safari--which means it can do just about anything you can do on the Web. (I'm thinking of the fact that devices like Sony's Mylo, which are in some ways similar to the Touch but sold on the strength of their computing and communications features, never seem to go anywhere.) However it's marketed, the Touch is the first phone-less iPod that can do a heckuva lot of things that have nothing to do with enjoying entertainment, and you gotta think that Apple is quietly but intentionally expanding the iPod's mission with this device.


What, no multi-touch iPod I can put all my music on? The most important product Apple didn't announce today--and the iPod I and a lot of other people want--is a model equipped with a big touchscreen and at least 80GB of storage apace. I'm not entirely sure why one didn't show up--maybe it's hard to make one as thin as Mr. Jobs likes his music players--but it seems a safe bet that we'll get one within the next year, if not a lot sooner.


Will anyone turn the Touch into a Wi-Fi VoIP iPhone? Technically, it's probably doable without a huge amount of effort--you can make Skype calls on an iPhone, and there are plug-in microphones for other iPods. I'm sure someone will try, but I can't figure out whether Apple will consider it a laudable use of its device or a nefarious threat to iPhone sales.


When will we be able to download video on an iPod? The iTunes store you can get to from the Touch and iPhone is the iTunes Music Store. Movies and TV would eat up a lot more Wi-Fi bandwidth, but we'll presumably see them at some point.


Why no music sharing a la the Zune? iPod Touches (or is that iPods Touch?) apparently can't use their Wi-Fi connections to talk to each other. I'll bet Apple would never introduce a sharing feature as ridden by DRM-related gotchas as the Zune's "squarting," but I'm still curious whether it's trying to figure out a way to make sharing make sense.


Prediction winners and losers. Think Secret correctly predicted we'd get a touch-screen iPod (although it said it would likely have a hard drive) and nailed the new Nano. Not perfect, but not bad. VNUnet, however, flopped with its confident-sounding piece on iPods with HD radio. Pure fantasy, at least for now.


Starbucks' cup runneth over. I'm not a great audience for an extended discussion of the Wonders of Starbucks--I drink maybe one cup of coffee every two years--but I'll bet I'm not the only person in the audience who thought that Chairman Howard Schultz's presentation was interminable. (Especially given that Mr. Jobs himself kinda rushed through some pretty interesting stuff, like the new iPod user interface.) On the bright side, Schultz was a polished enough presenter to hold his own during a Jobs keynote, which you can't say about most of the other execs who manage to get on stage at these events. (Cue flashback to the debut of Motorola's ROKR phone.)


What's really behind the Starbucks-Apple partnership? The coffee kingpins are going to spend years--and, presumably, millions and millions of dollars--setting up the technology they need to let customers spend 99 cents to download the song they're listening to. You gotta think that there's a master strategy behind it all that's not apparent yet. (More than one person I talked to wondered why you won't be able to use an iPod to pay for your latte: Maybe you will someday.)


No John, No Paul, No George, No Ringo. This was approximately the 6,172nd Apple event preceded by pundits confidently predicting it would involve the announcement that Beatles music would be available for download. Jobs seemed to taunt us, even--his demos involved both solo Lennon and solo McCartney at various points. I was willing to believe that Paul was waiting in the wings at the Moscone Center up to the moment that Jobs bid us all farewell. But the iTunes Store remains Fab Fourless.


Those are my iPod-related thoughts and questions at the moment. Got any answers, rejoinders, or musings of your own?




Technorati :

Green light for human-animal embryo research


There has been a mixed reaction to a decision by British medical research regulators to allow, in principle, the creation of hybrid embryos from humans and animals. The Human Fertilisation and Embryology Authority has cleared the way for scientists researching degenerative diseases to inject human DNA into cow or rabbit eggs. The resulting stem-cells from the hybrid embryos could lead to new treatments for conditions like Parkinson's, and Alzheimer's.

Angela McNab, the Chief Executive of the HFEA, said: "Many people initially have a disquiet about this type of research, but once people understand much more about what's involved, they're able to focus more on what the potential benefits of the science are, and they feel much more comfortable about it."

But religious groups, and other opponents say combining human and animal material breaks an absolute taboo.

Anthony Ozimic, from the Society for the Protection of Unborn Children said: "The hype surrounding hybrids is being promoted by those with a vested interest in the government's stem cell research fund, and yet again patients with degenerative conditions are being given false hope by the profit-hungry biotech industry."

Researchers still have to apply for licences to use the technique for specified medical projects. Two teams of British scientists have already done so.




Technorati : ,

Dark Energy research urged to NASA


A proposed NASA mission to study a mysterious force thought to be accelerating the expansion of the universe should be the first in the agency's "Beyond Einstein" program to be developed and launched, the National Research Council recommended today.


Beyond Einstein is NASA's research roadmap for five proposed space missions set to begin in 2009 that will study areas in science that build on and extend the work of physicist Albert Einstein.


The missions include Constellation-X and the laser Interferometer Space Antenna (LISA), which will measure X-rays and look for hypothetical gravity waves, respectively, as well as the Inflation Probe (IP), the Black Hole Finder Probe (BHFP) and the Joint Dark Energy Mission (JDEM).


The National Research Council report recommended that JDEM be the first mission to be deployed since it is already in the prototype phase and will require less development than the other missions.


"All of the mission areas in the Beyond Einstein program have the potential to fundamentally alter our understanding of the universe," said committee co-chair Charles Kennel of the University of California, San Diego. "But JDEM will provide direct insight into a key Beyond Einstein science question, and it is the most technically feasible option for immediate development."


'Lambda'


Dark energy is a mysterious force scientists think is speeding up the expansion of spacetime and constitutes some three-quarters of the density of the universe.


It was initially proposed by Einstein as a counterforce to the gravitational attraction of matter to explain why the universe appeared static, neither growing nor shrinking. But he later dismissed his idea as a mistake when observations by astronomer Edwin Hubble revealed the universe was in fact expanding.


Dark energy, which Einstein called lambda, was revived in the late 1990s when astronomers discovered that the universe was not only expanding, but expanding at an accelerated clip.


Gravity waves


The report also recommended that LISA become the flagship mission of the program and that more money be funneled into the project because it could provide an entirely new way of observing the universe. However, the report committee believes that more testing is required before it launches. Specifically, the mission must await the results of the LISA Pathfinder mission in 2009 that will test some of the critical technologies to be used in the final LISA mission.


The report was sponsored by the U.S. Department of Defense and NASA. The Research Council is the principal operating agency of the National Academy of Sciences and the National Academy of Engineering.


Dark energy facts.


In physical cosmology, dark energy is a hypothetical form of energy that permeates all of space and tends to increase the rate of expansion of the universe. [1] Assuming the existence of dark energy is the most popular way to explain recent observations that the universe appears to be expanding at an accelerating rate. In the standard model of cosmology, dark energy currently accounts for almost three-quarters of the total mass-energy of the universe.


Two proposed forms for dark energy are the cosmological constant, a constant energy density filling space homogeneously,[2] and scalar fields such as quintessence or moduli, dynamic fields whose energy density can vary in time and space. In fact contributions from scalar fields which are constant in space are usually also included in the cosmological constant. The cosmological constant is thought to arise from the vacuum energy. Scalar fields which do change in space are hard to distinguish from a cosmological constant, because the change may be extremely slow.


High-precision measurements of the expansion of the universe are required to understand how the speed of the expansion changes over time. The rate of expansion is parameterized by the cosmological equation of state. Measuring the equation of state of dark energy is one of the biggest efforts in observational cosmology today.


Adding the cosmological constant to cosmology's standard FLRW metric leads to the Lambda-CDM model, which has been referred to as the "standard model" of cosmology because of its precise agreement with observations.


Nature of Dark energy.


The exact nature of this dark energy is a matter of speculation. It is known to be very homogeneous, not very dense and is not known to interact through any of the fundamental forces other than gravity. Since it is not very dense-roughly 10−29 grams per cubic centimeter-it is hard to imagine experiments to detect it in the laboratory. Dark energy can only have such a profound impact on the universe, making up 70% of all energy, because it uniformly fills otherwise empty space. The two leading models are quintessence and the cosmological constant








Technorati :

Why we have sex


In Channel Ten's new series Californication, a Los Angeles writer named Hank sleeps with five - or was it 10? - busty, celullite-free women in the very first episode. And while he seems to be getting a healthy dose of casual sex, he still finds himself begging his ex-wife to take him back, lamenting that his life just isn't the same without her.


While bedding a bevy of women in the space of 30 minutes and wanting to ditch it all for his one true love is fittingly fantastical (it is Hollywood after all), it got me thinking about a recent survey disclosing the reasons why people have sex.


So why do we have sex? Is it for love? Intimacy? Pleasure? The answer isn't as simple as you once might have thought. Scientists have found that there are a whopping 237 reasons why we engage in the horizontal hanky-panky. And it wasn't so easy to find the answers either.


Researchers at the University of Texas spent a whole five years on the project, using their very own money to study the "why" behind sex.


The No. 1 reason? "I was attracted to the person," followed by - surprise, surprise - physical pleasure. In the No. 3 spot was to express love, followed by the need to feel desired and the quest to deepen the relationship. Then there are the less-common answers, which include wanting a promotion to feeling closer to God. Hmph ...




The study, carried out by clinical psychology professor Cindy Meston and her colleague David Buss, studied 444 men and women aged between 17 to 52. They were surprised to find that it refuted many of the gender stereotypes. Mostly "... that men only want sex for the physical pleasure and women want love," says Meston. "That's not what I came up with in my findings."


Instead, according to the scientists, the most surprising find was that women these days are doing it like men: purely for physical pleasure.


Psychologist Jo-Anne Baker says she's noticed a similar trend in her own practice. "Women want to feel pleasure as well as feeling empowered as a sexual woman," she says. "They want to enjoy sex for what it is: pleasure, expression, connection and affection, rather than doing it differently with the hope of a deep psychological connection."


She also says that animal attraction does play a major role and that our brains play a discerning role too. "If not, we would be having sex with everyone we are attracted to."


To find out more, I made a visit to the University of NSW to have a chat to Dr Rob Brooks from the Evolution and Ecology Research Centre. By his reckoning, sex feels good as a pay-off to encourage us to procreate. "If sex was cold and nasty all of the time, we would find that very few individuals would do it and we or one of our ancestors would have died out," he says. Indeed.


Are there any reasons not to? "The odds are overwhelmingly in favour of it, aren't they?" says Jed, my male mate, who provides an overwhelming source of sexual amusement and fodder for this column. And I think he might just be right ...


Why do you think people have sex? Love, lust, emotional attachment?


ASK SAM EXTRA: Kiss = Sex?


"Don't have sex man," advised comedian Steve Martin to blokes everywhere, saying: "It leads to kissing and pretty soon you have to start talking to them."


Yet a recent US study has found that most men kiss women in hope of increasing the likelihood of having sex! And that men are less discriminating when it comes to deciding who to kiss or who to have sex with, than the women. What do you think?




Technorati :

Advancing the utility infrastructure through automation-AMI 360


AMI 360: Beyond meter-to-bill.


Utilities worldwide are embracing a "Smart Grid" vision for the future, leveraging new infrastructure to improve asset management and optimize resource planning. This modernization and automation encompasses smart metering, system monitoring and control electronics, and information management - all operating holistically to achieve utility enterprise-wide benefits. Often, a key contributing element to this goal is Advanced Metering Infrastructure (AMI).


To realize the full range of potential benefits from this effort, it is critical that a full 360-degree view of possible device-level interactions, communications requirements and capabilities, and user-focused applications, is examined and quantified.


KEMA has the proven experience helping utilities and other key stakeholders realize that AMI is more than advanced meters, communications, and back office systems. Our team continually hones its skills with practical hands-on knowledge that is supported with robust analytical tools. We are ready to help recommend strategies, evaluate options, and develop deployment plans that can help utilities meet their current and future needs.


About AMI 360


Frequently, the primary utility AMI initiative focuses on mainstream applications such as Meter to Bill. While this is an essential and fundamental through the meter area, the expanded endpoint visibility provided by the infrastructure now makes it both reasonable and appropriate to examine services that can be provided behind and alongside the meter.


Through the meter services include:


· On cycle and off cycle billing


· Revenue protection


· Meter-centric Customer Services


· Meter to bill process efficiencies and streamlining



Behind the meter services include::


· Pre-payment systems


· Web presentment


· Appliance monitoring and control


· Programmable Communicating Thermostats


· Smart device management (e.g., plug-in electric hybrid vehicles)


Alongside the meter services include:


· Load forecasting


· Distribution engineering planning


· Secondary voltage monitoring and control


· Related utility telemetry applications (e.g., distribution automation, stray voltage)


· Net Metering


Meet KEMA's AMI 360 Team


Autovation 2007 - October 1 & 2, 2007 (Reno, NV)

EMA is home to a global network of experts focused on helping utilities develop intelligent networks and communication systems to reach their vision of the future.


Our newly integrated team combines existing disciplines from advanced metering and meter data management, to intelligent network management, information systems integration, and network communications.


Key members of the Intelligent Networks and Communications team will be at the AMRA International Symposium - Autovation 2007 - in Reno, Nevada on October 1 and 2. Our experts will be available to discuss the full 360-degree view of AMI, including:


· Advanced Metering Infrastructure


· Protocols and standards


· Grid automation and intelligent control


· Enterprise-wide integration


· Telecommunications


Want to take part in the AMI 360 discussion?


· VISIT - Visit us at Autovation 2007 in the KEMA booth (#410)


· MEET - Schedule a one-on-one meeting at Autovation by contacting Rob Wilhite, our North American Intelligent Networks and Communications team leader


· LISTEN - Hear us speak at Autovation


· INFORM - Read and contribute to KEMA's complimentary newsletter "Automation Insight"


www.kema.com/strategic_metering


MIT research details parasitic battles


Pb : Md Moshiur Rahman sponsored by www.careerbd.net


Work indicates impact on evolution.


Scientists at MIT and the Technion Israel Institute of Technology have for the first time recorded the entire genomic expression of both a host bacterium and an infecting virus over the eight-hour course of infection.


The work, reported in the Sept. 6 issue of Nature, likely will encourage scientists in several fields to rethink their approach to the study of host-virus systems. Such systems are believed to play a key evolutionary role by facilitating the transfer of genes between species.


Professors Debbie Lindell of the Technion and Sallie Chisholm of MIT and co-authors say that their study of a system involving the marine bacteria, Prochlorococcus, leads them to speculate that viral infection may play a role in shaping the genetic repertoire of families of bacteria, even though individual infected bacteria die.


This could indicate that the meeting between a marine bacterial host and its virus may not be just a battle between two individuals, but an evolutionarily significant exchange that helps both species become more fit for life in the ocean environment.


"The current status of host-virus relations has been influenced by a rich history of interactions," said Lindell, who conducted the research as a postdoctoral associate in Chisholm's lab before joining the Technion faculty in late 2006. "While we can't definitively pin down the sequence of past co-evolutionary events, our findings suggest a novel means through which the exchange of beneficial genes between host and virus have been triggered."


And, because the pattern of genomic expression in this host-virus system differed significantly from that in the more commonly studied system of intestinal bacteria such as E. coli and a virus called T7, the research will likely lead to increased appreciation for the need to study diverse types of marine bacteria, rather than relying on a single system as a broad model.


"We hope this work will encourage scientists to explore a wide range of host-pathogen systems and thus lead to a significant broadening of our understanding of the diversity of the host-pathogen interactions existing in nature," said Chisholm, one of the discoverers of Prochlorococcus in 1985.


In previously studied host-virus systems, a virus hijacks the bacterial host cell and shuts down genome expression immediately, preventing the bacterium from conducting its own metabolic processes. The attacking virus redirects expression to its own genome and activates the genes beneficial for its activity.


But uncharacteristically, in the system of Prochlorococcus and virus P-SSP7, an unprecedented 41 of the bacteria's 1,717 genes were upregulated. That is, the researchers detected increased quantities of the messenger RNA encoded by these genes in the cell during the infection process. The upregulation of so many host genes during infection is a phenomenon unseen before in the world of bacteriology.


Lindell and Chisholm believe the most plausible scenario to explain this is that the bacterium activates certain genes in response to infection as a means of self-protection. The virus has "learned" to use those genes to its own advantage and so incorporates them into its own genome. Later, when infecting another bacterium, the virus upregulates those genes itself to facilitate its own reproduction within the host bacterium. When a bacterium survives an infection, those viral modified genes are incorporated back into the bacterial DNA, making that bacterium and its descendants more likely to survive in the harsh ocean environment.


Other MIT authors are graduate students Gregory Kettler and Maureen Coleman, postdoctoral associate Matthew Sullivan, and Jacob Jaffe of the Broad Institute. Additional authors are from Humboldt University, Harvard Medical School, and the University of Freiburg.


Funding for this research came from the Department of Energy, the Gordon and Betty Moore Foundation, and the National Science Foundation




Technorati :

Adult brain can change, study confirms


Work could aid interventions following stroke.


It is well established that a child's brain has a remarkable capacity for change, but controversy continues about the extent to which such plasticity exists in the adult human primary sensory cortex.


Now, neuroscientists from MIT and Johns Hopkins University have used converging evidence from brain imaging and behavioral studies to show that the adult visual cortex does indeed reorganize--and that the change affects visual perception. The study appears online Sept. 5 in an advance publication of the Journal of Neuroscience.


The authors believe that as scientists find ways to use this adaptive ability, the work could have relevance to topics ranging from learning to designing interventions for improving recovery following stroke, brain injury, or visual disorders.


Animal studies conducted two decades ago and using single cell recording of neurons found that the adult animal brain can change, but shed little information about the adult human brain. In 2005, a functional magnetic resonance imaging (fMRI) study led by Professor Nancy Kanwisher at the McGovern Institute for Brain Research at MIT found evidence of plasticity in the visual cortex of adults with macular degeneration, an eye disease that deprives regions of the cortex of visual information.


But another fMRI study of macular degeneration found no such evidence, and an animal study using both single cell recordings and fMRI also questioned the 20-year-old animal work.


Lead author Daniel Dilks, a postdoctoral associate in Kanwisher's lab who conducted the current work while a graduate student at Johns Hopkins in senior author Michael McCloskey's lab, jumped into the fray when he found BL, a stroke patient.


BL's stroke damaged the optic radiation fibers, which transmit information from the eye to the primary visual cortex, but the cortex itself remained intact. The damage eliminated input from the upper left visual field to the corresponding region of the primary visual cortex, thereby depriving a region of cortex and creating a blind area in the upper left visual field.


The researchers wanted to find out what happened to that deprived piece of cortex. "We discovered that it took on new functional properties, and BL sees differently as a consequence of that cortical reorganization," explains Dilks.


BL had reported that things "looked distorted" in the lower left visual field (below his blind area). The researchers hypothesized that the distortions resulted from cortical reorganization in the deprived cortex. To isolate that distortion, they had BL fixate on a center dot while objects, such as squares, appeared in various parts of the visual field. As expected, BL saw nothing when a square appeared in his blind area.


But when the square appeared just below the blind area, he perceived the square as a rectangle extending upwards into the blind area. Likewise, he saw triangles as "pencil-like", and circles as "cigar-like".


Subsequent fMRI studies confirmed that the visually deprived cortex (representing the upper left visual field) was responding to information coming from the lower left visual field. The deprived cortex assumed new properties, a hallmark of plasticity, and that explained the visual distortions.


Dilks is continuing this work in postdoctoral studies in Kanwisher's lab. In addition to Michael McCloskey, John Serences of University of California Irvine, and Benjamin Rosenau and Steven Yantis, both of Johns Hopkins, coauthored the Journal of Neuroscience paper. An Integrative Graduate Education and Research Traineeship and a Graduate Research Fellowship, both from the National Science Foundation, and the NIH funded the Johns Hopkins work




Technorati :

Intel's Sweet On Virtualization's Future With Quad-Core Xeon MP


pB: Md Moshiur Rahman sponsored by www.careerbd.net
Intel on Wednesday took the wraps off six quad-core Xeon processors it hopes will entice companies to expand their collection of virtualization servers without having to worry about software upgrades for a couple of years.
The 7300 series of Xeon chips -- bundled under the Tigerton family name -- were designed for multiprocessor servers, a staple of database, enterprise resource planning, and Java enterprise applications. The product line completes Intel's expandable offerings for servers that scale up to 32-way systems.


The processors include frequencies up to 2.93 GHz at 130 watts, several 80-watt processors, and one 50-watt version optimized for four socket blades and high-density rack form factors with a frequency of 1.86 GHz. The chips are available for ordering now. Prices range from $856 to $2,301 in quantities of 1,000.


In addition to the core architecture, the processors also are equipped with a chipset package designed for expanding virtualization environments. The chipsets are augmented by what Intel calls its "dedicated high-speed interconnect." Unlike Intel's front-side bus connection that shares data pathways, the updated interconnect links a single processor to a single chipset.


Also native to the new Xeon 7300 chips is Intel's attempt to make a series of chipsets socket compatible. Dubbed "VT FlexMigration," the design will make the Xeon 7300 series consistent with Intel's other multi-processor chips through at least its next-generation 45-nanometer process core micro-architecture, code-named Nehalem, which is expected in late 2008.


"With FlexMigration, if you are going to be buying servers over a period of time, you can add new servers to the system in a virtual environment, and not worry about migration to future hardware," Tom Kilroy, Intel VP and co-general manager of the Digital Enterprise Group, told InformationWeek. "That means, if your Caneland system is about to fail, you can convert a whole machine over to a server running a Nehalem chip without shutting down applications."


The future migration and virtualization issue became apparent as previous versions of VMware software had be upgraded along with updated hardware. Going forward, that is expected to be less of an issue as VMware and Intel have worked together to optimize VMware ESX Server on the Xeon 7300, said VMware VP of R&D Stephen Alan Herrod.


VMware and Intel are expected to announce further collaboration in the next two weeks, when each hosts a developer's conference.


Intel is optimistic about the market adoption of the Xeon 7300 series, as a number of white-boxes have been shipping to the company's partners since June for internal specification testing, Kilroy said.


The latest addition to Intel's stockpile of server partners, Kilroy noted, was Sun Microsystems, which has certified its hardware for Intel x86 processors.


"Sun brings along all of their ISVs [Independent Software Vendors],'' he said, "as well as the Solaris operating system and thousands of applications that operate on it."







Technorati :

Facebook Exposes Users To Search Engines


Facebook Exposes Users To Search Engines


Facebook users who do not want public search listings can indicate as much on Facebook's Search Privacy page.



Facebook is opening up, for better or worse.
Internet users can now search for Facebook members on the Facebook site without logging in.


"Starting today, we are making limited public search listings available to people who are not logged in to Facebook," said Facebook engineer Philip Fung in a blog post on Wednesday. "We're expanding search so that people can see which of their friends are on Facebook more easily. The public search listing contains less information than someone could find right after signing up anyway, so we're not exposing any new information, and you have complete control over your public search listing."


In a few weeks, these "public search listings" will be made accessible through Internet search engines like Ask, Google(GOOG), MSN Live, and Yahoo(YHOO). Fung said that "this will help more people connect and find value from Facebook without exposing any actual profile information or data."


Fung maintains that Facebook users who do not want public search listings can indicate as much on Facebook's Search Privacy page.


As Danny Sullivan at Search Engine Watch and other have noted, Facebook profiles that are linked to from outside Facebook have been accessible through search engines for months. Google currently lists 25,000 Facebook profiles. Sullivan said he believes that today's announcement reflects a change in the default setting of Facebook's privacy controls from "Restricted" to "Everyone" as Facebook optimizes its listings for external indexing and access.


For Facebook, the changes mean more traffic. For users, that translates into less privacy, unless they opt-out.





Technorati : , , , ,
Del.icio.us : , , , ,
Ice Rocket : , , , ,
Flickr : , , , ,
Zooomr : , , , ,
Buzznet : , , , ,
Riya : , , , ,
43 Things : , , , ,

Why Blu-Ray Should Never Have Existed: Technology Lesson Learned


A number of experts are pointing out why Blu-Ray is a mess. In hindsight, Blu-Ray should never have existed. Looking back at what happened with this technology can help us avoid similar mistakes in the future with a variety of products.


Today I'd like to cover the warning signs and point to Blu-Ray as the current example of a problem product that can crop up in any company, from IBM to Microsoft.


Now, I know a lot of people still believe Blu-Ray is winning (though that number declined sharply after Paramount and DreamWorks jumped ship), but if you really step back, you'll realize all it is doing is ensuring HD-DVD doesn't win either, and the impact of that on the movie industry has to be in the billions.


Danger Sign One: It Can't Stand on Its Own


I've seen this over and over again, and am surprised more of us don't point this out. If a product requires substantial support from the parent to keep it alive, including funding levels that probably can't be reasonably recouped, it has a very high likelihood of failing.


Successful products generally need some boost in terms of marketing and backing, but if they need sustained investment over long periods of red ink, at some point there is likely to be an executive change, and the new guy will immediately realize that the product needs to be killed.


We saw this years ago with OS/2. The level of investment was unprecedented, and to keep Louis Gerstner, the CEO who was brought in to turn IBM around, from killing it out of hand, he was maneuvered into publicly promising to support it indefinitely. Shortly thereafter, he killed funding for the offering quietly, leaving a lot of companies that had listened to the empty promise hanging in the wind.


With Blu-Ray, the warning sign was the tie-in to the PlayStation 3, which was the big crutch for the product. I was just as blind to this early on as everyone else, and didn't realize until too late that rather than the PlayStation assuring the success of Blu-Ray, Blu-Ray assured the failure of the PS3.


A product has to hit on three vectors: it has to work to expectations, it has to be something people want, and it has to be affordable. The fact that it wasn't affordable killed not only Blu-Ray but effectively took out the PS3 in the process. Without the PS3, Blu-Ray couldn't beat HD-DVD, which had no similar crutch and advanced into the market much more easily (and shipped much earlier).


Danger Sign Two: Key Competitive Advantage Unimportant


In the case of competing technologies, there are advantages and disadvantages between the products. For instance, in the case of Windows vs. Linux on the desktop, the key advantage is that Linux is open source which, to the average Windows user, is not only unimportant - when explained it might actually scare them away from the offering. Apple, on the other hand, is providing advantages consumers at least want, and is showing considerable success at the moment.


For Blu-Ray, the big advantages seem to be capacity and special features (something HD-DVD shared). On capacity, the reality was that you really didn't need as much as Blu-Ray offered for movies; since game developers (most of them) develop for several platforms, they were limited to standard DVD capacities, anyway. For backup, initially they had an argument, but with the growth of storage and the speed of writing to optical discs (which is very slow), both HD-DVD and Blu-Ray became impractical as backup and transport media for PC files. Portable hard drives are cheaper, easier to use (all you need is a USB port, not another Blu-Ray drive at the other end), vastly faster, and actually more portable.
For special features on Blu-Ray or HD-DVD movies, folks simply didn't care. They just wanted to watch the movie. So arguing who had the best features quickly became a waste of time.


So Blu-Ray was better, if the buyer doesn't care, it doesn't make any difference, and the vendors who haven't yet learned that lesson are way too prevalent.




Technorati : , , , , ,
Del.icio.us : , , , , ,
Ice Rocket : , , , , ,
Flickr : , , , , ,
Zooomr : , , , , ,
Buzznet : , , , , ,
Riya : , , , , ,
43 Things : , , , , ,

Microsoft’s Live Installer Coming this Week


Bloggers are responding to the long-expected news that Microsoft is releasing an installer this week for its Live-branded Web applications with a general nod; we knew it was coming, but it's not a market-changing event.


TechCrunch flippantly referred the installer as a "thingy" in its headline and questions the hyperbolic (imagine that) coverage at the New York Times that compares Redmond's belated drive into Web-delivered software to its monopolistic assault on Netscape more than a decade ago.


TechCrunch's Michael Arrington writes:



The important new web services are all browser based, and Microsoft has no competitive advantage over offerings from Google, Yahoo, AOL and thousands of new web startups all trying to move users from away from the desktop.



The NY Times piece says some components of the Live app family will be free (e-mail, photos and a blog writing tool); others, like security, will definitely be for a fee. We were unable to determine this morning if the Live installer will be pushed via Windows Update - which would definitely be a competitive leg up for Microsoft.


Coverage of the Microsoft/Google arch-rivalry is always a bit convoluted - primarily because the companies' own strategies often are so fragmented. This quick note at Big Mouth Media notes that Google recently began distributing StarOffice for free. Of course, that's a hard client piece of software, which Google is suppose to be driving to extinction.




Technorati : , , , ,
Del.icio.us : , , , ,
Ice Rocket : , , , ,
Flickr : , , , ,
Zooomr : , , , ,
Buzznet : , , , ,
Riya : , , , ,
43 Things : , , , ,

To Benefit Brain Research -3-D Fruit Fly Images will contribute


Side on: A 3D image of a fruit fly generated using optical projection tomography generated after first bleaching the fly's exoskeleton. Different organs can be clearly seen. The images mean scientists no longer have to dissect the flies by hand to observe how genetic changes influence the loss of brain cells." (Credit: Public Library of Science)


The fragile head and brain of a fly are not easy things to examine but MRC scientists have figured out how to make it a little simpler. And they hope their research will shed light on human disease.


Using an imaging technique, originally developed at the MRC Human Genetics Unit, called optical projection tomography (OPT) they have generated startling 3D images of the inside of a fruit fly for the first time. The OPT images could help to speed up genetic research into Alzheimer's and other human diseases that affect brain cells.


Dr Mary O'Connell of the MRC Human Genetics Unit who led the research explained: ''Neurodegeneration, the gradual loss of function of brain cells that occurs in Alzheimer's, Parkinson's and motor neurone diseases, isn't a strictly human phenomenon. Insects are affected by it too. In the autumn, bees and wasps often develop erratic behaviour before they die.''


Because the fruit fly (Drosophila melanogaster) and human share many genes with similar functions, the fly is widely used by genetic researchers to study how genes influence human disease.


''It's already known that defects in the equivalent fly genes involved in human brain diseases cause brain cells in fruit flies to lose function as they age,'' Dr O'Connell continued.


OPT could help researchers to look at how the fly brain changes in response to alterations in the normal activity of a specific gene without the risk of damaging tissue through dissection.


In a paper published in the September 5 issue of journal PLoS One, the team describes how they have already used the technique to image individual cavities within the brain of an ageing fly and see the brain deteriorate.


MRC PhD student Leeanne McGurk who captured many of the OPT images explained why the technique works: ''The dark colour of the fly exoskeleton prevents us from seeing inside it using a standard light microscope. In the past this has meant scientists have had to tease apart fruit fly tissues by hand -- a laborious process. Now, we have got over the problem by bleaching the fly exoskeleton. When the fruit fly becomes colourless it is possible to use imaging techniques not only to view its internal organs but to generate 2D and 3D images of the entire fly. ''


Using OPT images in this way will allow scientists to visualise where and how the products of selected genes are present in the fly. These patterns of gene expression, as they are known, will help to identify genes that control parts of the central nervous system and so provide detailed information about the human brain.


Bleaching of the exoskeleton to clear away the colour also allows images to be generated using other microscopic techniques that depend on penetration of light.


Dr O'Connell concluded: ''This research is not simply limited to the study of conditions like Alzheimer's but can also be used to study fly anatomy. The shape and size of organs can be affected by diseases like diabetes so imaging may yield clues to further our understanding of other conditions too.''


The team, including Dr. Liam Keegan at the MRC Human Genetics Unit in Edinburgh collaborated with scientists working on the Systems Biology Program at Centre de Regulacio Genomica, Barcelona, Spain






Technorati :

Nanotech initiative aims to reduce cost, power usage of embedded microchips


Houston's Rice University and Singapore's Nanyang Technological University announced on Sept. 4 an initiative dubbed The Institute for Sustainable Nanoelectronics, a joint effort aimed at lowering the cost and power consumption of embedded microchips with nanoscale solutions. ISNE is being funded by $2.6 million in seed money from NTU and is based at the Singapore institution.
The centerpiece to the initiative is the probabilistic CMOS (complementary metal-oxide semiconductor) chip invented by Rice researcher Krishna Palem, the architect of the ISNE initiative. PCMOS chips can tolerate nanoscale defects with a tunable numerical precision that trades off errors for lower power consumption. Last year Palem demonstrated a cell phone display [http://www.eetimes.com/showArticle.jhtml?articleID=193600408] in which no appreciable difference in picture quality could be detected by the naked eye, even when the PCMOS was tuned to use five times less power than conventional embedded chips.


ISNE will capitalize on the fact that for small screens, today's graphics chips are over-engineered, and that the brain's ability to perceive less-than-perfect images enables PCMOS and similar nanoscale technologies to harness defects and reproduce indistinguishable results at lower cost and using less power. The ISNE charter also calls for platform-independence in design methodologies, so that other nanotechnologies, such as photonics, can similarly benefit from trading off precision for lower cost and power.


Palem will direct his work with the International Network of Excellence - a team of computing experts from educational institutions like Rice University, NTU and Georgia Institute of Technology - from Rice, where he recently moved from Georgia Tech. The ISNE will also partner with Rice's new Value of Information-based Sustainable Embedded Nanocomputing Center, which Palem recently established with seed funding from Rice.



More power save


Chip scheme saves power


Portland, Ore. -- As CMOS integrated circuits shrink, parameter variations--slight differences among transistors that were designed to be identical--increasingly plague semiconductor designers. At the same time, the ability to cram in more and more transistors is making chips hotter. Now a new architectural technique called probabilistic systems-on-chip offers a comprehensive framework for solving both problems at once.



"Platforms that use our architecture will have a new design parameter--the probability of errors--which can be traded off for significant energy savings," said Krishna Palem, an EE professor at the Georgia Institute of Technology and founder of the university's Center for Research in Embedded Systems and Technology. "The key to our approach is a novel voltage-scaling scheme called biased voltage scaling, which can trade off energy consumption for signal-to-noise ratio in a well-defined manner."



Palem recently designed a probabilistic arithmetic unit for computing a fast Fourier transform (FFT) for a synthetic-aperture radar application. The work won the Best Paper award at last month's Cases 2006 conference in Seoul, South Korea.



Now Palem has fabricated the first hardware chip to demonstrate the probabilistic CMOS (PCMOS) technology for probabilistic applications with the Nanyang Technological University in Singapore. PCMOS defines parameter variations in devices, and their resultant error-prone circuit behaviors, as noise.



Palem began his work with applications that are inherently probabilistic anyway. "We have already shown enormous gains--300 to 400 times--for applications that exhibit a natural need for probabilities, such as neural networks, Baysean reasoning networks, pattern recognition, speech recognition and the like," he said.



Due to parameter variations between devices that will be unavoidable at the nanoscale, chip designers can expect all semiconductor behaviors to become probabilistic by 2016, according to the International Technology Roadmap for Semiconductors. To prepare for that future, Palem has begun developing probabilistic approaches to ordinary applications.



"Now we are looking at applications that operate on audio and video data streams, but which are not normally associated with probability--such as signal processing, filters and Fourier transforms," said Palem. "In this way, we have extended our probabilistic approach to encompass conventional computational primitives such as adders and multipliers."



To realize the probabilistic system-on-chip architecture, biased voltage scaling treats transistors differently, depending on where they are being used in a circuit. In the synthetic-aperture radar application, for example, "Error in the output of our probabilistic adder manifests itself as degradation in the signal-to-noise ratio of the radar image that is reconstructed by the FFT algorithm," said Palem. "In return for this degradation, which is visually indistinguishable from a reconstruction using conventional approaches, we realized a 5.6x energy savings."



Today voltage scaling is used to lower chip power consumption by lowering operating voltages (power equals voltage times current). For instance, lowering the operating voltage from 3.3 to 1.1 volts would achieve a threefold lowering of power consumption. Unfortunately, lowering operating voltage so close to the noise floor will also increase the probability of errors in the circuitry.



Palem's approach solves that problem by biasing voltage scaling toward parts of a circuit where errors can be tolerated. For instance, in an adder or multiplier, the operating voltage is dropped all the way to 1.1 V only for the least-significant bits, and is correspondingly increased for each more-significant bit; the most-significant bit is kept at full operating voltage.



"Using this approach, we have found that the amount of energy saved is seven times more than the lower signal-to-noise ratio we traded in to achieve it," said Palem.



Palem has fabricated the first hardware chip to demonstrate PCMOS with the Nanyang Technological University Singapore. The device, which is currently being packaged, has an input for a knob that can be twirled to adjust the trade-off between energy consumed and signal-to-noise ratio.



Next, the Georgia Tech researchers plan to incorporate the delays in on-chip signal paths as a source of probabilistic behavior. By incorporating probabilities calculations into carry-skip adder design, for instance, entire sections of carry propagation could be eliminated while simultaneously mitigating potential errors.







Technorati :

ISO Rejects Microsoft's XML


Microsoft has failed in its attempt to have its Office Open XML document format fast-tracked straight to the status of an international standard by the International Organization for Standardization.


The proposal must now be revised to take into account the negative comments made during the voting process.


Microsoft expects that a second vote early next year will result in approval, it said Tuesday. That is by no means certain, however, given the objections raised by some national standards bodies.


Dual Requirements
A proposal must pass two voting hurdles in order to be approved as an ISO standard: it must win the support of two-thirds of voting national standards bodies that participated in work on the proposal, known as P-members, and also of three-quarters of all voting members.


OOXML failed on both counts, ISO announced, as the working day ended in its Geneva offices.


The proposal won the support of 74 percent of voting members -- just shy of the required number. But only 53 percent of the voting P-members supported the proposal, well short of the required 67 percent.


Many of the national standard bodies voting against the OOXML proposal accompanied their votes with comments on what must be changed before they will vote in favor. ISO committee JTC1 must now reconcile those objections with the text, and find a compromise that will win enough votes to get through.


That will be difficult, as the French Association for Standardization, Afnor, wants to tear the proposal into two parts: a "core" part, which it wants to see converged over the course of three years with the competing Open Document Format (ODF), already an ISO standard, and an "extensions" part dealing with compatibility with legacy documents in proprietary formats.


France is not alone in suggesting modifications to the standard: Brazil raised more than 60 objections, including issues of support for different languages and date formats, while the standards body in India was concerned that OOXML is incompatible with the ODF standard.


Costly Loss
Microsoft could miss out on revenue from the lucrative government market if OOXML is also rejected next year. Some governments, worried that the need for access to electronic archives held in proprietary formats leaves them hostage to their software vendor, have mandated the use of document formats that comply with open international standards.


Others are considering such a move, which could put Microsoft at a double disadvantage to open source products such as OpenOffice.org, which not only store files natively using Open Document Format, but are free.


Frederic Couchet, spokesman for APRIL, the French Association for the Promotion and Research of Free Computing, supported Afnor's suggestion of combining parts of OOXML to ODF.


"The OOXML format contains significant design flaws," and it will be difficult to correct them "other than by starting again from scratch, or by enriching the already existing standard, Open Document Format," he said Tuesday.


Industry body CompTIA said it was disappointed by the result, but echoed Microsoft's believe that the proposal will find enough supporters in the second round of voting. Microsoft is among the members of CompTIA, but OOXML opponents Sun Microsystems Inc. and IBM Corp. are not.


Microsoft's Challenge
The ODF Alliance said while Microsoft has every right to seek the ISO label for its format, "The ballot results show it has a long way to go before it earns it and can be considered a truly open, interoperable document format."


Office Open XML began as the default document format used by Microsoft's Office 2007 productivity suite. The company submitted the specification to ECMA International, an association of computer industry manufacturers, which modified it slightly and published it as the ECMA-376 standard before submitting it to ISO for fast-track approval as an international standard.





Technorati :

Find here

Home II Large Hadron Cillider News