Search This Blog

Thursday, April 10, 2008

Is there any possibility!! :Crash collider destroys universe !!

Back in the old days before terrorists and tsunamis, it was always the labcoat-wearing mad scientists who were going to destroy the Earth in one of their crazy experiments. In the movies, scientists were always the ideal scapegoats - bald apart from two tufts of hair, certifiably mad, and without a friend in the world. Well, scientists are back in the firing line again. As the Large Hadron Collider (LHC) in Europe gets closer to completion, there is a paranoid groundswell telling the world that this science experiment will unleash uncontrollable forces, wreck the planet and kill us all.

The LHC has actually been designed to answer some of the big questions in physics, such as, what is mass. There are so many questions. For example, think about some of the sub-atomic particles, such as electrons and quarks. They are just points - they have no size. And in between them, there is a vacuum. So these sub-atomic particles are mathematical figments floating in nothing. Even worse, these mathematical figments have weird properties like charge, and mass. The LHC, when it switches on in late 2008, will help solve some of these crazy mysteries. To do this, it will recreate some of the titanic energies found immediately after the Big Bang.

The Large in LHC is not an exaggeration - it's enormous. It's an underground tunnel, shaped like a ring, that straddles the borders of Switzerland and France. Over 2500 scientists from 37 countries are labouring to build just one of its four detectors - which by itself, has more iron than the Eiffel Tower. The LHC will generate so much raw data that if it were stored on CDs, the stack would grow at 1.6 kilometres per month. The project will employ about half of all the particle physicists in the world.

Its name tells you what it does. 'Hadrons' are microscopic particles (such as the protons or neutrons in the core of an atom) that in turn, are made of even smaller sub-atomic particles. The LHC will collide beams of protons together. And hopefully the products of the collision will include the long sought after Higgs Particle, which is thought to endow everything in the universe with the strange property we call 'mass'.

The protons will travel at 99.99991% of the speed of light through pipes 27 kilometres in circumference, buried 50-100 metres underground. At that speed, the protons will have the energy of an express train. They will be kept travelling in a curved path by the largest array of superconducting magnets ever built, cooled by 130 tonnes of liquid helium. The liquid helium will be colder than the temperature of deep space.

You've probably heard that mass and energy can be turned into each other. In a nuclear weapon, a small amount of mass is turned into a huge amount of energy. In the LHC, the opposite happens - energy is turned into mass. In a bizarre example of how mass and energy can be interchanged, two small fast-moving protons will collide to make much-heavier slower particles - as though two nippy Cessna planes collided to make a lumbering bus. The energy in the 'speed' of the protons will hopefully be converted to the mass of the Higgs Particle.

Over the last decade, uninformed scare-mongers have spread disaster scenarios, with the LHC destroying the Earth, and even the universe. They say (quite correctly) that it's theoretically possible for the LHC to create mini-black holes. They then conveniently ignore the rest of the same theory that points out that the black holes would evaporate almost immediately. Instead, they wrongly claim that the mini-black holes would rapidly eat the Earth.

The scare-mongers also claim that the colliding protons in the LHC have enormous energies, and so something totally unforeseen in our current theories might happen. Well, cosmic rays with energies many tens of millions of times greater than the speeding protons in the LHC have been smashing into all the planets and moons in our solar system for billions of years - and we're all still here. So let's give it a whirl, and see what we find.

ASUS touts the discrete graphics in its tiny desktop PC



ASUS Champions Essentio CS5110 Desktop PC
Will all the buzz surrounding ASUS' Eee PC notebooks, it's easy to overlook other members of the family. The Eee PC range has gained attention in recent weeks thanks to the addition of the 4G-X model sporting Windows XP Home and the upcoming 8.9" Eee PC 900 model which is due in the coming months.

Today, however, it's time for a new ASUS family member to shine. ASUS is now touting its Essentio CS5110 desktop which is claims is "The world’s smallest Desktop PC equipped with a fully embedded discrete graphic card."

The CS5110 measures just 7.9" x 11.4" x 3.1" and weighs just 7.5 pounds. The CS5110 is also relatively quiet and only emits 23.9dB of noise at idle.

Despite the small dimensions, the CS5110 packs quite a punch. The desktop features an Intel G35-based motherboard and supports Core 2 Duo, Pentium Dual Core and Celeron D processors. Up to 4GB of DDR2 memory (via two SO-DIMM slots) is supported and the chassis can accommodate a single 3.5" SATA II HDD, and either a slot-loading DVD SuperMulti or Blu-ray drive.

Since ASUS touts the discrete graphics capabilities of the CS5110, it should be pointed out that the PC includes an NVIDIA GeForce 8600M GT graphics card with 256MB of memory.

Other features of the CS5110 include 802.11b/g/n, Bluetooth 2.0+EDR, GbE, six USB 2.0 ports, Firewire, an HDMI port, a VGA port and a 10-in-1 media reader.

RSA: Cyber Storm II Builds Network To Defend Against Cyber Crisis


Computer and internet security is most important thing now a days, we all now now depanding computer and internet. To make our depandancy secure - Among the goals for Cyber Storm II, a government-sponsored computer security exercise that occurred last month, was testing information sharing capabilities across organizations during a crisis.
SAN FRANCISCO--It turns out al-Qaida's leader and his cohorts aren't the biggest threat to our cybersecurity. You are.

Six years ago, Osama bin Laden represented the nightmare scenario for the computer security establishment. But more immediate cyberdangers lurk on the horizon. Experts attending the RSA conference that began here today say it's you--Mr. & Mrs. Computer User--who keep goofing up.

In fact, they contend, the future of cybersecurity hinges less on a latter-day version of spy-versus-spy against shadowy terror groups than on a more serious effort to instill best practices. Listening to their heeding was something akin to the scene in the movie Groundhog Day, where Bill Murray repeatedly wakes up to the same morning.

Security gurus have long urged the business world to turn network security into part of the corporate DNA. The message is not fully getting through. And now we're seeing the predictable results.

After listening to Symantec's John Thompson's morning keynote, I later kidded him about purposely scaring the hell out of people. He was a good sport about my joshing but pointed out that the information security landscape is increasingly punctuated by cases of data theft. He backed that up by reciting a litany of worrisome stats from his company's latest Internet security threat report. Truth be told, it makes for grim reading.
...............
By the accounts of panelists at the RSA Conference in San Francisco who participated in the exercise, the simulated cyber crisis was hugely valuable; they just couldn't share very much information about what went on.

Detailed information about Cyber Storm II will be made available later this summer in an after-action report, said Greg Garcia, assistant secretary for cybersecurity with the Department of Homeland Security.

It thus came as no surprise when U.S. CERT's deputy director Randy Vickers acknowledged that the exercise showed there were still some shortfalls in information sharing during the simulated crisis.

Other panelists included Michigan CIO Dan Lohrmann, New Zealand's managing director of critical infrastructure protection Paul McKittrick, Microsoft senior security specialist Paul Nicholas, and Dow senior information systems manager Christine Adams.

After listening to the panelists talk for forty-five minutes in very general terms about what their organizations hoped to accomplish and in similarly vague terms about various "learnings" that emerged, questions were solicited from the audience.

One pony-tailed RSA attendee, presumably a security pro, expressed dissatisfaction with the lack of specific information disclosed about Cyber Storm II and asked bluntly, "Was there a red team and did they win?"

According to the color traditions observed by the military and security professionals, the red team typically represents an attacking enemy and the blue team typically represents the defenders or home country.

"We don't have a firm answer about winning or losing," said panel moderator Jordana Siegel, acting deputy director at Department of Homeland Security. She however did allow that the exercise had taught everyone a lot.

Generally speaking, the U.S. government has not been shy when it comes to proclaiming its successes.

But if the blue team got trounced, that should not be an entirely unexpected result given that in real world version Cyber Storm II -- now playing on the Internet and coming soon to a network near you -- the red team scores victories daily, against government agencies, businesses, organizations, and individuals.

Vickers insisted that the red team-blue team dynamic didn't quite fit Cyber Storm II. That may be Cyber Storm III. But Cyber Storm II in March was more about getting ready to be tested. It was more about networking, which is to say building interpersonal relationships across organizations among those who may one day face a real cyber crisis.

Citing the words used by Homeland Security Secretary Michael Chertoff at his RSA keynote speech on Tuesday, Garcia said, "It takes a network to defeat a network, and that network is the adversary."

Whatever else it did, Cyber Storm II strengthened the foundations of the blue team's network, the public-private partnership that oversees critical cyber infrastructure.

And as Microsoft's Nicholas observed, public-private partnership "is easy to say but it's hard to do."

What was lost tens of millions of years ago is now found.


A fossil animal locked in Lebanese limestone has been shown to be an extremely precious discovery - a snake with two legs.

Scientists have only a handful of specimens that illustrate the evolutionary narrative that goes from ancient lizard to limbless modern serpent.

Researchers at the European Light Source (ESRF) in Grenoble, France, used intense X-rays to confirm that a creature imprinted on a rock, and with one visible leg, had another appendage buried just under the surface of the slab.

"We were sure he had two legs but it was great to see it, and we hope to find other characteristics that we couldn't see on the other limb," said Alexandra Houssaye from the National Museum of Natural History, Paris.

The 85cm-long (33in) creature, known as Eupodophis descouensi, comes from the Late Cretaceous, about 92 million years ago.

Unearthed near the village of al-Nammoura, it was originally described in 2000.

Its remains are divided across the two interior faces of a thin limestone block that has been broken apart.

portion of the vertebral column is missing; and in the process of preservation, the "tail" has become detached and positioned near the head.

But it is the unmistakable leg bones - fibula, tibia and femur - that catch the eye. The stumpy hind-limb is only 2cm (0.8in) long, and was presumably utterly useless to the animal in life.

Current evidence suggests that snakes started to emerge less than 150 million years ago.

Two theories compete. One points to a land origin in which lizards started to burrow, and as they adapted to their subterranean existence, their legs were reduced and lost - first the forelimbs and then the hind-limbs.

The second theory considers the origin to be in water, from marine reptiles.

This makes the few known bipedal snakes in the fossil record hugely significant, because they could hold the clues that settle this particular debate.

"Every detail can be very important in establishing the great relationships and that's why we must know them very well," explained Ms Houssaye.

"I wanted to study the inner structure of different bones and so for that you would usually use destructive methods; but given that this is the only specimen [of E. descouensi], it is totally impossible to do that.

"3D reconstruction techniques were the only solution. We needed a good resolution and only this machine can do that," she told BBC News.

That machine is the European Synchrotron Radiation Facility. This giant complex on the edge of the Alps produces an intense, high-energy light that can pierce just about any material, revealing its inner structure.

For this study, the fossil snake was clamped to an inclined table and rotated in front of the facility's brilliant X-ray beam.

In a process known as computed laminography, many hundreds of 2D images are produced which can be woven, with the aid of a smart algorithm, into a detailed 3D picture.

The finished product, which can be spun around on a computer screen, reveals details that will be measured in just millionths of a metre.

The E. descouensi investigation shows the second leg hidden inside the limestone is bent at the knee.

"We can even see ankle bones," ESRF's resident palaeontologist Paul Tafforeau said.

"In most cases, we can't find digits; but that may be because they are not preserved or because, as this is a vestigial leg, they were never present."

To modern eyes, it may seem strange to think of a snake with legs.

But look at some of the more primitive modern snakes, such as boas and pythons, and you'll see evidence of their legged ancestry - tiny "spurs" sited near their ends, which today are used as grippers during sex.

Gene's 'selective signature' helps scientists identify instances of natural selection in microbial evolution

Microbes, the oldest and most numerous creatures on Earth, have a rich genomic history that offers clues to changes in the environment that have occurred over hundreds of millions of years.

While scientists are becoming increasingly aware of the many important environmental roles played by microbes living today--they process the food in our intestines, they keep carbon moving through the ocean food web, they can be harnessed to process sewage and build specific proteins--they still know little about these tiny critters, particularly marine microbes, which generally are classified into species based on their ecological niche. For instance, two species of marine microbe might look very similar physically, but one may have adapted to life in a particularly dark part of the ocean, while its sister species may have adapted to feeding off a nutrient that is rare in most parts of the ocean, but exists in abundance in one small area.

Scientists at MIT who are trying to understand existing microbes by studying their genetic history recently created a new approach to the study of microbial genomes that may hasten our collective understanding of microbial evolution.

The researchers have reversed the usual order of inquiry, which is to study an organism, then try to identify which proteins and genes are involved in a particular function. Instead, they have come up with a simple mathematical formula that makes it possible to analyze a gene family (a single type of gene or protein that exists in many creatures) simultaneously in a group of ecologically distinct species.

This means that we can begin to identify occurrences of natural selection in an organism's evolution simply by looking at its genome and comparing it with many others at once. This would allow them to take advantage of the nearly 2,500 microbes whose genomes have already been sequenced.

The new method determines the "selective signature" of a gene, that is, the pattern of fast or slow evolution of that gene across a group of species, and uses that signature to infer gene function or to map changes to shifts in an organism's environment.

"By comparing across species, we looked for changes in genes that reflect natural selection and then asked, 'How does this gene relate to the ecology of the species it occurs in?'" said Eric Alm, the Doherty Assistant Professor of Ocean Utilization in the Departments of Civil and Environmental Engineering and Biological Engineering. Natural selection occurs when a random genetic mutation helps an organism survive and becomes fixed in the population. "The selective signature method also allows us to focus on a single species and better understand the selective pressures on it," said Alm.

"Our hope is that other researchers will take this tool and apply it to sets of related species with fully sequenced genomes to understand the genetic basis of that ecological divergence," said graduate student B. Jesse Shapiro, who coauthored with Alm a paper published in the February issue of PLoS Genetics.

Their work also suggests that evolution occurs on functional modules--genes that may not sit together on the genome, but that encode proteins that perform similar functions.

"When we see similar results across all the genes in a pathway, it suggests the genomic landscape may be organized into functional modules even at the level of natural selection," said Alm. "If that's true, it may be easier than expected to understand the complex evolutionary pressures on a cell."

For example, in Idiomarina loihiensis, a marine bacterium that has adapted to life near sulfurous hydrothermal vents in the ocean floor, the genes involved in metabolizing sugar and the amino acid phenylalanine underwent significant changes (over hundreds of millions of years) that may help the bacterium obtain carbon from amino acids rather than from sugars, a necessity for life in that ecological niche. In one of I. loihiensis' sister species, Colwellia psychrerythraea, some of those same genes have been lost altogether, an indication that sugar metabolism is no longer important for Colwellia.

Shapiro and Alm focused on 744 protein families among 30 species of gamma-proteobacteria that shared a common ancestor roughly one to two billion years ago. These bacteria include the laboratory model organism E. coli, as well as intracellular parasites of aphids, pathogens like the bacteria that cause cholera, and soil and plant bacteria. They mapped the evolutionary distance of each species from the ancestor and incorporated information about the gene family (for instance, important proteins evolve more slowly than less-vital ones) and the normal rate of evolution in a particular species' genome in order to determine a gene's selective signature.

"These are experiments we could never perform in a lab," said Alm. "But Mother Nature has put genes into an environment and run an evolutionary experiment over billions of years. What we're doing is mining that data to see if genes that perform a similar function, say motility, evolve at the same rate in different species. To the extent that they differ, it helps us to understand how change in core genes drives functional divergence between species across the tree of life."

This work is part of the Virtual Institute for Microbial Stress and Survival. The research was also supported by additional grants from the U.S. Department of Energy Genomics: GTL Program, the National Institutes of Health, and a scholarship from the Natural Sciences and Engineering Research Council of Canada.

Langer a finalist for Millennium Technology Prize


Institute Professor Robert Langer
MIT Institute Professor Robert Langer has been chosen as a finalist for the Millennium Technology Prize, the world's largest prize for technology innovation.

Langer was chosen "for his inventions and development of innovative biomaterials for controlled drug release and tissue regeneration that have saved and improved the lives of millions of people," according to the Technology Academy Finland, which gives the award every other year.

The award goes to developers of a technology that "significantly improves the quality of human life, today and in the future."

This year's winner will be announced June 11. Winners receive 800,000 euros, and the other finalists each receive 115,000 euros.

Andrew Viterbi '56, SM '57, founder of Qualcomm, is also a finalist. He was honored for creating an algorithm that became "the key building element in modern wireless and digital communications systems, touching lives of people everywhere," according to the Technology Academy Finland.

The other finalists are Alec Jeffreys, who developed DNA fingerprinting techniques, and a trio of scientists who developed an optical amplifier that transformed telecommunications: David Payne, Emmanuel Desurvire and Randy Giles.

Previous winners include Tim Berners-Lee, creator of the World Wide Web and senior research scientist at MIT, and Shuji Nakamura, inventor of light-emitting diodes.

MIT retirees' scholarship fund makes first award


A desire on behalf of MIT retirees to stay current and connected with the Institute and its students inspired the advisory committee of the MIT Retirees Association to establish the MIT Retirees Undergraduate Scholarship Fund in 2006. The fund serves as a way for retirees to express their appreciation to MIT and the many colleagues, mentors and friends who enriched their MIT experiences, and to contribute to MIT's future excellence by providing financial assistance to an undergraduate.

The fund recently awarded its first scholarship to senior Laura Harris. Harris is the granddaughter of the late Paul J. Harris, who worked at Lincoln Lab for more than 50 years until his retirement.

"I remember that my grandfather often tried to share his passions for science with me and my sister. Maybe this award will inspire other MIT employees and retirees to share their passions with their children and grandchildren, like my grandfather did with me. I'm sure he would have been very proud that I followed in his footstep," said Harris, who majors in computer science and engineering and plans to pursue a master's degree next year.

Gifts to the scholarship fund may be made without designation or in appreciation or memory of a colleague, mentor or loved one or to celebrate a milestone or professional or personal achievement. For more information about the MIT Retirees Association or the scholarship fund, visit web.mit.edu/retireesassoc or contact Jane Griffin (griffin@mit.edu),

Cashing in on the user data by Google App

The technology gaint of web is relly dominating the world data control.Google's announcement of its App Engine has naturally generated a lot of buzz, as well as some fear, uncertainty, and doubt. There is the concern that Google will corral even more user data via its App Engine, becoming a kind of 21st century data and advertising baron, as Microsoft has been the operating system and productivity software baron in the last three decades.

If you extrapolate from Google's growing share of search and advertising, and include a growing share of Web applications through its APIs and the fledgling App Engine, you could imagine a Google that becomes the dominant Internet operating system and infrastructure provider. It's still the early days of cloud computing, but the ground is shifting.

"It's funny that we waged the war to free ourselves of shackles of Microsoft and Hailstorm (a failed attempt to manage personal data)," said David Young, CEO of cloud infrastructure provider and App Engine competitor Joyent. "Now, for some reason, the digerati are anxious to run into exact same thing with Google. It's not evil, but they are tracking users and clickstreams, which (are) the real currency of the Web, and most people don't care. If you can get all data, you can target ads and the user experience, such as showing a site in a different color, depending on user profile."

The Web currency of user data and clickstreams is also vital to Joyent's business. The company has 10,000 customers, handles 5 billion page views a month, and provides infrastructure for 25 percent of the third-party applications running on Facebook. Through its Player's Club, Joyent provides free hosting to Facebook developers, as well as OpenSocial developers, in exchange for the data.

"We gather the data and work with ad networks to help their clients target site," Young said. Joyent works with ad networks such as Slide, RockYou, Social Media, Federated Media, and AdBrite. "With billions of visitors, Google can gather the data on its own, but the social networks allow companies like Joyent to get access to it as well," he said. Basically, the majority of developers are willing to share their user data in exchange for free infrastructure services.
"If I were Google, I would buy every big Web application, such as Six Apart and WordPress, out there to get access to clickstream and user data as people move across the Web. I think that is what App Engine is all about," Young said.

In light of App Engine, Joyent is offering a similar infrastructure service (but using MySQL, Postgre SQL, or Oracle databases rather than Google's Bigtable and file system). Like App Engine, the Joyent "Garden of Eden" program includes free infrastructure for Python Web applications in exchange for customer information and clickstream data.

However, Joyent isn't limiting the usage, and it will provide unlimited compute, storage, memory, and bandwidth, as well as root control. Google's App Engine, which is in beta, is limited to 500MB of storage, 200 million megacycles of CPU, and 10GB of bandwidth per day. Young figures that this would support 25,000 unique users a month, while Joyent will support a million users for free.

With all the hand-wringing about Google's increasing footprint and clout, the company is contributing code to the open-source world and driving data portability standards, such as the OpenSocial and Social Graph APIs. David Recordon notes the potential for App Engine sites to log in via Google Accounts.

Today that means that every App Engine site could have a shared sense of a user; the ability to understand who someone is across different App Engine sites and Google services. (Obviously I'd love to see Google move toward supporting OpenID for this sort of thing, but small bits piece by piece work for me.)

Imagine if Google Accounts added support for the (upcoming) OpenSocial REST APIs. All of a sudden, each of these App Engine sites could start injecting activity and querying for activity across each other. Maybe you'll argue that this just means that Google Accounts could become the next big social network, but isn't it a bit different when this functionality is just a part of your hosting infrastructure? What if Google Accounts ignored the notion of friends and instead left that to actual social networks? If done right, this really could be the first shipping glimpse of the distributed social Web that there is to come.

If Google's growth trends continue to accelerate, the company will colonize more Web territory, collecting more data and monetizing it across billions of users and sites. So far, Google has a head start, with its highly profitable search and ad business (which is why Microsoft is in hot pursuit of Yahoo) and is moving into new application territory.

The old guard--Microsoft, IBM, Hewlett-Packard, Dell, Sun Microsystems, Cisco Systems, Oracle--haven't yet revealed plans for colonizing Web users with end-to-end cloud-based platforms. The have stood by while Salesforce.com becomes a company with $1 billion in annual revenue. Will they be standing on the sidelines as Google and others, such as the 22-person Joyent, prove the viability of cloud platforms as a service?

New collabration of google and yahoo for adsense.

Technology field is diversing so fast and critical that its impossible to do all by one. So now a days ots a common era of collabration.
When Microsoft’s offer to acquire Yahoo was made public, Google stated it would be willing to help Yahoo fend off the acquisition in anyway it could. It’s now or never.

Yahoo is talking with Google about having them handle some of the advertisements on its search pages, The Wall Street Journal reports. This partnership is said to be a test for what could become a larger one — which reads a lot like: A test, small enough to make sure the government doesn’t get involved with its antitrust hounds, but with enough promise to allow Yahoo to fend off Microsoft.

Yahoo has been in talks with several other companies about potential partnerships that would allow it to placate the shareholders clamoring for the company to accept Microsoft’s bid, but none of those discussion have proven fruitful. Yahoo has also stated that its financial outlook for the future is good even without any partnerships (our coverage), but that future seems cloudy at best.

While a Google ad partnership may make sense on paper to save Yahoo, one can’t help but wonder if giving away valuable search advertisements is a “cutting off the nose to spite the face” situation, as Allen Stern of CenterNetworks notes.

Microsoft has stepped up its effort in recent days to put pressure on Yahoo to accept its offer. Yahoo has been given three weeks to come to the table and formally talk about a deal before Microsoft will take a less valuable offer directly to the shareholders and seek to put new board members of its choosing in place (our coverage).

Yahoo responded to Microsoft’s threat by basically calling its chief executive, Steve Ballmer, a liar and a cheapskate (our coverage).

The countdown to a nasty public letter from Microsoft about any Yahoo/Google deal, begins now.

update: As we expected, it only took Microsoft a matter of hours to shoot back with a release about the Google/Yahoo deal. It’s only three sentences but emphasizes the point that Google and Yahoo combined control over 90 percent of the search advertising market.

The full statement by Microsoft general counsel Brad Smtih:

“Any definitive agreement between Yahoo! and Google would consolidate over 90% of the search advertising market in Google’s hands. This would make the market far less competitive, in sharp contrast to our own proposal to acquire Yahoo! We will assess closely all of our options. Our proposal remains the only alternative put forward that offers Yahoo! shareholders full and fair value for their shares, gives every shareholder a vote on the future of the company, and enhances choice for content creators, advertisers, and consumers.”

Find here

Home II Large Hadron Cillider News