Search This Blog

Monday, November 5, 2007

Security Warning !!! Zombie Storm Rising !!! Your PC Might Be A Zombie If ...


The global technology is rolling the global economy ...the no thing without computer and internet !!! So how much danger- comes !!!The greatest threat to global cyber security today, according to Internet Security Systems researcher Josh Corman, may be your mother's computer.


Or more precisely, the collected computers of all the world's mothers. Along with millions of other out-of-date and unsecured PCs strung together by the Internet--what Corman calls "the leper colony"--those machines represent a combined mass of computing power responsible for most of the Net's spam e-mails, much of its click fraud, and the vicious "denial of service" attacks that can knock sites offline and even destroy online businesses altogether.


Your PC Might Be A Zombie If ...
Since the beginning of the decade, cybercriminals have increasingly used malicious software to hijack unwitting PCs, turning about 20% of the world's computers into "zombies" that can be controlled and collected by the thousands into subservient criminal armies, according to research by security firm Trend Micro. Now, that zombie software is becoming more infectious and sophisticated: One strain in particular, the so-called "Storm worm," has enslaved between 15 and 50 million PCs, by security researchers' estimates. To make matters worse, Storm's zombies don't moan or drool blood, like their human-shaped counterparts. These digital undead, security researchers say, work in practically undetectable silence.


"The Storm worm is patient, resilient, adaptive and invisible," says Corman. "It's persisted unfettered for 10 months now, and a lot of us in the security industry think that it's the biggest threat we've ever seen."


So far, Storm's zombie army hasn't been used for much other than sending spam e-mails that grow its ranks. Storm's messages originally offered news about disastrous storms in Europe last January. Now they constantly evolve to tempt users into opening infected e-mail attachments by referring to recent news and using other "social engineering" tactics.


The Storm worm's final purpose still isn't clear: Some security experts worry that its massive botnet could be turned on government Web sites to flood them with denial of service attacks. Others say Storm's collection of zombies is now being split into pieces, which are sold to the highest bidder--a sort of commando bot-force, available for hire.


Most people probably wouldn't want their computer to lead a double life as an agent in a seething, cybercriminal organization. Then again, most people don't notice whether their machines are busy even when no one is at the keyboard.


Brian Grayek, vice president of security company CA (nyse: CA - news - people ), says that users can sometimes detect a slowdown in performance, hear their PC's hard drive whirring or see their Internet router's lights flashing when they're not using the computer, all signs that zombie software is at work. But David Perry, a spokesperson for Trend Micro, says that's no longer enough. "Everyone wants to be clever and think they can spot a zombie," Perry says. "But really there are no behavioral or visual clues."


Zombie hunting, Perry argues, should be left to the professionals: commercial anti-virus scanning programs. But even software scans may not be enough to detect the Storm worm, according to Internet Security Systems' Corman. He says that Storm mutates as often as every 30 minutes, updating far faster than the scanners trying to track it. Worse, the worm can "lobotomize" anti-virus software, Corman says, so that it appears to be running but has no effect.


Storm's zombie network is endlessly innovative. Unlike past viruses that have hijacked armies of PCs, Storm doesn't place the command and control of the network in a single computer. Whereas other zombie networks can be "beheaded" and disabled, according to Corman, every enslaved member of Storm's army has equal autonomous power.


Even scarier, Storm's creators are actively attacking researchers who try to uncover the worm's secrets. Security analysts at firms like Secureworks and Spamhaus have both been struck with denial of service attacks after publishing research on Storm, according to Corman.


But Bruce Schneier, chief technology officer of BT Counterpane, says that the simple element that keeps Storm more elusive than past viruses is its patience. "Symptoms don't appear immediately, and an infected computer can sit dormant for a long time," he wrote in his blog earlier this month. "If it were a disease, it would be more like syphilis, whose symptoms may be mild or disappear altogether, but which will eventually come back years later and eat your brain."


The only remaining key to preventing the expansion of Storm's zombie hordes may thus be prevention: If users stopped opening e-mail attachments from strangers--a basic security practice--Storm would lose its main avenue of infection.


But Schneier warns that education isn't any more likely to solve the problem, because users have no direct incentive: A subtle zombie sends spam at other users without victimizing its host. "The basic problem is that your company's security depends on my mother," he says.


That means, Schneier reluctantly admits, that the zombie epidemic will continue. "Short of finding the guys who wrote this and arresting them, there's no real solution," he says. "Annoying, isn't it?"




Technorati : , , , ,

A Warning !!!! :Chemical weapons create toxic waste nightmare


We want environment for us , for next generation , so in all espect we have to think the reaction of thing or policies we are innitiating , have to clear about CONFLICT: EVOLUTION AND FUTURE OF COMBAT AND WEAPONS


MEETING in the Hague next week, the signatories to the Chemical Weapons Convention (CWC) will celebrate the fact that Albania and the UK have destroyed their last chemical weapons and that India and South Korea are almost done. But there will be an elephant in the room: in their scramble to destroy weapons by a 2012 deadline, Russia and the US, which possess over 95 per cent of the world's chemical weapons, are creating thousands of tonnes of a nasty, toxic residue that they are having trouble disposing of.


Chemical weapons can be incinerated directly. But fearing a release of toxic gases, half the American and all the Russian weapons sites are breaking down the lethal molecules by adding alkali, a technique called hydrolysis. This creates a new problem: how to dispose of the resulting toxic soup, known as the hydrolysate.




Technorati : , , ,

Robotic : Robots on Revolution


On Saturday, when Carnegie Mellon's robotic Chevy Tahoe, known as "Boss," rolled across the finish line of the Defense Advanced Research Projects Agency (DARPA) Urban Challenge in Victorville, Calif., after 60 miles of urban driving, no driver stepped out to be showered with champagne and photographs. In fact, Boss had flawlessly accomplished the 19 missions given to the 11 finalist robots in the competition--parking at precise locations, negotiating a mix of onroad and offroad driving, and avoiding the other robotic and manned cars that roamed the streets of an abandoned airforce base--all without a human behind the wheel.


"This is a wonderful day in the history of robotics," Carnegie Mellon team leader Red Whittaker said after the race. "It's as good as it gets."


,CMU's Tartan Racing took home $2 million for first place in DARPA's Urban Challenge--a test of driverless cars on urban streets here at the former George Air Force Base in Southern California's Mojave Desert. By doing so, the team regained its pride after two stinging race defeats in 2004 and 2005. And it stole some glory back from 2005's winner, Stanford University, in tackling what was effectively a harder challenge this year. (Stanford claimed the second prize of $1 million this year.)


Apart from a little competitive drama and at least one robot wreck, the DARPA Urban Challenge produced a more important win for robotics this year, one that everyone from Whitaker to Stanford's team leader Sebastian Thrun pointed out at the race Saturday. That was simply that the competition seeded the idea in people's minds that self-driving cars are possible. Moreover, proponents say the underlying technology will pave the way for a new generation of cars that will help save lives, either through assisted-driving applications for civilian cars or fully autonomous vehicles for the military,


Planet Hunting: Find Neptune and Uranus



Most people have seen the five brightest naked-eye planets, yet there is a sixth planet that can be spied without optical aid and another which can be picked up using just a good pair of binoculars.


You'll have to know exactly where to look for them, though.


Fortunately, both are currently well placed for viewing in our evening sky and with the bright moon out of the way this week, it will be a good time to search for them.


Uranus


Barely visible to the unaided eye on very dark, clear nights, the planet Uranus is now visible during the evening hours among the stars of Aquarius, the Water Carrier. It is best to study a sky map first, and then scan that region with binoculars. Using a magnification of 150-power with a telescope of at least three-inch aperture, you should be able to resolve it into a tiny, pale-green featureless disk.


Uranus, which lies at a mean distance of 1.8 billion miles (2.9 million kilometers) from the sun, has a diameter of about 31,800 miles (51,100 kilometers). At last count, Uranus has 27 moons, all in orbits lying in the planet's equator in which there is also a complex of nine narrow, nearly opaque rings, which were discovered in 1978. Uranus has a rocky core, surrounded by a liquid mantle of water, methane, and ammonia, encased in an atmosphere of hydrogen and helium.


A bizarre feature of the planet is how far Uranus is tipped. Its north pole lies 98 degrees from a perpendicular to its orbit plane. Thus, its seasons are extreme: when the sun rises at its north pole, it stays up for 42 Earth-years; then it sets and the north pole is in darkness for 42 Earth-years.


Sir William Herschel discovered Uranus on March 13, 1781, noting that it was moving slowly through the constellation Gemini. Initially, however, Herschel thought he had discovered a new comet.


Neptune


Neptune, on the other hand, is much too faint to be viewed with the unaided eye, lying at a mean distance from the sun of 2.8 billion miles (4.5 billion kilometers). It is slightly smaller than Uranus, with a diameter of 30,800 miles (49,600 kilometers).


Neptune is about seven times dimmer than Uranus, but if you have access to a dark, clear sky and carefully examine our map, you should have no trouble in finding it with a good pair of binoculars. Neptune can be found among the stars of Capricornus, the Sea Goat. With a telescope, trying to resolve Neptune into a disk will be more difficult than it is with Uranus. You're going to need at least a four-inch telescope with a magnification of no less than 200-power, just to turn Neptune into a tiny blue dot of light.


Voyager 2 passed Neptune in 1989 and showed it to possess a deep-blue atmosphere, with rapidly moving wisps of white clouds as well as a Great Dark Spot, rather similar in nature to Jupiter's famous Great Red Spot.


Voyager 2 also revealed the existence of at least three rings around Neptune, composed of very fine particles.


Neptune has 13 moons, one of which, Triton, has a tenuous atmosphere of nitrogen and at nearly 1,700 miles (2,700 kilometers) in diameter, is larger than Pluto.


Neptune's discovery came about from long-term observations of Uranus. It seemed to astronomers that some unknown body was somehow perturbing Uranus' orbit.


In 1846, two astronomers, Urbain J.J. Leverrier (1811-1877) of France and John Couch Adams (1819-1892) of England, were independently working on this problem. Neither knew what the other was doing, but ultimately, both men had figured out the probable path of the supposed object that was disturbing the orbit of Uranus. Both believed that the unseen body was then in the constellation of Aquarius. Adams was a student at Cambridge University, England, and he sent his results to Sir George Airy (1801-1892), the Astronomer Royal, with specific instructions of where to look for it.


For some unknown reason Airy delayed a year before starting the search. In the meantime, Leverrier wrote to the Berlin Observatory requesting that they search in the place his observations directed. Johann Galle and Heinrich d'Arrest at Berlin did exactly as instructed, and found the new planet in less than an hour.


Still, the drama of the competition was largely between CMU and Stanford.


In 2004, CMU was pegged the favorite in DARPA's first-ever challenge of autonomous driving vehicles, given that the expertise of the university's robotics department and professor Whittaker. But CMU's autonomous car spun its wheels after only 7 miles on the 142-mile desert course, leaving no winner that year.


In 2005, CMU returned to the Grand Challenge more determined than ever with two race vehicles, heavily outfitted and modified Hummers. However, technical problems with the vehicles brought CMU defeat, and Stanford's team led by Thrun--the former protege of Whittaker--claimed the $1 million prize as a first-time entrant in the race.


Stanford also garnered global attention for accomplishing what hadn't been done before: engineering a car to drive itself more than 132 miles in the desert in less than 10 hours. It's rumored that after the race, CMU's team threw darts at a picture of Stanford's robot, Stanley.


This year, Whittaker's team will be remembered for engineering a robot that could master basic traffic rules while driving among other robots.


One race veteran put it like this: "Competition is huge for this event. The spirit of competition focuses everyone to solve the problem at hand."






Technorati : , ,

FlashbacK : 50 Years Ago: The First Dog in Orbit


It's today :Research on space is showing the unlimited fortune of creation ,



Just a month after the Soviet Union stunned the world by putting the first artificial satellite into orbit, it boasted a new victory - a much bigger satellite carrying a mongrel dog called Laika.


The mission, 50 years ago Saturday, ended sadly for Laika but helped pave the way for human flight.


As with other episodes of the Soviet space program, Laika's mission was hidden under a veil of secrecy, and only after the collapse of the Soviet Union could the participants tell the real story behind it.


The satellite that carried Laika into orbit was built in less than a month in what was perhaps the world's fastest-prepared space mission ever.


Excited by the international uproar over the launch of Sputnik on Oct. 4, 1957, Soviet leader Nikita Khrushchev summoned Sergei Korolyov, the father of the Soviet space program, and ordered him to come up with "something new" to celebrate the Nov. 7 anniversary of the 1917 Bolshevik Revolution.


Khrushchev's demand was a shock even for Korolyov, whose team had managed to put together the first Sputnik in less than three months, said Georgy Grechko, a cosmonaut who started his career as a space engineer.


"We didn't believe that you would outpace the Americans with your satellite, but you did it. Now you should launch something new by Nov. 7," Korolyov quoted Khrushchev telling him, according to Grechko.


Boris Chertok, Korolyov's right-hand man, said the short notice made it impossible to design a principally new spacecraft, but there was also little sense in simply repeating the Sputnik launch.


"Korolyov rightly feared that this holiday gift could end up in an accident that would spoil a hard-won victory," Chertok wrote in his memoirs. But they couldn't argue with Khrushchev, and the decision to conduct the launch was made on Oct. 12.


When someone on Korolyov's team suggested putting a dog into orbit, he jumped at the idea.


Little was known about the impact of space flight on living things, and some believed they would be unable to survive the launch or the conditions of outer space.


The Soviet Union had experimented with launching dogs on short suborbital missions during ballistic missile tests, and some of them survived several such missions. All of them were stray mongrel dogs - doctors believed they were able to adapt quicker to harsh conditions - and all were small so they could fit into the tiny capsules.


Just nine days before the launch, Doctor Vladimir Yazdovsky chose one of them - 2-year-old Laika - for the mission.


Stories about how she was chosen vary. Some say Laika was chosen for her good looks - a Soviet space pioneer had to be photogenic. Others say space doctors simply had a soft spot for Laika's main rival and didn't want to see her die: Since there was no way to design a re-entry vehicle in time for the launch, the glory of making space history also meant a certain death.


"Laika was quiet and charming," Yazdovsky wrote in his book chronicling the story of Soviet space medicine. He recalled that before heading to the launchpad, he took the dog home to play with his children.


"I wanted to do something nice for her: She had so little time left to live," Yazdovsky said.


Working round-the-clock, Korolyov and his team combined a capsule that would carry the dog with basic life-support systems and elements of the first Sputnik. To simplify the design, they decided not to separate the satellite from the booster's second stage.


They worked without blueprints at a pace that was breathtaking even at the time of the space race and seems utterly impossible by today's standards.


"Now when we have computers, sophisticated industrial equipment, lasers and other things, no one is capable of making a new satellite in just one month," Grechko said in an interview. "Now it would take a month just to start doing the paperwork. Korolyov told us later that it was the happiest month of his life."


As a result of some last-minute technical problems, Laika had to wait for the launch in the cabin for three days. The temperatures were low, and workers put a hose connected to a heater into the cockpit to keep her warm.


On Nov. 3, Laika blasted off into space in Sputnik 2, which weighed 1,118 pounds - a show of Soviet ability to take big payloads into space.


Sputnik 1 weighed just 184 pounds. The first U.S. satellite, Explorer 1, launched on Jan. 31, 1958, weighed about 31 pounds.


When Laika reached orbit, doctors found with relief that her pulse, which had risen on launch, and her blood pressure were normal. She ate specially prepared food from a container.


According to official Soviet reports, the dog was euthanized after a week. Laika's mission drew a wave of protests from animal protection activists in the West.


It wasn't until after the Soviet collapse, that some participants in the project told the true story: Laika indeed was to be euthanized with a programmed injection, but she apparently died of overheating after only a few hours in orbit. There was no information to indicate when exactly she died.


"It was impossible to build reliable life-support and thermal-control systems in such a short time," Chertok said in his memoirs.


Several other dogs died in failed launches before the successful space flight - and safe return to Earth - of Belka and Strelka in August 1960. After a few other flights with dogs, the Soviet Union put the world's first human - Yuri Gagarin - into space on April 12, 1961.


Gagarin is said to have joked: "I still don't understand who I am: the first human or the last dog in space."




Technorati : , , ,

Planet Hunting: Find Neptune and Uranus


Planet Hunting: Find Neptune and Uranus


Most people have seen the five brightest naked-eye planets, yet there is a sixth planet that can be spied without optical aid and another which can be picked up using just a good pair of binoculars.


You'll have to know exactly where to look for them, though.


Fortunately, both are currently well placed for viewing in our evening sky and with the bright moon out of the way this week, it will be a good time to search for them.


Uranus


Barely visible to the unaided eye on very dark, clear nights, the planet Uranus is now visible during the evening hours among the stars of Aquarius, the Water Carrier. It is best to study a sky map first, and then scan that region with binoculars. Using a magnification of 150-power with a telescope of at least three-inch aperture, you should be able to resolve it into a tiny, pale-green featureless disk.


Uranus, which lies at a mean distance of 1.8 billion miles (2.9 million kilometers) from the sun, has a diameter of about 31,800 miles (51,100 kilometers). At last count, Uranus has 27 moons, all in orbits lying in the planet's equator in which there is also a complex of nine narrow, nearly opaque rings, which were discovered in 1978. Uranus has a rocky core, surrounded by a liquid mantle of water, methane, and ammonia, encased in an atmosphere of hydrogen and helium.


A bizarre feature of the planet is how far Uranus is tipped. Its north pole lies 98 degrees from a perpendicular to its orbit plane. Thus, its seasons are extreme: when the sun rises at its north pole, it stays up for 42 Earth-years; then it sets and the north pole is in darkness for 42 Earth-years.


Sir William Herschel discovered Uranus on March 13, 1781, noting that it was moving slowly through the constellation Gemini. Initially, however, Herschel thought he had discovered a new comet.


Neptune


Neptune, on the other hand, is much too faint to be viewed with the unaided eye, lying at a mean distance from the sun of 2.8 billion miles (4.5 billion kilometers). It is slightly smaller than Uranus, with a diameter of 30,800 miles (49,600 kilometers).


Neptune is about seven times dimmer than Uranus, but if you have access to a dark, clear sky and carefully examine our map, you should have no trouble in finding it with a good pair of binoculars. Neptune can be found among the stars of Capricornus, the Sea Goat. With a telescope, trying to resolve Neptune into a disk will be more difficult than it is with Uranus. You're going to need at least a four-inch telescope with a magnification of no less than 200-power, just to turn Neptune into a tiny blue dot of light.


Voyager 2 passed Neptune in 1989 and showed it to possess a deep-blue atmosphere, with rapidly moving wisps of white clouds as well as a Great Dark Spot, rather similar in nature to Jupiter's famous Great Red Spot.


Voyager 2 also revealed the existence of at least three rings around Neptune, composed of very fine particles.


Neptune has 13 moons, one of which, Triton, has a tenuous atmosphere of nitrogen and at nearly 1,700 miles (2,700 kilometers) in diameter, is larger than Pluto.


Discovery link


Neptune's discovery came about from long-term observations of Uranus. It seemed to astronomers that some unknown body was somehow perturbing Uranus' orbit.


In 1846, two astronomers, Urbain J.J. Leverrier (1811-1877) of France and John Couch Adams (1819-1892) of England, were independently working on this problem. Neither knew what the other was doing, but ultimately, both men had figured out the probable path of the supposed object that was disturbing the orbit of Uranus. Both believed that the unseen body was then in the constellation of Aquarius. Adams was a student at Cambridge University, England, and he sent his results to Sir George Airy (1801-1892), the Astronomer Royal, with specific instructions of where to look for it.


For some unknown reason Airy delayed a year before starting the search. In the meantime, Leverrier wrote to the Berlin Observatory requesting that they search in the place his observations directed. Johann Galle and Heinrich d'Arrest at Berlin did exactly as instructed, and found the new planet in less than an hour.





Technorati : , ,

Ancient sea mud records supernova blast



It is the oldest telescope in the world - and it lies at the bottom of the ocean. Ancient sea floor sediments have revealed that a supernova exploded during the Pliocene era and may have caused a minor extinction event on Earth.
Levels of radioactive iron-60 suggest the supernova was between 60 and 300 light years away, says Brian Fields of the University of Illinois at Urbana-Champaign. "It didn't hit us or we wouldn't be here." Radiation from the blast could have weakened Earth's atmosphere, he says, exposing organisms to the sun's ultraviolet radiation. This coincides with an extinction peak, but Fields says there is no direct evidence of a link. The work was reported at a meeting of the Geological Society of America in Denver, Colorado, this week.



Supernova Blast Nearby Galaxy

The nearby dwarf galaxy NGC 1569 is a hotbed of vigorous star birth activity which blows huge bubbles that riddle the main body of the galaxy. The galaxy's "star factories" are also manufacturing brilliant blue star clusters. This galaxy had a sudden onset of star birth about 25 million years ago, which subsided about the time the very earliest human ancestors appeared on Earth.


In this new image, taken with NASA's Hubble Space Telescope, the bubble structure is sculpted by the galactic super-winds and outflows caused by a colossal input of energy from collective supernova explosions that are linked with a massive episode of star birth.


One of the still unresolved mysteries in astronomy is how and when galaxies formed and how they evolved. Most of today's galaxies seem to have been already fully formed very early on in the history of the universe (now corresponding to a large distance away from us), their formation involving one or more galaxy collisions and/or episodes of strongly enhanced star formation activity (so-called starbursts).


While any galaxies that are actually forming are too far away for detailed studies of their stellar populations even with Hubble, their local counterparts, nearby starburst and colliding galaxies, are far easier targets.


NGC 1569 is a particularly suitable example, being one of the closest starburst galaxies. It harbors two very prominent young, massive clusters plus a large number of smaller star clusters. The two young massive clusters match the globular star clusters we find in our own Milky Way galaxy, while the smaller ones are comparable with the less massive open clusters around us.


NGC 1569 was recently investigated in great detail by a group of European astronomers who published their results in the January 1, 2004 issue of the British journal, Monthly Notices of the Royal Astronomical Society. The group used several of Hubble's high-resolution instruments, with deep observations spanning a wide wavelength range, to determine the parameters of the clusters more precisely than is currently possible from the ground.


The team found that the majority of clusters in NGC 1569 seem to have been produced in an energetic starburst that started around 25 million years ago and lasted for about 20 million years. First author Peter Anders from the Gottingen University Galaxy Evolution Group, Germany says "We are looking straight into the very creation processes of the stars and star clusters in this galaxy. The clusters themselves present us with a fossil record of NGC 1569's intense star formation history."


The bubble-like structures seen in this image are made of hydrogen gas that glows when hit by the fierce winds and radiation from hot young stars and is racked by supernovae shocks. The first supernovae blew up when the most massive stars reached the end of their lifetimes roughly 20-25 million years ago. The environment in NGC 1569 is still turbulent and the supernovae may not only deliver the gaseous raw material needed for the formation of further stars and star clusters, but also actually trigger their birth in the tortured swirls of gas.


The color image is composed of 4 different exposures with Hubble's Wide Field and Planetary Camera 2 through the following filters: a wide ultraviolet filter (shown in blue), a green filter (shown in green), a wide red filter (shown in red), and a Hydrogen alpha filter (also shown in red).





Technorati : , , , ,

New Magnet Design Sheds Light On Nanotechnology And Semiconductor Research


New Magnet Design Sheds Light On Nanotechnology And Semiconductor Research


Engineers at Florida State University's National High Magnetic Field Laboratory have successfully tested a groundbreaking new magnet design that could literally shed new light on nanoscience and semiconductor research.



When the magnet -- called the Split Florida Helix -- is operational in 2010, researchers will have the ability to direct and scatter laser light at a sample not only down the bore, or center, of the magnet, but also from four ports on the sides of the magnet, while still reaching fields above 25 tesla. By comparison, the highest-field split magnet in the world attains 18 tesla. "Tesla" is a measurement of the strength of a magnetic field; 1 tesla is equal to 20,000 times the Earth's magnetic field.


Magnetism is a critical component of a surprising number of modern technologies, including MRIs and disk drives, and high-field magnets stand beside lasers and microscopes as essential research tools for probing the mysteries of nature. With this new magnet, scientists will be able to expand the scope of their experimental approach, learning more about the intrinsic properties of materials by shining light on crystals from angles not previously available in such high magnetic fields. In materials research, scientists look at which kinds of light are absorbed or reflected at different crystal angles, giving them insight into the fundamental electronic structure of matter.


The Split Florida Helix design represents a significant accomplishment for the magnet lab's engineering staff. High magnetic fields exert tremendous forces inside the magnet, and those forces are directed at the small space in the middle . . . that's where Mag Lab engineers cut big holes in it.


"You have enough to worry about with traditional magnets, and then you try to cut huge holes from all four sides from which you can access the magnet," said lab engineer Jack Toth, who is spearheading the project. "Basically, near the midplane, more than half of the magnet structure is cut away for the access ports, and it's still supposed to work and make high magnetic fields."


Magnet engineers worldwide have been trying to solve the problem of creating a magnet with side access at the midsection, but they have met with little success in higher fields. Magnets are created by packing together dense, high-performance copper alloys and running a current through them, so carving out empty space at the heart of a magnet presents a huge engineering challenge.


Instead of fashioning a tiny pinhole to create as little disruption as possible, as other labs have tried, Toth and his team created a design with four big elliptical ports crossing right through the midsection of the magnet. The ports open 50 percent of the total space available for experiments, a capability the laboratory's visiting scientists have long desired.


"It's different from any traditional magnet that we've ever built before, and even the fabrication of our new parts was very challenging," Toth said. "In search of a vendor for manufacturing the prototypes, I had phone conversations where people would promise me, 'Jack, we looked at it from every possible angle and this part is impossible to machine.'"


Of course, that wasn't the case, and the model coil, crafted from a mix of copper-beryllium blocks and copper-silver plates, met expectations during its testing in a field higher than 32 tesla with no damage to its parts.


Though the National Science Foundation-funded model has reached an important milestone, years of work will go into the final product. The lab hopes to have a working magnet for its User Program by 2010, and other research facilities have expressed great interest in having split magnets that can generate high magnetic fields.





Technorati : , , , , , ,

The 10 biggest Web annoyances


The 10 biggest Web annoyances


Many commonplace online frustrations -- some dating all the way back to the earliest days of the Web -- remain unfixed



In its relatively short life, the,world Wide Web has already made many of our most mundane, tedious tasks quicker and easier to perform. But there are still a surprising number of activities -- from helping us buy concert tickets to protecting our privacy -- that, for one reason or another, the Web still can't get right, stirring the ire of even the most patient users. We look at 10 of the worst of them.


Beyond obvious, nagging problems such as e-mail spam, phishing lures, viruses and spyware, a great many commonplace online frustrations -- some dating all the way back to the earliest days of the Web -- remain unfixed.


We asked visitors at our online forums to identify what they consider the most dysfunctional aspects of the Web, and then we polled our readers to find out which of these problems they find the most aggravating. For each difficulty, we identified an "aggravation factor" -- the percentage of readers who were either "very annoyed" or "infuriated" by the issue. We start with the ones that irk our readers most, and work our way down.


1. Dubious privacy policies


Aggravation factor: 69%


Many business-focused Web sites -- particularly in the areas of health and financial services -- collect sensitive private information from users. The vast majority of these sites have established privacy policies to lay out what information the site collects and to delineate customers' rights. But the legal jargon in these policies is often laid on so thick that customers can't understand it, leaving them unsure about whether their private data is truly safe from misuse.


Amazon.com's online privacy notice, for example, is a 2,700-word document that links to a 2,600-word conditions-of-use page jam-packed with arcane legalese. Good luck figuring out your rights if you don't have a J.D. after your name. Privacy policies at some Web sites grant the sites very broad discretion in handling private data, including the right to use the data to market other products and services to members, and the right to share data with unknown, unnamed third parties -- leaving the person who supplied the data feeling exposed.


Consumer advocates have found this problem exceedingly difficult to correct because site owners (via their attorneys) go to extremes to avoid legal liability. Of course, you can refuse to patronize any site that you suspect might take liberties with your data. But short of hiring a lawyer to analyze the privacy policy, how do you determine that a site is untrustworthy before it's too late?


2. Difficult online forms


Aggravation factor: 65%


Filling out a simple form online -- be it for something as important as a loan application or as mundane as a news site registration -- can turn into an endless cycle of annoying browser refreshes. That's because online forms often mix required and optional fields without clearly distinguishing between the two. While filling out the form, you inevitably skip one of the required fields and then sometimes have to start all over again because the site wipes the page clean.


To be fair, things have improved in recent times as companies figure out that user frustration can hurt business. Still, since the problem is so easy to fix, its continued existence is mind-boggling. Site designers should clearly mark all required fields in a different color (red would work just fine). And if a user makes an error anyway, there's no reason to wipe all the fields clean. To move things along smoothly, Web site developers should highlight any field that still needs to be filled in.



3. Overcommercialization of the Web


Aggravation factor: 62%


Interstitials, pop-ups, pop-unders, noisy Flash commercials, strobe-lit banner ads, video ads that load without user action... Just another day on the Web.


The idea of pushing advertising in exchange for free Web services has led to overcommercialization of the Web -- a major turnoff for surfers. At MySpace, Yahoo and even (we have to admit it) PCWorld.com, such advertising has grown more aggressive, increasingly annoying and impossible to avoid. On cluttered Web pages, ads jostle against each other and vie for screen real estate with the content that visitors actually came to see. The result? Slower connection speeds, slower page loads and far less user control over browsers.


Advertisements affect Web content, too. When sites measure the value of content by how many eyeballs it attracts to the ads, unusual, diverse or niche content can get squeezed out in favor of more reliably popular middle-of-the-road stuff. "I think in many ways, we have missed the potential of the Web -- much like we did with television," says Mike Tinsley, a disappointed Web user in Columbus, Ind. "When [the Web] was new, it held so much promise to be so useful for education, information and even entertainment. However, much like TV, the Web has sunk to the lowest common denominator, and I'm not sure we can ever get it back," Tinsley says.


The ad-driven online content industry will continue to devise innovative, eye-catching and obnoxious advertising formats, so things won't change for the better anytime soon. At the same time, browser makers and other software utility vendors may be able to offer some respite with features designed to restrain advertising annoyances. Browser makers like Microsoft and Mozilla should, by default, block animations or video ads from taking complete control of a Web page and obscuring the content a surfer is trying to view. At the very least, they should provide users an easy way to adjust the settings manually so as to block such intrusive annoyances.


4. Need for standards


Aggravation factor: 58%


Few things are more infuriating than going to a Web site and being told, "The page you have requested requires Internet Explorer to function properly."


The historical origin of this problem is Internet Explorer's incomplete (and sometimes incorrect) support for the core standards that are used to build Web pages. Because IE commands the largest market share among browsers, many Web designers build pages not to conform to standards, but to conform to IE. With Firefox's success, more and more sites (with the notable exception of some Microsoft sites) work properly in Mozilla's browser. But that leaves users of Opera or Safari out in the cold still. From online banking applications to newer Web 2.0-style sites, pages may not load properly on all browsers, which forces people to use different browsers for different sites.


If browsers were built to meet a consistent set of standards, this hiccup would disappear. Though each new version of IE has improved its support for standards, the problem persists because so many Web site developers continue to code only for IE, or IE and Firefox.


Having trouble creating a new document in Google Docs? The site's advice is so simplistic that it is unlikely to solve any real problems.


Among the high-profile offenders in this area are Google Docs, Washington Mutual and Yahoo -- none of which supports Opera or Safari.



5. Trolls in forums


Aggravation factor: 58%


The Internet can be a spacious platform for all sorts of community interaction, provided that the participants conduct themselves in a civil manner. Too often, though, they don't.


"I hate when I am on a forum and people just post random comments about how much somebody is a jerk or how their religion saves," said PC World reader Roberta Dikeman of Dublin, Calif. "Can we please stay on topic -- or post that drivel on your own sites!"


Hiding behind the pseudonymity of a Web alias, trolls disrupt useful discussions with ludicrous rants, inane threadjackings, personal insults and abusive language, deliberately baiting forum regulars into pointless controversy and disharmony.


Trolls lurk everywhere -- in Google and Yahoo news groups, in blog comment areas, and on specialty message boards created to offer technical help to users.


The free and fruitful exchange of ideas on the Web suffers when Web community owners have to moderate discussions and keep a tight rein on membership. But such actions are among the few effective ways to maintain civility and sanity in online forums. Another approach is for users to police the community themselves by collectively ignoring or dismissing malicious interlopers.


6. Buying event tickets


Aggravation factor: 54%


Sites like Ticketmaster have managed to transform one of the Internet's biggest conveniences -- the ability to buy and print out event tickets in a few mouse clicks -- into one of its biggest rip-offs. Never mind that automated ticketing companies have dispensed with much of the traditional overhead (staff, rent, equipment) associated with selling tickets at a physical location. Never mind that they don't have to print the tickets you buy or ship them to your home.


Ticketmaster.com, the world's largest ticketing agent, adds a $9 "convenience charge" to the price of every $32.50 ticket for a concert in San Francisco, for example, plus a $4.90 "processing fee" on top of every order. So if you buy one ticket, you pay 42% of the face value of the ticket in fees to Ticketmaster! In contrast, assuming that the show isn't sold out, you can buy the same ticket at the Civic Auditorium box office sans convenience fees for $32.50 -- a savings of nearly $14.


One reason that Ticketmaster can impose such prices is that it faces little competition in the events ticketing business; the company holds exclusive contracts with the majority of venues in the U.S. In 1994, the rock band Pearl Jam famously complained to the U.S. Department of Justice that TicketMaster's high prices were made possible by a monopoly, but the DOJ ultimately decided that Ticketmaster hadn't broken any antitrust laws.


7. Web 2.0 help doesn't help


Aggravation factor: 49%


Web 2.0 technology supports the delivery of useful applications in snazzy interactive Web interfaces, but if you need help wading through the site, the help section is often a dead end.


That's because the answers to many frequently asked questions presented there are too generic or obvious to be useful. For example, an application may not work properly because an essential browser plug-in is missing or because other software on the system is incompatible with the new app, but the FAQ and help pages on most sites don't address these problems specifically.


Rather than posting unhelpfully generic help sections and FAQs that fail to answer real-world questions, companies could invest in easy-to-use forums, wikis or chat rooms, and offer incentives to customers to assist each other in a community-driven environment.


8. The expense of e-books


Aggravation factor: 41%


Publishing and distributing books in electronic format should be a lot cheaper than doing it the old hard-copy way. No trees get pulped, and shipping costs vanish. So why should readers pay the same amount (or more) for the digital version of a book? Here's an example: At eBooks.com, Rhonda Byrne's The Secret retails for $15.29. Meanwhile, at Amazon.com, a hardcover copy of the same book (shipped to your doorstep) costs $13.17. Bizarre.
On average, publishers have set e-book prices for mass-market titles at between $8 and $16, the same range that they charge for the corresponding physical books. Supposedly, much of the sticker price goes to authors, who receive the same amount in royalties per book sold, regardless of the book's form. Publishers say they are still "working out the pricing models" -- that is, figuring out what people are willing to pay for the novelty of an e-book and what effect e-book sales will have on sales of hard copies.


9. Disappointing Web video


Aggravation factor: 38%


The picture quality of video delivered over the Internet gets better by the day, but the absence of top-shelf content continues to deter many would-be viewers from making the jump to online video.


Some major networks -- especially ABC and CBS -- have begun putting TV shows on the Web, but consumers are still struggling to find their favorite programs at a reasonable price.


In its TV Shows section, Apple's iTunes Store offers episodes at $1.99 a pop, but Rafat Ali, who tracks digital media at PaidContent.org, says that not all shows are available because large content owners (including HBO) believe that making online versions of their shows available will dilute the market for their cable television offerings.


"I can't go online and buy the last season of The Sopranos because HBO won't put it online. That's a big disappointment for a lot of viewers who love HBO's content," Ali says. "There are still a lot of hesitant content owners unwilling to put everything online."


10. Boring virtual worlds


Aggravation factor: 9%


Given the promise and hype surrounding virtual worlds, or metaverses, like Second Life, we found it interesting how few of our readers care about them. More than half of our survey takers said as much, while another 25% said that they aren't bothered at all by the quality of virtual worlds.


Yankee Group analyst Christopher Collins points out that while social networks like MySpace and Facebook continue to show phenomenal growth, the biggest virtual world, Second Life, has experienced a lower rate of traffic growth since its October 2006 peak.


Newcomers to virtual worlds (many of whom were attracted by the media hype) often leave for good after struggling with the basics of moving their avatar around or communicating with others "in-world." Their efforts aren't helped by the sites' often-clunky user interfaces or by regular software glitches. As of Oct. 7, 2007, according to Second Life's statistics, its virtual world had almost 10 million "total residents" (people signed up for the site), but only 1.3 million (13%) of them had logged in during the preceding 30 days. And only about 338,000 of them had logged in during the previous seven days.


To attract wider audiences, virtual worlds will have to become at least as user-friendly, navigable and full of things to do as the real world. And they just might achieve that goal if the companies that operate them improve their software, introduce new technologies and learn lessons from their users.




Technorati : ,

Intel moves to consolidate 133 data centers into eight facilities


Intel moves to consolidate 133 data centers into eight facilities


IT manager talks up benefits, but anonymous worker blogs about the pain of layoffs


An IT manager at Intel Corp. says that the company wants to run its data centers "like an Intel factory," a strategy that includes a plan to consolidate 133 existing IT facilities into eight data center hubs.


Intel currently has about 93,000 servers -- more than one for every employee. And many of them are based on single-core processors, according to Brently Davis, manager of the company's data center efficiency initiative.


Davis outlined Intel's data center consolidation strategy in a short blog posting and an accompanying video in which he responded to questions asked by someone who was outside of the camera's view. In response to some of the answers, the person asking the questions sprinkled in a couple of superlatives, such as, "Wow, that's amazing."


"Like most other companies these days, Intel is facing a growing demand for computing resources," Davis wrote in his blog posting. "As a result, our computing costs are going up along with that demand. All of these issues prompted us to take a hard look at our data center strategy to see where we could make it more efficient."


As part of the consolidation effort, Intel wants to increasingly move to servers with multicore processors as well as server virtualization software, Davis said in the video. Instead of running one operating system per server, the company wants to put four operating systems on individual machines, he added.


Davis said the consolidation moves may result in as much as $1.8 billion in cost savings for Intel over the next seven years. It is expected to take that long to fully implement the efficiency program, he said, although he added that the company hopes to accelerate the process and complete the work by 2010 or 2011.


Data center consolidation has become a mainstream trend within IT departments, including the ones at major technology vendors. For instance, Hewlett-Packard Co. has been reducing 85 data centers located worldwide into six major facilities, all in the U.S., under a consolidation plan announced two years ago


If Intel did nothing, its server count would continue to rise and might reach 250,000 systems at some point, Davis said.


Intel is probably in the middle of pack among large IT vendors in embarking on such a consolidation program, said Jonathan Eunice, an analyst at Illuminata Inc. in Nashua, N.H.


Eunice said companies have found that they can make dramatic savings through data center consolidation, which typically also involves consolidating applications and other software programs to ensure that different systems are running on the same versions of products and have the same patch levels. Consolidation also enables IT departments to move off of legacy hardware and applications to "a more modern foundation," he said.


While blogging by company officials is giving businesses a new way of reaching customers and the media, it's also arming some of their employees with a means of communicating anonymously with the outside world.


In Intel's case, a blogger who calls himself Intel IT Guy has been using his Intel Perspective Blog to detail the impact of ongoing cutbacks within the company's IT department.


Anonymous blogs are potentially fraught with credibility issues. But when Intel IT Guy described the layoff process in a September posting, an Intel spokesman confirmed that cuts were being made and said that the company intends to reduce its IT staff by as much as 10%.


In his most recent blog entry, posted on Wednesday, Intel IT Guy described IT workers at the company as "angry, frustrated, buried in work and looking for some leadership. And after worrying about keeping their jobs for three months, they now have to figure out how to try and keep the infrastructure from falling apart."


The blogger also criticized Intel's management for not having a concrete operating plan to follow the workforce reductions. "We need a vision, a strategy and a plan that shows where we're going and how we're going to get there," he wrote. "Resource reductions are an action, not a vision. They're a consequence, not a plan."





Technorati : , , ,

Space : One Collective Soul in Outer Space


A reality check on dreams for space: the repairsThe crews from the shuttle Discovery and the International Space Station had a farewell ceremony today and closed the hatch between the two craft. Discovery will undock tomorrow and prepare for its return to Earth on Wednesday.
It was a blubberfest.
This has been an intense mission - the planned tasks involved some of the toughest technical challenges in the history of the space station's construction process, and included adding a new room to the station, the Harmony module, and moving an enormous solar array and truss from its temporary position on top of the station to its far left side.
Astronaut Scott Parazynski worked along the truss assembly of the International Space Station on Saturday, preparing equipment for mounting on the boom extension.
NASA/Reuters





But beyond those efforts, problems made the mission even tougher. The solar array tore as it was being re-deployed, setting off a scramble to come up with a spacewalk that could repair the tear and get the array functional before the shuttle left. Without that array fully extended and able to be rotated on its own rotary joint, space station construction would have been stalled and upcoming missions delayed.
And on top of that, spacewalkers detected damage to the rotary joint on the right side of the station, one that keeps the right-side solar arrays facing the sun - a problem that will have to be addressed down the road. In a high-risk, high-stakes spacewalk Dr. Scott E. Parazynski fixed the array on Saturday.
So it's no surprise that the farewells are more than a little emotional. Clayton Anderson, who spent 137 days as a space-station crew member and will be coming home on Discovery, kept turning off his microphone as he was overcome with emotion as he thanked the "folks on the ground" - flight-control engineering and training teams in Houston, Huntsville and Moscow. "I say thank you," he said, his voice breaking. "You are indeed the best and the brightest that our world has to offer."
Over the communications loop, there was loud applause from the "folks on the ground."
Mr. Anderson then played the song "Reunion," by Collective Soul. His crewmates swayed to the music (which in zero gravity has to be seen to be believed) as it played, tinny over the orbit-to-ground transmission:
Change will come
Change is here
Love fades out
Then love appears
Now my water's turned to wine
And these thoughts I have
I now claim as mine
I'm coming home
Change has been
Change will be
Time will tell
Then time will ease
Now my curtain has been drawn
And my heart can go
Where my heart does belong
I'm going home
Discovery's commander for this mission, Pamela A. Melroy, also teared up as she thanked the the station commander, Peggy A. Whitson, and the Russian cosmonaut on board the station, Yuri Malenchenko, and said goodbye to crew member Daniel M. Tani, who will stay aboard the station. "We promise we'll send somebody to come pick you up and bring you home," she joked.
"We're family now," she said.
Mr. Tani wiped his eyes repeatedly as well.
Col. Malenchenko made headlines in 2003 during his last stint aboard the station, when he got married from orbit: his bride, Ekaterina Dmitriev, was on earth in the Villa Capri restaurant near the Johnson Space Center. A justice of the peace did the honors; Col. Malenchenko wore a bowtie with his flight suit and was represented on the ground by a paper cut-out.
One can only wonder what the stoic Col. Malenchenko thought of the waterworks from his American crewmates, but when it came time to say goodbye, he gave Mr. Anderson what looked like a real rib-crusher of a hug.
Anyone who wants to see the emotional session can tune in to NASA television, where the farewell ceremonies are replayed as part of the highlights reel that runs on the hour.



Technorati : , ,

Neurology : Big MIT contingent at Society for Neuroscience meeting


MIT's excellence in brain research will be showcased next week in San Diego as Institute scientists give five of the 24 invited talks at the annual meeting of the Society for Neuroscience.


"This is an extremely high representation from one institution," said Mriganka Sur, chair of the Department of Brain and Cognitive Sciences and one of the MIT speakers.


The Society for Neuroscience is the world's largest organization of scientists devoted to studying the brain. Some 30,000 people are expected to attend the group's annual -conference.


One of the five MIT researchers, H. Sebastian Seung, professor of computational neuroscience, will give the Presidential Special Lecture on "The Once and Future Science of Neural Networks." In his talk, Seung will describe a revolutionary new way to create a nanoscale-level map of the brain's axon and dendrite "wires" based on actual human brain tissue. This cutting-edge field, called computational neuroanatomy, is expected to confirm or deny long-held basic assumptions about how the brain works.


Seung's laboratory is one of a handful in the world working on a new initiative called "the connectome," which seeks to translate three-dimensional images of the brain at nanoscale resolution into a circuit diagram of all the brain's neurons and synaptic connections.


Current models of the human brain revolve around the belief that synapses' connections determine brain function and that synaptic plasticity underlies learning and memory. But now, "we have seen tantalizing hints that these basic ideas are at least partially true, as well as examples where they fall short," said Seung, who also is a Howard Hughes Medical Institute (HHMI) investigator. "The advent of high-throughput methods for gathering neurophysiological and neuroanatomical data will transform our ability to test the foundations of neural network theory."


The other MIT speakers and their topics are:


Mark F. Bear, Picower Professor of Neuroscience, director of the Picower Institute for Learning and Memory and HHMI investigator, will speak on "Modification of Cerebral Cortex by Experience." More than four decades of research on how synapses are formed, strengthened, weakened and lost under the influence of sensory experience have culminated in a deep understanding of the mechanisms for this synaptic plasticity. The knowledge ranges from novel insights into the pathophysiology of developmental disorders to new strategies to enhance perceptual learning and recovery from environmental deprivation, he said.


Li-Huei Tsai, Picower Professor of Neuroscience and HHMI investigator, will speak on "Mechanisms Underlying Prevention of Cognitive Decline and Restoration of Memory in Age-Dependent Neurodegenerative Disorders." The number of people with Alzheimer's disease is expected to triple from 5 to 15 million by the year 2050. Age-dependent neurodegeneration and dementia is currently incurable. Tsai's lecture will focus on recent research of cellular mechanisms, including synaptic plasticity, which have provided models for understanding neurodegeneration and memory loss. A particular emphasis will be placed on new strategies to prevent decline or restore memories.


Mriganka Sur, Sherman Fairchild Professor of Neuroscience, will speak on "New Approaches for Revealing Cortical Function: Plasticity and Dynamics of Visual Cortex Networks." Sur will talk about how a range of novel tools, including in vivo high-resolution imaging of neurons, synapses and astrocytes, cell-specific markers and genetically engineered probes, along with new experimental paradigms, are transforming the analysis of cortical networks.


Susan L. Lindquist, professor of biology, member of the Whitehead Institute for Biomedical Research and HHMI investigator, will give the Albert and Ellen Grass Lecture. Proteins begin as long strings that must fold precisely. The misfolding of certain amyloidogenic proteins associated with neuronal cells is responsible for certain neurodegenerative diseases. Surprisingly, she said, similar changes in the folding of other proteins may have beneficial effects in learning and memory. Lindquist's lecture will investigate therapeutic strategies to control the folding of amyloidogenic proteins.




Technorati : , , ,

Find here

Home II Large Hadron Cillider News