Search This Blog

Saturday, October 20, 2007

Comcast really does block BitTorrent traffic after all :Comcast Screws with File-Sharing Traffic


For a few months Comcast has been the subject of scattered reports that say it throttles BitTorrent traffic.


TorrentFreak said in August that Comcast was surreptitiously interfering with file transfers by posing as one party and then, essentially, hanging up the phone. But when we contacted Comcast at the time, it flatly denied doing it.


Thanks to tests reported Friday by the Associated Press, however, it's clear that Comcast is actively interfering with peer-to-peer networks even if relatively small files are being transferred.


The tests involved transferring a copy of the King James Bible through the BitTorrent network on Time Warner Cable, Cablevison, AT&T and two Comcast connections (in Philadelphia, Boston and San Francisco). Only the Comcast-connected computers were affected.


This is significant. The Gutenberg version of the King James Bible is only 4.24MB, which is relatively tiny and indicates that Comcast was singling out even small files.


Now, even though there's been some musing that Comcast can't do this, I'd be surprised if a court would say that it was somehow unlawful. Comcast's Terms of Service says: "You further agree to comply with all Comcast network, bandwidth, and data storage and usage limitations. You shall ensure that your bandwidth consumption using the Service does not exceed the limitations that are now in effect or may be established in the future. If your use of the Service results in the consumption of bandwidth in excess of the applicable limitations, that is a violation of this Policy...if the Service is used in a way that Comcast or its suppliers, in their sole discretion, believe violate this AUP, Comcast or its suppliers may take any responsive actions they deem appropriate.


Which is pretty broad.


The danger for Comcast is twofold. First, its hyperactive filtering may zap perfectly legitimate file transfers, which seems to have happened in one case involving a customer using Lotus Notes.


Second, it encourages countermeasures such as obfuscating BitTorrent traffic or encrypting it. That means that future efforts by Comcast to manage its traffic may be far more difficult. (If Comcast had merely slowed down BitTorrent transfers instead of cutting them off completely, users wouldn't be escalating this arms race as quickly.)


Probably the best result would be tiered pricing. BitTorrent users who are heavy users of bandwidth would pay more, while average home users would pay less. It's not perfect, and lots of Internet users may not like a tiered pricing model, but it's probably better than escalating a technological arms race, or not being able to use BitTorrent at all.





Comcast Screws with File-Sharing Traffic



Tests reveal Comcast meddles with P2P network connections

Independent testing performed by the AP has revealed that Comcast actively interferes with peer-to-peer traffic going to and from its high-speed internet subscribers, by impersonating users' machines and sending fake disconnect signals.

While traffic shaping - the act of throttling a given piece of Internet traffic based on its type, like BitTorrent or VOIP - is becoming increasingly common amongst ISPs interested in preserving quality of service, it seems that Comcast is one of the first companies that actively impersonate individual connections. Most providers will simply slow down some traffic in favor of others, or block a protocol's port number to prevent it from functioning.

According to the report, Comcast's technology affects users across many different networks, including e-Donkey, Gnutella, and BitTorrent. Robb Topolski, a former software quality engineer at Intel and Comcast subscriber, began to notice unexplainable performance problems with his P2P software. Posting to the popular forum DSLreports.com, he collected similar reports from other Comcast users around the country.

In the case of BitTorrent, Comcast's technology only kicks in when a user's client has a complete copy of the file and is uploading it to other users, and not while downloading.

Comcast spokesman Charlie Douglas would not comment directly on the matter, instead only saying, "Comcast does not block access to any applications, including BitTorrent."

There are currently very few regulations regarding traffic shaping, and none that specifically cover Comcast's particular use. The FCC says that while consumers are entitled to run the applications and services of their choice, they are subject to measures of "reasonable network management" by their ISPs. The closest directive governing Comcast's behavior - which still doesn't directly apply - would be found in AT&T's conditions for acquiring BellSouth, where it had to agree not to manipulate traffic in any way based on its origin - not service type.

Comcast's "traffic discrimination" has important ramifications for the growing number of services that are leveraging P2P as a means to distribute large files quickly and cheaply. A company like Blizzard Entertainment, who relies on BitTorrent for distributing World of Warcraft updates that often measure hundreds of megabytes in size, may have trouble reaching its players if it or they are behind a Comcast internet connection. This problem will only worsen if other ISPs decide on a similar course of action.

Ashwin Navin, co-founder and president of BitTorrent Inc. confirmed the AP's findings, and noted that he has seen similar practices from several Canadian ISPs.

"They're using sophisticated technology to degrade service, which probably costs them a lot of money. It would be better to see them use that money to improve service,




Technorati :

Errors blamed for nuclear arms going undetected


art.bruce.emig.af.jpgAir Force weapons officers assigned to secure nuclear warheads failed on five occasions to examine a bundle of cruise missiles headed to a B-52 bomber in North Dakota, leading the plane's crew to unknowingly fly six nuclear-armed missiles across the country.


That August flight, the first known incident in which the military lost track of its nuclear weapons since the dawn of the atomic age, lasted nearly three hours, until the bomber landed at Barksdale Air Force Base in northern Louisiana.


But according to an Air Force investigation presented to Defense Secretary Robert M. Gates on Friday, the nuclear weapons sat on a plane on the runway at Minot Air Force Base in North Dakota for nearly 24 hours without ground crews noticing the warheads had been moved out of a secured shelter.


"This was an unacceptable mistake," said Air Force Secretary Michael W. Wynne at a Pentagon news conference. "We would really like to ensure it never happens again."


For decades, it has been military policy to never discuss the movement or deployment of the nuclear arsenal. But Wynne said the accident was so serious that he ordered an exception so the mistakes could be made public.


On Aug. 29, North Dakota crew members were supposed to load 12 unarmed cruise missiles in two bundles under the B-52's wings to be taken to Louisiana to be decommissioned. But in what the Air Force has ruled were five separate mistakes, six missiles contained nuclear warheads.


According to the investigation, the chain of errors began the day before the flight when Air Force officers failed to inspect five bundles of cruise missiles inside a secure nuclear weapons hangar at Minot. Some missiles in the hangar have nuclear warheads, some have dummy warheads, and others have neither, officials said.


An inspection would have revealed that one of the bundles contained six missiles with nuclear warheads, investigators said.


"They grabbed the wrong ones," said Maj. Gen. Richard Newton, the Air Force's deputy chief of staff in charge of operations.


After that, four other checks built into procedures for checking the weapons were overlooked, allowing the plane to take off Aug. 30 with crew members unaware that they were carrying enough destructive power to wipe out several cities.


Newton said that even though the nuclear missiles were hanging on the B-52's wings overnight without anyone knowing they were missing, the investigation found that the Minot's tarmac was secure enough that the military was never at risk of losing control of the warheads.


The cruise missiles were supposed to be transported to Barksdale without warheads as part of a treaty that requires the missiles to be mothballed. Newton said the warheads are normally removed in the Minot hangar before the missiles are assigned to a B-52 for transport.


The Air Force did not realize the warheads had been moved until airmen began taking them off the plane at Barksdale. The B-52 had been sitting on the runway there for more than nine hours, however, before they were offloaded.


Newton did not say what explanation the Minot airmen gave investigators for their repeated failure to check the warheads once they left the secured hangar, saying only that there was inattention and "an erosion of adherence to weapons-handling standards."


Air Force officials who were briefed on the findings said investigators found that personnel lacked neither the time nor the resources to perform the inspections, indicating that the weapons officers had become lackadaisical in their duties.


One official noted that until the Air Force was given the task of decommissioning the cruise missiles this year, it had not handled airborne nuclear weapons for more than a decade, implying that most of the airmen lacked experience with the procedures.


The Air Force has fired four colonels who oversaw aircraft and weapons operations at Minot and Barksdale, and some junior personnel have also been disciplined, Newton said. The case has been handed to a three-star general who will review the findings and determine whether anyone involved should face court-martial proceedings.


Despite the series of failures, Newton said, the investigation found that human error, rather than inadequate procedures, were at fault. Gates has ordered an outside panel headed by retired Gen. Larry D. Welch, a former Air Force chief of staff, to review the Pentagon's handling of nuclear weapons.




From CNN International :Air Force officers relieved of duty over loose nukes


A six-week probe into the mistaken flight of nuclear warheads across the country uncovered a "lackadaisical" attention to detail in day-to-day operations at the air bases involved in the incident, an Air Force official said Friday.


Four officers -- including three colonels -- have been relieved of duty in connection with the August 29 incident in which a B-52 bomber flew from Minot Air Force Base in North Dakota to Barksdale Air Force Base in Louisiana.


The plane unknowingly carried a payload of nuclear-tipped cruise missiles.


"Nothing like this has ever occurred," Newton said.


"Our extensive, six-week investigation found that this was an isolated incident and that the weapons never left the custody of airmen -- were never unsecured -- but clearly this incident is unacceptable to the people of the United States and to the United States Air Force."


The probe also found there was "an erosion of adherence to weapons-handling standards at Minot Air Force Base and at Barksdale Air Force Base," Newton said.


"We have acted quickly and decisively to rectify this," he added.





Relieved of duty were the Minot wing commander and maintenance crew commander, and the Barksdale operational group commander.


Minot's munitions squadron commander was relieved of duty shortly after the incident.


Newton didn't name any of the officers, but Col. Bruce Emig had been the commander of the 5th Bomb Wing at Minot.


A number of other personnel -- "under 100," Newton said, including the entire 5th Bomb Wing at Minot -- have lost their certification to handle sensitive weaponry.


The matter will be referred to an Air Force convening authority to find out whether there's enough evidence to bring charges or any other disciplinary action against any personnel, Newton said.


Air Force Secretary Michael Wynne called the incident "an unacceptable mistake and a clear deviation for our exacting standards." VideoWatch Wynne talk about the breakdown in procedures »


"We are making all appropriate changes to ensure that this has a minimal chance of happening again, but we would really like to ensure that it never happens again," he said.


Wynne has convened a blue-ribbon panel to review all of the Air Force's security procedures and adherence to them. That panel is to report back on January 15.


The probe into the incident, which ended this week, lists five errors -- all of them procedural failures to check, verify and inspect, Newton said.


The investigation found that nuclear warheads were improperly handled and procedures were not followed as the missiles were moved from their storage facility, transferred to the bomber and loaded onto it, Newton said.


The bomber carried six nuclear warheads on air-launched cruise missiles, but the warheads should have been removed from the missiles before they were attached to the B-52.


A munitions crew at Barksdale followed proper procedure when the plane landed, discovering the error and reporting it up the chain of command, Newton said.


The weapons were secured in the hands of airmen at all times and had been stored properly at Minot, Newton said.





Technorati : ,

Errors blamed for nuclear arms going undetected


art.bruce.emig.af.jpgAir Force weapons officers assigned to secure nuclear warheads failed on five occasions to examine a bundle of cruise missiles headed to a B-52 bomber in North Dakota, leading the plane's crew to unknowingly fly six nuclear-armed missiles across the country.


That August flight, the first known incident in which the military lost track of its nuclear weapons since the dawn of the atomic age, lasted nearly three hours, until the bomber landed at Barksdale Air Force Base in northern Louisiana.


But according to an Air Force investigation presented to Defense Secretary Robert M. Gates on Friday, the nuclear weapons sat on a plane on the runway at Minot Air Force Base in North Dakota for nearly 24 hours without ground crews noticing the warheads had been moved out of a secured shelter.


"This was an unacceptable mistake," said Air Force Secretary Michael W. Wynne at a Pentagon news conference. "We would really like to ensure it never happens again."


For decades, it has been military policy to never discuss the movement or deployment of the nuclear arsenal. But Wynne said the accident was so serious that he ordered an exception so the mistakes could be made public.


On Aug. 29, North Dakota crew members were supposed to load 12 unarmed cruise missiles in two bundles under the B-52's wings to be taken to Louisiana to be decommissioned. But in what the Air Force has ruled were five separate mistakes, six missiles contained nuclear warheads.


According to the investigation, the chain of errors began the day before the flight when Air Force officers failed to inspect five bundles of cruise missiles inside a secure nuclear weapons hangar at Minot. Some missiles in the hangar have nuclear warheads, some have dummy warheads, and others have neither, officials said.


An inspection would have revealed that one of the bundles contained six missiles with nuclear warheads, investigators said.


"They grabbed the wrong ones," said Maj. Gen. Richard Newton, the Air Force's deputy chief of staff in charge of operations.


After that, four other checks built into procedures for checking the weapons were overlooked, allowing the plane to take off Aug. 30 with crew members unaware that they were carrying enough destructive power to wipe out several cities.


Newton said that even though the nuclear missiles were hanging on the B-52's wings overnight without anyone knowing they were missing, the investigation found that the Minot's tarmac was secure enough that the military was never at risk of losing control of the warheads.


The cruise missiles were supposed to be transported to Barksdale without warheads as part of a treaty that requires the missiles to be mothballed. Newton said the warheads are normally removed in the Minot hangar before the missiles are assigned to a B-52 for transport.


The Air Force did not realize the warheads had been moved until airmen began taking them off the plane at Barksdale. The B-52 had been sitting on the runway there for more than nine hours, however, before they were offloaded.


Newton did not say what explanation the Minot airmen gave investigators for their repeated failure to check the warheads once they left the secured hangar, saying only that there was inattention and "an erosion of adherence to weapons-handling standards."


Air Force officials who were briefed on the findings said investigators found that personnel lacked neither the time nor the resources to perform the inspections, indicating that the weapons officers had become lackadaisical in their duties.


One official noted that until the Air Force was given the task of decommissioning the cruise missiles this year, it had not handled airborne nuclear weapons for more than a decade, implying that most of the airmen lacked experience with the procedures.


The Air Force has fired four colonels who oversaw aircraft and weapons operations at Minot and Barksdale, and some junior personnel have also been disciplined, Newton said. The case has been handed to a three-star general who will review the findings and determine whether anyone involved should face court-martial proceedings.


Despite the series of failures, Newton said, the investigation found that human error, rather than inadequate procedures, were at fault. Gates has ordered an outside panel headed by retired Gen. Larry D. Welch, a former Air Force chief of staff, to review the Pentagon's handling of nuclear weapons.




From CNN International :Air Force officers relieved of duty over loose nukes


A six-week probe into the mistaken flight of nuclear warheads across the country uncovered a "lackadaisical" attention to detail in day-to-day operations at the air bases involved in the incident, an Air Force official said Friday.


Four officers -- including three colonels -- have been relieved of duty in connection with the August 29 incident in which a B-52 bomber flew from Minot Air Force Base in North Dakota to Barksdale Air Force Base in Louisiana.


The plane unknowingly carried a payload of nuclear-tipped cruise missiles.


"Nothing like this has ever occurred," Newton said.


"Our extensive, six-week investigation found that this was an isolated incident and that the weapons never left the custody of airmen -- were never unsecured -- but clearly this incident is unacceptable to the people of the United States and to the United States Air Force."


The probe also found there was "an erosion of adherence to weapons-handling standards at Minot Air Force Base and at Barksdale Air Force Base," Newton said.


"We have acted quickly and decisively to rectify this," he added.





Relieved of duty were the Minot wing commander and maintenance crew commander, and the Barksdale operational group commander.


Minot's munitions squadron commander was relieved of duty shortly after the incident.


Newton didn't name any of the officers, but Col. Bruce Emig had been the commander of the 5th Bomb Wing at Minot.


A number of other personnel -- "under 100," Newton said, including the entire 5th Bomb Wing at Minot -- have lost their certification to handle sensitive weaponry.


The matter will be referred to an Air Force convening authority to find out whether there's enough evidence to bring charges or any other disciplinary action against any personnel, Newton said.


Air Force Secretary Michael Wynne called the incident "an unacceptable mistake and a clear deviation for our exacting standards." VideoWatch Wynne talk about the breakdown in procedures »


"We are making all appropriate changes to ensure that this has a minimal chance of happening again, but we would really like to ensure that it never happens again," he said.


Wynne has convened a blue-ribbon panel to review all of the Air Force's security procedures and adherence to them. That panel is to report back on January 15.


The probe into the incident, which ended this week, lists five errors -- all of them procedural failures to check, verify and inspect, Newton said.


The investigation found that nuclear warheads were improperly handled and procedures were not followed as the missiles were moved from their storage facility, transferred to the bomber and loaded onto it, Newton said.


The bomber carried six nuclear warheads on air-launched cruise missiles, but the warheads should have been removed from the missiles before they were attached to the B-52.


A munitions crew at Barksdale followed proper procedure when the plane landed, discovering the error and reporting it up the chain of command, Newton said.


The weapons were secured in the hands of airmen at all times and had been stored properly at Minot, Newton said.





Technorati : ,

one happy cow


one happy cow


This is one happy cow


No smacking, no prodding, no shouting. Britain's most enlightened farms are revolutionising the way meat is reared. And the reason? Contented cattle make great steaks. Read more from Alex Renton on happy cows and farming on our food blog



The boggy fields around Bill Cassells' bungalow are dotted with great humps of ginger hair, like an invasion of alien fungus. Venture a little closer and you find that these move. Behind curtain-sized fringes are huge brown eyes and above them a spread of horns a good three feet wide. These are Highland cattle and, for all that they look like half a ton of fright wig on legs, they're the epitome of mellow. They wander over to inspect us at Bill's call, and are hardly put out even when Murdo, OFM's photographer, starts planting lighting reflectors and flash guns around them. Anna, the five- year-old matriarch of this 75-strong herd, slouches forward to pose for the picture with all the cool of a catwalk veteran. These are happy cattle, these Highlanders. They live outside on the hills above Scotland's River Spey, eating heather and the moorland grass, supplemented in winter with draff, the malty remains of the grain used in the nearby whisky distilleries.



What could top the "Got Milk?" campaign from the 1990s?


The Happy Cows campaign from California!


Apparently there's a bit of competition between USA states for buyers of cheese. Hmmm will I get Vermont cheese or California cheese? To help people decide, the California Milk Advisory Board commissioned Deutsch Los Angeles to make some very expensive but very effective TV ads. The result? A series of vignettes in which we are informed that dairy cows living in California just love it there. So the milk has to be better!


"Great cheese comes from happy cows.


Happy cows come from California."


On the Real California Cheese web site there are three of the TV spots available. Big Sheep - reminiscent of Braveheart. "Breaking Out" - a dairy farm set in the Arctic, or is it Vermont? "Girl Talk" - what the gender-related chat of cows might sound and look like.


To see the ads go to www.realcaliforniacheese.com


Click to get past the splash screen, then choose "Happy Cows", and "Happy Cow TV".


While you're there you can even download a happy cows calendar, or purchase a cute and cuddly set of happy cows puppets.



Now what makes this campaign interesting is the 'anti-campaign' by PETA, People for the Ethical Treatment of Animals, found at www.unhappycows.com PETA took the California Milk Advisory Board to court for misrepresentation of the lot of dairy cows. According to their suit, dairy cows often have to put up with dusty, muddy and dirty conditions. Picturing them in green fields is misleading. The court suit was unsuccessful. It may have lead to a higher interest in the happy cows. And what do the happy/unhappy cows think of it all I wonder?






Technorati : , ,

Computers With 'Common Sense'


Computers With 'Common Sense'


Using a little-known Google Labs widget, computer scientists from UC San Diego and UCLA have brought common sense to an automated image labeling system. This common sense is the ability to use context to help identify objects in photographs.


For example, if a conventional automated object identifier has labeled a person, a tennis racket, a tennis court and a lemon in a photo, the new post-processing context check will re-label the lemon as a tennis ball.


"We think our paper is the first to bring external semantic context to the problem of object recognition," said computer science professor Serge Belongie from UC San Diego.


The researchers show that the Google Labs tool called Google Sets can be used to provide external contextual information to automated object identifiers.



Google Sets generates lists of related items or objects from just a few examples. If you type in John, Paul and George, it will return the words Ringo, Beatles and John Lennon. If you type "neon" and "argon" it will give you the rest of the noble gasses.


"In some ways, Google Sets is a proxy for common sense. In our paper, we showed that you can use this common sense to provide contextual information that improves the accuracy of automated image labeling systems," said Belongie.


The image labeling system is a three step process. First, an automated system splits the image up into different regions through the process of image segmentation. In the photo above, image segmentation separates the person, the court, the racket and the yellow sphere.


Next, an automated system provides a ranked list of probable labels for each of these image regions.


Finally, the system adds a dose of context by processing all the different possible combinations of labels within the image and maximizing the contextual agreement among the labeled objects within each picture.


It is during this step that Google Sets can be used as a source of context that helps the system turn a lemon into a tennis ball. In this case, these "semantic context constraints" helped the system disambiguate between visually similar objects.


In another example, the researchers show that an object originally labeled as a cow is (correctly) re-labeled as a boat when the other objects in the image - sky, tree, building and water - are considered during the post-processing context step. In this case, the semantic context constraints helped to correct an entirely wrong image label. The context information came from the co-occurrence of object labels in the training sets rather than from Google Sets.


The computer scientists also highlight other advances they bring to automated object identification. First, instead of doing just one image segmentation, the researchers generated a collection of image segmentations and put together a shortlist of stable image segmentations. This increases the accuracy of the segmentation process and provides an implicit shape description for each of the image regions.


Second, the researchers ran their object categorization model on each of the segmentations, rather than on individual pixels. This dramatically reduced the computational demands on the object categorization model.


In the two sets of images that the researchers tested, the categorization results improved considerably with inclusion of context. For one image dataset, the average categorization accuracy increased more than 10 percent using the semantic context provided by Google Sets. In a second dataset, the average categorization accuracy improved by about 2 percent using the semantic context provided by Google Sets. The improvements were higher when the researchers gleaned context information from data on co-occurrence of object labels in the training data set for the object identifier.


Right now, the researchers are exploring ways to extend context beyond the presence of objects in the same image. For example, they want to make explicit use of absolute and relative geometric relationships between objects in an image - such as "above" or "inside" relationships. This would mean that if a person were sitting on top of an animal, the system would consider the animal to be more likely a horse than a dog.


Reference: "Objects in Context," by Andrew Rabinovich, Carolina Galleguillos, Eric Wiewiora and Serge Belongie from the Department of Computer Science and Engineering at the UCSD Jacobs School of Engineering. Andrea Vedaldi from the Department of Computer Science, UCLA.


The paper will be presented on Thursday 18 October 2007 at ICCV 2007 - the 11th IEEE International Conference on Computer Vision in Rio de Janeiro, Brazil.


Funders: National Science Foundation, Afred P. Sloan Research Fellowship, Air Force Office of Scientific Research, Office of Naval Research.





Technorati : ,

Killifish can adapt


Killifish can adapt


killifish can adapt to life in a tree


Biologists in Belize and Florida have discovered that the mangrove killifish lives in trees when the water they usually live in has disappeared.



The London Telegraph said scientists found hundreds of killifish hiding in the rotting branches and trunks of mangrove trees.


Scott Taylor of the Brevard County Environmentally Endangered Lands Program in Florida told New Scientist magazine that the fish -- which normally live in mangrove swamps -- are able to change their bodies and metabolism to cope with life out of water. The changes are reversed as soon as they return to the water.





Technorati : , ,

Best Solar Homes:


Best Solar Homes:


Best Solar Homes: German Team Wins Solar Decathlon


Second place went to the University Of Maryland. The Maryland LEAFHouse has one of the few technical innovations in the competition -- a waterfall that incorporates design and function to reduce both moisture and the energy needed for air conditioning.



The Solar Decathlon challenged 20 college and university teams to compete in 10 contests and design, build, and operate the most attractive and energy-efficient solar-powered home.


The Solar Decathlon's homes are zero-energy, yield zero carbon, and include the latest high-tech solutions and money-saving benefits to consumers, without sacrificing comfort, convenience, and aesthetics. Each house must also produce enough "extra" energy to power an electric vehicle. Many of the solar power and building technologies showcased on the National Mall are available for purchase and use. Teams have worked for more than two years designing, building and testing their homes - the Solar Decathlon is the culmination of that work.


The ten contests that decide the Solar Decathlon measure many aspects of a home\'s performance and appearance. A perfect total score for all ten contests is 1,200 points.


First Place: Technische Universit├Ąt Darmstadt


This team from Germany came to the Solar Decathlon hoping to have an impact on people, and it\'s safe to say that this happened. Darmstadt won the Architecture, Lighting, and Engineering contests. The Architecture Jury said the house pushed the envelope on all levels and is the type of house they came to the Decathlon hoping to see. The Lighting Jury loved the way this house glows at night. The Engineering Jury gave this team an innovation score that was as high as you could go, and said nobody did the integration of the PV system any better. Darmstadt was one of seven teams to score a perfect 100 points in the Energy Balance contest. All week, long lines of people waited to get into this house. Total points - 1024.85


Second Place: University of Maryland


At the beginning of the week, people wondered if the Maryland team would have a home-field advantage because they are so close to Washington, D.C. As the week progressed, and Maryland won the Communications contest and was second in Architecture, Market Viability, and Lighting, it became clear that Maryland didn\'t need any advantage. The Communications Jury praised their excellent Web site and house tour. The Architecture Jury said the house definitely belonged in the top tier. The Lighting and Market Viability juries also had high praise. They were one of seven teams to score a perfect 100 points in the Energy Balance contest. Total points 999.807


Third Place: Santa Clara University


This team wanted to build a sustainable solar house that is functional, elegant, and innovative-and they did just that. The Communications Jury lauded their friendly, enthusiastic house tour, which was informative, entertaining, and very much "on target" for public audiences. They were one of five teams to score a perfect 100 points in the Hot Water contest and one of seven teams to score a perfect 100 points in the Energy Balance contest. Their house almost didn\'t make it to the Solar Decathlon, because their transport truck broke an axle and delayed them by three days. Total points 979.959



MORE NEWS.....




Thin-layer Solar Cells May Bring Cheaper Green Power


Scientists are researching new ways of harnessing the sun's rays which could eventually make it cheaper for people to use solar energy to power their homes.


The experts at Durham University are developing light-absorbing materials for use in the production of thin-layer solar photovoltaic (PV) cells which are used to convert light energy into electricity.


The four-year project involves experiments on a range of different materials that would be less expensive and more sustainable to use in the manufacturing of solar panels.


Thicker silicon-based cells and compounds containing indium, a rare and expensive metal, are more commonly used to make solar panels today.


The research, funded by the Engineering and Physical Sciences Research Council (EPSRC) SUPERGEN Initiative, focuses on developing thin-layer PV cells using materials such as copper indium diselenide and cadmium telluride.


Right now the project is entering a new phase for the development of cheaper and more sustainable variants of these materials.


The Durham team is also working on manipulating the growth of the materials so they form a continuous structure which is essential for conducting the energy trapped by solar panels before it is turned into usable electricity. This will help improve the efficiency of the thin-layer PV cells.


It's hoped that the development of more affordable thin-film PV cells could lead to a reduction in the cost of solar panels for the domestic market and an increase in the use of solar power.


Solar power currently provides less than one hundredth of one percent of the UK's home energy needs.


The thin-layer PV cells would be used to make solar panels that could be fitted to roofs to help power homes with any surplus electricity being fed back to The National Grid.


This could lead to cheaper fuel bills and less reliance on burning fossil fuels as a way of helping to generate electricity.


Professor Ken Durose, Director of the Durham Centre for Renewable Energy, who is leading the research, said: "One of the main issues in solar energy is the cost of materials and we recognise that the cost of solar cells is slowing down their uptake.


"If solar panels were cheap enough so you could buy a system off the shelf that provided even a fraction of your power needs you would do it, but that product isn't there at the moment.


"The key indicator of cost effectiveness is how many pounds do you have to spend to get a watt of power out?


"If you can make solar panels more cheaply then you will have a winning product."


To aid its research the university has taken delivery of a £1.7 million suite of high powered electron microscopes, funded by the Science Research Investment Fund, which have nano-scale resolution allowing scientists to see the effects that currently limit the performance of solar cells.


One of the microscopes is the first of its kind in the UK and Professor Durose said: "This instrument will put the North East right out in front.


"We are working on new ideas in renewable energy and this opens up tremendous opportunities in research."




Technorati : , ,

MU researchers consider red wine


MU researchers consider red wine


Here's another healthy reason to drink wine: It fights food-borne pathogens. A University of Missouri-Columbia study recently found promising evidence that harmful bacteria such as salmonella and E. coli can be killed with a little red wine or grape juice.


"People have been consuming wine for eons," said Azlin Mustapha, a microbiologist in the food science program at the College of Agriculture, Food and Natural Resources who used more than a dozen commercial wines in the research. "Even before we understood what it did, people were already using it for medicinal effects."


Anecdotal evidence and red wine's known cardiovascular impact as well as its potential for lowering cholesterol and as an anti-carcinogen led Mustapha and doctoral student Atreyee Das to explore whether red wine could kill pathogens while allowing protobodies - "good" bacteria - to live.


The Centers for Disease Control and Prevention estimates that food-borne diseases affect 76 million Americans each year and result in 5,000 deaths. "What we're seeing now already is very promising," Mustapha said, explaining that when pathogens are introduced to a test tube containing certain red wines, pathogens are killed.


"Sixty percent" concentration "wine is enough to kill bacteria," said Das, 26, who began the research as her doctoral work in the winter semester. "We're trying to delve in deeper," she added, noting that additional pathogens will be tested. Das and Mustapha said animal tests will be conducted next, a necessary step toward learning whether the wines have a similar effect in humans.


Mustapha said that merlot, cabernet, pinot noir and shiraz varieties worked best, with no differentiation between brands or domestic and imported wines.


White wines and grape juice were also tested but had no effect, indicating that that something present in the grapes' skin, which produces the red color, fights off the bacteria. Although results of the experiment are promising, Mustapha warns this is no reason to head to a local vineyard on a daily basis.


"I would not recommend that people go out and consume wine in excess," she said, noting that it is not known whether drinking wine before exposure to pathogens or after the fact would be most effective.


The researchers estimate they have two or three more years of research on the subject. Mustapha said research like this might be used to produce a novel wine containing protobodies.







Technorati : , , , , , ,

Unity3D


Unity3D












OTEE's Unity3D promises to be what the Mac needs and many of us have wanted for years, a program that is to 3D what Macromedia Director was to 2D. It's close enough that it will take your breath away, but it falls short of its promise in ways that may frustrate many potential users.


A Little History


There have been many radically innovative programs on the Macintosh. MacWrite, PageMaker, ImageStudio (monochrome precursor to Photoshop), Live Picture (even if it ultimately failed), Illustrator, Fontographer, Authorware, Futuresplash Animator (Flash), Avid Media Composer, After Effects, Dreamweaver, Swivel 3D, HyperCard, Lightspeed Pascal, PowerPoint, and even Excel. Phew, what a long list.


One of the most radically innovative Macintosh programs started life as Videoworks, but became well-known as Director. Director is now circling the drain, with no significant updates since 8.5 (MX was simply "Carbonised", and MX 2004 added a bunch of DVD support and made Lingo more like JavaScript), over five years ago, and no Universal Binary.


So, what is a multimedia/game developer seeking reasonably state-of-the-art tools to do? Director is moribund, most of its competitors - from mTropolis to iShell have fallen by the wayside.


Unity3D - The Promise



Physics? I ducked that class in high school! Here is a whole bunch of programming you won't have to do.


Unity3D sets out to be the ultimate 3D game development environment. In many ways it succeeds, and in fact it succeeds so well in many respects that it comes close to being the ultimate multimedia development environment. For $250 (for the indie version) you can easily put together interactive 3D programs, dashboard widgets, and web plugins. With a little more effort you can produce 3D screen savers.


For $1500 (for the pro version) you can also produce Windows executables and gain access to a number of higher-end features, including render-to-texture, C++ plugins, and gorgeous preset full-screen and shader effects (such as the currently popular "bloom" effect). You can put together some things with little or no programming. If you just want to display a 3D model or animation, you don't need to write a line of code.


To produce an interactive walkthrough, you need only write a few lines of code. You can bring in models and animations produced using Maya, Cinema 4D, Cheetah 3D, or any program that can export to FBX, which covers most high-end 3D software, including Lightwave 3D, Softimage, and 3D Studio Max. With the first three programs, the workflow is seamless. Otherwise, you simply need to export an FBX. Unity3D also supports Blender files, although it currently cannot import animations.


If you do want to program interactivity, Unity 3D offers you a choice of three scripting languages, "JavaScript", C#, and a fairly obscure language called Boo (think compiled Python). Each of these languages is pretty easy for a programmer familiar with C-like syntax (e.g. PHP) to pick up, although the environment is a little quirky.


Features


Unity3D is very much a state-of-the-art real-time 3D tool. First, it supports all the latest graphics hardware (you can even write your own vertex and pixel shaders). If you don't know what vertex and pixel shaders are, then the short answer is pixel shaders are the difference between DX8-10 and DX7. They're what CoreImage is built on. Vertex shaders are the difference between being able to barely animate two human figures at a time (in say the original Tomb Raider) and being able to animate forests.




Coding Unity3D. This much JavaScript code handles the character (collisions are handled without code in this case) and the camera movement. The behavior is completely configurable from the IDE (e.g. you can change the character's walking speed, turn rate, and so on).


Second, Unity has the Aegia Physics engine built in. You can simply click on an object and pick "rigidbody" from a menu to make that object subject to gravity and collisions. Stack up some objects, make them rigidbodies, click "play" and you see a Physics simulation instantly. Unity3D also supports hinges/joints, springs, forces, and so on. Unity's built-in particle system is terrific, and it plays well with Physics too.


Third, Unity offers excellent audio support, including effortless implementation of 3D positional audio (including doppler-shifting sounds from moving targets).


Pretty much everything you'd expect to work, whether it's importing animations from powerful animation tools like Maya, or normal maps, or sky boxes, just works. (If you've ever tried to work with Director's 3D import, and character animation in particular, this may sound like Science Fiction, but yes, it's true.)


The simplest way to put it is this. Go look at a current single player AAA console game. Chances are, Unity3D can do that stuff. But remember, I said "single player" and "console". Unity3D doesn't have much support for networked multiplayer games. It offers all the networking functionality present in .NET 1.x, but this is kind of like saying a C compiler lets you write networked multiplayer games. It's also lacking in 2D and video support, as we shall see.


Underpromise and Overdeliver


In some ways, Unity actually exceeds expectations. It comes with very good examples (more are available on the website), solid tutorials (including several video tutorials), and the features of Unity 3D that were developed internally are well documented. Unity3D has excellent forums and a growing wiki and the developers are very responsive to questions (and actually use the software themselves). In many respects it is the model of a bleeding edge 21st century software company, the kind of guys who ship a new version of their program while waiting for Dilbert's pointy-headed boss to shut up in a meeting.


Flies in the 2D Ointment



Setting up 2D interface objects. If you want to build a 2D interface in Unity, you'd better learn to love this little panel. The "Pixel Inset" values are misleadingly named, they should be called "left, top, right, bottom". Each 2D element is "pinned" to a screen x,y coordinate (each ranging from 0 to 1) by a handle (located top/center/bottom by middle/center/right). So positioning an element involves setting the handle and then the "pixel inset" values as offsets from the handle. You certainly can't just pick up an object and drag it into position. And automating the process is difficult for other reasons.


All is not perfect in Unity land. From my perspective this comes down to five things.


First, Unity's JavaScript isn't really JavaScript - it's about as different from JavaScript as, say, ActionScript. Unlike JavaScript, it supports (optional) explicit types, lacks many dynamic features, but is compiled and runs insanely fast. The speed is wonderful, but the subtle differences from JavaScript and the minor but irksome inconsistencies of the underlying runtime environment can be infuriating (e.g. arrays have a .length property, but strings have a .Length property … or is it the other way around?) and aren't well-documented.


Second, Unity leverages a huge amount of open source functionality. E.g. it incorporates the Mono code-base for .NET functionality and the CLI (Common Language Infrastructure; hence C# and Boo; "JavaScript" is built on top of Boo). The functionality is welcome, but the lack of documentation and widely varying "standards" can be frustrating. Unity's documentation is pretty decent - as far as it goes - but you won't find third-party books on its flavor of JavaScript, or those portions of .NET it happens to support.


I should note that on the positive side, Unity's forums are incredibly lively, full of very helpful and knowledgeable people, and the developers are extremely responsive to questions, feedback, and bug reports.


Third, as much as the developers have lavished attention on its 3D capabilities, Unity3D's 2D capabilities are relatively awkward. You certainly can't just drag a 2D interface element into position, just forget you even thought of that. First you need to give it a pseudo coordinate, then you need to pin it to that coordinate, then you need to edit some pixel offsets. Building 2D interfaces (and most good 3D games have a significant 2D component) is, at best, fiddly. There's nothing - or very little - you can't do, but much of what you can do is, say, far more difficult than doing things in 3D is in Director.


Fourth, while the development environment is completely live (which is great) there's no actual debugger you to let you step through code and watch variables. (Such is the curse of the new generation of programming languages). The live environment lets you create fairly solid workarounds (and when a script crashes, the game keeps running; you just get an error message, and it will point you to the line of code that caused the problem), but sometimes you just want a debugger to step through code, and Unity doesn't have one.


Finally, Unity3D's undo is, at best, quirky. This would normally be a crippling deficiency, but oddly in Unity, it's merely annoying. In part this is because so much of your work will be done in a text editor or 3d tool (either of which probably have working undo). In part it's because its interface is so powerful in other ways (see the section on workflow, below). In any event, Unity certainly needs to improve its undo support.


This Revolution Can't Be Televised Yet


The major missing piece of functionality in Unity3D right now is video support. If you want to play a video on a TV in your virtual world, you'll need to have the Pro version of Unity3D, a sketchy third-party plugin, and a good measure of luck. I doubt it would get through any kind of QA process. This is sad, because if Unity3D had robust video functionality it would essentially eclipse Director in all major functional respects.


For multimedia developers, there is a potential workaround. Unity can run in a web browser and offers excellent browser/JavaScript integration, so you could combine Unity with Flash or QuickTime inside a browser, but you can't really allow the two things to overlap.


Of less importance to me is support for Director's many many supported file formats, notably Flash (hardly surprising, but still), HTML, or humble text. (Why is it that developers of multimedia software always forget text?) Given all these shortcoming, you won't be able to seamlessly replace Director (or Flash) with Unity in your multimedia production just yet.


The Unity 3D Workflow


OTEE is justifiably proud of Unity3D's workflow. To develop 3D games/applications, aside from Unity itself, you'll need a compatible 3D modeling/animation tool (e.g. Maya, Cinema4D, or Cheetah3D), a suitable image editing tool (e.g. Photoshop), and probably an audio editor (e.g. Amadeus Pro). You may want to use your own text editor.


Unity uses its own version of Smultron (called "Unitron"), which is perfectly decent, although if you have your own favorite text editor, you'll probably prefer to stick with it. As mentioned earlier, Unity3D's workflow with the apps mentioned is seamless. When you double-click an asset in Unity, it opens in the source program. When you save changes, Unity updates automatically by doing all the export work without your having to.



This is the Unity3D development environment. 1: this is my game running. 2: this is the 3d world editor; my character is currently selected (but I can change the selection while the game runs - the entire interface is live and modeless). 3: this is my character's transform, which is visibly changing as I run around.


A word about what using Unity is like (in case you don't feel like downloading the rather hefty disk image to try it out for yourself). Unity runs its game engine live in the development environment (so you can see what you're doing live and interact with it live). Furthermore when you run your project, the development environment is live. E.g. if you instantiate an object at runtime, it appears in the development environment and its properties change in real time. You can pause the game, manipulate objects and properties, and continue. Live. I repeat: live.


Again, I should mention two other key features of Unity3D in practice:


If you write a configurable script (e.g. it might have several user-editable properties) all you need to do is declare them (e.g. var name : String = "Fred";) and the variable becomes user-editable in the main interface. If you create an enumeration and set a variable to that type, you get a popup menu in the editor interface. Powerful, elegant, and beautiful. (If you've written user-configurable behaviors in Director - shudder - you're probably drooling on your keyboard right now.)


If you want to reuse a particular configuration of objects, e.g. an object nested in another object with a few components and scripts, you can turn it into a "prefab".


You can then create instances of that prefab anywhere (e.g. by dragging it into a scene) or create instances it at runtime with a single line of code. If you modify a prefab, it remains linked to the original, and you can reset it to the original (entirely, or one property at a time) or make the prefab conform to it (with a single menu command). Again, the equivalent operations in Director range from being difficult to near impossible, and if you've build reusable components in Flash, well, this is a lot easier.


Time to ship. You may wonder, after all this, how difficult it is to build an application with Unity. Well, you select a menu item. Now, congratulate yourself on a job well done and have a beer.


Unity not only publishes with a single click, it performs image compression "just in time". So you can keep your textures as layered Photoshop files knowing that you can tune the delivery format at the last moment by modifying Build Settings. It would be even cooler if Unity could do the same thing with level of detail in 3D models and audio compression, but I guess we can't have everything.


Performance


Obviously, all this functionality comes at a hefty price in performance. To run the Unity3D development environment you'll want the latest eight core Mac Pro with at least 16GB of RAM and dual 30″ monitors. This is only reasonable, since you probably want this kind of setup if you want to play with Aperture or Motion, right?


I'm joking.


Unity3D runs just fine on my 1.2 GHz iBook with 1.25GB of RAM (indeed I wrote a little game on my iBook while we were snowed-in in Denver over Christmas). I bought my license after trying it out on my (since supplanted) dual 1GHz G4 Power Mac. Not every high end feature works on older hardware, although surprisingly many do, and GMA 950-equipped MacBooks and Mac Minis run Unity just fine. I've repeatedly been surprised at how many of Unity's higher end effects work on older hardware. As projects become more complex, resource requirements increase and you can never have too much memory. Certainly, if you can run the latest Maya or Cinema4D, Unity3D isn't going to trouble your hardware. Unity3D is not the resource hog you might reasonably expect it to be.


So, what's it for?


Is Unity3D a solution looking for a problem? When VideoWorks first came out it certainly seemed to be. (But then, it originally had no scripting language.) Unity3D was obviously conceived of as the ultimate game development tool, but it has obvious applications for developing education and training programs, interactive multimedia, VR applications (yes, they still exist), and presentations.


Its existence appears to be owed in large part to major initiatives in the EU toward funding the development of compelling educational software for use in schools. Clearly, for almost all of its potential users, its major missing feature is stronger 2D layout functionality. Unless you're developing a certain kind of game, you're going to need a fairly complex user interface at some point, and as of now building it in Unity3D is doable, but not pleasant.


Conclusions


Pros


Bleeding Edge 3D Graphics
Given its capabilities, gentle learning curve, good tutorials and documentation
Given its capabilities, modest hardware requirements
Amazingly Interactive IDE
Seamless workflow from Maya, Cinema4D, and Cheetah 3D; solid workflow from most high-end tools
Aside from undependable Undo, amazingly Mac-like
Lets you target Macintosh, Windows, and Web Browsers
Excellent Price ($250 for Indie License, $1500 for Pro)
Amazingly helpful and friendly people at OTEE, developer community, forums, and Wiki
Upcoming Wii support
Cons


Undependable Undo in IDE
Clumsy 2D Capabilities
No real video support
Gaps in the documentation send you to Google or the Forums (e.g. "JavaScript" and Mono)
Possibly a little "too visual" to work well for very large projects
It's not fair to review Unity3D as a Director replacement, since that's not what it's intended to be. Unity is intended to allow anyone, from a hobbyist to a small, serious development team to produce their own Quake, or (more likely) compelling 3D educational and training software or promotional tools. It is intended to let you deploy your finished product as a Mac or Windows program, or as a web plugin (or, soon, as a Wii game).


As such, with some of the generally minor issues discussed above set aside, it succeeds admirably. You will find building the 2D components a little painful, but nothing terrible. And you can always minimize your 2D interface components in favor of realtime 3D. But, life isn't fair. Whether it's intended as a Director replacement or not, Unity3D practically screams to be considered for this role. Unity is only missing video functionality and improved 2D functionality to be the next Director.


For now, if you're even mildly interested in developing 3D games, you should try Unity3D (it has a free trial).





Technorati :

A space odyssey :Man on the moon


Man on the moon, Malaysian in space... what else does the future hold?


WATCHING the Soyuz rocket launcher take off last week, carrying the first Malaysian into space, my mind went back 38 years ago when I viewed, on TV, the first manned landing on the Moon.


At least, that's what I hope it was, because my recollection of the event is somewhat blurred. Actually, it was ostensibly something of great significance that I watched, back in 1969 - I just can't recall what I made of it as a mere lad.


We did not have a TV set at home, and I vaguely remember my father talking about some major cosmic event. Sometime in the evening, the family walked over to a neighbour's house to watch this on TV.


There was something said about landing on the moon, and of course, the name "Apollo 11" was bandied around, and also, Neil Armstrong. What I saw on TV remains vaguely imprinted on my mind ... fuzzy black and white images perhaps, maybe even the voices of men communicating through a vast distance, although I can't ascertain to this day if these were merely figments of my imagination.


Later, I learnt I had watched the first manned lunar landing.


Almost four decades on, it's still a bit blur to me, whether I actually saw the event on TV. I know and believe man landed on the moon in 1969, despite the conspiracy theories that have been floating around since the 1970s. The point is - did we watch it on TV the way I remember? Does anyone have a clear recollection of the events as shown on TV? I, for one, would certainly like to know.


Give me the details, everything! Anyway, looking ahead, when manned landings on the Moon resume, perhaps we'll see a Malaysian traipse across the lunar surface.


Think of the excellent spin-offs in the food and beverage industry - anyone for a teh tarik lunar or roti canai Marikh? In fact, I'm surprised we haven't seen items like mee goreng Soyuz or roti ISS being offered so far.


Any takers?


While we're on the subject of celestial objects... I recently received an irate missive from a friend who was pretty pissed off because, somewhere, someone had written that we would look to colonising the moon and other planets in future because we would run out of space on Earth.


He was angry that anyone would perpetrate such false hope, because, as far as he and the rest of mankind were concerned, we had just one habitable planet and that was it - screw it up and we're on the slide to oblivion.


He offered one thought to control the expanding population - have fewer children - and cited China as an example. Me, I'm an idealist, having grown up on a steady science-fiction diet of Arthur C. Clarke, Isaac Asimov, Ray Bradbury and Star Trek. I would like to think - well, hope - that somewhere in the future, in half a millennium or so, man would have reached across the chasms of space to call other worlds home.


And that these worlds would be better than we could ever imagine Earth to be now ... and that people like you and me could hop regularly across cosmic distances without undergoing months of rigorous specialised training for this.


Hopes and dreams, after all, are what take us forward and indeed, make today's farfetched vision a reality. Look at how far we've come in communication technology - just take the mobile phone and the Internet for example. Hmm, then again...


Dreaming may be the safer option.




Technorati : ,

Show Tests Roaches' Radiation Resistance


source : AP


Would cockroaches survive a nuclear holocaust that killed everything else? That question is being tested this week at the nearby Hanford nuclear reservation by a team from the "Mythbusters" show on the Discovery Channel, which expects to air the episode in about four months.


"It's been on the original list of myths since day one," said Kari Byron, who appears on the cable television series and was in town with Grant Imahara and Tory Belleci for the tests.


The crew is using an irradiator in the basement of Hanford's 318 Building just north of Richland. Pacific Northwest National Laboratory usually uses the device to calibrate dosimeters, which measure radiation exposure to humans and animals, and to check for radiation damage of video cameras, fiber optic cables and other equipment.


Lab operators agreed to the research for purposes of science education and workers donated their time, in some cases using part of their vacation allotments.


On Thursday afternoon, Byron and Imahara were cramming their uncooperative critters into a specially built roach condo to be exposed in the irradiator.


"I had to put myself in quite the mind-set to do it," Byron said.


A scientific supply company sent 200 cockroaches for the tests, "all laboratory-grade, farm fresh," Imahara said.


A control group of 50 will get no radiation, 50 others will be exposed to 1,000 rad, a lethal load of radiation for humans, 50 will be exposed to 10,000 rad and the last 50 to 100,000 rad.


The bugs will be watched over the next couple of weeks to see how soon they die.


"Contrary to popular belief, not a significant amount of research goes into cockroach radiation," Imahara said.


Flour beetles and fruit flies, also being irradiated for comparison, were a snap compared with the cockroaches, which did not take well to being corralled within a tiny block arrangement designed to make sure each bug gets the same dosage.


"They are very fast. They are very aggressive. They want to get away," Byron said. "They are opportunists."


The surviving bugs get a chauffeured ride back to San Francisco. A "Mythbusters" employee has been detailed to drive them because airlines won't let them in the passenger cabin and they can't be placed in the baggage hold without wrecking the experiment.


"We have to maintain reasonable temperature and humidity so they don't go into shock," Imahara said.




Technorati :

extra room for the sky


extra room for the sky


An extra room for the sky



In the quaint seaside community of Gloucester, Mass., on Cape Ann, one gray clapboard house stands out from the rest. It has a big white dome rising from the top, with a sliding shutter that opens to the sky and a powerful telescope inside.


"My wife got an ocean view and I got a view of the sky," said Dr. Mario Motta, 55, a cardiologist and astronomy enthusiast, of the house they built three years ago.


At a time when amateur astronomy is becoming increasingly popular - thanks in part to the availability of high-tech equipment like digital cameras that filter out light pollution - Motta and his wife, Joyce, are among a growing number of Americans incorporating observatories into new or existing homes. Manufacturers of observatory domes report increasing sales to homeowners, and new residential communities are being developed with observatories as options in house plans.


"As the baby boomers and wealthy tech types retire, they want challenging hobbies like astronomy, and have enough cash stashed away to afford to build their own observatories," said Richard Olson, president of the Ash Manufacturing Co. in Plainfield, Ill., which makes steel domes for observatories.


His customers used to be limited to academic and research institutions, but within the past five years, he said, homeowners have begun making requests, to the point where 25 percent of his sales are to people like Steve Cullen, a 41-year-old retired senior vice president of the Symantec Corp., who is building a home and observatory on 190 acres in Rodeo, N.M.


Cullen said he chose the location because it has "some of the darkest skies and clearest weather for space photography in the U.S." (Most sophisticated telescopes now allow for the addition of digital cameras.) He expects the total cost of his observatory, which is still under construction, to be close to $340,000, including a $225,000 telescope, but his is a high-end project.


Most home observatories have between $10,000 and $40,000 in equipment, including telescopes, computers, refractors, filters and tracking mechanisms, according to astronomy equipment retailers. The total budget for an observatory can range from $50,000 to more than $500,000, depending on how technologically advanced the equipment and the size and complexity of the structure.


Motta also photographs deep space from his home's observatory, posting his images of distant galaxies online and publishing them in astronomy magazines and journals.


His telescope, which he constructed himself, weighs well over a hundred pounds, and would be cumbersome to move outdoors if he didn't have an observatory. And like most sophisticated telescopes, it would also require at least an hour of careful recalibration if relocated.


"The reason why people don't use their telescopes is they are such a pain to haul out and set up," said John Spack, 50, a certified public accountant who had a domed observatory built on top of an addition to his house in Chicago last year. "Now, if I want to get up at 3 a.m. and look at something, I just open the shutter."


Like observatories at research facilities and museums, most home observatories now have computers that rotate the dome so the telescope is oriented toward precisely what the user wants to see. Once fixed on a point in space, the dome continues to slowly rotate to compensate for the earth's rotation, so whatever is in view doesn't move out of range.


"It's all fully automated, real high-tech," said Spack, who estimated that he spent at least $100,000 to build and equip his observatory. Many home observatories also allow remote real-time views through the telescope from any computer with an Internet connection.


Roy and Elise Furman, who own a software company, view the cosmos through the telescope in their vacation house observatory in Portal, Ariz., both when they are there and when they are at home, in Philadelphia.


"Philadelphia skies are so light polluted, we got depressed trying to do astronomy," said Furman, 48. So the couple bought the Portal property, which is about 10 miles from Rodeo and part of a community called Arizona Sky Village, founded in 2003. Half of the 15 adobe-style homes there have matching domed observatories, and five more observatory homes are under construction. "We are a bunch of astronomy buffs looking through our telescopes out in the middle of nowhere," said Furman, 57.


Other astronomy-themed residential developments include Deerlick Astronomy Village in Sharon, Ga., about 100 miles east of Atlanta, established in 2004, and Chiefland Astronomy Village in Chiefland, on Florida's west coast, which began in 1985 as a place for amateur astronomers to buy or rent land on which to camp. Within the last five years, several houses with observatories have been built there.


These communities encourage home observatories, but elsewhere, "people do run into problems with deed restrictions," said Jerry Smith, president of Technical Innovations, a manufacturer of observatory domes in Gaithersburg, Md. The company started in 1991 and primarily served universities and government agencies, but since 2002 individual consumers have accounted for 60 percent of the 1,400 domes it has sold.


To avoid overheating and warping the viewing equipment, Smith said, "it's better to have a white dome, because it's reflective, but we've had to do them in earth tones because that's the only way to get them approved by property owners' associations."


Domes in home observatories are typically made of metal or fiberglass and range in size from 8 to 30 feet in diameter. They are sold in kits from manufacturers like Ash or Technical Innovations and start at about $5,000, depending on the size, materials and features. The price includes a computer-controlled motorized system that opens the dome's sliding or hatch-like shutter and rotates the dome.


The telescope beneath the dome requires "a dedicated foundation so it's not subject to the vibrations transmitted by people walking around in the building," said Gregory La Vardera, an architect in Merchantville, N.J., who designed Cullen's observatory. This usually involves elevating the instrument on a discrete concrete pier. A telescope mount is bolted to the pier and the mount is motorized so it rotates the telescope in sync with the dome.


Observatories cannot be air-conditioned because any difference between the inside and outside air would distort the telescope's optics, La Vardera said. For comfort, most home observatories have a separate insulated and air-conditioned control room that houses all the computer equipment. These rooms often look like studies, with lots of space photography hanging on the walls.


"I have a lot of astronomy books on the bookshelves so I can feel knowledgeable," said Dr. M. Eric Gershwin, the chairman of clinical immunology at the University of California, Davis, about the control room in his home's observatory in Davis. An avid amateur astronomer, Gershwin, 61, had the observatory built 10 years ago and has been tweaking the instrumentation and control systems ever since. "You're never done," he said. "Right now I'm updating the computers."


Helping people with the installation and computerization of observatories has become a sideline for Kris Koenig, 45, a video producer from Chico, Calif., who specializes in astronomy-themed productions.


"It started a couple of years ago, when I helped set up the digital equipment in some public and private observatories locally," Koenig said, adding that he is now getting at least half a dozen calls for assistance a month just through word of mouth. He charges $500 to $1,000 an hour depending on the difficulty of the job, plus travel expenses. His most recent project involved linking a California home observatory's telescope to an entertainment center, so the images could be broadcast on a big-screen television.


The work is gratifying, he said. "It's great that so many people want to bring the universe home."





Technorati : , ,

$30M Jacobs gift to support graduate fellowships


Focus is on electrical engineering and computer science


MIT announced Oct. 19 a $30 million gift from Joan and Irwin Jacobs (S.M. 1957, Ph.D. 1959) to support graduate fellowships for students in the School of Engineering. Irwin Jacobs, founder and chairman of San Diego-based Qualcomm Inc., a worldwide leader in digital wireless communication, received his master's and doctoral degrees from MIT in electrical engineering and computer science.


The Jacobs' gift creates the Irwin Mark Jacobs and Joan Klein Jacobs Presidential Fellowships. The $30 million gift will support at least 15 Jacobs Presidential Fellows annually in the Department of Electrical Engineering and Computer Science, with the first Fellows to be named in fall 2008.


"We ourselves were the beneficiaries of undergraduate scholarships, and then when I applied for graduate school, I also received a fellowship. It was very important for me to have fellowship support, and if we benefited, we think there are many others who can benefit as well," Irwin Jacobs said.


This latest gift from the Jacobses to MIT is part of the couple's long history of philanthropy, which also includes the creation of an endowed chair at MIT, the Joan and Irwin Jacobs Professorship in the Laboratory for Information and Decision Systems. The Jacobs are also major supporters of the San Diego Symphony and the University of California, San Diego.


"I am enormously grateful to Joan and Irwin Jacobs for their extraordinary generosity and support. MIT attracts some of the very best graduate students in the world, and Joan and Irwin Jacobs' magnificent gift ensures that those students will have the opportunity to pursue the cutting-edge education and research for which the Institute is known," said MIT President Susan Hockfield.


"As someone who has been an MIT student, MIT professor, inventor, entrepreneur, and co-founder of a successful technology company, Irwin Jacobs stands as an extraordinary role model for engineering and for MIT," said Dean of Engineering Subra Suresh. "I add my gratitude for the Jacobs' exceptional gift to support graduate fellowships in electrical engineering and computer science, which will help prepare outstanding students to follow in his footsteps."


The MIT Presidential Fellowship Program recruits the most outstanding students worldwide to pursue graduate studies at the Institute. Presidential Fellowships fund the tuition and living stipend of awardees for their first academic year at MIT. The Fellows are selected by the president and provost from a pool of candidates nominated by the deans and heads of departments and interdisciplinary programs. The program was started in 1999.


The Jacobs' gift is the largest to date to MIT's Campaign for Students, a fundraising campaign that began in fall 2006 to support undergraduate and graduate education and student life. The campaign will fund undergraduate scholarships, graduate fellowships, undergraduate education initiatives, and programmatic and capital investments in student life. The campaign's formal launch will be in October 2008 and will conclude in 2011 to coincide with the celebration of the 150th anniversary of MIT's founding.




Technorati :

Low carbon innovators : POWER FOR NEXT GENERATION


power for next generation.


Financial and business support for low carbon entrepreneurs from the Carbon Trust


Progress is being made in the race to combat global warming: Old-style domestic light bulbs are being phased out by 2010, every new building must meet stringent energy efficiency guidelines, and businesses and industry are committed to cutting power consumption and switching to renewable energy.


None of these vital steps to curb carbon emissions can happen without an influx of ground-breaking low carbon technologies and products - the enablers of change.


"There are many brilliant inventors out there, but they are often struggling to transform their big ideas into commercial reality," says Rachael Nutter, Business Incubator Manager at The Carbon Trust. "We are here to help ideas develop into profitable, sustainable businesses."


The Carbon Trust is perhaps better known for its carbon-cutting advice and services to the corporate world, than for its financial and commercialisation support in the field of early stage technological innovation.



"Within the organisation there is a vast amount of work being done to bring technologies to market that will generate clean electricity, introduce more energy-efficient industrial processes and make practical use of renewables," says Nutter.


As well as applying for generous research grants, through the Applied Research scheme, fledgling companies and research institutions with innovative solutions can gain valuable consultancy help if they're accepted onto the Carbon Trust's Low Carbon Incubator programme.


Set up in 2004 with four incubator consultancy partners - Angle Technology, Imperial Innovations, Isis Innovation, and TTP - the Incubator offers business development support, market research and practical preparation for fundraising. The Carbon Trust and partners are currently working with 18 companies, and a further 34 companies have already been supported, many securing private sector funding as they move to market. To be accepted onto the programme companies have to demonstrate the emissions reduction their product would make possible and its commercial attractiveness.


"A small company may want to develop a low-cost Solar Photovoltaic solution [using daylight to power electrical equipment] for use in buildings," explains Nutter. "But typically the team will be scientists, not company directors. There will be no business plan and minimal understanding of market demand or how to exploit it."


She says that in the business arena, simply offering a 'green' alternative to the existing technology isn't enough. "Investors and end-users will want to know the commercial benefits. Will it introduce real efficiencies, and crucially, save money?"


Nutter adds: "Often in a specific market there is a problem that needs solving, but you must understand exactly what is required. There's no point starting to manufacture a 2kw product when the requirement is for 1kw. Also markets are changing so rapidly in line with incoming regulations that you can capitalise on new needs, but market knowledge is essential, and this is where our consulting partners really help."


Lontra is a London-based start-up that has benefited from both a Carbon Trust Applied Research grant and Incubator support. Its Blade Compressor can save up to 35 per cent of the energy normally required by industrial compressors.


"Working with the Carbon Trust and Imperial Innovations was invaluable," says Simon Hombersley, business development director at Lontra. "Imperial Innovations has an excellent reputation for clean energy commercialisation, and were able to introduce us to highly skilled technical people. For me the mentoring support was vital too. Being a scientific entrepreneur can be a relatively lonely role, but the contact and networking opportunities that open up really help you and your product develop. We've gone from research project to viable company, and now have the necessary funding to take the Blade Compressor to market."


Revenue modelling, honing a clear view of how to exploit their technology and building credibility in the market all came about through The Carbon Trust's support says Hombersley.


Nutter says that many of the first wave of incubated innovations have profitability in their sights. "And because The Carbon Trust is constantly engaging with the market our network of contacts is growing significantly," she says. "Leading companies - Philips for example - have a genuine appetite for the next generation of low carbon technologies and want access to the cutting edge products that will save energy into the future. Leveraging these links will speed the passage of ground-breaking technology towards everyday use."




Technorati :

Viacom mounts renewed attack on Google


Recent time you tube have taken innitiative to develop copyrights system to protect piracy .


The content provider says the seach giant is not doing enough to prevent clips being illegally shown on YouTube.


Viacom ramped up its offensive against Google yesterday, saying that it would not back down from its $1 billion lawsuit against the internet search engine.


Philippe Dauman, Viacom's chief executive, said that Google had not done enough to prevent content from being illegally uploaded to YouTube, and gave no impression that a settlement was near to being reached.


Viacom, the entertaiment company which owns MTV and Nickelodeon, claims that Google allowed more than 160,000 clips of its programming to be uploaded to YouTube, the video-sharing website it owns.


Google denies that is infringeing Viacom's copyright, and claims that it removes unauthorised videos from YouTube when asked to by content owners.


Speaking at an internet conference in San Francisco, Mr Dauman said that he had "an open mind" about reaching an agreement with Google, which he described as a "responsible company", but that a settlement "wasn't quite there yet".


Referring to Google's proposed solution to the problem, a filtering system which allows new content being uploaded to be checked against a database of copyright material, he said: "They have a lot of tools, but they're not perfect. What no-one wants is a proprietary system that benefits one company to the exclusion of others."


Mr Dauman said that what he would prefer would be an industry standard system, adding that it was "beyond the capacity of a company like ours, let alone smaller ones", to cope with a range of filtering technologies.


Earlier in the day Viacom and a range of other content producers, including Disney, CBS, Fox and NBC, as well as internet companies such as Microsoft and MySpace, announced that they would collaborate on a technology which would prevent users from uploading unauthorised material.


Google was not a party to the list, although analysts said it was not feasible that it continue to use its own technology while the rest of the internet and content industries were working to a common standard.


There was "a developing consensus among content creators and distributors" that whilst it was important content be widely available via the internet, there needed to be "rules of the road", Mr Dauman told an assembled audience at the Web 2.0 Summit in San Francisco.


The complaint of media companies such as Viacom is that they should bear part of the onus - and cost - of policing sites such as YouTube, onto which vast amounts of content are uploaded each day, for unauthorised content.


Google argues that it complies with the terms of the Digital Millennium Copyright Act (DMCA), and takes down copyright-infringeing material when requested to do so by the copyright owner.




Viacom's Bet on Web Diversity.



Viacom CEO Philippe Dauman, whose company sued Google last year for $1 billion for alleged copyright violations on Google's YouTube video sharing site, journeyed into the belly of the beast a few minutes ago. He was, not surprisingly, unapologetic about the suit, which was not popular among the Web digerati. But in the process of defending his position, he did make it clear that Viacom is betting big on the notion that people online will travel to hundreds of individual Web sites for the content they want to view. That was underscored by today's announcement that Viacom would make clips of segments from The Daily Show With Jon Stewart available online for free. "We believe in fragmentation going forward on the Internet."

Of course, no one person wants to see all of Viacom's offerings, but I wonder if people really will click directly to all that many individual sites. The rise of YouTube may well depend on the presence of unauthorized videos, but there's a reason people flock there: They can find what they're looking for without having to click all over the Net. As Cisco senior VP Dan Scheinman said just a few minutes before, "The challenge of our era is, how do we find anything?"

Search helps, but it's clearly not the whole answer anytime soon. And I think people, online or off, want to gather where there are a whole bunch of other people.

Can Viacom fight that reality? Maybe, if it can get enough critical mass of fans for each of those sites. And it's hard to argue with $500 million in online revenue. But I can't imagine that will ever be completely sufficient. Still seems like there's more benefit in using YouTube--whose videos are hardly HDTV-quality--as a way to drive traffic to Viacom than in suing it and preventing users from finding what they want.




Technorati :

Find here

Home II Large Hadron Cillider News