Search This Blog

Thursday, August 16, 2007

MIT faculty, alumni among Technology Review's top young innovators


Several MIT faculty and alumni have been named to the TR35, Technology Review Magazine's annual compilation of the 35 top innovators worldwide under the age of 35.


Selected from more than 300 nominees by a panel of expert judges and the editorial staff of Technology Review, the TR35 is an elite group of accomplished young innovators who exemplify the spirit of innovation in business, technology and the arts.


"The quality and breadth of accomplishment of the 2007 TR35 winners is truly amazing," said Jason Pontin, editor in chief of Technology Review. "We honor them for their achievements today and into the future."


The 35 innovators will be profiled in the September-October issue of the magazine and will be honored at the 2007 Emerging Technologies Conference to be held Sept. 25-27 at MIT.


The MIT faculty on the list, which was announced today, are Ali Khademhosseini, Kristala Jones Prather and Mehmet Fatih Yanik.


Khademhosseini is an assistant professor in medicine and health sciences and technology at the Harvard-MIT Division of Health Sciences and Technology and the Harvard Medical School; Prather is the Joseph R. Mares (1924) Career Development Assistant Professor in the Department of Chemical Engineering; and Yanik is an assistant professor in the Department of Electrical Engineering and Computer Science.


Khademhosseini, 31, was selected for his work on improving engineered tissues using an approach he has likened to building with "living Legos."


"By giving cells the same interconnections they have in the body, Khademhosseini hopes to create tissues that can be used to test new drugs and, eventually, to rebuild organs," according to Technology Review.


The magazine chose Prather, 34, for her role in trying to make compounds using biological processes rather than chemical reactions--a technique that could avoid harsh solvents and toxic byproducts.


"What I'm interested in is designing organisms to be chemical factories," Prather told Technology Review.


Yanik, 29, was selected for inventing a way to stop light pulses on a chip and release them at will. The magazine said such a technology could allow engineers to route and store optical data in telecommunications networks and on microchips without having to convert it to electricity.


MIT alumni among the TR35 include David Berry, Khademhosseini, Ju Li, Christopher Loose, Anna Lysyanskaya, Prather, Neil Renninger and Yanik. The list also includes Javier Garcia-Martinez, a recent MIT postdoctoral fellow.




Technorati : ,

New Weapons Identified in Battle against Hospital Superbugs


Three drugs that are effective against antibiotic-resistant superbugs, such as MRSA, should be available for use within 2-3 years, according to the British company that discovered them.


They are the first antibiotics employing a novel mode of action to be discovered for more than 30 years. E-Therapeutics, a spin-out company from Newcastle University, used techniques developed under an EPSRC-funded project as part of the UK e-Science Programme to make the discoveries. The drugs are to enter clinical trials next month.


This story has received widespread news coverage, including the front page of the Daily Express (17 January) and on BBC Radio 2.


External links





Technorati :

EPSRC Support Helps Initiate World's First Plastic Electronics Factory




















UK firm Plastic Logic, a spin-out company from the University of Cambridge, has announced that it will build the world's first factory to manufacture plastic electronics on a commercial scale. Plastic Logic announced that it has raised $100 million (£50.6 million) in order to establish this facility at Dresden, Germany.


The company was co-founded in 2000 by Sir Richard Friend and Henning Sirringhaus of the University of Cambridge's Cavendish Laboratory.


"This development comes on the back of a long-term programme of basic science supported by the Engineering and Physical Sciences Research Council," said Sir Richard Friend. "It is this support that has enabled us to stay at the top internationally."


This story was covered widely, including on BBC Radio 4's Today Programme, BBC News Online and in the Financial Times.



EPSRC Information



External Links




Plastic Logic's flexible displays
An example of "take anywhere, read anywhere" Plastic Logic flexible displays using E Ink® Imaging Film.


Richard Friend
Sir Richard Friend





Green Power Stations are no Pipe Dream


Sky News, Science Daily, Tyne Tees TV and BBC Radio Newcastle are just some of the media outlets that are highlighting EPSRC funded research at Newcastle University that could cut greenhouse gas emissions from power stations.

The work involves controlling the combustion process with tiny tubes made from an advanced ceramic material. The material known as LSCF, has the remarkable property of being able to filter oxygen out of the air. By burning fuel in pure oxygen it is possible to produce a stream of almost pure carbon dioxide, which has commercial potential for reprocessing into useful chemicals.


Details of the research and development project are also published today (3 August 2007) simultaneously in two technical publications - Materials World and The Chemical Engineer. A series of research papers have also been published in academic journals as the project has developed.


The tubes of LSCF, which stands for Lanthanum-Strontium-Cobalt-Ferric Oxide, have been tested successfully in the laboratory and the design is attracting interest from the energy industry. The Newcastle team is now carrying out further tests on the durability of the tubes to confirm their initial findings that they could withstand the conditions inside a power station combustion chamber for a reasonable length of time.


External links



EPSRC links





Technorati :

Comcast offers Internet, phone in Houston




To further solidify its cable entrance into the Houston market, has launched its high-speed Internet and digital voice products in the region. Philadelphia cable, entertainment and communications company Comcast (NASDAQ: CMCSA) took over the 1.7 million-subscriber Houston market from in June. > Like Time Warner and area competitor San Antonio-based (NYSE: T), Comcast customers will be able to bundle their cable, Internet and phone services for $33 each, called a Triple Play. The service offers one bill, one point of contact for customer service and one installation visit to the home. Contact the Editor Need Assistance? More Latest News #sba2007 { width:604px; border:3px solid #ccc9be; background:#fff; margin:15px 0 5px 0; padding:0; } #sba2007 h3 { margin:0; font:bold 15px/15px arial; text-transform:none; padding:6px 10px; background:#f1ede2; color:#900; display:block; } #sba_inner {margin:6px 10px 6px 10px; font:normal 12px arial; float:left;} #sba_inner p {margin:0 0 5px 0;...




Technorati : , , , ,
Del.icio.us : , , , ,
Ice Rocket : , , , ,
Flickr : , , , ,
Zooomr : , , , ,
Buzznet : , , , ,

Intel Prepares X38 Express Launch



Expect Intel's high-end chipset to show up next month


Intel officially set its performance embargo on its upcoming X38 Express chipset for September 23. Motherboards based on the X38 Express chipset should show up in retail in early September, according to motherboard vendors. The September 23 non-disclosure lift date only applies to reviews and performance numbers for the X38 Express chipset. The situation will be similar to the P35 Express chipset launch, where motherboards were available before its Computex 2007 launch announcement and NDA lift date.


The new chipset is a member of the Bearlake family, which saw its initial debut with the G33 and P35 Express variants last June. Intel's X38 Express succeeds the 975X Express that made its debut with Intel's Pentium D Presler processors. Although the Intel 975X Express launched in late 2005, the chipset shared basics with Intel's 945 and 955X Express chipset families. Intel decided not to refresh the 975X Express with a Broadwater variant and held out for Bearlake.


Intel's X38 Express introduces PCIe 2.0 support to the LGA775 platform. PCIe 2.0 offers greater bandwidth over the existing PCIe standard - up to four gigatransfers per second, or GT/s, with the 20% encoding overhead accounted for. The chipset also supports dual full-speed PCIe x16 slots for ATI CrossFire multi-GPU technology. Intel guidance does not show any indication of support for NVIDIA's SLI Technology.


Officially, the Intel X38 Express chipset only supports DDR3 memory. However, motherboard vendors disagree and intend to release X38 Express based motherboards with DDR2 memory support. Motherboard manufacturers such as DFI, Foxconn, Gigabyte, MSI and others had DDR2-compatible X38 Express motherboards on display at Computex 2007. The DDR2-compatible solutions were either DDR3 and DDR2 or dedicated DDR2 supporting motherboards.


Expect motherboards based on the Intel X38 Express to pop up in retail next month. DailyTech estimates the cost of entry around $200 for a no-frills board and around $300 for boards that include a kitchen sink in the package.



news inside news;


Intel i860


The Intel i860 (also 80860) was a RISC microprocessor from Intel, first released in 1989. The i860 was (along with the i960) one of Intel's first attempts at an entirely new, high-end ISA since the failed Intel i432 from the 1980s. It was released with considerable fanfare, and obscured the release of the Intel i960 which many considered to be a better design. The i860 never achieved commercial success and the project was terminated in the mid-1990s.




Technical features


The i860 combined a number of features that were unique at the time, most notably its VLIW (Very Long Instruction Word) architecture and powerful support for high-speed floating point operations. The design mounted a 32-bit ALU along with a 64-bit FPU that was itself built in three parts, an adder, a multiplier, and a graphics processor. The system had separate pipelines for the ALU, floating point adder and multiplier, and could hand off up to three operations per clock. (I.e., two instructions - one integer instruction and one floating point multiply-and-accumulate instruction per clock.)


All of the buses were 64-bits wide, or wider. The internal memory bus to the cache, for instance, was 128-bits wide. Both units had thirty-two 32-bit registers, but the FPU used its set as sixteen 64-bit registers. Instructions for the ALU were fetched two at a time to use the full external bus. Intel always referred to the design as the "i860 64-Bit Microprocessor".


The graphics unit was unique for the era. It was essentially a 64-bit integer unit using the FPU registers. It supported a number of commands for SIMD-like instructions in addition to basic 64-bit integer math. Experience with the i860 influenced the MMX functionality later added to Intel's Pentium processors.


One unusual feature of the i860 was that the pipelines into the functional units were program-accessible, requiring the compilers to carefully order instructions in the object code to keep the pipelines filled. In traditional architectures these duties were handled at runtime by a scheduler on the CPU itself, but the complexity of these systems limited their application in early RISC designs. The i860 was an attempt to avoid this entirely by moving this duty off-chip into the compiler. This allowed the i860 to devote more room to functional units, improving performance. As a result of its architecture, the i860 could run certain graphics and floating point algorithms with exceptionally high speed, but its performance in general-purpose applications suffered and it was difficult to program efficiently (see below).



Performance (problems)


Paper performance was impressive for a single-chip solution; however, real-world performance was anything but. One problem, perhaps unrecognized at the time, was that runtime code paths are difficult to predict, meaning that it becomes exceedingly difficult to properly order instructions at compile time. For instance, an instruction to add two numbers will take considerably longer if the data is not in the cache, yet there is no way for the programmer to know if it is or not. If you guess wrong the entire pipeline will stall, waiting for the data. The entire i860 design was based on the compiler efficiently handling this task, which proved almost impossible in practice. While theoretically capable of peaking at about 60MFLOPS for the XP versions, hand-coded assemblers managed to get only about up to 40MFLOPS, and most compilers had difficulty getting even 10.


Another serious problem was the lack of any solution to quickly handle context switching. The i860 had several pipelines (for the ALU and FPU parts) and an interrupt could spill them and require them all to be re-loaded. This took 62 cycles in the best case, and almost 2000 cycles in the worst. The latter is 1/20000th of a second, an eternity for a CPU. This largely eliminated the i860 as a general purpose CPU.






Versions, Applications


The chip was released in two versions, the basic XR (code named N10), and the XP (code named N11). The XP added larger on-chip caches, a second level cache, faster buses, and hardware support for bus snooping, for cache consistency in parallel computing systems. The XR ran at 25 or 40MHz, and a process shrink for the XP (from 1 micrometre to 0.8) bumped the XR to 40 and 50MHz. Both ran the same instruction set.


At first the i860 was only used in a small number of very large machines like the iPSC/860 at Los Alamos National Laboratory. As the compilers improved, the general performance of the i860 did likewise, but by then most other RISC designs had already passed the i860 in performance.


Intel for a time tested the viability of the i860 as a workstation CPU, competing with the MIPS Architecture chips and others. Microsoft initially developed what was to become Windows NT on internally-designed i860-based workstations (codenamed Dazzle), only porting NT to the MIPS (Microsoft Jazz), Intel 386 and other processors later. It is often rumoured that the original meanings of the 'N' and 'T' in Windows NT was for "N-Ten", after the working name for the i860 core.


The i860 did see some use in the workstation world as a graphics accelerator. It was used, for instance, in the NeXTdimension, where it ran a cut-down version of the Mach kernel running a complete PostScript stack. In this role the i860 design worked considerably better, as the core program could be loaded into the cache and made entirely "predictable", allowing the compilers to get the ordering right. Another example was SGI Onyx Reality Engine 2, which used a number of i860XP processors in its geometry engine. This sort of use slowly disappeared as well, as more general-purpose CPUs started to match the i860's performance, and as Intel turned its focus to Pentium processors for general-purpose computing.


In the late 1990s Intel replaced their entire RISC line with ARM-based designs, known as the XScale. Confusingly, the 860 number has since been re-used for a motherboard control chipset for Intel Xeon (high-end Pentium) systems.




Technorati : , ,
Del.icio.us : , ,
Ice Rocket : , ,
Flickr : , ,
Zooomr : , ,
Buzznet : , ,

DARPA completes successful autonomous in-flight refueling without pilot intervention



The Defense Advanced Research Projects Agency (DARPA) is always on the forefront of technology. When DARPA isn't collaborating with research centers to develop technology for the consumer sector, it's looking at ways to advance military technology. DARPA's latest achievement is no exception.


DARPA and NASA successfully completed a demonstration of an autonomous system for in-flight refueling -- a procedure that has become all too familiar for military pilots on extended missions.


The Autonomous Airborne Refueling Demonstration (AARD) used a combination GPS, video mapping and fly-by-wire antics to plug an F/A-18's refueling nozzle into a 707-300 tanker's fuel receptacle. DARPA and NASA completed 18 successful tests using a variety of control methods and under a varying range of flight conditions.


"The system further demonstrated the ability to join the tanker from up to two nautical miles behind, 1,000 feet below, and 30 degrees off heading, thus providing a ready transition from the waypoint control approach used by most unmanned aircraft to a fully autonomous refueling mode" DARPA noted in a press release. "In recent flights, automatic sequencing reflected improved confidence in the system, compared to last year's flight where pilot consent was required at specified points in the refueling maneuver."


"Skilled pilots can actually save some tricky, last second movement the basket has a habit of making, but in so doing they set themselves up for a basket strike, ripping off the basket from the hose, or sometimes breaking the probe or parts of the airplane," said NASA test pilot Dick Ewers.


The AARD further removes human pilots from the equation when it comes to combat situations. Unmanned aircraft are continually being developed by the military to take human pilots out of harm's way. An unmanned aircraft that has the ability to refuel on its own in combat situations would make for an excellent weapon in the military's vast arsenal.


news insides news;


Monocular, vision based, autonomous refueling system


This paper describes design and implementation of a vision based platform for automated refueling tasks. The platform is an autonomous docking system in principle, with the specific application - refueling of vehicles. The system is. based on monochromatic, monocular vision, and it utilizes very specialized image processing schemes. Image processing consists of very fast filtering and segmenting algorithms, as well as moment's computation. A robotic arm with 6 joints (FANUC M-6i), and a controller unit (R-B), does the physical work. A serial interface, with very high-level commands, connects a supercomputing machine and the robot's controller. A practical setup would probably be scaled down to a special design robot, and a single processor, controller with special VLSI chips for image processing. Results are very promising; the robot can identify the cap position, orientation, and height in real time with acceptable accuracy and reliability.






Technorati : , , , , , ,
Del.icio.us : , , , , , ,
Ice Rocket : , , , , , ,
Flickr : , , , , , ,
Zooomr : , , , , , ,
Buzznet : , , , , , ,

Apple, Google satisfaction ratings slip


American consumers still like Apple and Google, but not as much as they did last year, a new report indicates.


Both companies rank near the top of their respective industries in the latest update of the University of Michigan's American Customer Satisfaction Index, which will be released today. But both companies' rating in the index slipped over the past year.


That decline could have long-term repercussions for both companies, said Claes Fornell, a business professor at the University of Michigan who heads up the index. Companies who see their customer satisfaction rating change often see their profits change in the same direction, Fornell said.


On the flip side from Google, Yahoo has seen its rating in the index increase over the past year, so its score now is nominally higher than Google's.


"Whenever users are becoming more satisfied with your service, that's going to help," Fornell said.


Apple's rating fell from an 83 last year to a 79, while Google dropped from 81 to 78. Yahoo's score increased from 76 to 79.


All three were graded on a 100-point scale using responses from a national survey. Any change of two or more points is considered statistically significant, as is a difference of two or more points in any companies' scores.


Apple's score dropped because of customers complaining about the reliability of its products, Fornell said. The company's sales and computer shipments have grown rapidly in


recent years, outpacing the broader PC industry.
"We know it's just difficult to maintain quality at all levels if you do that," Fornell said.


While Apple is happy to remain the top-ranked PC maker the index, the company "is going to try even harder," after its rating slipped, said company spokeswoman Natalie Kerris.


"Customer satisfaction is very important to Apple," she said.


Despite the decline, Apple's score was still three points higher than that of Hewlett-Packard, the second-highest-ranked computer maker.


Google's decline was a result of what was going on around it, Fornell said. While Yahoo and Ask.com have revamped the look and feel of their homepages, Google's has stayed largely static.


"Google's is a bit stale compared with Ask.com," said Fornell, noting that Ask's rating increased four points from last year to a 75.


The problem for Google is that it has to be cautious about updating its site to match those of competitors, he said.


"They run the risk of alienating people that really like it," he said.


"We are continually working to provide the best online experience for our users and welcome strong competition that helps drive market innovation," a Google representative said in a statement.


A Yahoo spokesperson said, "Yahoo is pleased with the results of this year's ACSI study, which reflect our continued efforts to enhance the consumer experience for our more than 500 million users."


More News
American consumers still like Apple and Google, but not as much as they did last year, a new report indicates.


Both companies rank near the top of their respective industries in the latest update of the University of Michigan's American Customer Satisfaction Index, which will be released today. But both companies' rating in the index slipped over the past year.


That decline could have long-term repercussions for both companies, said Claes Fornell, a business professor at the University of Michigan who heads up the index. Companies who see their customer satisfaction rating change often see their profits change in the same direction, Fornell said.


On the flip side from Google, Yahoo has seen its rating in the index increase over the past year, so its score now is nominally higher than Google's.


"Whenever users are becoming more satisfied with your service, that's going to help," Fornell said.


Apple's rating fell from an 83 last year to a 79, while Google dropped from 81 to 78. Yahoo's score increased from 76 to 79.


All three were graded on a 100-point scale using responses from a national survey. Any change of two or more points is considered statistically significant, as is a difference of two or more points in any companies' scores.


Apple's score dropped because of customers complaining about the reliability of its products, Fornell said. The company's sales and computer shipments have grown rapidly in


recent years, outpacing the broader PC industry.
"We know it's just difficult to maintain quality at all levels if you do that," Fornell said.


While Apple is happy to remain the top-ranked PC maker the index, the company "is going to try even harder," after its rating slipped, said company spokeswoman Natalie Kerris.


"Customer satisfaction is very important to Apple," she said.


Despite the decline, Apple's score was still three points higher than that of Hewlett-Packard, the second-highest-ranked computer maker.


Google's decline was a result of what was going on around it, Fornell said. While Yahoo and Ask.com have revamped the look and feel of their homepages, Google's has stayed largely static.


"Google's is a bit stale compared with Ask.com," said Fornell, noting that Ask's rating increased four points from last year to a 75.


The problem for Google is that it has to be cautious about updating its site to match those of competitors, he said.


"They run the risk of alienating people that really like it," he said.


"We are continually working to provide the best online experience for our users and welcome strong competition that helps drive market innovation," a Google representative said in a statement.


A Yahoo spokesperson said, "Yahoo is pleased with the results of this year's ACSI study, which reflect our continued efforts to enhance the consumer experience for our more than 500 million users."




Technorati : , , ,

General Motors will soon put fuel-cell SUVs in the hands of 100 lucky consumers. You could be one of them



Free Hydrogen-Powered Cars for Good Homes
General Motors will soon put fuel-cell SUVs in the hands of 100 lucky consumers. You could be one of them











































Courtesy GM Corp.

EARLY RETIREMENT The hydrogen fuel cell in the Chevy Equinox is engineered for a life of about 49,700 miles.
ad

ad

ad

ad
ad

You still can't walk into a dealership and buy a hydrogen-powered car, but if you live in Orange County, California, Westchester County, New York, or Washington, D.C., where it's easiest to find a hydrogen filling station, you may be able to borrow one for a couple years. Starting this month, in what will be the largest real-world test of fuel-cell passenger vehicles, GM's Project Driveway program is seeking good homes for 100 fuel-cell versions of its Equinox SUV. (Honda, by contrast, has leased just two of its FCX fuel-cell cars to customers.)


As with other fuel-cell vehicles, the Chevy Equinox Fuel Cell combines hydrogen and oxygen to generate electricity, with water vapor as the sole by-product-no smog-forming emissions, no greenhouse gases. Each Equinox stores nine pounds of hydrogen at 10,000 pounds per square inch; 2.2 pounds contain roughly the energy equivalent of a gallon of gasoline. At 50 mpg, the Equinox will travel about 200 miles on a tank. And if you're worried about H-bomb fender-benders, breathe easy: The car is designed to meet all highway safety standards.


GM has yet to decide who will test the futuristic machines, which cost up to $1 million to build, but says that drivers will pay next to nothing, in exchange for their feedback. Interested? Check out chevrolet.com/fuelcell.




Technorati :

General Motors will soon put fuel-cell SUVs in the hands of 100 lucky consumers. You could be one of them



Free Hydrogen-Powered Cars for Good Homes
General Motors will soon put fuel-cell SUVs in the hands of 100 lucky consumers. You could be one of them











































Courtesy GM Corp.

EARLY RETIREMENT The hydrogen fuel cell in the Chevy Equinox is engineered for a life of about 49,700 miles.
ad

ad

ad

ad
ad

You still can't walk into a dealership and buy a hydrogen-powered car, but if you live in Orange County, California, Westchester County, New York, or Washington, D.C., where it's easiest to find a hydrogen filling station, you may be able to borrow one for a couple years. Starting this month, in what will be the largest real-world test of fuel-cell passenger vehicles, GM's Project Driveway program is seeking good homes for 100 fuel-cell versions of its Equinox SUV. (Honda, by contrast, has leased just two of its FCX fuel-cell cars to customers.)


As with other fuel-cell vehicles, the Chevy Equinox Fuel Cell combines hydrogen and oxygen to generate electricity, with water vapor as the sole by-product-no smog-forming emissions, no greenhouse gases. Each Equinox stores nine pounds of hydrogen at 10,000 pounds per square inch; 2.2 pounds contain roughly the energy equivalent of a gallon of gasoline. At 50 mpg, the Equinox will travel about 200 miles on a tank. And if you're worried about H-bomb fender-benders, breathe easy: The car is designed to meet all highway safety standards.


GM has yet to decide who will test the futuristic machines, which cost up to $1 million to build, but says that drivers will pay next to nothing, in exchange for their feedback. Interested? Check out chevrolet.com/fuelcell.




Technorati :

Xandros expands Microsoft partnership


San Francisco (IDGNS) - Linux distributor Xandros is licensing messaging protocols from Microsoft as part of an expansion of the partnership the two companies forged in June.
Xandros, which offers desktop and server versions of Linux, is acquiring the specification and licenses for Exchange ActiveSync and Outlook-Exchange Transport Protocol so its Scalix Mail Servers can better interoperate with Microsoft clients that now primarily interact with Microsoft's Exchange Server messaging infrastructure.


Microsoft and Xandros first announced a pact to make their products more interoperable in June during Microsoft's annual TechEd conference. Microsoft also agreed not to sue Xandros users for patent-infringement. Microsoft claimed earlier this year that Linux violates more than 230 patents it holds.


Microsoft has been collecting Linux vendors like Novell and Xandros as alliance partners in what some see as an effort to appear more friendly to the open-source OS, which is a strong competitor to its Windows Server OS. For their part, Linux vendors want to protect their customers from any potential patent-infringement claims from Microsoft. And Xandros, being one of the less successful Linux vendors, gains a competitive advantage by teaming up with the software giant.


ActiveSync enables synchronization of Windows Mobile and Windows CE devices to connect with server-side information from Exchange. Xandros will develop a server-side implementation of Exchange ActiveSync so Xandros' Scalix Mail Servers can synchronize data over wireless networks directly with mobile clients that use ActiveSync, said Florian von Kurnatowski, director of program management for Scalix at Xandros.


Currently, ActiveSync-enabled devices can interact directly with Scalix Mail Servers if third-party software is installed locally on the client. Xandros' ActiveSync implementation will eliminate the need for that software, which can be cumbersome and is just an added expense for end-users, von Kurnatowski said.


Microsoft's Outlook-Exchange Transport Protocol allows desktop clients, such as Microsoft Outlook running on the Windows OS, to communicate directly with Exchange Server.


He added that creating an ActiveSync implementation is more important to Xandros than creating software using the Outlook-Exchange protocol because of the complexity of getting users to install third-party software on mobile devices versus the relative ease of using Xandros' own Scalix Connect software on clients. "It's less of a priority than ActiveSync," von Kurnatowski said.


Xandros should produce the first results from its implementation of the protocols in six to 12 months, he said.



About Xandros


Xandros is a leading provider of Linux-based server, desktop and Windows-Linux cross-platform systems management tools. Xandros' mission is to help businesses lower costs of IT infrastructure through revolutionary design and workflow-driven management tools. The company's products empower Windows-centric businesses to benefit from the flexibility, reliability and security of Linux and open source, without requiring Linux expertise. Xandros was founded in 2001, and is headquartered in New York, with offices in Ottawa, Frankfurt and Sao Paulo.


Company Information


Corporate Background
Media Information
Xandros in the News


Communicate with Us


Get newsletter
Contact by e-mail
Address and phone numbers





Technorati : , ,

Home under the sea


Funded by Australian Geographic magazine, Godson's effort was part science experiment, part educational outreach. As a marine biologist, he wanted to learn more about sustainable living in a closed ecological system.


He fashioned the sub from mostly recycled scrap metal welded to keep water out. Inside, a "biocoil" full of water and algae helped to absorb carbon dioxide and supply oxygen [see "The BioSUB," below]. At the same time, he hoped to inspire future aquanauts by broadcasting live video to students worldwide.


Although his stay is not a record for length of time spent underwater (in 1992 aquanaut Rick Presley set that, at 69 days), Godson joins the rarefied ranks of human aquatic inhabitants, among them marine researcher Dennis Chamberland. In 1997 Chamberland lived in NASA's Scott Carpenter Space Analog Station, a seafloor abode near Key Largo, Florida, for 11 days to test life-support systems for space.


Godson is helping Chamberland prepare for a record-breaking 80-day stay aboard the Leviathan Habitat off the coast of Florida in 2009. Why not undertake the mission himself? "I like the things we have up here," he says.


The BioSUB


Container: The two-ton, mostly recycled-steel box is moored to the lakebed by 28 tons of concrete.


Biocoil: A pump churns water and algae through a coiled tube. The algae absorbs carbon dioxide and supplies oxygen.


Dive compressors: The algae-filled biocoil tube offers a nice experiment in human-plant sym-biosis, but 12-volt compressors floating on the surface above the shelter are Lloyd Godson's primary source of oxygen.


Generator: A modified exercise bike powers a laptop and the biocoil pump. Godson also drew power from onshore methane fuel cells and solar panels.


Air monitor: This waterproof gas-detection device monitors oxygen and carbon dioxide



Eliminate Pooled Water Under HomeQ: I just purchased a home and have noticed that I have a major problem under the house. Water is pooling up and making it a muddy mess. There are even mushrooms growing there.


My first inclination would be to install a French drain, but I want to talk with an engineer before I do that. Do you have any recommendations for plumbing engineers, someone who can tell me how they would install a drainage system?


A: Installing a French drain is a lot of work and a lot of expense, especially if you hire someone to do the work.


A French drain requires digging a trench around the house, lining the trench with plastic, placing perforated drainpipe in the trench and backfilling the whole shebang. Imagine digging a 3-foot-deep trench around the entire perimeter of even an average home. That's a lot of dirt to move.


There may be less drastic alternatives that will produce a satisfactory result.


Your instinct to seek professional advice is right. Spending a few dollars to get an expert opinion is often money well spent and will save money and trouble in the long run.


We must confess, we've never heard of a "plumbing engineer." This brings back memories of Art Carney on the old "Honeymooners" TV series. Carney's character, Ed Norton, worked in the New York City sewers as a "sanitation engineer." Funny thing is that these days waste disposal workers are closer to engineers than were their counterparts of the '50s.


There are several types of professionals you could call for help. At the top of the heap would be a soils engineer. This person should be able to analyze the composition of your soil and determine why water does not percolate through it and instead ends up under the house.


Next might be a landscape architect. These professionals are experts not only in horticulture but also in soils. He or she may well be able to suggest ways for you to redirect ground water so that it doesn't end up under the house. Oftentimes, landscape architects double as landscape contractors.


A structural and pest control expert may also be able to help. This may sound strange, but a large percentage of the infestations they see are attributable to water. Your mushroom garden in the crawl space indicates that you certainly have water.


We'd go with the soils engineer for the consultation. He or she may not be equipped to do the work but will be able to give you a number of ways to solve the problem.


An alternative we'd try before seeking a consultation is installing some solid PVC drainpipes around the house to discharge water that collects in your gutters and runs down your drainpipes away from the house. Connect the downspouts into the pipe so that the rainwater discharges well away from the foundation. This may be enough to dry out your crawl space.


First, lay the pipe on top of the soil to see if this is the solution. If it works, bury the pipe about 6 inches in the ground, making sure that it falls about 1/4 inch for each 10 feet of run. Run it so that the water discharges into a dry well or onto the lawn. A dry well is a gravel-lined hole that accepts water discharged from the pipes and eventually percolates into the soil.


Be sure to place the dry well downgrade from the house. This could solve the problem and save you the cost of a consultation fee and a good deal of the work, time and money to install a French drain.







Technorati : , ,

Team finds way to create cancer stem cells


MIT scientists and colleagues have found a way to create in the lab large amounts of cancer stem cells, or cells that can initiate tumors. The work, reported in the August 13 issue of Cancer Cell, could be a boon to researchers who study these elusive cells. Labs could easily grow them for use in experiments.


The findings also contradict an assumption about the trajectory of cancer cells. According to current cancer models, any normal cell can evolve toward a malignant state through a series of alterations, including mutations. Given the right alterations, any cell could eventually acquire the ability to invade other tissues.


But the new study suggests that some normal cells are more prone to become tumor-initiating and have a higher potential to metastasize, or spread to other tissues.


According to the researchers involved in the current work, in some ways certain tumors resemble bee colonies. Each cancer cell in the tumor plays a specific role, and just a fraction of the cells serve as "queens," possessing the unique ability to maintain themselves in an unspecialized state and seed new tumors. These cells can also divide and produce the "worker" cells that form the bulk of the tumor.


These "queens" are cancer stem cells, and they are the cells recently created by MIT biology professor and Whitehead Institute Member Robert Weinberg and colleagues. They did so by isolating and transforming a particular population of cells from human breast tissue. After being injected with just 100 of these transformed cells, mice developed tumors that metastasized.


"The operational definition of a cancer stem cell is the ability to initiate a tumor, so these are cancer stem cells," said Weinberg.


Tan Ince didn't set out to engineer these potent cells. As a postdoctoral researcher in Weinberg's lab and gynecologic pathologist at Brigham and Women's Hospital, he was simply trying to create breast cancer models that look like real human tumors under the microscope and behave like those seen in many patients.


Now an independent investigator at Brigham and Women's and instructor at Harvard Medical School, Ince developed a recipe for a new chemically defined culture medium and managed to grow a different type of human breast cell that ordinarily dies in culture. He transformed it into a cancer cell by inserting specific genes through a standard procedure.


The engineered cells proved to be extremely powerful. When Ince injected more than 100,000 of them into a mouse with a compromised immune system, it quickly developed massive, deadly tumors. In initial experiments, a few tissue slices revealed a primary tumor structure that resembled that of cancer patients with metastases.


He repeated the experiment in other mice, reducing the number of cells in the injection to as few as 100. The cancer cells continued to seed tumors and those tumors metastasized.


In sharp contrast, scientists must inject about one million cells to get a tumor when working with the cancer cell lines routinely used in the laboratory.


The study also offers clues about the trajectory of cancer cells. A normal cell is thought to evolve progressively toward a malignant state through a series of genetic mutations. The early alterations confer uncontrolled growth, while later alterations enable the cell to migrate and invade other tissues.


The new study suggests, however, that some normal cells are more prone to become tumor-initiating cells and have a higher metastatic potential when they become cancer cells than other normal cells. The culture medium Ince created favors the growth of the human breast cells with high tumor-making and metastatic potential while the standard culture medium favors cells with low tumor-making potential. Although the two types are only slightly different, the cells behave completely differently after acquiring the same mutations.


Ince confirmed this behavioral difference by taking a single human breast tissue sample, splitting it in two and growing the cells in the two culture mediums, which select for different cells. Next, he transformed the two populations with the same tumor-initiating genes, injected them in mice and watched the result. The cells that were grown in the new culture medium were 10,000 times more potent as tumor initiators and were the only ones able to metastasize. Thus genes that were previously thought to only initiate tumors initiated metastasis, which is the main cause of cancer mortality in the clinic.


Ince's and Weinberg's colleagues on this work are Andrea Richardson, George Bell, Maki Saitoh, Samuel Godar, and James Iglehart. This research is funded by the Breast Cancer Research Foundation and the National Institutes of Health.




Technorati : , ,

Nuclear power is the solution to climate change


Chances are good, gentle reader, that you are going to have to sit next to someone in the coming year who will assert that nuclear power is the solution to climate change. What will you tell them? There's so much to say. You could be sitting next to someone who hasn't really considered the evidence yet. Or you could be sitting next to scientist and Gaia theorist James Lovelock, a supporter of Environmentalists for Nuclear Energy™, which quotes him saying, "We have no time to experiment with visionary energy sources; civilisation is in imminent danger and has to use nuclear-the one safe, available, energy source-now or suffer the pain soon to be inflicted by our outraged planet."


If you sit next to Lovelock, you might start by mentioning that half the farms in this country had windmills before Marie Curie figured out anything about radiation or Lise Meitner surmised that atoms could be split. Wind power is not visionary in the sense of experimental. Neither is solar, which is already widely used. Nor are nukes safe, and they take far too long to build to be considered readily available. Yet Stewart Brand, of Whole Earth Catalog fame, has jumped on the nuclear bandwagon, and so has Greenpeace founding member turned PR flack Patrick Moore. So you must be prepared.


Of course the first problem is that nuclear power is often nothing more than a way to avoid changing anything. A bicycle is a better answer to a Chevrolet Suburban than a Prius is, and so is a train, or your feet, or staying home, or a mix of all those things. Nuclear power plants, like coal-burning power plants, are about retaining the big infrastructure of centralized power production and, often, the habits of obscene consumption that rely on big power. But this may be too complicated to get into while your proradiation interlocutor suggests that letting a thousand nuclear power plants bloom would solve everything.


Instead, you may be able to derail the conversation by asking whether they'd like to have a nuclear power plant or waste repository in their backyard, which mostly they would rather not, though they'd happily have it in your backyard. This is why the populous regions of the eastern U.S. keep trying to dump their nuclear garbage in the less-populous regions of the West. My friend Chip Ward (from nuclear-waste-threatened Utah) reports, "To make a difference in global climate change, we would have to immediately build as many nuclear power plants as we already have in the U.S. (about 100) and at least as many as 2,000 worldwide." Chip goes on to say that "Wall Street won't invest in nuclear power because it is too risky. . . . The partial meltdown at Three Mile Island taught investment bankers how a two-billion-dollar investment can turn into a billion-dollar clean-up in under two hours." So we, the people, would have to foot the bill.


Nuclear power proponents like to picture a bunch of clean plants humming away like beehives across the landscape. Yet when it comes to the mining of uranium, which mostly takes place on indigenous lands from northern Canada to central Australia, you need to picture fossil-fuel-intensive carbon-emitting vehicles, and lots of them-big disgusting diesel-belching ones. But that's the least of it. The Navajo are fighting right now to prevent uranium mining from resuming on their land, which was severely contaminated by the postwar uranium boom of the 1940s and 1950s. The miners got lung cancer. The children in the area got birth defects and a 1,500 percent increase in ovarian and testicular cancer. And the slag heaps and contaminated pools that were left behind will be radioactive for millennia.


If these facts haven't dissuaded this person sitting next to you, try telling him or her that most mined uranium-about 99.28 percent-is fairly low-radiation uranium-238, which is still a highly toxic heavy metal. To make nuclear fuel, the ore must be "enriched," an energy-intensive process that increases the .72 percent of highly fissionable, highly radioactive U-235 up to 3 to 5 percent. As Chip points out, four dirty-coal-fired plants were operated in Kentucky just to operate two uranium enrichment plants. What's left over is a huge quantity of U-238, known as depleted uranium, which the U.S. government classifies as low-level nuclear waste, except when it uses the stuff to make armoring and projectiles that are the source of so much contamination in Iraq from our first war there, and our second.


Reprocessing spent nuclear fuel was supposed to be one alternative to lots and lots of mining forever and forever. The biggest experiment in reprocessing was at Sellafield in Britain. In 2005, after decades of contamination and leaks and general spewing of horrible matter into the ocean, air, and land around the reprocessing plant, Sellafield was shut down because a bigger-than-usual leak of fuel dissolved in nitric acid-some tens of thousands of gallons-was discovered. It contained enough plutonium to make about twenty nuclear bombs. Gentle reader, this has always been one of the prime problems of nuclear energy: the same general processes that produce fuel for power can produce it for bombs. In India. Or Pakistan. Or Iran. The waste from nuclear plants is now the subject of much fretting about terrorists obtaining it for dirty bombs-and with a few hundred thousand tons of high-level waste in the form of spent fuel and a whole lot more low-level waste in the U.S. alone, there's plenty to go around.


By now the facts should be on your side, but do ask how your neighbor feels about nuclear bombs, just to keep things lively.


The truth is, there may not be enough uranium out there to fuel two thousand more nuclear power plants worldwide. Besides, before a nuke plant goes online, a huge amount of fossil fuel must be expended just to build the thing. Still, the biggest stumbling block, where climate change is concerned, is that it takes a decade or more to construct a nuclear plant, even if the permitting process goes smoothly, which it often does not. So a bunch of nuclear power plants that go online in 2017 at the earliest are not even terribly relevant to turning around our carbon emissions in the next decade-which is the time frame we have before it's too late.


If you're not, at this point, chasing your poor formerly pronuclear companion down the hallway, mention that every stage of the nuclear fuel cycle is murderously filthy, imparting long-lasting contamination on an epic scale; that a certain degree of radioactive pollution is standard at each of these stages, but the accidents are now so many in number that they have to be factored in as part of the environmental cost; that the plants themselves generate lots of radioactive waste, which we still don't know what to do with-because the stuff is deadly . . . anywhere . . . and almost forever. And no, tell them, this nuclear colonialism is not an acceptable sacrifice, since it is not one the power consumers themselves are making. It's a sacrifice they're imposing on people far away and others not yet born, a debt they're racking up at the expense of people they will never meet.


Sure, you can say nuclear power is somewhat less carbon-intensive than burning fossil fuels for energy; beating your children to death with a club will prevent them from getting hit by a car. Ravaging the Earth by one irreparable means is not a sensible way to prevent it from being destroyed by another. There are alternatives. We should choose them and use them.






Technorati : , , ,
Del.icio.us : , , ,
Ice Rocket : , , ,
Flickr : , , ,
Zooomr : , , ,
Buzznet : , , ,
Riya : , , ,
43 Things : , , ,

Find here

Home II Large Hadron Cillider News