Search This Blog

Wednesday, October 3, 2007

The Geothermal Energy Industry Connects With the Financial Community for a Major Deal Making Event


Meet leading project developers, investors, lenders, power purchasers, technology experts and other active market players looking to craft deals and discover:



  • How to benefit from the increasing flow of available capital

  • Who will be making deals

  • What terms are available

  • What opportunities exist for partners & investors


About The Summit:

Geothermal is poised to meet the rapidly growing demand for renewable, zero-carbon power. Expansion opportunities across the US have enormous potential for geothermal energy growth. The opportunities for geothermal have never been more promising and the financial community sees it as the next big renewable play.


The Geothermal Finance & Investment Summit will bring together leading project developers, investors, lenders, EPC contractors, and other players to share their perspectives on the market for geothermal finance and investment. You will not only hear from this group about the latest developments in the geothermal finance and investment markets, but the Summit will also provide an outstanding opportunity to meet and network with active market players and to accurately gauge the current pulse of the industry.


The Summit is designed to provide the latest intelligence on the current market environment for putting together geothermal deals. The players in the market will discuss what they are looking for when they get involved in deals, what future opportunities exist for partners and investors, and how to successfully get deals done in 2008 and beyond. You will hear:



  • Developers offer their perspective on the current market

  • Investors discuss criteria used in evaluating geothermal investments

  • Lenders outline terms and structures for debt-financing

  • Technology companies and EPC contractors provide their perspective on physical development of projects

  • Utilities discuss purchasing geothermal baseload to meet RPS requirements





Financing Geothermal Projects


An in-depth full-day workshop, Financing Geothermal Projects, precedes the Summit. The workshop will provide an in-depth tutorial by top lawyers and consultants on how to best structure your project and allocate risk for maximum project success.

To register
or obtain more information about this unique event, please visit the event website at http://www.infocastinc.com/geotherm.html, or call (818) 888-4444.













Technorati :

MIT research helps convert brain signals into action


Lakshminarayan Srinivasan (S.M., Ph.D. 2006) is part of a team that develops standardizing math equations to allow neural prostheses to work better. He is currently a medical student in the Harvard-MIT Division of Health Sciences and Technology and a postdoctoral researcher at the Center for Nervous System Repair at Massachusetts General Hospital.
sponsored by careerbd Brain the most wonderful thing in the space . their are many belivers that if sometime we can discover our brain we may live life diffrent
MIT researchers have developed a new algorithm to help create prosthetic devices that convert brain signals into action in patients who have been paralyzed or had limbs amputated.

The technique, described in a paper published as the cover article in the October edition of the Journal of Neurophysiology, unifies seemingly disparate approaches taken by experimental groups that prototype these neural prosthetic devices in animals or humans.

"The work represents an important advance in our understanding of how to construct algorithms in neural prosthetic devices for people who cannot move to act or speak," said Lakshminarayan "Ram" Srinivasan (S.M., Ph.D. 2006), lead author of the paper.

Srinivasan, currently a postdoctoral researcher at the Center for Nervous System Repair at Massachusetts General Hospital and a medical student in the Harvard-MIT Division of Health Sciences and Technology (HST), began working on the algorithm while a graduate student in MIT's Department of Electrical Engineering and Computer Science (EECS).

Trauma and disease can lead to paralysis or amputation, reducing the ability to move or talk despite the capacity to think and form intentions. In spinal cord injuries, strokes, and diseases such as amyotrophic lateral sclerosis (Lou Gehrig's disease), the neurons that carry commands from the brain to muscle can be injured. In amputation, both nerves and muscle are lost.

Neural prosthetic devices represent an engineer's approach to treating paralysis and amputation. Here, electronics are used to monitor the neural signals that reflect an individual's intentions for the prosthesis or computer they are trying to use. Algorithms form the link between neural signals that are recorded and the user's intentions that are decoded to drive the prosthetic device.

Over the past decade, efforts at prototyping these devices have divided along various boundaries related to brain regions, recording modalities, and applications. The MIT technique provides a common framework that underlies all these various efforts.

The research uses a method called graphical models that has been widely applied to problems in computer science like speech-to-text or automated video analysis. The graphical models used by the MIT team are diagrams composed of circles and arrows that represent how neural activity results from a person's intentions for the prosthetic device they are using.

The diagrams represent the mathematical relationship between the person's intentions and the neural manifestation of that intention, whether the intention is measured by an electoencephalography (EEG), intracranial electrode arrays or optical imaging. These signals could come from a number of brain regions, including cortical or subcortical structures.

Until now, researchers working on brain prosthetics have used different algorithms depending on what method they were using to measure brain activity. The new model is applicable no matter what measurement technique is used, according to Srinivasan. "We don't need to reinvent a new paradigm for each modality or brain region," he said.

Srinivasan is quick to underscore that many challenges remain in designing neural prosthetic algorithms before they are available for people to use. While the algorithm is unifying, it is not universal: the algorithm consolidates multiple avenues of development in prostheses, but it isn't the final and only approach these researchers expect to see in the years to come. Moreover, energy efficiency and robustness are key considerations for any portable, implantible bio-electronic device.

Through a better quantitative understanding of how the brain normally controls movement and the mechanisms of disease, he hopes these devices could one day allow for a level of dexterity depicted in movies, such as actor Will Smith's mechanical arm in the movie, "I, Robot."

The gap between existing prototypes and that final goal is wide. Translating an algorithm into a fully functioning clinical device will require a great deal of work, but also represents an intriguing road of scientific and engineering development for the years to come.

Other authors on the paper are Uri Eden (Ph.D. 2005), assistant professor in mathematics and statistics at Boston University; Sanjoy Mitter, professor in EECS and MIT's Engineering Systems Division; and Emery Brown, professor in brain and cognitive sciences, HST, and anesthesia and critical care at Massachusetts General Hospital. The cover image for the October issue of Journal of Neurophysiology that depicts this research was designed by Rene Chen (B.S. 2007) and Eric Pesanelli.

This work was sponsored by the National Institutes of Health and the National Science Foundation.

Could Adenine From Interstellar Dust Have Triggered Life On Earth? Elsewhere?


Some of the elements necessary to support life on Earth are widely known - oxygen, carbon and water, to name a few. Just as important in the existence of life as any other component is the presence of adenine, an essential organic molecule. Without it, the basic building blocks of life would not come together. Scientists have been trying to find the origin of Earth's adenine and where else it might exist in the solar system. University of Missouri-Columbia researcher Rainer Glaser may have the answer.


Life exists on Earth because of a delicate combination of chemical ingredients. Using a theoretical model, Glaser is hypothesizing the existence of adenine in interstellar dust clouds. Those same clouds may have showered young Earth with adenine as it began cooling billions of years ago, and could potentially hold the key for initiating a similar process on another planet.


"The idea that certain molecules came from space is not outrageous," said Glaser, professor of chemistry in MU's College of Arts and Science. "You can find large molecules in meteorites, including adenine. We know that adenine can be made elsewhere in the solar system, so why should one consider it impossible to make the building blocks somewhere in interstellar dust?"


Glaser believes astronomers should look for interstellar dust clouds that have highly-concentrated hydrogen cyanide (HCN), which can indicate the presence of adenine. Finding such pockets would narrow the spectrum of where life could exist within the Milky Way galaxy.


"There is a lot of sky with a few areas that have dust clouds. In those dust clouds, a few of them have HCN. A few of those have enough HCN to support the synthesis of the molecules of life. Now, we have to look for the HCN concentrations, and that's where you want to look for adenine," Glaser said. "Chemistry in space and 'normal chemistry' can be very different because the concentrations and energy-exchange processes are different. These features make the study of chemistry in space very exciting and academically challenging; one really must think without prejudice."


This theory describing the fusion of early life-forming chemicals is presented in the latest issue of the peer-reviewed journal "Astrobiology" and is co-authored by Brian Hodgen (Creighton University), Dean Farrelly (University of Manchester) and Elliot McKee (St. Louis University). The paper, "Adenine Synthesis in Interstellar Space: Mechanisms of Prebiotic Pyrimidine-Ring Formation of Monocyclic HCN-Pentamers," describes the absence of a sizeable barrier that would prevent formation of the skeleton needed for adenine synthesis. The article is also featured in the Aug. 6 issue of "Chemical & Engineering News.


Source : http://www.sciencedaily.com/releases/2007/10/071002113036.htm




Technorati :

Lubrication oil appears to be an important yet little-recognized source of toxic particle emissions from motor vehicles


Lubrication oil appears to be an important yet little-recognized source of toxic particle emissions from motor vehicles -- even those fueled by clean-burning hydrogen, according to a joint study by government and academic researchers in Washington State and Minnesota


Their study, a step toward more cleaner-burning engines, will be published in the Oct. 1 issue of ACS' Environmental Science & Technology.


Scientists have long recognized diesel-fueled vehicles as important sources of air pollution that can increase the risk of asthma, bronchitis, and other health problems. Most research, however, has focused on diesel soot, rather than emissions produced by lubrication oil.


In the new study, Arthur L. Miller and colleagues modified a truck diesel engine to run on clean-burning hydrogen instead of diesel fuel, allowing the researchers to focus solely on particle emissions from lubrication oil.


They found that the hydrogen-powered engine emitted higher levels of metal-rich particles than the diesel-fueled engine. Lubrication oil was the primary source of these increased emissions. Emission particles identified include calcium, phosphorous, zinc, magnesium, and iron nanoparticles, all of which have the potential to cause lung damage when inhaled over long periods, they say.


"This study's findings may increase current knowledge about the role of lubrication oil in particle-formation dynamics as engine technology improves and cleaner internal combustion engines are developed," the researchers state.


More about toxic air


Dioxin, lead and particulate matter emissions from diesel-fueled engines are three of five toxic air contaminants that may cause children and infants to be especially susceptible to illness, according to a new evaluation conducted by the California Environmental Protection Agency's Office of Environmental Health Hazard Assessment (OEHHA).


Polycyclic organic matter and acrolein are the other two toxic air contaminants identified in the evaluation.


"This was one of the most extensive evaluations to date of the effects that toxic pollutants in our air may have specifically on children and infants," said OEHHA Director Dr. Joan Denton.


"There is increasing evidence that children and infants may be more vulnerable than adults to the toxic effects of many pollutants," Dr. Denton said. "However, most past scientific research focused on the effects of pollution on adults. For that reason, most air-quality regulations are based on the effects of air contaminants on adults, rather than children. This evaluation is a key step in California's efforts to ensure children receive the protection they deserve from toxic air contaminants."


The OEHHA evaluation was mandated by the Children's Environmental Health Protection Act (Senate Bill 25), which was authored by Senator Martha Escutia and signed into law by Governor Gray Davis in October 1999. The Act requires OEHHA to evaluate available information on toxic air contaminants and develop a list of up to five toxic air contaminants that may cause children and infants to be especially susceptible to illness. OEHHA is forwarding the list to the California Air Resources Board (ARB), which is required by the Act to review existing regulations for those contaminants and, if necessary, amend them or develop new regulations to ensure the adequate protection of children and infants.


The Act also requires OEHHA to continue evaluating the health effects of other toxic air contaminants on children and infants. Beginning in 2004, OEHHA will annually evaluate at least 15 contaminants and then present an updated list of contaminants to ARB, which will review and revise its regulations as needed.


Children may face greater risks than adults from air pollution, in part because their exposure to airborne pollutants is greater. Infants and children generally breathe more air per pound of body weight than adults, which increases their exposure to any pollutants in the air. Infants and children often breathe through their mouths, bypassing the filtering effect of the nose and allowing more pollutants to be inhaled. Children also tend to be more active physically than adults, and spend more time outdoors.


Exposure to toxic air contaminants during infancy or childhood could interfere with the development of the respiratory, nervous, endocrine and immune systems, and could increase the risk of cancer later in life.


Beginning in early 2000, OEHHA scientists conducted an initial review of the toxicity and prevalence of more than 200 toxic air contaminants. OEHHA then oversaw focused reviews of the scientific literature on 36 of those contaminants, and selected 17 contaminants for further evaluation based on evidence of their potential effects on children. The state's Scientific Review Panel on Toxic Air Contaminants, a committee of independent scientists affiliated with the University of California, reviewed OEHHA's draft report and endorsed the selection of the final five contaminants, as described below:


Dioxins are a family of chemicals that include polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs). Dioxins typically are released to the air during waste incineration, the burning of fuels to produce power for industrial purposes, and motor vehicle use. Dioxins persist for long periods of time in the environment. Airborne dioxins can settle on crops, which are then eaten by humans directly, or by livestock that humans later consume.


Fetuses and newborns are particularly vulnerable to dioxin exposure. Dioxins have been found in amniotic fluid and placenta samples, and breast-fed infants can have blood levels of dioxin greater than in their mother. Evidence indicates that exposure to dioxins during infancy may affect the development of the immune system and later make the child more susceptible to infectious diseases. Fetal exposure to dioxins may be associated with low birth weight. Early dioxin exposure may also increase cancer risk later in life.


Regulatory efforts have led to a substantial decrease in dioxin emissions. By 1995, dioxin releases were 80 percent lower than in the 1970s. Federal and state regulations in recent years have targeted municipal waste and medical waste incinerators. ARB is initiating a new evaluation of the sources of dioxin emissions.


Lead has long been associated with toxic effects in children. Low levels of lead exposure have been associated with delays in mental development; decreases in intelligence, short-term memory and visual motor functioning; and aggressive behavior in children.


Airborne lead levels have decreased dramatically in recent decades, primarily due to the ban on leaded gasoline. Deteriorating lead-based paint is now a more significant source of lead exposure for California's children than lead in the ambient air. However, lead emissions still occur from a number of industrial facilities in California. Children living close to these facilities may face an increased risk of lead-related health effects, especially if they are further exposed to lead from paint and other sources.


Polycyclic Organic Matter (POM) consists of a family of more than 100 chemicals, including benzo[a]pyrene and napthalene. They are produced by the combustion of fossil fuels, vegetable matter and other carbon-based materials. POM is present in exhaust from diesel- and gasoline-powered motor vehicles, fireplace smoke, tobacco smoke, and emissions from paper mills, industrial machinery manufacturing plants, and petroleum refineries. POM can be a significant indoor air pollutant due to smoking, wood burning, and infiltration of outdoor polluted air.


A number of POM substances have been identified as causing cancer in humans or animals. Early-in-life exposures to POM may increase the risk of cancer later in life. Transfer of POM from the mother to the fetus has been well documented, and several studies indicate POM exposure in the womb may result in low birth weight, birth defects or cancer.


ARB regulations have significantly reduced POM emissions from motor vehicles, and new ARB motor vehicle measures are expected to further reduce POM emissions. Industrial facilities with significant POM emissions are required by state law to submit health risk assessments for OEHHA's review and, if necessary, implement risk-reduction measures.


Particulate matter emissions from diesel-fueled engines are microscopic particles present in diesel exhaust. These particles can inflame the airways, enhance allergic responses and may make children more susceptible to allergies and asthma. They also contribute to overall levels of airborne particles, which have been associated with exacerbation of asthma, bronchitis, cough and wheeze in children. Diesel particles also contain a number of toxic substances, including POM.


ARB has previously approved various regulations to reduce diesel-exhaust emissions. In September 2000, ARB approved a risk reduction plan that is expected to reduce diesel particle emissions by 85% by 2020. ARB will determine whether any changes in its diesel regulatory program are needed as a result of OEHHA's identification of diesel exhaust particles.


Acrolein is present in motor vehicle exhaust, tobacco smoke, wood smoke and some industrial emissions, and is used as an herbicide in irrigation canals. It can also be formed in the atmosphere from chemical reactions involving 1,3-butadiene, another pollutant present in motor vehicle and industrial emissions. Acrolein is very difficult to measure in ambient air, but studies indicate it is routinely present in urban settings at concentrations that may affect the respiratory system.


Several studies in animals strongly suggest that acrolein may exacerbate asthma. OEHHA believes this is of special concern for children, because asthma is more prevalent among children than adults, and because asthma episodes can be more severe in children than adults due to their smaller airways.


Other toxic air contaminants that may make children and infants especially susceptible to illnesses and that will be given a high priority in future OEHHA reviews are arsenic, benzene, carbon disulfide, chlorine, formaldehyde, glycol ethers, manganese, mercury, methyl bromide, methylene chloride, polychlorinated biphenyls (PCBs) and vinyl chloride.


In other activities related to the Children's Environmental Health Protection Act, OEHHA and ARB are studying whether the state's ambient air quality standards for particulate matter and other pollutants adequately affect the health of children. For more information, please see the fact sheet, "Air Pollution and Children's Health" on OEHHA's Web site at www.oehha.ca.gov/public_info/facts/airkids.html. (updated 2/28/02)


The Office of Environmental Health Hazard Assessment is one of six entities within the California Environmental Protection Agency. OEHHA's mission is to protect and enhance public health and the environment by objective scientific evaluation of risks posed by hazardous substances.






Technorati :

Giardia Genome Unlocked


Giardia Genome Unlocked


Giardia lamblia trophozoite. This is the form the parasite takes while living in the intestine of a human or other animal. Scanning electron micrograph, false color. (Credit: Joel Mancuso, University of California, Berkeley).


Giardia lamblia, one of the most common human parasites in the United States, causes more than 20,000 intestinal infections a year, often through contact with contaminated drinking or swimming water. In the September 28 issue of Science, an international team led by researchers at the MBL (Marine Biological Laboratory) describe the complete genome (genetic sequence) of Giardia, which could lead to the development of new drugs to combat this persistent infection, called giardiasis.


"Even though there are treatments now available, a number of people get chronic giardiasis, which is difficult to eliminate. So there is interest in new treatments," says Hilary Morrison, Ph.D., of the MBL's Josephine Bay Paul Center for Comparative Molecular Biology and Evolution, the first author on the paper.


The Giardia parasite lives in the human intestine in a swimming and feeding form called a trophozoite, which is eventually expelled through the stools. Outside the body, Giardia takes the form of a highly infectious cyst that can live for weeks in water, soil, food, or on other surfaces.


Giardiasis is most common among children, especially those who are exposed to diaper changing. Swimmers, hikers, campers and others who drink untreated water are also prone to the infection (hence the nickname "backpacker's disease" or "beaver fever"), as are international travelers. Common symptoms include diarrhea, nausea, stomach cramps and gas, and usually persist two to six weeks. Because the parasite clings to intestinal cells that absorb fats and nutrients, giardiasis can lead to severe complications such as poor nutrient absorption and weight loss.


Giardia spends one phase of its lifecycle in the environment and the other in the gut of an infected human or wild animal. To maintain this dual existence, the parasite has two radically different microscopic forms.


In water, Giardia exists as a hardy, highly infectious cyst, which can survive for months, even in fresh water devoid of all nutrients. In the gut, Giardia exists in a swimming and feeding form known as a trophozoite.


The awakening of the dormant cyst happens quickly after someone swallows contaminated water or food. After the cysts encounter the warm acidic juices in the stomach, they change into trophozoites. Within about two hours, these trophozoites will be swimming in the intestines.


Unlike many other parasites, trophozoites do not invade tissues or cells. Instead they simply attach to cells, drink in nutrients and multiply. The parasite evades the immune system and persists in the intestine by shifting the proteins it displays on its surfaces.


Existing drugs can effectively treat people with Giardia infections, the disease known as giardiasis, but most infections resolve on their own. When trophozoites detach from the intestinal wall, they may swim and reattach to new intestinal cells, or they may pass down the digestive tract and into the bowels, transform back into cysts and be passed through the stools.


"Although not life threatening, Giardia is a rather fastidious parasite and quite important from an economical viewpoint worldwide and in the United States, where it constitutes a major cause of diarrheal disease in children in daycare centers," says Mitchell Sogin, Ph.D., director of the Josephine Bay Paul Center and leader of the Giardia study.


Analysis of the Giardia genome revealed several unusual proteins that are promising drug targets, Morrison says. "These proteins are different enough from human proteins that if you affect them with a drug, it's not going to affect the human counterparts," she says. "Drugs can be devised that will interfere with the parasite's ability to replicate, or to move or bind in the small intestine, or to exist at all."


Evolutionary history



The team also investigated the evolutionary history of this ancient parasite. Giardia is a single-celled eukaryote, meaning its cell has a nucleus, as do the cells of humans and most other multicellular organisms. But the Giardia genome is compact compared to other eukaryotes, with simplified machinery for several basic processes, such as DNA replication and RNA processing.



If the Giardia genome had originally been complex and experienced gene loss over evolutionary time, Morrison says, one would expect to see parts of the machinery intact and parts missing. This, however, wasn't the case. "It looks like the genome was just simpler to begin with," she says. The authors hypothesize that Giardia diverged from other eukaryotes more than a billion years ago.


"We embarked upon this genome project because of its importance to human health and suggestions from earlier molecular analyses that Giardia represents a very early-diverging lineage in the evolutionary history of eukaryotes," Sogin says. "Giardia's genome content and architecture support these theories about the parasite's ancestral character."


Eukaryotic organisms are so diverse that they include everything from amoebae to humans. But even within such a wide range, Giardia is unusual. In its trophozoite form, it has two nuclei instead of the more usual one. When it becomes a cyst, it multiplies its genetic material into four identical nuclei. But despite having these multiple copies of its nuclei, Giardia is really a genetic minimalist. It has fewer and simpler genetic components than most other eukaryotes.


Why? According to one theory, Giardia is simple because it has lost complexity: evolutionary pressure favors parasites that shed genes coding for functions they can borrow from their infected hosts. An alternative theory holds that the parasite may have always been simple because it diverged from other eukaryotic organisms more than a billion years ago, long before the complex modern eukaryotes emerged.



Hilary Morrison, Ph.D., and Mitchell Sogin, Ph.D., of MBL, who led the study, say their findings support this latter theory. Careful analysis of the genome reveals that Giardia's molecular machinery -- even for the most basic processes usually shared by other eukaryotes -- is simple by comparison. This suggests that it has always been so. Its parasitic niche has allowed it to maintain its simple genetic makeup for billions of years -- long before it started showing up at daycare centers.


Another important finding, Sogin says, is that the genes that allow Giardia to evade the human immune response are organized differently than in other parasites. In the host intestine, Giardia eludes an immune system attack by shifting the proteins it displays on its surfaces. The genes for these surface proteins are scattered throughout the Giardia genome rather than found in clusters, as in other parasites.


Along with Drs. Morrison, Sogin and their MBL colleagues, collaborators on the project included researchers from the University of California, San Diego; the University of Texas at El Paso, University of Arizona College of Medicine; University of Illinois at Urbana--Champaign; Uppsala University in Sweden, the University of Zürich; Boston University Goldman School of Dental Medicine; the Swedish Institute for Infectious Disease Control; the Salk Institute for Biological Studies; and the University of Pennsylvania.


The research was funded by the National Institute of Allergy and Infectious Diseases (NIAID), one of the National Institutes of Health (NIH).









Technorati : , ,

Extreme Star Cluster Bursts Into Life


Extreme Star Cluster Bursts Into Life


The NASA/ESA Hubble Space Telescope has captured a spectacular image of NGC 3603, a giant nebula hosting one of the most prominent massive young clusters in the Milky Way. (Credit: NASA, ESA and the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration)


The NASA/ESA Hubble Space Telescope has captured a spectacular image of NGC 3603, a giant nebula hosting one of the most prominent massive young clusters in the Milky Way, thus supplying a prime template for star formation studies.


NGC 3603 is a prominent star-forming region located in the Carina spiral arm of the Milky Way, about 20,000 light-years away from our Solar System. This latest image from the NASA/ESA Hubble Space Telescope shows a young star cluster surrounded by a vast region of dust and gas. Most of the bright stars in the image are hot blue stars whose ultraviolet radiation and violent winds have blown out an enormous cavity in the gas and dust enveloping the cluster.


The new Hubble image provides a snapshot in time of many stars with differing masses but similar ages inside the young cluster. This allows for detailed analysis of several types of stars at varying stages in their lives. Astronomers can then compare clusters of different ages with one another and determine which properties (such as temperature and brightness) change as the stars get older.


According to astronomer Dr. Jesús Maíz Apellániz from Instituto de Astrofísica de Andalucía, Spain, who is leading the Hubble investigation, the massive star cluster in NGC 3603 appears to gather the most massive stars at its core. He and his team have discovered that the distribution of different types of stars at the centre of this very dense cluster is similar to that of other young star clusters in the Milky Way.


The team has also found that the three brightest stars in the centre are apparently misleading us into believing that they are more massive objects than theoretical limits allow. These heavyweight stars may actually consist of two or maybe more individual massive stars whose light has blended together. Even with the resolution of Hubble it is not possible to separate the individual stars in each of the three systems.



This finding agrees with a recent discovery by Dr. Anthony Moffat from the Université de Montréal, Canada, who used ESO's Very Large Telescope and Hubble's infrared NICMOS camera to measure the movements of the individual stars in two of the three systems. Dr. Moffat measured the largest individual mass to be roughly 115 solar masses, which is within the acceptable limits for conventional theory.


The swirling nebula of NGC 3603 contains around 400,000 solar masses of gas. Lurking within this vast cloud are a few Bok globules (seen at the top right corner of the image), named after Bart Bok who first observed them in the 1940s. These are dark clouds of dense dust and gas with masses of about ten to fifty times larger than that of the Sun. They resemble insect cocoons and are undergoing gravitational collapse on their way to form new stars. Bok globules appear to be some of the coldest objects in the Universe.


NGC 3603 was first discovered by Sir John Herschel in 1834. It is known to harbour a blue supergiant star called Sher 25 that can be spotted above and left of the densest part of the cluster. This star is believed to be near the point of exploding as a supernova and is often denoted as the Milky Way counterpart of the predecessor of the now famous supernova SN 1987A.









Technorati : , , , ,

Nanotube Forests Grown On Silicon Chips For Future Computers, Electronics


Nanotube Forests Grown On Silicon Chips For Future Computers, Electronics


Mechanical engineering doctoral student Baratunde A. Cola, from left, looks through a view port in a plasma-enhanced chemical vapor deposition instrument while postdoctoral research fellow Placidus Amama adjusts settings. The two engineers recently have shown how to grow forests of tiny cylinders called carbon nanotubes onto the surfaces of computer chips to enhance the flow of heat at a critical point where the chips connect to cooling devices called heat sinks. The carpetlike growth of nanotubes has been shown to outperform conventional "thermal interface materials." The research is based at the Birck Nanotechnology Center in Discovery Park at Purdue. (Credit: Purdue News Service photo by David Umberger)


Engineers have shown how to grow forests of tiny cylinders called carbon nanotubes onto the surfaces of computer chips to enhance the flow of heat at a critical point where the chips connect to cooling devices called heat sinks.
The carpetlike growth of nanotubes has been shown to outperform conventional "thermal interface materials." Like those materials, the nanotube layer does not require elaborate clean-room environments, representing a possible low-cost manufacturing approach to keep future chips from overheating and reduce the size of cooling systems, said Placidus B. Amama, a postdoctoral research associate at the Birck Nanotechnology Center in Purdue's Discovery Park.


Researchers are trying to develop new types of thermal interface materials that conduct heat more efficiently than conventional materials, improving overall performance and helping to meet cooling needs of future chips that will produce more heat than current microprocessors. The materials, which are sandwiched between silicon chips and the metal heat sinks, fill gaps and irregularities between the chip and metal surfaces to enhance heat flow between the two.


The method developed by the Purdue researchers enables them to create a nanotube interface that conforms to a heat sink's uneven surface, conducting heat with less resistance than comparable interface materials currently in use by industry, said doctoral student Baratunde A. Cola.


Findings were detailed in a research paper that appeared in September's issue of the journal Nanotechnology. The paper was written by Amama; Cola; Timothy D. Sands, director of the Birck Nanotechnology Center and the Basil S. Turner Professor of Materials Engineering and Electrical and Computer Engineering; and Xianfan Xu and Timothy S. Fisher, both professors of mechanical engineering.


Better thermal interface materials are needed either to test computer chips in manufacturing or to keep chips cooler during operation in commercial products.


"In a personal computer, laptop and portable electronics, the better your thermal interface material, the smaller the heat sink and overall chip-cooling systems have to be," Cola said.


Heat sinks are structures that usually contain an array of fins to increase surface contact with the air and improve heat dissipation, and a fan often also is used to blow air over the devices to cool chips.


Conventional thermal interface materials include greases, waxes and a foil made of a metal called indium. All of these materials, however, have drawbacks. The greases don't last many cycles of repeatedly testing chips on the assembly line. The indium foil doesn't make good enough contact for optimum heat transfer, Fisher said.


The Purdue researchers created templates from branching molecules called dendrimers, forming these templates on a silicon surface. Then, metal catalyst particles that are needed to grow the nanotubes were deposited inside cavities between the dendrimer branches. Heat was then applied to the silicon chip, burning away the polymer and leaving behind only the metal catalyst particles.


The engineers then placed the catalyst particle-laden silicon inside a chamber and exposed it to methane gas. Microwave energy was applied to break down the methane, which contains carbon. The catalyst particles prompted the nanotubes to assemble from carbon originating in the methane, and the tubes then grew vertically from the surface of the silicon chip.


"The dendrimer is a vehicle to deliver the cargo of catalyst particles, making it possible for us to seed the carbon nanotube growth right on the substrate," Amama said. "We are able to control the particle size - what ultimately determines the diameters of the tubes - and we also have control over the density, or the thickness of this forest of nanotubes. The density, quality and diameter are key parameters in controlling the heat-transfer properties."


The catalyst particles are made of "transition metals," such as iron, cobalt, nickel or palladium. Because the catalyst particles are about 10 nanometers in diameter, they allow the formation of tubes of similar diameter.


The branching dendrites are tipped with molecules called amines, which act as handles to stick to the silicon surface.


"This is important because for heat-transfer applications, you want the nanotubes to be well-anchored," Amama said.


Researchers usually produce carbon nanotubes separately and then attach them to the silicon chips or mix them with a polymer and then apply them as a paste.


"Our direct growth approach, however, addresses the critical heat-flow path, which is between the chip surface and the nanotubes themselves," Fisher said. "Without this direct connection, the thermal performance suffers greatly."


Because the dendrimers have a uniform composition and structure, the researchers were able to control the distribution and density of catalyst particles.


The research team also has been able to control the number of "defect sites" in the lattice of carbon atoms making up the tubes, creating tubes that are more flexible. This increased flexibility causes the nanotube forests to conform to the surface of the heat sink, making for better contact and improved heat conduction.


"The tubes bend like toothbrush bristles, and they stick into the gaps and make a lot of real contact," Cola said.


The carbon nanotubes were grown using a technique called microwave plasma chemical vapor deposition, a relatively inexpensive method for manufacturing a thermal-interface material made of carbon nanotubes, Fisher said.


"The plasma deposition approach allows us great flexibility in controlling the growth environment and has enabled us to grow carbon nanotube arrays over a broad range of substrate temperatures," Fisher said.


The research has been funded by NASA through the Institute for Nanoelectronics and Computing, based at Purdue's Discovery Park. Cola also received support through a fellowship from Intel Corp. and Purdue.





Technorati : , , , , , , ,

Individual Differences Caused By Shuffled Chunks Of DNA In The Human Genome


Individual ,Differences ,Caused ,Shuffled ,Chunks ,DNA ,Human Genome


Snyder, Urban and Korbel (L-R) examine the distribution of structural variation on a map of the human genome. (Credit: Image courtesy of Yale University)



A study by Yale researchers offers a new view of what causes the greatest genetic variability among individuals -- suggesting that it is due less to single point mutations than to the presence of structural changes that cause extended segments of the human genome to be missing, rearranged, or present in extra copies.
The focus for identifying genetic differences has traditionally been on point mutations or SNPs -- changes in single bases in individual genes," said Michael Snyder, the Cullman Professor of Molecular, Cellular & Developmental Biology and senior author of the study, which was published in Science Express. "Our study shows that a considerably greater amount of variation between individuals is due to rearrangement of big chunks of DNA."


Although the original human genome sequencing effort was comprehensive, it left regions that were poorly analyzed. Recently, investigators found that even in healthy individuals, many regions in the genome show structural variation. This study was designed to fill in the gaps in the genome sequence and to create a technology to rapidly identify structural variations between genomes at very high resolution over extended regions.


"We were surprised to find that structural variation is much more prevalent than we thought and that most of the variants have an ancient origin. Many of the alterations we found occurred before early human populations migrated out of Africa," said first author Jan Korbel, a postdoctoral fellow in the Department of Molecular Biophysics & Biochemistry at Yale.


To look at structural variants that were shared or different, DNA from two females -- one of African descent and one of European descent -- was analyzed using a novel DNA-based methodology called Paired-End Mapping (PEM). Researchers broke up the genome DNA into manageable-sized pieces about 3000 bases long; tagged and rescued the paired ends of the fragments; and then analyzed their sequence with a high-throughput, rapid-sequencing method developed by 454 Life Sciences.


"454 Sequencing can generate hundreds of thousands of long read pairs that are unique within the human genome to quickly and accurately determine genomic variations," explained Michael Egholm, a co-author of the study and vice president of research and development at 454 Life Sciences.


"Previous work, based on point mutations estimated that there is a 0.1 percent difference between individuals, while this work points to a level of variation between two- and five-times higher," said Snyder.


"We also found 'hot spots' -- particular regions where there is a lot of variation," said Korbel. "While these regions may be still actively undergoing evolution, they are often regions associated with genetic disorder and disease."


"These results will have an impact on how people study genetic effects in disease," said Alex Eckehart Urban, a graduate student in Snyder's group, and one of the principal authors on the study. "It was previously assumed that 'landmarks,' like the SNPs mentioned earlier, were fairly evenly spread out in the genomes of different people. Now, when we are hunting for a disease gene, we have to take into account that structural variations can distort the map and differ between individual patients."


"While it may sound like a contradiction," says Snyder, "this study supports results we have previously reported about gene regulation as the primary cause of variation. Structural variation of large of spans of the genome will likely alter the regulation of individual genes within those sequences."


According to the authors, even in healthy people, there are variants in which part of a gene is deleted or sequences from two genes are fused together without destroying the cellular activity with which they are associated. They say these findings show that the "parts list" of the human genome may be more variable, and possibly more flexible, than previously thought.


Other authors from Yale in addition to primary authors Alex E Urban and Jan Korbel, who is also affiliated with the European Molecular Biology Laboratory in Heidelberg, Germany, are Fabian Grubert, Philip Kim, Dean Palejev, Nicholas Carriero, Andrea Tanzer, Eugenia Saunders, Sherman Weissman, and Mark Gerstein. The research was funded the National Institutes of Health, a Marie Curie Fellowship, the Alexander von Humboldt Foundation, The Wellcome Trust, Roche Applied Science and the Yale High Performance Computation Center.








Technorati : , , , , , ,

Hi-Def Radio Tagging: Hear It on the Air, Buy It on iTunes


Hi-Def Radio Tagging: Hear It on the Air, Buy It on iTunes


Because it is digital radio, information about each song played is streamed to your receiver. If the receiver has the ability to store a small amount of data it can store the info on any song you tag. When you sync your radio with your computer, any stored song located on the iTunes Store is made available for you to buy.


Did you see this? I know this has been around for awhile, but there is something significant going on here.


Apple (Nasdaq: AAPL) and a company called iBiquity Digital teamed up to provide a service to folks with digital HD radio that will let people "tag" a song they hear for purchase at the iTunes Store.


The service is called "iTunes Tagging" and it's pretty cool technology.



Find It, Keep It

Because it is digital radio, information about each song played is streamed to your receiver. If the receiver has the ability to store a small amount of data it can store the info on any song you tag. When you sync your radio with your computer, any stored song located on the iTunes Store is made available for you to buy.


This answers a nagging problem with any non-subscription music service: How do average Joes and Jills find new music?


This service also gives radio in general a huge shot in the arm. I don't know about you, but the only reason I turn on a radio these days is to receive the signal from my iPod's FM transmitter, and maybe check the weather for pop-up hurricanes. (This IS Florida after all.)


I used to listen to radio all the time, but commercials started taking up more airtime. It got so that 15 to 20 minutes of every hour was taken up by loud, annoying ads that insulted my intelligence and left me deaf. Even National Public Radio (NPR) had these increasingly frequent fund raising telethons where they came just short of begging the public for money. (I think NPR should be fully funded by tax dollars as it is one of the few easily accessible public services, but that's just my opinion.)


The Feeling of Discovery

Fed up with the ad jibber-jabber I wound up turning the radio off altogether and fired up CDs and, now, my iPod when I want music.


Hi-Def Radio could change that by offering CD quality tunes and fewer commercials, and now, iTunes Tagging. It would certainly be an attraction to me even if I only occasionally turn the radio on.


One thing I sorely miss is discovering new music, and radio, for all of its problems, was the best way to get introduced to new stuff without having to go through a lot of effort: just turn on the radio and listen.


Of course, this begs the question to be asked: How will Apple support Digital Radio?


Phone It In?


You may (or may not) know already that Apple has included hardware in iPhones that will let you receive FM radio. Will the iPhone pick up Digital FM radio?


I don't have a clue, but it does seem like a reasonable thing to do.


There you are, out in the wild, jamming to a new tune on your iPhone. Since a song rocked, you tag it. The next time you fire up iTunes on your iPhone, a list of tagged songs appears and iTunes asks you which, if any, you would like to buy. You buy them all, they get downloaded into your iPhone, you jam to new tunes that automagically get synced to your Mac or PC at home, and all is right with the world.


What's not to like about a scenario like that? All I can say is, bring it on.





Technorati : , , , , , ,

Nanotechnology Surges Into Health


"nanotechnology," and geeks imagine iPhones, laptops and flash drives. But more than 60 percent of the 580 products in a newly updated inventory of nanotechnology consumer products are such "un-geeky" items as tennis racquets, clothing, and health products. An updated inventory includes Head® NanoTitanium Tennis Racquets, Eddie Bauer® Water Shorts with Nano-Dry® technology, Nano-In Foot Deodorant Powder/Spray, and Burt's Bees® sunscreen with "natural Titanium Dioxide mineral...micronized into a nano sized particle."


Since the Project on Emerging Nanotechnologies launched the world's first online inventory of manufacturer-identified nanotech goods in March 2006, the number of items has increased 175 percent--from 220 to 580 products. There are 356 products in the health and fitness category--the inventory's largest category--and 66 products in the food and beverage category. One of the largest subcategories is cosmetics with 89 products. All are available in shopping malls or over the Internet. The list includes merchandise from such well-known brands as Samsung, Chanel, Black & Decker, Wilson, L.L. Bean, Lancome and L'Oreal.


The nanomaterial of choice appears to be silver--which manufacturers claim is in 139 products or nearly 25 percent of inventory--far outstripping carbon, gold, or silica.


"The use of nanotechnology and nanomaterials in consumer products and industrial applications is growing rapidly, and the products listed in the inventory are just the tip of the iceberg," said Project on Emerging Nanotechnologies science advisor Andrew Maynard. "How consumers respond to these early products--in food, electronics, health care, clothing and cars--will be a bellwether for broader market acceptance of nanotechnologies in the future. This is especially true given that the Project's recent poll shows seventy percent of the public still knows little or nothing about the technology."


Nanotechnology


Nanotechnology is the ability to measure, see, manipulate and manufacture things usually between 1 and 100 nanometers (nm). A nanometer is one billionth of a meter. A human hair is roughly 100,000 nanometers wide. The limit of the human eye's capacity to see without a microscope is about 10,000 nm. By 2014, a projected $2.6 trillion in global manufactured goods will incorporate nanotech, or about 15 percent of total output.


Full product list available at: http://www.nanotechproject.org/consumerproducts




Technorati :

Toshiba plans to begin selling TVs with OLED screens as soon as panels are ready.


Toshiba Corp. plans to begin selling televisions with OLED (organic light-emitting diodes) screens as soon as panels are ready, according to a company spokeswoman.

The first Toshiba OLED television sets should hit the market in 2009, said Yuko Sugahara [CQ], a company spokeswoman.

OLED screens offer higher contrast and faster response times than LCD (liquid crystal display) screens. OLED screens can also be thinner since no backlight is required. The carbon-based materials used to make OLEDs illuminate themselves when an electrical current is applied.

However, OLEDs are difficult to manufacture and degrade over time. Manufacturers are therefore working on ways to improve production yields and increase the lifespan of the screens.

Sony Corp. became the first company to introduce an OLED television on Monday, with the release of its XEL-1. The television, which goes on sale in December, has an 11-inch screen and has an estimated of lifespan of around 30,000 hours. That's enough time to watch eight hours of television per day for 10 years, Sony said.

While Sony was first to market with an OLED television, a lot of work remains to be done before the screens are ready for widespread adoption. The XEL-1 will be available in limited quantities, with Sony expecting to produce just 2,000 sets per month.

OLED
An organic light-emitting diode (OLED) is any light-emitting diode (LED) whose emissive electroluminescent layer comprises a film of organic compounds. The layer usually contains a polymer substance that allows suitable organic compounds to be deposited. They are deposited in rows and columns onto a flat carrier by a simple "printing" process. The resulting matrix of pixels can emit light of different colors.

Such systems can be used in television screens, computer displays, portable system screens, advertising, information and indication. OLEDs can also be used in light sources for general space illumination, and large-area light-emitting elements. OLEDs typically emit less light per area than inorganic solid-state based LEDs which are usually designed for use as point-light sources.

A great benefit of OLED displays over traditional liquid crystal displays (LCDs) is that OLEDs do not require a backlight to function. Thus they draw far less power and, when powered from a battery, can operate longer on the same charge. OLED-based display devices also can be more effectively manufactured than LCDs and plasma displays. But degradation of OLED materials has limited the use of these materials. See Drawbacks.

OLED technology was also called Organic Electro-Luminescence (OEL), before the term "OLED" became standard.

MIT Shows Confidence in Hometown Deal


CAMBRIDGE, MA-MIT Real Estate is confirming the organization has bought into several properties owned here by Alexandria Real Estate Equities, with the REIT striking ground leases bearing a total value of approximately $30 million. According to MIT real estate associate director John McQuaid, the initiative is expected to provide a financial boost and steady income stream for the institution while at the same time allowing for reinvestment into the local economy.


"MIT is a big supporter of what is going on in Cambridge," McQuaid tells GlobeSt.com. "It is a market we understand and it is one we have a lot of faith in." The ground leases have an initial term of 50 years, and carry a 25-year extension option.


First reported earlier today by GlobeSt.com, the series of four transactions came after extended discussions between the two sides. It underscores a close relationship forged last year when Alexandria paid $600 million for a 90% stake in Technology Square, McQuaid explains, with MIT previously acquiring that seven-building, 1.2-million-sf complex in the core East Cambridge submark


As in the case of the latest deals, the school stayed involved at Technology Square via a ground lease. That method has been used in the past by the organization, McQuaid notes, exemplified in the case of University Park at MIT, a major mixed-use complex that has helped revitalize Cambridge's Central Square since work began there in the 1980s. The main developer is Forest City Enterprises, but MIT's involvement helped clear the way for that ambitious project, one that abuts the school's main campus.


The real estate investment arm of MIT is charged with finding revenue-generating opportunities, but McQuaid says the ability to put money back into the neighborhood is also of importance. "We want to do what we can to improve the area," he says.


Based in Pasadena, CA, Alexandria secured the properties involved beginning in 2001 when it paid $35.6 million for 770, 784 and 790 Memorial Dr., the middle of which served as the long-time headquarters of Polaroid Corp. The 3.8-acre complex overlooking the Charles River was purchased by Alexandria during construction of 770 and 790 Memorial Dr., a pair of laboratory structures totaling 100,000 sf. According to McQuaid, the two new buildings are involved in MIT's deal, and not 784 Memorial Dr. itself.


Alexandria subsequently acquired 161-171 and 170-176 Sidney St. in late 2005 for $4.8 million; paid $72.7 million for the 128,000-sf 300 Third St. in March 2006; and bought property on Brookline and Erie streets for $11.1 million a year ago this month, headlined by 99 Erie St. The total price paid by Alexandria for the properties involved in the MIT deal was $124.2 million. There is about 300,000 sf of existing space aggregate for the assets, which also includes extensive parking. The biggest price paid by MIT was $16.6 million to buy into 300 Third St., while the Memorial Dr. ground lease was placed at $12.3 million.


Calls to Alexandria were not returned by GlobeSt.com's press deadline. One industry observer says the arrangement is unique in terms of the deal's extended length. In the meantime, there are no encumbrances to prevent Alexandria from developing additional space on the properties, says McQuaid. "It's up to them" to do what they want, he says.




Technorati :

Energy companies, miners weigh on ADRs :ADR Report


U.S.-listed shares of overseas companies fell on Tuesday, as lower oil and mixed metals prices weighed on energy companies and miners.


The Bank of New York's index of leading American Depository Receipts (ADRs) (.BKADR: Quote, Profile, Research) fell 0.72 percent while the 30-share Dow Jones industrial average (.DJI: Quote, Profile, Research) slipped 0.34 percent at midday. U.S. stocks fell as oil prices' drop below $80 a barrel hurt energy shares.


The Bank of New York's index of leading European ADRs (.BKEUR: Quote, Profile, Research) was down 1.04 percent, while the FTSEurofirst 300 index (.FTEU3: Quote, Profile, Research) closed up 0.32 percent. European stocks rose to a ten-week high as banks rallied on hopes the worst of the credit crunch might be behind, but a drop in energy shares kept gains in check. [.EU]


ADRs of major oil companies took a hit as U.S. crude prices (Clc1: Quote, Profile, Research) deepened losses to below $80 a barrel.


Royal Dutch Shell (RDSa.N: Quote, Profile, Research) fell 2.9 percent to $80.98, Total (TOT.N: Quote, Profile, Research) dropped 2.7 percent to $78.83 and BP (BP.N: Quote, Profile, Research) was down 1.7 percent at $68.43.


ADRs of integrated aluminum producer Norsk Hydro (NHY.N: Quote, Profile, Research) fell 4.7 percent to $41.98, and Australian miner BHP Billiton (BHP.N: Quote, Profile, Research) shares fell 3.9 percent to $78.72.


Receipts with the Bank of New York's index of leading Latin American ADRs (.BKLA: Quote, Profile, Research) fell 1.2 percent. In Latin America, major benchmarks fell, including Brazil's Petrobas (PBR.N: Quote, Profile, Research) and iron ore mining company CVRD. (RIO.N: Quote, Profile, Research)


Petrobas ADRs fell 1.8 percent to $76.70 and CVRD dropped 3 percent to $35.06.
But Asia bucked the trend, with the Bank of New York's index of leading Asian ADRs (.BKAS: Quote, Profile, Research) up 0.40 percent. The MSCI's Asia-Pacific ex-Japan (.MIAPJOOOOPUS: Quote, Profile, Research) hit record peaks on hopes the worst of the credit squeeze may be over.


Japan's Nikkei average rose 1.2 percent on Tuesday, closing above 17,000 for the first time in nearly eight weeks as Sony Corp (6758.T: Quote, Profile, Research) gained after saying its financial unit had set its IPO price at the top end of its tentative range.


ADRs of Sony (SNE.N: Quote, Profile, Research) rose 3.8 percent to $51.25. China Mobile's ADRs (CHL.N: Quote, Profile, Research) rose 1.6 percent to $86.31.





Technorati :

Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive


"Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive." --Scott Fisher, founding director, Virtual Environment Workstation Project at NASA Ames


A half-century of space flight
In celebration of the 50th anniversary of the Sputnik launch on October 4, we take a look back at some of the ships that have helped humans explore space and some of those that might do so in the near future. Forget about the Xbox and the iPhone. This is some serious hardware.


The launch of the basketball-size satellite is widely considered the dawn of the space age, and began the space race between the United States and the Soviet Union.


In 1955, both the United States and the Soviet Union announced plans to launch satellites into orbit as part of the International Geophysical Year, which had been established to take place from the middle of 1957 through the end of 1958.


The U.S. may have announced its plans first, but the U.S.S.R. got off the ground startlingly fast. Sputnik I, pictured at left, launched October 4, 1957, raising fears among Americans that it gave the Soviet Union a leg up on the U.S. not only technologically, but in the ability to launch nuclear missiles.


Sputnik II, carrying a dog named Laika and a much heavier payload, soon followed, launching only a month later on November 3.


For anyone who's ever been stuck in rush-hour traffic on U.S. Highway 101 through Silicon Valley, the region's overgrowth of green-glass office buildings, ugly tech company headquarters and expensive cars is a frustrating flip side to the steady stream of world-changing innovation that has emerged there.


But if you'd visited the region in 1930, all you'd have seen was a two-lane highway cutting through acres and acres of nothing but farmland and tiny hamlets, and not even a hint of what would someday become arguably the most important commercial technology center in the world.
In December of that year, however, word came that the U.S. Navy was going to open an air station in Sunnyvale, Calif., one that would handle gigantic airships and that would need a mammoth hangar.


The result? The Sunnyvale Naval Air Station, later known as NASA Moffett Field. And today, Moffett is home to NASA's Ames Research Center, a facility that is at the heart of Silicon Valley, both geographically and figuratively. In 1930 the region didn't know what was about to arrive, but it soon realized how much change was coming.


"Industries allied to aviation will spring up like mushrooms, each bringing its own payroll," wrote the San Jose Mercury Herald in 1931, according to NASA. "It means in short that San Jose and the Bay region are on the threshold of the most glorious era of posterity in their history."


Usually, such proclamations fall short of reality, but on this the newspaper was spot on. While the projected growth was expected to be tied to aviation, not space research, the arrival in 1939 of the National Advisory Committee for Aeronautics--the precursor to NASA--and later NASA itself helped drag the Valley into the center of American industry.


Of course, Silicon Valley has grown way beyond NASA since the Apollo program was leaning on researchers from Stanford, nearby University of California at Berkeley and a number of small companies that started to dot the area in the 1960s. But in the crucial early years of the Valley's technology industry, government contracts played a key role.


"Several companies in what would become Silicon Valley benefited from the ambitious goals and budget largesse of the Apollo space program," said Dag Spicer, the senior curator of the Computer History Museum, in Mountain View, Calif. "The stringent quality and performance requirements of (integrated circuits) for Apollo allowed early semiconductor companies to learn at government (that is, public) expense, a technology that would soon have broad application and whose price would plummet as these companies perfected manufacturing methods."


A list of companies that emerged to take advantage of NASA's work on integrated circuits would be impossible to compile today, but there's no doubt that among the biggest winners on such a list would be Fairchild Semiconductor, and Intel, which was founded by Fairchild's Robert Noyce and Gordon Moore.


"Fairchild...was likely the largest recipient of government-related integrated circuit work," said Spicer. "The irony of these early contracts was that, while they were welcome in the early 1960s (when) semiconductor companies were learning how to make integrated circuits, by 1970, government/military work was frequently viewed as a damper on profits and innovation since it took people and resources away from research and development into newer and more profitable commercial products."


Nonetheless, the Apollo program turned out to be a fantastic source of technology that would eventually find its way into commercial products and applications. Also among the companies that would most benefit from the program was Hewlett-Packard. HP's association with the space program, in fact, pre-dates NASA, according to Measure magazine.


"HP's instrument sales force has been selling to the space program since the 1950s, before NASA was formally created," wrote Measure magazine in 1983, according to information provided by Devon Dawson, an archivist for HP spinoff Agilent Technologies. "NASA and its contractors use instruments from virtually every HP division to develop, test and support the sophisticated electronic equipment used in all NASA programs."


Specific instances of the HP-NASA alliance on the Apollo 11 program abound, Measure wrote in its September 1969 issue: The launch control facility at Cape Kennedy and Houston's Mission Control Center both utilized HP technology such as FM-AM telemetering signal generators and RF vector impedance meters. And, HP's Precision Frequency Source keyed to a cesium clock built by the company "provided the precise frequency outputs used for thousandths-of-a-second accuracy throughout the worldwide Apollo network of tracking stations and communications systems."


The relationship between HP and NASA has stayed strong, Dawson said. Among the space programs employing HP or Agilent technology are space shuttle missions, Mariner missions, Voyager 2, the 1995 docking of the Atlantis shuttle with the Mir Space Station and the Lunar Prospector in 1997.


But the impact of the Apollo program on commercial technology goes far beyond such highly specialized equipment and missions. According to Bruce Damer, founder of the DigiBarn computer museum and a frequent NASA contractor himself through his company DigitalSpace, it's possible to draw a direct evolutionary link between the simple flight simulators NASA was using for the Apollo astronauts in 1967 and 1968--what he called "one of the first highly interactive computer environments"--and some of the early commercial video games.


Similarly, NASA's work with wind tunnels at Moffett became so expensive that the agency decided to turn to supercomputers for more cost-effective simulations.


And that, in conjunction with work done at Ames on tele-operations and telepresence--research that tried to simulate the interior of the space shuttle--led to the creation of 3D graphics, head-mounted displays and early virtual reality technology, all partially funded by NASA.


"Starting in the 1960s, as the needs became more necessary...I think that drove the research on graphics tech and certainly computing in general," said Scott Fisher, chair of the interactive media division in the University of Southern California's School of Cinematic Arts, and the founding director of the Virtual Environment Workstation Project (VIEW) at NASA Ames. "When we built a real-time virtual environment system and the flow visualization guys used it to input their data, they were ecstatic that they could manipulate viewpoints into their data by just moving their head or walking around in the data as opposed to typing in a set of coordinates for each new viewpoint."


Another technology to come out of NASA and later find its way into industry was the use of audio technology in pilots' computerized interfaces, said Fisher.


"NASA did lots of work on finding the best ways to alert a pilot to some system problem," Fisher said. "Audio turned out to be very effective." Now, nearly 20 years later, the technology is making its way into video games and other off-the-shelf commercial systems, he said.


The relationship between NASA and space technologies and Silicon Valley and the companies that have blossomed there may best have been summed up by Northrop Grumman chairman and CEO Ronald Sugar in a speech he gave on September 20 at the 50th Anniversary of Space Exploration conference in Pasadena, Calif.



"Space exploration and use has created new industries that today generate billions of dollars of revenue, employ millions of people worldwide and improve the lives of virtually everyone," said Sugar. "Space, which first served as a coliseum for two grappling superpowers, now welcomes new nations to explore and utilize its potential, and in the process, draws all mankind closer together."


Of course, for those who work or worked in the space industry, the experience of being involved with such technologies and seeing how they affected the rest of the world is something that will always be special.


"Having the space program as a very challenging real-world mission to focus tech development around was tremendously inspiring and productive," said Fisher.






Technorati :

The first BlackBerry with Wi-Fi support debuts.


The exterior of the new BlackBerry 8820 may look familiar. After all, the device has the same sleek black design as the BlackBerry 8800, which was launched earlier this year. The new phone has one important upgrade under the hood, however: Wi-Fi.


This is the first BlackBerry with Wi-Fi support, and the addition is a welcome one, as it means the phone can provide wireless voice and data access over both cellular and Wi-Fi wireless networks.


Like the 8800 and the more consumer-oriented BlackBerry Curve, the 8820 will be available from AT&T. Starting tomorrow, you can get the 8820 for $300 with a two-year service contract. In addition to Wi-Fi, the quad-band GSM phone includes support for the carrier's EDGE network--but not its true 3G HSDPA service.


Easy Setup
The 8820 can connect to 802.11a/b/g networks; connecting to my 802.11g wireless network at home was a breeze. I simply launched the Wi-Fi Setup wizard, selected my network from the list of available choices (you can either scan for available networks or manually enter the name of the network to which you'd like to connect), typed in the WEP key, and was connected in seconds. You can choose to automatically connect to your favorite wireless networks when they're available, and also can store any login info that is needed to connect to public hotspots.


The device will default to a Wi-Fi connection when one is available; a small icon on the top of the screen tells you which network you're using. Surfing the Web and sending and receiving e-mail messages via Wi-Fi was speedy, especially where the EDGE network coverage was spotty--as it often is inside my house.


When the 8820 connected to my Wi-Fi network, I was able to send an e-mail message and surf the Web while on a phone call without a problem. You can't, however, make voice-over-Wi-Fi calls on the device just yet. RIM says that this feature will be offered at the discretion of the carrier; AT&T says that the Wi-Fi capabilities on the 8820 are "data only" at this time.


Other than the Wi-Fi support, the 8820 is almost identical to the 8800 (which will be replaced in AT&T 's lineup by the 8820). The phone includes built-in GPS functionality (no hardware add-ons required) and comes with access to the easy-to-use TeleNav GPS service from AT&T, which is available for an extra $10 per month.


E-Mail Is Easy
Also like the 8800--and all BlackBerry devices--the 8820 is a champ when it comes to e-mail. The phone supports up to ten e-mail accounts, including POP3, IMAP, and Web-based e-mail. Corporate e-mail access is available through the BlackBerry Enterprise Server; I tested the personal e-mail capability with the BlackBerry Internet Service and a POP3 account. I simply entered my e-mail user name and password; within minutes, mail from my personal account was arriving in my hand. The inbox is neatly organized and superbly easy to read.


The 320-by-240 display is gorgeous, and navigating the device via the small--but very usable--QWERTY keyboard and BlackBerry Pearl-like trackball is quite comfortable. At 4.5 inches tall by 2.6 inches wide by 0.6 inch thick and 4.7 ounces, the 8820 can feel slightly boxy when held next to your ear. Call quality was quite good, though, and the included speakerphone worked well. We are currently in the process of testing the phone's talk-time battery life; we'll add that information (and a PC World rating) to this review as soon as it is available.


While the 8820 lacks the camera found on the Curve, it does include a media player for playing back audio and video files and a microSD card slot for storage. In addition, it includes AT&T 's Mobile Music service, which lets you access subscription services such as eMusic and XM Satellite Radio.


The business-oriented 8820 lacks a camera and some of the other consumer-friendly applications (such as access to popular instant messaging clients) found on other BlackBerry devices. The support for Wi-Fi is an excellent tradeoff, however, and the result is a sleek cell phone that delivers speedy data service even when your cellular coverage is spotty.





Technorati : ,

Reader Digital Book could turn some heads among gadget lovers


www.24hoursnews.blogspot.com


Technology Market is really challenging ......challeging through quality....chalenging through ..Marketing


While it may not pack the sales bang of Harry Potter and the Deathly Hallows, the latest edition of Sony Electronics Inc.'s Reader Digital Book could turn some heads among gadget lovers when it is released this month.


Sony announced the latest edition of the Reader, model PRS-505, on Tuesday, and said it will be available in the U.S. this month at Sony Style stores and on the Sony Style Web site as well as at Borders Inc. book stores. For US$300, people will get a paperback book-sized Reader in either silver or dark blue, which can hold up to 160 books.


A Sony spokesman was unable to provide a specific date for the launch of the new Reader.


To get people started on their new Reader, Sony is offering credit for 100 classic books, including the works of Shakespeare and Jane Austen, on Connect, an eBooks store set up by Sony. The site includes 20,000 eBooks, including the latest editions of many top authors and much of the New York Times Bestsellers' list. It doesn't include any books in J.K. Rowling's Harry Potter series.


Improvements to the reader include nearly twice the storage space, a battery that will last around 7,500 page views, new controls that are redesigned to mimic page-turning and allow quicker navigation, and a USB (Universal Serial Bus) port allowing the transfer of data from a PC. The new Reader also includes slots for Memory Stick Duo and SD memory cards to increase storage capacity.


An auto-sync feature with the new edition allows users to create a folder for books and documents on their computer with which they can automatically synchronise the Reader.





Technorati : ,

Global Chip Sales Remain Hot


Global chip sales soared in August, indicating makers of electronic devices such as PCs, iPods and mobile phones expect the holiday season to be strong this year.


Worldwide chip sales rose 4.5 percent year-over-year to US$21.6 billion in the month of August, the Semiconductor Industry Association (SIA) reported late Monday.


The increase is better than historical norms, according to John Pitzer, chip industry analyst at Credit Suisse Securities (USA) LLC. August marks the third month in a row that chip sales have been higher than seasonal norms.


August is the month electronics manufacturers buy most of the chips they need to prepare for holiday season gadget demand, the SIA said, indicating companies expect strong Christmas sales. Rising PC demand is also boosting overall chip sales because about 40 percent of all chips go into PCs. Other goods are helping global chip sales by increasing the number of chips used, such as automobile engines.


Global PC shipments grew 12.5 percent year-over-year in the second quarter, according to market researcher IDC, with the Asia/Pacific region returning the fastest growth at more than 20 percent. The brisk quarterly growth prompted Gartner Inc. to forecast PC sales for 2007 to increase 12.3 percent over last year, led by sales in emerging markets as well as rising laptop PC sales in developed countries.


A spike in prices for NAND flash memory chips, which are used to store songs, video clips and other data in devices ranging from iPods to digital cameras and mobile phones, also aided chip sales growth in August, the SIA said. NAND flash memory prices were up 48 percent compared to a year ago and 19 percent higher than July.





Technorati : , , , ,

Find here

Home II Large Hadron Cillider News