Search This Blog

Wednesday, October 17, 2007

Global Warming and the Future of Coal


Carbon Capture and StorageEver-rising industrial and consumer demand for more power in tandem with cheap and abundant coal reserves across the globe are expected to result in the construction of new coal-fired power plants producing 1,400 gigawatts of electricity by 2030, according to the International Energy Agency. In the absence of emission controls, these new plants will increase worldwide annual emissions of carbon dioxide by approximately 7.6 billion metric tons by 2030. These emissions would equal roughly 50 percent of all fossil fuel emissions over the past 250 years.


In the United States alone, about 145 gigawatts of new power from coal-fired plants are projected to be built by 2030, resulting in CO2 emissions of 790 million metric tons per year in the absence of emission controls. By comparison, annual U.S. emissions of CO2 from all sources in 2005 were about 6 billion metric tons.


Policymakers and scientists now recognize that the current growth of greenhouse gas emissions must be reversed and that emissions must be reduced substantially in order to combat the risk of climate change. Yet a dramatic increase in coal-fired power generation threatens to overwhelm all other efforts to lower emissions and virtually guarantees that these emissions will continue to climb. This would preclude any possibility of stabilizing greenhouse gas concentrations in the atmosphere at levels that would acceptably moderate the predicted rise in global temperatures.


In China and other developing countries experiencing strong economic growth, demand for power is surging dramatically, with low-cost coal the fuel of choice for new power plants. Emissions in these countries are now rising faster than in developed economies in North America and Europe: China will soon overtake the United States as the world's number one greenhouse gas emitter. With the power sector expanding rapidly, China and India will fall further behind in controlling greenhouse gas emissions unless new coal plants adopt emission controls. Lack of progress in these countries would doom to failure global efforts to combat global warming.


The Promise of Carbon Capture and Storage


Fortunately, there is a potential pathway that would allow continued use of coal as an energy source without magnifying the risk of global warming. Technology currently exists to capture CO2 emissions from coal-fired plants before they are released into the environment and to sequester that CO2 in underground geologic formations. Energy companies boast extensive experience sequestering CO2 by injecting it into oil fields to enhance oil recovery. Although additional testing is needed, experts are optimistic this practice can be replicated in saline aquifers and other geologic formations that are likely to constitute the main storage reservoirs for CO2 emitted from power plants.


However, these so-called carbon capture and storage, or CCS systems, require modifications to existing power plant technologies. Today the prevailing coal-based generation technology in the United States is pulverized coal, with high-temperature (supercritical and ultrasupercritical) designs available to improve efficiency. It is possible to capture CO2 emissions at these pulverized coal units, but the CO2 capture technology currently has performance and cost drawbacks.


But there's a new coal-based power generation technology, Integrated Gasification Combined Cycle, or IGCC, which allows CCS systems in new plants to more efficiently capture and store CO2 because the CO2 can be removed before combustion. Motivated by this advantage, some power plant developers have announced plans to use IGCC technology but very few have committed to installing and operating CCS systems.


The great challenge is ensuring that widespread deployment of CCS systems at new IGCC and pulverized coal plants occurs on a timely basis. Despite growing recognition of the promise of carbon capture and storage, we are so far failing in that effort. The consequences of delay will be far-reaching-a new generation of coal plants could well be built without CO2 emission controls.


Barriers to the Adoption of Carbon Capture and Storage Systems


Industry experts today are projecting that only a small percentage of new coal-fired plants built during the next 25 years will use IGCC technology. IGCC plants currently cost about 20 percent to 25 percent more to build than conventional state-of- the-art coal plants using supercritical pulverized coal, or SCPC, technology. What's more, because experience with IGCC technology is limited, IGCC plants are still perceived to have reliability and efficiency drawbacks.


More importantly, IGCC plants are not likely to capture and sequester their CO2 emissions in the current regulatory environment since add-on capture technology will reduce efficiency and lower electricity output. This will increase the cost of producing electricity by 25 percent to 40 percent over plants without CCS capability.


These barriers can be partially overcome by tax credits and other financial incentives and by performance guarantees from IGCC technology vendors. Even with these measures, however, it is unlikely that IGCC plants will replace conventional coal plants in large numbers or that those plants which are built will capture and store CO2. There are two reasons for this.


First, even cost-competitive new technologies are usually not adopted rapidly, particularly in a conservative industry such as the utility sector, where the new technology is different from the conventional technology. This is the case with IGCC plants, which are indeed more like chemical plants than traditional coal-fired plants.


Second, there is now no business motivation to bear the cost of CCS systems when selecting new generation technologies even though the cost of electricity from IGCC plants is in fact lower than from SCPC plants once CCS costs are taken into account. This is because plant owners are not required to control greenhouse gas emissions and CCS systems are unnecessary for the production of power. The upshot: IGCC units (with and even without CCS capability) will lack a competitive edge over SCPC units unless all plant developers are responsible for costeffectively abating their CO2 emissions. No such requirement exists today.


A New Policy Framework to Stimulate the Adoption of CCS Systems


This paper considers how best to change the economic calculus of power plant developers so they internalize CCS costs when selecting new generation technologies. Five policy tools are analyzed:


Establishing a greenhouse gas cap-and-trade program
Imposing carbon taxes
Defining CCS systems as a so-called Best Available Control Technology for new power plants under the Clean Air Act's New Source Review program
Developing a "low carbon portfolio" standard that requires utilities to provide an increasing proportion of power from low-carbon generation sources over time
Requiring all new coal power plants to meet an "emission performance" standard that limits CO2 emissions to levels achievable with CCS systems.
Each of these tools has advantages and drawbacks but an emission performance standard for new power plants is likely to be most effective in spurring broad-scale adoption of CCS systems.


In the current U.S. political environment, a cap-and-trade system is unlikely to result in a sufficiently high market price for CO2 (around $30 per ton) in the early years of a carbon control regime to assure that all coal plant developers adopt CCS systems. At lower carbon prices, plant developers could well conclude that it is more economical to build uncontrolled SCPC plants and then purchase credits to offset their emissions. A carbon tax that is not set at a sufficiently high level likely would have the same consequences.


A low carbon portfolio standard would be complex and difficult to implement because of the wide variations in generation mix between different regions. Moreover, unless the standard sets stringent targets for low carbon generation, it would not preclude construction of uncontrolled coal plants.


Although the recent Supreme Court decision defining CO2 as a "pollutant" has opened the door to controlling new power plant emissions under the New Source Review program, legal uncertainties may prevent the Environmental Protection Agency from defining CCS systems as the Best Available Control Technology under current law. Individual states could also reject CCS systems during permitting reviews. Moreover, the New Source Review program would not allow flexible compliance schedules for installing and operating CCS systems, nor would it provide financial incentives to offset the increased cost of electricity.


How Emission Performance Standards for New Coal Plants Would Work


In contrast to other approaches, an emission performance standard that limits new plant emissions to levels achievable with CCS systems would provide certainty that new coal plants in fact capture and store


CO2. To provide a clear market signal to plant developers, this standard would apply to all new plants built after a date certain, although some flexibility would be allowed in the timing of CCS installation so that the power generation industry can gain more experience with various types of capture technology and underground CO2 storage. For example, all plants that begin construction after 2008 could be subject to the standard and would be required to implement carbon capture technology by 2013, and then to meet all sequestration requirements by 2016.


To provide additional flexibility while CCS technology is being perfected, plant developers during the first three years in which the new performance standard is in effect could have the option to construct traditional coal plants that do not capture and sequester CO2 if they offset on a one-to-one basis their CO2 emissions by taking one or more of the following steps:


Improving efficiencies and lowering CO2 emissions at existing plants
Retiring existing coal or natural gas units that generate CO2 emissions
Constructing previously unplanned renewable fuel power plants representing up to 25 percent of the generation capacity of the new coal plant.
In 2011, this alternate compliance option would sunset and all new plants subsequently entering construction would need to capture and sequester their emissions.


An emission performance standard for new coal plants should be accompanied by a cap-and-trade program for existing power plants, with the cap starting at 100 percent of emissions and progressively declining over time. A declining cap would encourage greater efficiencies in operating existing plants and incentivize the retirement of higher emitting existing plants. This would assure that an emission performance standard for new plants does not simply prolong the useful life of older plants. In addition, as the cap declines, retrofitting existing plants with CCS systems could become a viable option.


Mitigating Electricity Price Hikes


If legislation requiring an emission performance standard for new coal plants is enacted, then Congress should simultaneously take steps to offset the additional costs of installing CCS systems and provide relief from electricity price increases. This would prevent disproportionate costs from falling upon consumers who live in regions heavily dependent on coal for power generation. By reducing the financial risks and uncertainties of building power plants with CCS systems, it would also encourage investments in such plants by developers and their financial backers.


One approach would be to create a fund to "credit" utilities for all or part of the price increase that consumers would otherwise bear if they receive power from plants with CCS systems. Alternatively, financial incentives could be offered to plant developers which, in combination, offset a significant portion of the incremental costs of installing a CCS system as opposed to operating a coal-fired plant that does not control CO2 emissions. This new incentive program would replace current incentive programs for IGCC plants and other coal technologies that do not include CCS systems.


Assuming that government incentives cover 10 percent to 20 percent of total plant construction costs and that they apply to the first 80 gigawatts of new coal capacity with CCS systems built by 2030, these incentives could cost in the range of $36 billion over 18 years. Although $36 billion is a large sum, it is only a fraction of the $1.61 trillion that the International Energy Agency predicts will be invested in new power plants in the United States between now and 2030.


Building a Technical and Regulatory Foundation for CCS Systems


Once the nation commits to a rapid timetable for requiring CCS systems at all new coal plants under an emission performance standard, then all of our regulatory and research and development efforts should be focused on implementing CCS technology as effectively as possible. This would require:


An enhanced R&D program for capture technologies at both SCPC and IGCC facilities to reduce the costs of capture as quickly as possible
An accelerated program to gain largescale experience with sequestration for a range of geologic formations
A comprehensive national inventory of potential storage reservoirs
A new regulatory framework for evaluating, permitting, monitoring, and remediating sequestration sites and allocating liability for long-term CO2 storage.
Maintaining the Viability of Coal in a Carbon-Constrained World


Although an emission performance standard that requires CCS systems for all new coal plants would pose a daunting technological and economic challenge, it will ultimately assure coal a secure and important role in the future U.S. energy mix. Such a standard would establish a clear technological path forward for coal, preserving its viability in a carbon-constrained world and giving the utility industry confidence to invest substantial sums in new coal-fired power generation. In contrast, continued public opposition and legal uncertainties may cause investors to withhold financing for new coal plants, placing the future of coal in jeopardy.


If the United States is successful in maintaining the viability of coal as a cost-competitive power source while addressing climate concerns, our leadership position would enable U.S. industries to capture critical export opportunities to the very nations facing the largest challenges from global warming. Once our domestic marketplace adopts CCS systems as power industry standards, the opportunities to export this best-of-breed technology will grow exponentially.


This will be critical to combating the massive rise of coal-derived greenhouse gas emissions in the developing world. Boosting exports while also helping China, India, and other developing nations reduce emissions and sustain economic growth would be a win-win-win for our economy, their economies, and the global climate.






Read the full report (PDF)




Technorati :

YouTube Copyright Enforcement System :If you dont own me ,dont use me!!


YouTube Copyright Enforcement System :If you dont own me ,dont use me!!Its really good to hear against piracy ,


The technology is designed to let content owners prevent YouTube users from uploading copies of their videos, or they can have the choice of monetizing unauthorized uploads with ads.


Seven months ago, Viacom filed a copyright infringement lawsuit demanding $1 billion from Google and YouTube and charging the companies with "brazen disregard" for intellectual property laws and threatening "the economic underpinnings of one of the most important sectors of the United States economy." On Tuesday, YouTube finally launched a content identification system, YouTube Video Identification, to give copyright owners some measure of control over the presence of their content on the site.


The new service requires that content owners upload videos they wish to protect so that a "hash" -- a numeric fingerprint of sorts -- can be created. That done, content owners will be able to prevent YouTube users from uploading copies of their videos; they will also have the choice of monetizing unauthorized uploads with ads.


"Video Identification goes above and beyond our legal responsibilities," said David King, YouTube Product Manager, in a blog post. "It will help copyright holders identify their works on YouTube, and choose what they want done with their videos: whether to block, promote, or even -- if a copyright holder chooses to license their content to appear on the site -- monetize their videos."


YouTube's and Google's legal responsibilities are at issue in Viacom's copyright lawsuit.


Under the Digital Millennium Copyright Act, Google, as an Internet service provider, escapes liability for copyright infringement by its users if it responds quickly to notifications of copyright infringement.


Viacom claims that Google and YouTube "actively engage in, promote and induce this infringement," and thus shouldn't qualify for safe harbor protection.


It's not yet clear whether YouTube's new technology will prompt Viacom to drop its copyright claim. When the lawsuit was first filed, pundits observed that the lawsuit was a negotiating tactic to force concessions of some sort from Google.


In April, at the Web 2.0 Expo in San Francisco, Calif., Google CEO Eric Schmidt predicted that as Google rolls out its content protection system, "the issues in Viacom become moot."


Yet, 64 legal filings later, the case chugs along, with Viacom still apparently set on a $1 billion pay day.


In his post, King pointed out that Google already has a number of content policies and tools in place to help copyright owners. These include account terminations for repeat infringers, technical measures to prevent videos that have been removed from being re-uploaded, a 10-minute limit on the length of uploaded content, an electronic notice and takedown tool, and prominent copyright compliance tips for users.




Technorati :

Industrial Nanotech, Inc. Announces New International Supply Chain Expansion


Nanotech business is becoming the most precious business in the world following the upgrowing demand and service of the sector


Industrial Nanotech, Inc. (Pink Sheets:INTK), an emerging global leader in nanotechnology, announced today that the Company has severed its relationship with Mercatus & Partners Group, of Rome, Italy as joint venture partners for a manufacturing facility in Italy and is moving to expand the Company's supply chain for Europe, the Middle East, and Asia with manufacturing in Budapest, Hungary and a fulfillment center in Shanghai, China.


Stuart Burchill, CEO of Industrial Nanotech, Inc., states, "The relationship with Mercatus & Partners Group was not productive or suitable for Industrial Nanotech, Inc. and we have terminated the relationship. However, pending deals make it a priority that we ramp up our ability to provide large quantities of product on a regular basis to Europe, the Middle East, and China. We are currently in negotiations with a facility in Budapest, where one of our key coating scientists maintains professional relationships sufficient to provide day to day quality control monitoring, to provide our Company with manufacturing capabilities and we plan to utilize the services of a fulfillment center in Shanghai. This strategy represents the most cost effective way to implement an efficient supply chain to meet our needs in these regions and without a major capital expenditure and with consideration for the protection of our valuable intellectual property."


About Industrial Nanotech, Inc.


Industrial Nanotech Inc. is rapidly emerging as a global nanoscience solutions and research leader. The Company develops and commercializes new and innovative applications for nanotechnology. Additional information about the Company and its products can be found at their websites www.industrial-nanotech.com and www.nansulate.com.


About Nansulate(R)


Nansulate(R) is the Company's patented product line of specialty coatings containing a nanotechnology based material and which are well-documented to provide the combined performance qualities of thermal insulation, corrosion prevention, resistance to mold growth and lead encapsulation in an environmentally safe, water-based, coating formulation. The Nansulate(R) Product Line includes both industrial and residential coatings.


Safe Harbor Statement


Safe Harbor Statement under the Private Securities Litigation Reform Act of 1995: This release includes forward-looking statements made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995 that involve risks and uncertainties including, but not limited to, the impact of competitive products, the ability to meet customer demand, the ability to manage growth, acquisitions of technology, equipment, or human resources, the effect of economic and business conditions, and the ability to attract and retain skilled personnel. The Company is not obligated to revise or update any forward-looking statements in order to reflect events or circumstances that may arise after the date of this release.


More
contacts:
bjedynak@janispr.com
lgrock@janispr.com




Technorati :

Blood may help us think : Can we develop our thinks?


What gives us innovative power? what makes us active ? Thinkig!!


MIT scientists propose that blood may help us think, in addition to its well-known role as the conveyor of fuel and oxygen to brain cells.


"We hypothesize that blood actively modulates how neurons process information," Christopher Moore, a principal investigator in the McGovern Institute for Brain Research at MIT, explained in an invited review in the October issue of the Journal of Neurophysiology. "Many lines of evidence suggest that blood does something more interesting than just delivering supplies. If it does modulate how neurons relay signals, that changes how we think the brain works."


According to Moore's Hemo-Neural Hypothesis, blood is not just a physiological support system but actually helps control brain activity. Specifically, localized changes in blood flow affect the activity of nearby neurons, changing how they transmit signals to each other and hence regulating information flow throughout the brain. Ongoing studies in Moore's laboratory support this view, showing that blood flow does modulate individual neurons.


Moore's theory has implications for understanding brain diseases such as Alzheimer's, schizophrenia, multiple sclerosis and epilepsy. "Many neurological and psychiatric diseases have associated changes in the vasculature," said Moore, who is also an assistant professor in MIT's Department of Brain and Cognitive Sciences.


"Most people assume the symptoms of these diseases are a secondary consequence of damage to the neurons. But we propose that they may also be a causative factor in the disease process, and that insight suggests entirely new treatments." For example, in epilepsy people often have abnormal blood vessels in the brain region where the seizures occur, and the hypothesis suggests this abnormal flow may induce epileptic onset. If so, drugs that affect blood flow may provide an alternative to current therapies.


The hypothesis also has important implications for functional magnetic resonance imaging, or fMRI, a widely used brain scanning method that indicates local changes in blood flow. "Scientists looking at fMRI currently regard blood flow and volume changes as a secondary process that only provides read-out of neural activity," explained Rosa Cao, a graduate student in Moore's lab and co-author of the paper. "If blood flow shapes neural activity and behavior, then fMRI is actually imaging a key contributor to information processing."


Again, studies in Moore's lab support this interpretation. For example, his fMRI studies of the sensory homunculus--the brain's detailed map of body parts like fingers, toes, arms, and legs--show that when more blood flows to the area representing the fingertip, people more readily perceive a light tap on the finger. This suggests that blood affects the function of this brain region and that information about blood flow can predict future brain activity. This finding does not undermine prior studies, but adds another, richer layer to their interpretation and makes fMRI an even more useful tool than it already is.


How could blood flow affect brain activity? Blood contains diffusible factors that could leak out of vessels to affect neural activity, and changes to blood volume could affect the concentration of these factors. Also, neurons and support cells called glia may react to the mechanical forces of blood vessels expanding and contracting. In addition, blood influences the temperature of brain tissue, which affects neural activity.


To Moore's knowledge, the Hemo-Neural Hypothesis offers an entirely new way of looking at the brain. "No one ever includes blood flow in models of information processing in the brain," he said. One historical exception is the philosopher Aristotle, who thought the circulatory system was responsible for thoughts and emotions. Perhaps the ancient Greeks were on to something.


This work was funded by Thomas F. Peterson, the Mitsui Foundation and the McGovern Institute for Brain Research at MIT.




Technorati :

Global warming :Finding the clean tech money


. The effect of Global warming is not easy to face so What kind of clean tech product will thrive over the long term?is matter of evaluate seriously.


"Something that doesn't defy laws of physics, and there are plenty of those," said Rodrigo Prudencio, a partner with Nth Power LLC. The venture capital firm helped Evergreen Solar and Imperium Renewables to get off the ground.


Nobody at the AlwaysOn Going Green conference was making bold predictions about what might become the Google of green tech, but the sector is expected to continue expanding at a rapid clip.


Clean tech companies receive the third largest amount of venture capital, a staggering increase to $2.4 billion last year from $917 million in 2005, according to research by Clean Edge. Ninety percent of venture-backed, green tech companies that made initial public offerings last year are listed on the Nasdaq market.


"There will be new ways to squeeze that last bit out of a kilowatt and new ways to create that kilowatt," said Steve Eichenlaub, managing director of Cleantech Investments at Intel Capital. He and other investment experts offered these tips:


Don't burn out by shooting for every initial public offering. "You still have to be careful," said JonCarlo Mark, senior portfolio manager at CalPERS. "There will be money lost in certain technologies and investments, but there's a need to diversify from fossil fuels."
Although unglamorous, technologies that improve energy efficiency, from manufacturing plants to workplaces to homes, will be in high demand as businesses and consumers seek to reduce expenses and carbon emissions. "All companies making incremental improvements in the energy economy are gonna move the needle," said Prudencio.
Renewable sources of energy that don't lean on government subsidies or tax incentives look promising.
Think globally, far into the future. For instance, the need for water filtration and treatment will balloon as the world's population exceeds 8 billion within the next decade, and more people migrate to coastal regions.
"Climate change aside, anything that takes hazardous waste out of the market is gonna be a huge market for investment," said Keith Casto, a partner of Sedgwick, Detert, Moran & Arnold who heads the law firm's international climate change practice. Companies that use recycled components in manufacturing can save money they might otherwise spend on a dwindling supply of raw materials.




Technorati :

Because vascular health impacts many different diseases


Vascular health impacts many different diseases..........


The finding not only offers an important insight into the development of the vascular system during embryonic development but suggests a potential target for inhibiting the blood vessels that fuel cancers, diabetic eye complications and atherosclerosis, the researchers say.


The study was conducted in the zebrafish, the tiny, blue-and-silver striped denizen of India's Ganges River and many an aquarium. A "News and Views" commentary on the paper will run in the same issue.


"We expect this finding will offer important insights into blood vessel formation in humans," says lead author Massimo Santoro, PhD, UCSF visiting postdoctoral fellow in the lab of senior author Didier Stainier, PhD, UCSF professor of biochemistry and biophysics. "The zebrafish has proven to be an important model for discovering molecules relevant to human disease."


Angiogenesis, or the growth of blood vessels, is active not only during embryonic development but throughout the life of the body, providing a source of oxygenated blood to tissues damaged by wounds.


However, it is also active in a number of disease processes, including cancer. Without a blood supply, tumors cannot grow beyond the size of a small pea. Cancerous tumors release chemical signals into their environment that stimulate healthy blood vessels to sprout new vessels that then extend into the tumors. During the last decade, scientists have identified several molecules that promote angiogenesis. A drug that inhibits these molecules is now commercially available and others are being studied in clinical trials.


Scientists are also exploring strategies for stimulating the growth of new blood vessels in patients whose clogged arteries prevent a sufficient blood supply to the heart muscle.


In the current study, the UCSF team determined that two well known signaling molecules, birc2 and TNF, are crucial to the survival of endothelial cells -- which line the blood vessels and maintain the integrity of the blood vessel wall during vascular development -- in zebrafish embryos.


"The pathway these molecules make up during vascular development has not been looked at before," says Stainier. "It offers a new target for therapeutic strategies."


The birc2 gene belongs to a family of proteins that control the balance between cell survival and cell death (apoptosis). A cell induces apoptosis when it detects that it is irreparably damaged. The integrity of the blood vessel wall is determined by a dynamic balance between endothelial cell survival and apoptosis.


The scientists started the investigation by examining zebrafish with unusual physical characteristics and working to identify the mutated genes that were responsible for the traits.


"We began with a genetic mutant that displayed vascular hemorrhage associated with vascular defects, and soon proved that the mutant had a defective birc2 gene," says Santoro. "Without the birc2 gene, hemorrhage and blood pooling occurred, resulting in vascular regression and cell death."


Next, through a series of genomic analyses and biochemical studies, the team discovered the critical role of birc2 and TNF in blood vessel health in the zebrafish embryo. They showed that birc2 is needed for the formation of the tumor necrosis factor receptor complex 1, a group of proteins and peptides that activate cell survival by initiating signals. Tumor necrosis factor promotes activation of NF-kB, a protein complex transcription factor involved in the transfer of genetic information. Further tests proved the existence of a genetic link between the birc2/NF-kB pathway, and that it is critical for vascular health and endothelial cell survival.


"Studies on vascular development are important so that we can better understand the molecular basis of some endothelial cell-related pathologies, such as cancer and [diabetic eye complications, known as] retinopathies," Santoro said. "It can also help us design new therapeutic strategies for these diseases."


The team hopes that future researchers will investigate other avenues and alternative pathways. "Because vascular health impacts many different diseases, understanding how to genetically control endothelial cell survival and apoptosis is critical to future work in these areas




Technorati :

Exploring Red planet


To explore the red planet more closely ,NASA Tuesday announced it was extending for the fifth time the mission of Mars space probes Spirit and Opportunity, in their indefatigable exploration of the Red planet.


The two robots touched down three weeks apart on Mars in January 2004 for an expected 90-day mission that instead could stretch out to 2009, the National Aeronautics and Space Administration said on its website.


In September, Opportunity began a perilous descent into the Victoria crater, in Mars' Meridiani Planum region.


On the opposite side of the dusty planet and in opposite direction, Spirit in early September began climbing onto the Home Plate volcanic plateau where scientists believe the volcanic rock might contain traces of water.


"After more than three-and-a-half years, Spirit and Opportunity are showing some signs of aging, but they are in good health and capable of conducting great science," said John Callas, rover project manager at NASA's Jet Propulsion Laboratory, Pasadena, California.


The roving probes carry several sophisticated instruments to examine the geology of Mars for information about past environmental conditions.


Opportunity has returned dramatic evidence that its area of Mars stayed wet for an extended period of time long ago, with conditions that could have been suitable for sustaining microbial life, NASA said.


Spirit has found evidence in the region it is exploring that water in some form has altered the mineral composition of some soils and rocks, the space agency added.


To date, Spirit has driven 7.26 kilometers (4.51 miles) and has sent back to Earth more than 102,000 images. Opportunity has driven 11.57 kilometers (7.19 miles) and has returned more than 94,000 images




Technorati :

Discovery Launch on Track, NASA Says


Space shuttle Discovery got the green light to launch next week, but the vote was not unanimous.


NASA managers spent hours Tuesday debating whether to liftoff. It was all part of a standard flight review, but this one was different. Some safety workers recommended rolling Discovery back to the vehicle assembly building to replace three heat protection panels that have weakened.


NASA's chief engineer said that was enough to vote against launching. But overall, mission managers say the risk is not high enough to warrant a delay.


Liftoff is now set for next Tuesday at 11:38 in the morning. Keep it here on News 13 for continuing coverage.


NASA's senior managers cleared space shuttle Discovery for liftoff Tuesday, overruling a safety group that called for further studies and wing repairs, if necessary, before next week's launch.


The potential problem is with the critical thermal shielding on Discovery's wings. A new inspection method uncovered possible cracking just beneath the protective coating on three of the 44 panels that line the wings.


Engineers were evenly split on whether Discovery's flight to the international space station should be delayed, shuttle program manager Wayne Hale said. In the end, top managers concluded Tuesday night following an all-day meeting that repairs were not needed.


"There was a great deal of evidence presented today and the preponderance of evidence in my mind says that we have an acceptable risk to go fly. And let me make sure you understand that. I didn't say it's safe to go fly and I wouldn't say that. We have an acceptable risk to go fly," Hale said at a news conference.


The NASA Engineering and Safety Center - formed in the wake of the 2003 Columbia disaster - has been studying the issue since May and still does not understand why the protective coating on some of the wing panels is coming off.


It recommended additional testing, at the very least, before Discovery flies and favored replacing the three reinforced carbon panels in question. That work would have set the launch back by at least two months.


Columbia was destroyed during re-entry because of a hole in its wing.


Hale said part of what gave him confidence to proceed with the launch was the fact that two similar cases in orbit ended up being benign and the astronauts will have a repair kit in orbit for mending small damage.


In addition, "it appears that there is good analysis that says we could survive even if the worst thing happens to us during entry," he said.


The worst that could happen, said Ralph Roe, the safety center's director, is that some of the coating is lost off the front of a wing panel right before re-entry and the hot atmospheric gases burn through the entire panel.


Roe said his safety center could not get comfortable with all the uncertainties about whether the coating problem might worsen in space. The three panels in question on Discovery do not appear to have worsened over the past three flights, despite indications of possible cracks to the coating.


Engineers will continue to work to understand what is going on. "If the risk grows to an unacceptable level, we will take action," Hale said.


Unlike the sometimes brusque and hasty flight readiness reviews before the Columbia accident, "everybody got to ask questions, everybody got to give their understanding of it down to the working troop level," Hale said.






Technorati :

Nano-Science and Technology in the International Technology Roadmap for Semiconductors


source :The International Technology Roadmap for Semiconductors (ITRS) continues to provide the most up to date view into the semiconductor industries technology requirements and potential solutions for those requirements.(1) As noted several years ago by Dan Hutcheson, the feature sizes for today's devices and interconnects placed the integrated circuit well into the world of Nanotechnology.(2) As ITRS states, the semiconductor industry will squeeze all the performance out of existing CMOS transistor technology in an effort to extend it for as long as possible. The ITRS teams are also looking into the future beyond CMOS. Nano Science and Technology pay critical roles in both CMOS extension and in "Beyond CMOS". The 2007 ITRS will have sections on both Emerging Research Materials and Emerging Research Devices. These sections describe potential new device technology and some of the materials that are being researched for use in these devices. In addition, the ITRS Metrology Roadmap has a section on measurement needs and methods for Emerging Research Materials and Devices.

The industry search for a new device to replace or augment the transistor requires significant amounts of research and development in universities, national laboratories, and industry itself. Some of the concepts being explored include using spin transport instead of charge transport, molecular electronics where changes in molecular configuration govern charge transport, excitonic devices. New materials such as graphene, carbon nanotubes, semiconductor nanowires, metal oxides, and quantum dots are under consideration as the materials that would be used to fabricate the device.

Underpinning both CMOS extension and beyond CMOS devices are new phenomena that manifest themselves only when materials are fabricated into nano-scale dimensions. As one can glean from experienced semiconductor technologists, shrinking dimensions has always brought on both new performance and new problems. Everything from materials strength to quantum confinement impact properties at nanoscale dimensions. In addition, the measurement process itself is subject to new physics at these length scales.

There are countless challenges facing characterization and metrology of nano-scale materials and devices. All areas from microscopy to spectroscopy must be advanced at a more rapid pace. Although improvements in microscopy, such as aberration corrected lens systems, have continued at unprecedented rates, electron microscopy of soft materials such as graphene and molecular electronics remain exceedingly difficult. The combination of experiment and modeling are required to accelerate progress. For example, simulation of TEM images both improve image interpretation and point to optimum imaging conditions. Theory is also necessary for interpretation of electron energy loss spectra. The ITRS is calling for measurements such as carrier spin with presently unachievable spatial localization of the information. Scanned probe methods including BEEM (ballistic electron emission microscopy)are providing new insights into materials properties such as spin transport. Recent work shows that nanoscale phenomena on optical properties such as quantum confinement provide a means of extending optical methods to smaller dimensions.(3) Measurements the detect defects in nanostructures such as graphene and correlate them to resultant properties are very important. Recent work at the National Institute of Standards and Technology showed how scanning tunneling microscopy can observed such phenomena.(4)

Another set of measurement needs is for uniformity of properties of nanoscale materials. For example, what is the average and range of the dimensions of an assembly of nanodots? Numerous properties need to be characterized for such arrays. How does one measure this uniformity at the precision required for volume manufacturing control? The semiconductor industry has faced these issue for many years.

So in December when the 2007 ITRS is released, the Emerging Research Materials, Emerging Research Devices, Metrology, and other sections of the ITRS will provide us with a very useful overview of how important the semiconductor industry finds Nano-Scale Science and Technology.

1. International Technology Roadmap for Semiconductors WEB-Site http://www.itrs.net

2. G. Dan Hutcheson. The first nanochips. Scientific American, pages 48-55, April 2004.


source


http://www.nanotech-now.com/columns/?article=124






Technorati :

Predicting the future of the past tense :Mathematicians apply evolutionary models to language


This illustration is a visual representation of data by MIT and Harvard scientists on how irregular verbs regularize over time. Verb size in the image corresponds to usage frequency. Large verbs tend to stay sequestered at the top, while smaller verbs tend to fall through to the bottom. The paper predicts that 'wed' is the next verb to regularize, so it teeters on the brink


Verbs evolve and homogenize at a rate inversely proportional to their prevalence in the English language, according to a formula developed by MIT and Harvard University mathematicians who've invoked evolutionary principles to study our language over the past 1,200 years.


The team, which reported their findings in the Oct. 11 issue of Nature, conceives of linguistic development as an essentially evolutionary scheme. Just as genes and organisms undergo natural selection, words--specifically, irregular verbs that do not take an "-ed" ending in the past tense--are subject to powerful pressure to "regularize" as the language develops.


"Mathematical analysis of this linguistic evolution reveals that irregular verb conjugations behave in an extremely regular way - one that can yield predictions and insights into the future stages of a verb's evolutionary trajectory," says Erez Lieberman, a graduate student in the Harvard-MIT Division of Health Sciences and Technology and in Harvard's School of Engineering and Applied Sciences. "We measured something no one really thought could be measured, and got a striking and beautiful result."


"We're really on the front lines of developing the mathematical tools to study evolutionary dynamics," says Jean-Baptiste Michel, a graduate student at Harvard Medical School. "Before, language was considered too messy and difficult a system for mathematical study, but now we're able to successfully quantify an aspect of how language changes and develops."


Lieberman, Michel, and colleagues built upon previous study of seven competing rules for verb conjugation in Old English, six of which have gradually faded from use over time. They found that the one surviving rule, which adds an "-ed" suffix to simple past and past-participle forms, contributes to the evolutionary decay of irregular English verbs according to a specific mathematical function: It regularizes them at a rate that is inversely proportional to the square root of their usage frequency.


In other words, a verb used 100 times less frequently will evolve 10 times as fast.


To develop this formula, the researchers tracked the status of 177 irregular verbs in Old English through linguistic changes in Middle English and then modern English. Of these 177 verbs that were irregular 1,200 years ago, 145 stayed irregular in Middle English and just 98 remain irregular today, following the regularization over the centuries of such verbs as help, laugh, reach, walk, and work.


The group computed the "half-lives" of the surviving irregular verbs to predict how long they will take to regularize. The most common ones, such as "be" and "think," have such long half-lives (38,800 years and 14,400 years, respectively) that they will effectively never become regular. Irregular verbs with lower frequencies of use--such as "shrive" and "smite," with half-lives of 300 and 700 years, respectively - are much more likely to succumb to regularization.


They project that the next word to regularize will likely be "wed."


"Now may be your last chance to be a 'newly wed'," they quip in the Nature paper. "The married couples of the future can only hope for 'wedded' bliss."


Extant irregular verbs represent the vestiges of long-abandoned rules of conjugation; new verbs entering English, such as "google," are universally regular. Although fewer than 3 percent of modern English verbs are irregular, this number includes the 10 most common verbs: be, have, do, go, say, can, will, see, take, and get. The researchers expect that some 15 of the 98 modern irregular verbs they studied--although likely none of these top 10--will regularize in the next 500 years.


Their Nature paper makes a quantitative, astonishingly precise description of something linguists have suspected for a long time: The most frequently used irregular verbs are repeated so often that they are unlikely to ever go extinct.


"Irregular verbs are fossils that reveal how linguistic rules, and perhaps social rules, are born and die," Michel says.


"If you apply the right mathematical structure to your data, you find that the math also organizes your thinking about the entire process," says Lieberman, whose unorthodox projects as a graduate student have ranged from genomics to bioastronautics. "The data hasn't changed, but suddenly you're able to make powerful predictions about the future."


Lieberman and Michel's co-authors on the Nature paper are from Harvard. The work was sponsored by the John Templeton Foundation, the National Science Foundation, and the National Institutes of Health




Technorati :

Verizon Admits to Emergency Wiretapping


sponsor: www.lustnews.blogspot.com


In an Oct. 12 letter to the House Energy and Commerce Committee, Verizon officials said they acted under the emergency provisions of FISA (the Foreign Intelligence Surveillance Act). The committee is seeking information about the country's telecom carriers' cooperation, including possible violations of U.S. privacy laws, given the Bush administration's admitted domestic wiretapping program.


AT&T, of San Antonio, Texas, and Qwest Communications, of Denver, also responded to the committee's request for information, but provided no details, pointing out that they are under a federal order to not disclose any information about their activities.


"The United States, through a sworn declaration from the director of national intelligence, has formally invoked the states secrets privilege to prevent AT&T from confirming or denying certain facts about alleged intelligence operations and activities that are central to your investigation," Wayne Watts, AT&T's general counsel, wrote to the committee.


Qwest officials wrote a similar response.


However, New York-based Verizon provided details that show the Bush administration's interest in obtaining customers' electronic communications.


"Verizon would receive a classified written notice that the attorney general has authorized the emergency surveillance, stating the time of such authorization," wrote Randal S. Milch, senior vice president of legal and external affairs at Verizon. "We would provide the assistance requested as expeditiously as possible. If we do not receive a FISA order to continue the surveillance within 72 hours of the attorney general's authorization, the surveillance would be terminated."


Verizon also noted that in 2005, it cooperated with more than 90,000 legal requests backed by subpoenas or court orders issued by local, state and federal government officials. In 2006, Verizon responded to about 88,000 such requests, and through the first nine months of 2007, it had cooperated with 61,000 requests.


Verizon, AT&T and Qwest all contend they acted legally in reliance on existing federal, state and local laws.


"Current law … provides a complete defense to any provider who in good faith relies on a statutory authorization," Verizon wrote. "If the government advises a private company that a disclosure is authorized by statute, a presumption of regularity attaches."


All three carriers are involved in what AT&T characterized as a "maelstrom" of litigation over the domestic spying program. The New York Times first broke the story of the administration's warrantless wiretapping and USA Today later added that the National Security Agency is using information provided by telephone carriers to data mine tens of millions of calling records.


AT&T and others asked about government access to records. Click here to read more.


AT&T said the issue of disclosing its alleged participation in the domestic spying program rests with the White House, which is also seeking immunity for carriers in the legislation before Congress.


"Our company essentially finds itself caught in the middle of an oversight dispute between the Congress and the executive branch relating to government surveillance activities," AT&T wrote.


"Applicable legal rules make clear that much of the information you seek is under control of the executive and that disputes of this kind need to be resolved through accommodation between the two political branches of the government."


House Commerce Committee Chairman John Dingell, said the carriers' response proved to him that the White House, "as the sponsor of this program and the party preventing the companies from defending themselves-is the entity best able to resolve the many outstanding issues. I look forward to meeting with representatives of the administration in short order, and I am hopeful that they will be forthcoming with the information Congress needs to properly evaluate this program."





Technorati :

Broadcom unveils integrated 3G chip,


Broadcom Corp (BRCM.O: Quote, Profile, Research) said on Monday it had developed an integrated third-generation (3G) high-speed wireless cell phone chip ahead of bigger rivals Texas Instruments Inc (TXN.N: Quote, Profile, Research) and Qualcomm Inc (QCOM.O: Quote, Profile, Research), sending Broadcom shares up as much as 3 percent.


Shares of Texas Instruments and Qualcomm both fell about 2 percent after Broadcom said it developed a single chip with a baseband -- the cell phone's main processor -- and a radio receiver as well as FM radio and Bluetooth, a short-range technology used for wirelessly linking handsets to headsets


Chipmaker Broadcom said Monday that it has developed a new processor that integrates all key 3G cellular and mobile technologies onto a single chip.


The processor that operates at extremely low powers will enable cell phone makers to build new 3G phones in more compact form factors with very long battery lives at a fraction of what it costs today, the company said.


The new 3G "Phone on a Chip" supports the four next-generation cellular technologies used throughout the world: HSUPA (High-Speed Uplink Packet Access), HSDPA (High-Speed Downlink Packet Access), WCDMA (Wideband Code Division Multiple Access), and EDGE (enhanced data for GSM evolution). It also can transmit and receive FM radio for playing music on a car stereo. And it supports Bluetooth technology and processing capability for a 5-megapixel camera.


Broadcom claims it is at least a year ahead of competitors, such as Texas Instruments and Qualcomm, in terms of integrating so much functionality into a single chip. The company also said the chip is already available to a select group of Broadcom customers.


In 2006, Broadcom had only about 1.4 percent of the cell phone chip market. By contrast, TI and Qualcomm each had about 20 percent of the 2006 mobile phone chip market, according to iSuppli.


The new chip could help boost Broadcom's market share against these competitors, especially in Asia where operators are rolling out faster networks much more quickly than they are here in the U.S. market. Broadcom has been aggressively trying to get a greater share of the cell phone market for the past few years. And as a result, the company has been embroiled in a series of legal fights with rival Qualcomm.


Broadcom won an important battle earlier this year, when the U.S. government banned Qualcomm and its partners from importing devices that use Qualcomm's 3G technology, because part of the technology has been found to infringe on patents held by Broadcom.




Technorati :

Microsoft developing (OCS)office communication server 2007


Sponsorder by : www.lustnews.blogspot.com



Microsoft Corp. is looking for develop their office communication server


While most enterprise IT shops today still don't know what Unified Communications really is, information systems leaders at Global Crossing in 2005 had a pretty good idea of what it was and how the company could benefit from it.


ADVERTISEMENT On Oct. 16, Global Crossing IS leaders will participate in the launch of Microsoft's UC platform and demonstrate how their UC implementation via Office Communications Server 2007 and Exchange Server 2007 helped improve worker productivity by streamlining exception handling.



Microsoft Corp. is expanding its work with enterprise telephony vendors to make its Office Communication Server (OCS) 2007 work more closely with office phone systems.


On Tuesday, at the launch of OCS, the company plans to unveil a formal program to certify interoperability between IP (Internet Protocol) phone systems and OCS. As part of that, Microsoft will discuss a specification to let enterprises migrate one building at a time to its software-based unified communications system and still have calls go across the organization as if on the same PBX (private branch exchange). Two models of Cisco Systems Inc.'s popular ISR (Integrated Services Router) branch-office platform will be among the products certified for this type of interoperability, according to Zig Serafin, general manager of Microsoft's Unified Communications group.


Microsoft's initiative, called the OCS 2007 Open Interoperability Program, will formalize work that has already been going on with some third parties. As that work has expanded, it's reached a point where it needs to be more organized, Serafin said. The idea is to let customers know what will work with OCS, and Microsoft will provide a table on its Web site where potential customers can check the certifications of third-party products.


Although promoted as an effort to coexist with the IP (Internet Protocol) phone systems now established or taking root in enterprises, the program also will make it easier for customers to migrate away from dedicated communications systems and phones themselves, the company acknowledges. Voice call control is new to Microsoft's unified communications system with OCS 2007, but the software giant envisions a day when separate platforms such as Cisco's CallManager won't be needed, industry analysts say.


Cisco, Avaya Inc. and other vendors have already moved the voice call-control functions of traditional circuit-switched PBXes (private branch exchanges) into server software, but they sell that software along with IP handsets and other gear. Microsoft intends OCS, together with Office Communicator 2007 client software or special OCS phones made by Polycom Inc. and LG Electronics Inc., to ultimately replace those dedicated systems.


There are three methods of interoperability that will be certified under the program.


- SIP CSTA (Computer Supported Telephony Applications) is based on a standard by the European Communications Management Association (ECMA). It lets users control calls through the Office Communicator client on the PC, though in most cases still using the handset and PBX.


- OCS Coexistence lets the user pick up a call on either the existing handset or a client that uses OCS, namely Office Communicator or a special OCS phone.


- Direct SIP (Session Initiation Protocol) interoperability allows for some parts of an enterprise to use traditional or IP PBXes and others to use OCS, with transparent connections between them using gateways, according to Microsoft. SIP is the emerging standard protocol for exchanging information on voice, videoconferencing and other communications sessions.


Microsoft has already certified gateway products from five vendors for Direct SIP interoperability, Serafin said. Among them are Cisco's Integrated Services Router 2851 and 3845. In fact, all ISRs with voice capability can interoperate with OCS, according to Mike Wood, [cq] director of product marketing in Cisco's access routing group. Gateways from Dialogic Inc. also have already been certified.


As a newcomer to telephony, Microsoft will take time to displace many standalone telephony systems, so interoperability will be critical, analysts said.


Most enterprises that adopt OCS still have phones connected to PBXes and will dial through the PBX, said Brent Kelly, [cq] a senior analyst at Wainhouse Research LLC. To start, most OCS users will keep their PBXes in place and take advantage of CSTA to gain the click-to-call benefits of OCS, he said.


"Right now, OCS doesn't have a voice model that's good enough for the enterprise," Kelly said.


However, there are a number of barriers to interoperability, too, said IDC analyst Nora Freedman. While Direct SIP interoperability is a good idea, it will take a long time to really work because SIP is so new, she believes.


"We're still battling proprietary SIP extensions from all the notable vendors," Freedman said.


Meanwhile, CSTA could be a distraction for enterprises trying to make the transition to unified communications because it brings yet another standard into the picture, she said. And for now, it's hard for early adopters to get theses kinds of systems put together, she added.


"Now we have a wealth of product but a drought of system-integrator experience in this," Freedman said. Resellers are working feverishly to build up their expertise, she said.


Microsoft's plan for telephony is bold, looking to eventually eliminate OCS as a separate product and make it, and telephony itself, just a set of features in applications, believes Zeus Kerravala [cq] of Yankee Group Inc. But for the time being, the job at hand is making OCS work with existing phones, he said.


"The first phase is just to get it out there,"




Technorati :

Find here

Home II Large Hadron Cillider News