Search This Blog

Monday, December 29, 2008

very low temperatures, helium can be solid and a perfect liquid

How Helium Can Be Solid And Perfect Liquid At Same Time, Now Explained By Computer-assisted Physics
Perfect helium crystals are normal classical crystals in which the atoms are localised at their lattice positions. At the point of a crystal defect, such as the grain boundary shown in the image, quantum mechanical effects cause the atoms to lose their exact position. They become delocalized and can flow along the defect without any friction: a "supersolid" is formed, a solid that is also a perfect liquid at the same time.


At very low temperatures, helium can be solid and a perfect liquid at the same time. Theoreticians, though, have incorrectly explained the phenomenon for a long time. Computer simulations at ETH Zurich have shown that only impurities can make this effect possible.

Matthias Troyer and his team carry out experiments at their computers. Troyer is Professor of Computational Physics at ETH Zurich’s Institute of Theoretical Physics. He simulates quantum phenomena such as “supersolid” structures. Supersolidity describes a physical phase which can occur at very low temperatures and where a material appears to be solid and “superfluid” at the same time.
Enquiries from the armed forces
However, the word can be misunderstood, as was discovered by one of Troyer’s colleagues who works on the phenomenon in the USA. The US Navy thought that “supersolid” meant “extremely hard” and so asked the physicist whether such a material could be used to armour ships or at least put into a spray can or be used to kill someone. The physicist answered “No” – because “supersolid” does not mean super-hard. After that, the army showed no further interest.
The researchers carry out fundamental research and no direct applications for “supersolidity” are yet on the horizon. At the same time, a group of physicists led by Matthias Troyer has shed light on how the phenomenon occurs. Their results have been published in a series of articles in Physics Review Letters. The first author of the article is post-doctoral researcher Lode Pollet, who has since moved from ETH Zurich to the Universities of Massachusetts and Harvard University in the US. He is in discussions for a professorship, even though he is not yet thirty.
An incorrect explanation
Theoreticians first predicted the “supersolidity” phenomenon in 1969. Their explanation was incorrect, but this escaped notice for some time. The first evidence for “Supersolidity” was measured in an experiment only in 2004. This involved attaching a disc-shaped helium crystal to a spring and rotating it to and fro. In this arrangement, the vibration frequency depends on the rotating mass. The researchers found that the frequency became higher if they cooled the apparatus down to below 0.2 Kelvin – almost down to absolute zero. Part of the mass no longer participated in the rotation; it behaved as a superfluid, meaning it behaved like a friction-free liquid. In other words, it had become “supersolid”.
Up to this point, the measurements were still in line with the theory, but further experiments showed that the proportion of the crystal that became supersolid increased with the number of defects in the crystal. However, the theoreticians who predicted the phenomenon had done their calculations using perfect crystals, ones totally free from defects.
No effect with perfect crystals
At this juncture, the problem became interesting for the computer-assisted physics group led by Matthias Troyer at ETH Zurich and their colleagues in the US and Canada. Although the physicists also carry out experiments, they do so on computer models rather than on the material itself. This allows them to monitor the crystal more closely. For example, they experimented with crystals free from impurities, i.e. perfect crystals of the kind that cannot be grown in the laboratory. No “supersolidity” occurred here.
However, the scientists also grew virtual crystals with defects, for example by orienting the structure of one half of the crystal in a different direction to the other half. They performed this experiment using about one hundred variations with different temperatures and orientations. The result: “supersolidity” occurred where the layers of atoms with different orientations came together, and did so only if the layers did not fit together particularly well. This meant that it depended on the defects, exactly as in the laboratory experiments.
At US customs
Initially, these results were met with rejection from a few scientists. The fact that the phenomenon was possible only when impurities were present did not fit with the view held by the theoreticians, who usually ignore impurities in their considerations. However, the explanation has since gained wide acceptance.
Scientists are not the only people interested in the physicists’ results. When Lode Pollet arrived in the US, a customs officer asked him whether he was the man who worked on this material that was solid and liquid at the same time. Clearly, the American government has not yet lost interest completely.

Wednesday, December 24, 2008

Facebook silences Project



Social network Facebook has disabled widgets from music-sharing site Project Playlist at the behest of the music industry, several days after rival site MySpace did the same. The reason? The user-uploaded music on Project Playlist that doesn't have industry sanction.


"The Recording Industry Association of America (RIAA) initially contacted Facebook last summer requesting the removal of the Project Playlist application for copyright violation, and recently reopened those communications," a statement from Facebook read. "We have forwarded the RIAA's letters to Project Playlist so it can work directly with that organization and music labels on a resolution. In the meantime, the application must be removed to comply with the Facebook Platform Terms of Service. Our hope and expectation is that the parties can resolve their disagreements in a manner that satisfies the developer and copyright holder, that continues to offer a great experience to music fans, and that doesn't discourage other developers from using (Facebook's) Platform to share their creativity and test new ideas."


Project Playlist has struck a deal with Sony BMG but has outstanding lawsuits with most other big players in the music industry, including the RIAA. The fast-growing start-up--it has 40 million monthly users, per ComScore--has gained most of its traction by encouraging users to embed its widgets on social networks like Facebook and MySpace, so bans from the big social network could be a critical blow.


But ironically for Facebook, Project Playlist recently brought on its former chief operating officer, Owen Van Natta, as CEO. Part of his job, the blogosphere assumes, is to ink those crucial deals with the music industry.




Flickr : , , , , ,
Buzznet : , , , , ,

Tuesday, December 23, 2008

Warner Music, YouTube



Warner Music Group ordered YouTube on Saturday to remove all music videos by its artists from the popular online video-sharing site after contract negotiations broke down.


The order could affect hundreds of thousands of videos clips, as it covers Warner Music's recorded artists as well as the rights for songs published by its Warner/Chappell unit, which includes many artists not signed to Warner Music record labels.


The talks fell apart early on Saturday because Warner wants a bigger share of the huge revenue potential of YouTube's massive visitor traffic. There were no reports on what Warner was seeking.


"We simply cannot accept terms that fail to appropriately and fairly compensate recording artists, songwriters, labels and publishers for the value they provide," Warner said in a statement.


YouTube is hugely popular, with more than 100 million viewers in the United States alone in October, according to comScore, a Web audience measurement firm.


Warner Music, home to artists including Red Hot Chili Peppers and rapper T.I., was the first major media company to negotiate a deal with YouTube in 2006. Its executives believe that deal gave the site legitimacy in the eyes of search giant Google Inc (GOOG.O) which bought it soon after for $1.65 billion.


As part of the original 2006 negotiation, Warner, Universal Music and Sony Music all took small stakes in YouTube pre-acquisition and profited when the Google acquisition closed.


The music companies typically get paid a share of any advertising revenue associated with the video and a per-play payment for every video viewed. The per-play fee is usually a fraction of a penny and with millions visiting YouTube everyday it was all expected to add up to a substantial amount.


But a source familiar with Warner Music's talks said the amounts it has been receiving from YouTube were "staggeringly low".


YouTube representatives did not immediately return calls for comment.


YouTube executives have spent most of 2008 stepping up efforts to develop revenue streams on the site partly in a bid to keep content partners happy. It has been in long negotiations with Warner on how best to split revenues until things came to a head in talks on Friday.


"Despite our constant efforts, it isn't always possible to maintain their innovative agreements," YouTube said in a statement on its blog about difficulties of music licensing. "Sometimes, if we can't reach acceptable business terms, we must part ways with successful partners."


YouTube also has agreements with Vivendi's (VIV.PA) Universal Music Group, Sony Music Entertainment and EMI Music. Warner's move could see them also making tough demands for higher fees.


The demands could leave YouTube in a difficult position as it tries to balance the need to pay a reasonable fee to content partners, including TV and movie companies, and also generate enough return on the substantial investment needed to keep streaming millions of videos around the world.


more....


Recording labels and websites in a music video tussle
Universal Music Group
Urban artist Akon, left, during the making of the "I'm So Paid" music video. Paid product placement underwrote the Universal Motown video's entire $1-million production cost.
The removal of Warner Music Group's videos from YouTube over the weekend highlights the growing tension between music labels and websites over what is becoming an important source of revenue for the beleaguered recorded-music industry: advertising and licensing fees from music videos, the foundation that built MTV but which has now largely migrated to the Internet.


The impasse comes at a time when all four major labels -- Warner, Universal Music Group, Sony BMG Music Entertainment and EMI Music -- are renegotiating their licensing deals with YouTube, the largest video site.






Technorati : , , , , , , , ,

Sunday, December 21, 2008

Researchers have discovered our experience of pain depends on whether we think someone caused the pain intentionally.



In the study, participants who believed they were getting an electrical shock from another person on purpose, rather than accidentally, rated the very same shock as more painful. Participants seemed to get used to shocks that were delivered unintentionally, but those given on purpose had a fresh sting every time.


Pain Hurts More If Person Hurting You Means It


Researchers at Harvard University have discovered that our experience of pain depends on whether we think someone caused the pain intentionally. In their study, participants who believed they were getting an electrical shock from another person on purpose, rather than accidentally, rated the very same shock as more painful. Participants seemed to get used to shocks that were delivered unintentionally, but those given on purpose had a fresh sting every time.


The research, published in the current issue of Psychological Science, was led by Kurt Gray, a graduate student in psychology, along with Daniel Wegner, professor of psychology.


It has long been known that our own mental states can alter the experience of pain, but these findings suggest that our perceptions of the mental states of others can also influence how we feel pain.


"This study shows that even if two harmful events are physically identical, the one delivered with the intention to hurt actually hurts more," says Gray. "Compare a slap from a friend as she tries to save us from a mosquito versus the same slap from a jilted lover. The first we shrug off instantly, while the second stings our cheek for the rest of the night."


The study's authors suggest that intended and unintended harm cause different amounts of pain because they differ in meaning.


"From decoding language to understanding gestures, the mind distills meaning from our social environment," says Gray. "An intended harm has a very different meaning than an accidental harm."


The study included 48 participants who were paired up with a partner who could administer to them either an audible tone or an electric shock. In the intentional condition, participants were shocked when their partner chose the shock option. In the unintentional condition, participants were shocked when their partner chose the tone option. Thus, in this condition, they only received a shock when their partner did not intend them to receive one. The computer display ensured that participants both knew their partner's choice and that a shock would be coming, to ensure the shock was not more surprising in the unintentional condition.


Despite identical shock voltage between conditions, those in the intentional condition rated the shocks as significantly more painful. Furthermore, those in the unintentional condition habituated to the pain, rating them as decreasingly painful, while those in the intentional condition continued to feel the full sting of pain.


Gray suggests that it may be evolutionarily adaptive for this difference in meaning to be represented as different amounts of pain.


"The more something hurts, the more likely we are to take notice and stop whatever is hurting us," he says. "If it's an accidental harm, chances are it's a one-time thing, and there's no need to do anything about it. If it's an intentional harm, however, it may be the first of many, so it's good to take notice and do something about it. It makes sense that our bodies and brains might amplify our experience of pain when we know that the pain could signal threats to our survival."


These findings speak to how people experience pain and negative life events. If negative events are seen as intended, they may hurt more. This helps to explain why torture is so excruciating - not only are torture techniques themselves exceptionally painful, but it's the thought that counts-and makes torture hurt more than mere pain.


On the other hand, if negative events are seen as unintended, they may hurt less. This may explain, in part, why people in abusive relationships sometimes continue to stay in them. By rationalizing that an abusive partner did not intend harm, some victims may reduce their experience of pain, which could make them less likely to leave the relationship and escape the abuse.


The research was supported by the National Institute of Mental Health, the Canadian Social Sciences and Humanities Research Council and the Institute for Humane Studies.






Technorati : , , , , , ,
Del.icio.us : , , , , ,
Flickr : , , , , ,

Friday, December 19, 2008

GATES FOUNDATION INITIATES FOR BETTER NET CAFES AS LIBRARY

Internet is yhe gateway of knowledge,just few years before library was the ieal place for gathering knowledge,now senerio is diffrent,cyber cafe is the library ,Bill gates the father of technoly and his foundation Gates Foundation is innitiating to develop the infrustucture of cyber cafe considering library,
Gates Foundation to help libraries be better free 'net cafes
Public libraries haven't been just about books for some time now, but they are finding it increasingly difficult to keep up with the costs of infrastructure, faster Internet access, and new computers. To help struggling libraries get on their 21st century feet, the Bill & Melinda Gates Foundation today announced a grant program of $6.9 million theat will go toward launching a pilot broadband initiative in a handful of US states.

The seven states included in the Gates Foundation's pilot grant program include Arkansas, California, Kansas, Massachusetts, New York, Texas, and Virginia, and the money has been awarded to two separate organizations. $6.1 million goes to Connected Nation, a non-profit broadband Internet advocacy group that will help these states to gather and activate various public library leaders and officials who can support broadband Internet in each state's libraries.

The rest of the funds, a hair over $850,000, will go to the American Library Association's Office for Information Technology Policy (OITP), which will help state library agencies implement sustainable broadband strategies. The organization will also perform and distribute a series of case studies that demonstrate how other public libraries can successfully manage broadband services for their patrons.

A bull market in library usage
The Gates Foundation's grant comes at a crucial time when libraries across the US are reporting spikes in patron traffic due to the economic crisis. Students, the unemployed, and those without home Internet access are increasingly making use of the fact that local libraries double as free Internet cafes. In fact, a recent 2007-2008 study by the American Library Association (ALA) shows that 73 percent of public libraries are the only source of free, public Internet access in their respective communities. Despite this demand, however, only 38.9 percent of all libraries have a T1 (1.5Mbps) connection, and among those, 51.6 percent are urban libraries, 32.1 percent are rural.

These libraries are feeling the squeeze, too. Over 57 percent of libraries (up from 52 percent in the ALA's 2006-2007 study) report that their connectivity is too slow some or all of the time, and over 82 percent report that they don't have enough workstations some or all of the time. Because of these and other constraints, over 90 percent of libraries impose time limits on public Internet workstations, with 45.7 percent using a 60-minute limit, and 35.2 percent cutting users off at just 30 minutes; hardly enough time to finish registering at Monster.com or complete that web-based art history exam.

The Gates Foundation picked the seven states for this pilot program based on a variety of factors such as their high concentrations of public libraries with Internet speeds below 1.5Mbps and public policy support to improve public library broadband access. The foundation has already invested $325 million in grants and other support for computers and staff training in libraries across all 50 US states. If this pilot grant program goes well with these first seven candidates, the Gates Foundation may expand its support to a limited number of other states.

Thursday, December 18, 2008

Just imagine..what will life be like 2020 .Humans and machines in future.

Some experts say humans will merge with machines before the end of this century.
A group of experts from around the world will Thursday hold a first of its kind conference on global catastrophic risks.

They will discuss what should be done to prevent these risks from becoming realities that could lead to the end of human life on earth as we know it.
Speakers at the four-day event at Oxford University in Britain will talk about topics including nuclear terrorism and what to do if a large asteroid were to be on a collision course with our planet.
On the final day of the Global Catastrophic Risk Conference experts will focus on what could be the unintended consequences of new technologies, such as superintelligent machines that, if ill-conceived, might cause the demise of Homo sapiens.
"Any entity which is radically smarter than human beings would also be very powerful," said Dr. Nick Bostrom, director of Oxford's Future of Humanity Institute, host of the symposium. "If we get something wrong, you could imagine the consequences would involve the extinction of the human species."
Bostrom is a philosopher and a leading thinker of transhumanism -- a movement that advocates not only the study of the potential threats and promises that future technologies could pose to human life but also the ways in which emergent technologies could be used to make the very act of living better.
"We want to preserve the best of what it is to be human and maybe even amplify that," Bostrom told CNN.
Transhumanists, according to Bostrom, anticipate a coming era where biotechnology, molecular nanotechnologies, artificial intelligence and other new types of cognitive tools will be used to amplify our intellectual capacity, improve our physical capabilities and even enhance our emotional well-being.
The end result would be a new form of "posthuman" life with beings that possess qualities and skills so exceedingly advanced they no longer can be classified simply as humans.
"We will begin to use science and technology not just to manage the world around us but to manage our own human biology as well," Bostrom told CNN. "The changes will be faster and more profound than the very, very slow changes that would occur over tens of thousands of years as a result of natural selection and biological evolution."
Bostrom declined to try to predict an exact time frame when this revolutionary biotechnological metamorphosis might occur. "Maybe it will take eight years or 200 years," he said. "It is very hard to predict."
Other experts are already getting ready for what they say could be a radical transformation of the human race in as little as two decades.
"This will happen faster than people realize," said Dr. Ray Kurzweil, an inventor and futurist who calculates technology trends using what he calls the law of accelerating returns, a mathematical concept that measures the exponential growth of technological evolution.
In the 1980s Kurzweil predicted that a tiny handheld device would be invented sometime early in the 21st century allowing blind people to read documents from anywhere at anytime -- earlier this year such a device was publicly unveiled. He also anticipated the explosive growth of the Internet in the 1990s.
Now Kurzweil is predicting the impending arrival of something called the Singularity, which he defines in his book on the subject as "the culmination of the merger of our biological thinking and existence with our technology, resulting in a world that is still human but that transcends our biological roots."
"There will be no distinction, post-Singularity, between human and machine or between physical and virtual reality," he writes.

Singularity will approach at an accelerating rate as human-created technologies become exponentially smaller and increasingly powerful and as fields such as biology and medicine are understood more and more in terms of information processes that can be simulated with computers.
By the 2030s, Kurzweil tells CNN, humans will become more non-biological than biological, capable of uploading our minds onto the Internet, living in various virtual worlds and even avoiding aging and evading death.
In the 2040s, Kurzweil predicts non-biological intelligence will be billions of times better than the biological intelligence humans have today, possibly rendering our present brains as obsolete.
"Our brains are a million times slower than electronics," said Kurzweil. "We will increasingly become software entities if you go out enough decades."
This movement towards the merger of man and machine, according to Kurzweil, is already starting to happen and is most visible in the field of biotechnology.
As scientists gain deeper insights into the genetic processes that underlie life, they are able to effectively reprogram human biology through the development of new forms of gene therapies and medications capable of turning on or off enzymes and RNA interference, or gene silencing.
"Biology and health and medicine used to be hit or miss," said Kurzweil. "It wasn't based on any coherent theory about how it works."
The emerging biotechnology revolution will lead to at least a thousand new drugs that could do anything from slow down the process of aging to reverse the onset of diseases, like heart disease and cancer, Kurzweil said.
By 2020, Kurzweil predicts a second revolution in the area of nanotechnology. According to his calculations, it is already showing signs of exponential growth as scientists begin test first generation nanobots that can cure Type 1 diabetes in rats or heal spinal cord injuries in mice.
One scientist is developing something called a respirocyte -- a robotic red blood cell that, if injected into the bloodstream, would allow humans to do an Olympic sprint for 15 minutes without taking a breath or sit at the bottom of a swimming pool for hours at a time.
Other researchers are developing nanoparticles that can locate tumors and one day possibly even eradicate them.
And some Parkinson's patients now have pea-sized computers implanted in their brains that replace neurons destroyed by the disease -- new software can be downloaded to the mini computers from outside the human body.
"Nanotechnology will not just be used to reprogram but to transcend biology and go beyond its limitations by merging with non-biological systems," Kurzweil told CNN. "If we rebuild biological systems with nanotechnology, we can go beyond its limits."
The final revolution leading to the advent of Singularity will be the creation of artificial intelligence, or superintelligence, which, according to Kurzweil, could be capable of solving many of our biggest threats, like environmental destruction, poverty and disease.
"A more intelligent process will inherently outcompete one that is less intelligent, making intelligence the most powerful force in the universe," writes Kurzweil.
Yet the invention of so many high-powered technologies and the possibility of merging these new technologies with humans may pose both peril and promise for the future of mankind.
"I think there are grave dangers," said Kurzweil. "Technology has always been a double-edged swor.

Do you think technology will allow humans to transcend biology in the future? Would you be comfortable with altering your biology? Should humans try to reprogram their genetics? What do you think the future looks like for mankind and machines?

Carbon Nanotubes can be used in detecting Cancer Agents

The technology of nanotubes is becoming quite crucial these days as it has multiple uses in various fields.
Researchers at MIT have found that carbon nanotubes can serve as highly sensitive biological sensors for detecting single molecules in living cells in real time. The study, published online in Nature Nanotechnology is the first demonstration that nanoscale sensors can be used to detect and image multiple types of molecules in cells at the same time, at a sensitivity that far exceeds that of fluorescent dyes, the standard tool for molecular imaging. The researchers used the sensors to detect substances that damage DNA, including certain cancer drugs and toxins. The sensors could eventually be used to monitor the effectiveness of chemotherapy drugs, track molecular interactions in cells, and test for low levels of toxins in the environment.
That's not all! In fact they can even identify chemotherapy drugs, which are very powerful DNA disruptors and further, these sensors also have the ability of identifying toxins and other free radicals that pose a threat to DNA. The good thing is that these sensors can be placed into living cells without much trouble and then several of the agents that can affect DNA could be detected via it.
Furthermore, these sensors can also lend a helping hand as the chemotherapy analyzer that could make it lot easier to determine if the very powerful chemotherapy drugs used to destroy carcinogenic cells are really doing their job or not.
Apart from this, the scientist are also planning to utilize the sensors with the aim to study the effects of various anti-oxidants on DNA and also to learn as to how to make chemotherapy treatment far more effective than it is at present.
When asked as to how these sensors work, the scientists replied, "Apparently, each of the nanotubes is coated with DNA, which enables them to bind to DNA damaging agents present in the cell (if any). Now, carbon nanotubes have this ability to emit florescent light when subjected to infrared radiation/light. If there is an interaction between the DNA coated on the carbon nanotube and DNA damaging agents, the wavelength of the fluorescent light emitted by the DNA coated carbon nanotube changes. Depending on the wavelength change, scientists would be able to determine the specific agent that caused the change - accurately identifying it in the process."
The concluded, "The nanotubes coated with DNA can be safely injected in to the body, which makes their use a very easy affair. Continuous research in this field will ensure that in future, cancer treatments would be not only faster, but will also turn out to be much more efficient."
window.google_render_ad();

Wednesday, December 17, 2008

Year 2020- Internet and interactivity-Pew


By the year 2020, marketing and manipulation will have merged on the Internet, encouraging consumers to trade privacy for discounts. Copyright will be "dead duck," virtual reality sanctuaries will provide an escape from cyberspace, and viciousness will prevail over civility.
These are some of the predictions offered by "experts" in "Future of the Internet III," a study released on Monday by the Pew Internet & American Life Project.

By 2020, the mobile phone will be the primary connection tool to the Internet and it will be so integrated into our daily lives that it will be difficult to imagine what life was like without one, according to new research by the Pew Internet & American Life Project. In addition, respondents thought Sixty percent of the experts interviewed disagreed that content control through copyright-protection technology would dominate the Internet of 2012.
But the majority view appears to discount the popularity of the locked-down iPhone eco-system. Given the extent to which Apple's competitors in the mobile arena have committed to copying the iTunes App Store model, it wouldn't be surprising if mobile customers traded freedom for the promise of phone security. That might keep copyright alive until nano-assemblers make it feasible to copy objects on an atomic level


The Pew report expects continued blurring between work life and home life and between physical and virtual reality. Respondents were divided, 56% of whom think that future is okay, with the rest expressing some reservations about the potential added stress of being at work all the time.
The study includes a number of quotations from those who submitted their thoughts on what's to come. Their observations make other dystopian visions of the future, as seen in the 1982 film Blade Runner, look almost rosy.
"We will enter a time of mutually assured humiliation; we all live in glass houses," said Jeff Jarvis, a blogger and professor at City University of New York Graduate School of Journalism.
"Viciousness will prevail over civility, fraternity, and tolerance as a general rule, despite the build-up of pockets or groups ruled by these virtues," said Alejandro Pisanty, ICANN and Internet Society leader and director of computer services at Universidad Nacional Autonoma de Mexico. "Software will be unable to stop deeper and more hard-hitting intrusions into intimacy and privacy, and these will continue to happen."
"By 2020, the Internet will have enabled the monitoring and manipulation of people by businesses and governments on a scale never before imaginable," said writer and blogger Nicholas Carr. "Most people will have happily traded their privacy -- consciously or unconsciously

BLADE Network Technologies' -2009 Most Valuable Performers Award

BLADE Network Technologies' President and CEO, Vikram Mehta, With Technology Industry's 2009 Most Valuable Performers Award.

BLADE Network Technologies, Inc. (BLADE), the trusted leader in data center networking, announced today that Network Product Guide, a world leading publication on technologies and solutions, has honored BLADE President and CEO Vikram Mehta with the information technology industry's 2008 Most Valuable Performers (MVP) recognition. This prestigious industry award recognizes senior executives from around the world with the essential characteristics of leaders that exhibit the qualities of most valuable performers.
Vikram Mehta has been at the helm of BLADE since its inception. Through his passionate commitment to customer service and product innovation, BLADE has become the trusted leader in data center networking, the industry's leading supplier of blade server switch solutions and a pioneering provider of the new breed of 10 Gigabit Ethernet data center switches. Prior to establishing BLADE as a privately held company in 2006, Mehta held leadership and executive positions at Nortel Networks, Alteon Web Systems, Ensim and HP. Mehta is an electrical engineer from the Birla Institute of Technology, Ranchi, India.
Network Products Guide also has named BLADE as a 2009 Hot Companies finalist. Selected from a global industry analysis of information technology vendors that included established large companies, mid-size and new start-ups, BLADE has advanced to the finalists stage based on the "4Ps" selection criteria -- namely Products, People, Performance, and Potential. The coveted 2009 Hot Companies award criterion encompasses companies in all areas of information technologies including security, wireless, storage, networking, software and communications.
Over half of Fortune 500 companies rely on BLADE's Ethernet switches to equip their essential data center infrastructures. BLADE has shipped 5 million Ethernet switch ports to more than 5,000 customers worldwide. Through its partnerships with HP, IBM, NEC and Verari Systems, BLADE has delivered more than 220,000 Gigabit and 10 Gigabit Ethernet switches to enterprise data centers to connect over 1.1 million servers. BLADE's market share of data center switches for blade servers now stands in excess of 48.5 percent combined on HP and IBM blade servers and 66 percent on NEC blade servers. To date, BLADE's market share and Ethernet port shipments on both IBM and HP platforms are more than 2x greater than the nearest competitor's.
"The new economy leaders are essentially those that are adapting best in the current economic environment and will emerge with higher standards," said Rake Narang, editor-in-chief, Network Products Guide. "We are proud to honor Vikram Mehta with this year's 2008 Most Valuable Performers award and recognize BLADE as a 2009 Hot Companies Finalist."
Network Products Guide 2008 MVP leaders have a clear vision and mission, have set measurable goals and objectives for themselves, are selfless and mentors to others, and most importantly demonstrate respect and trust for their staff, employees and the high-technology industry. Senior executives were honored from companies around the world which include Ingres Corporation, Cisco Systems, Inc., IBM, AppGate Network Security, Crossroads Systems, Lumeta Corporation, SECNAP Network Security Corp., Dyadem International Ltd., Permabit Technology Corporation, M-CAT Enterprises, Google, Inc., BLADE Network Technologies, CaseCentral, ONStor, SolarWinds, BlueCat Networks, Inc., Rohati Systems, Inc., VirtualPBX, IBRIX, LogMeIn, Inc., GTB Technologies, Inc., Kazeon, Riverbed Technologies, Protegrity, Everyone.net and Xiotech Corporation.
The 2009 Hot Companies winners will be announced and honored at the 2009 "World Executive Alliance Summit" in San Francisco on March 26-27, 2009. BLADE will be among other key industry players at this event. CEOs of finalists will be presenting their company's 4Ps criteria live to an audience of leading entrepreneurs, IT companies, venture capitalists, corporate strategists and media. To see the complete list of finalists please visit http://www.networkproductsguide.com/hotcompanies/
About Network Products Guide Awards
Network Products Guide, published from the heart of Silicon Valley, is a leading provider of products, technologies and vendor related research and analysis. You will discover a wealth of information and tools in this guide including the best products and services, roadmaps, industry directions, technology advancements and independent product evaluations that facilitate in making the most pertinent technology decisions impacting business and personal goals. The guide follows conscientious research methodologies developed and enhanced by industry experts. To learn more, visit www.networkproductsguide.com
About BLADE Network Technologies
BLADE Network Technologies is the leading supplier of Gigabit and 10G Ethernet network infrastructure solutions that reside in blade servers and "scale-out" server and storage racks. BLADE's new "virtual, cooler and easier" RackSwitch family demonstrates the promise of "Rackonomics" -- a revolutionary approach for scaling out data center networks to drive down total cost of ownership. The company's customers include half of the Fortune 500 across 26 industry segments, and an installed base of over 220,000 network switches representing more than 1,100,000 servers and over 5 million switch ports. For more information, visit www.bladenetwork.net.
BLADE Network Technologies and the BLADE logo are trademarks of BLADE Network Technologies. All other names or marks are property of their respective owners. CONTACTS:
Tim Shaughnessy
BLADE Network Technologies
(408) 850-8963
Email Contact
Zee Zaballos
ZNA Communications
(831) 425-1581 x201
Email Contact

Tuesday, December 16, 2008

Mine of 1,000 new species



The "dragon millipede" (pictured here) is one of more than 1,000 new species discovered around the Mekong River in Southeast Asia over the last 10 years. Scientists suggest the millipede uses its bright color to warn predators of its toxicity. According to a new report by conservation group World Wildlife Fund (WWF), between 1997 and 2007, at least 1,068 new species have been discovered in the Greater Mekong, at a rate of approximately two new species a week.



In all, roughly 25,000 species call the Mekong River basin home. On a species-per-mile basis, the region's waterways are richer in biodiversity than the Amazon, according to "First Contact in the Greater Mekong," a report released today by WWF International.
"This region is like what I read about as a child in the stories of Charles Darwin," Thomas Ziegler, curator at the Cologne Zoo in Germany, said in a news release. "It is a great feeling being in an unexplored area and to document its biodiversity for the first time ... both enigmatic and beautiful."
Nicole Frisina, communications officer for WWF's Greater Mekong Program, told me that "the rate of species discovery is quite prolific as you compare it with other areas of the world." The average works out to two new species every week - and if anything, the pace is accelerating.
From war to wonderThe Greater Mekong Program's director, Stuart Chapman, told me there are a couple of reasons for that quickening pace.

WWF
The colored areas represent different parts of Southeast Asia's Greater Mekong region, draining into Vietnam's Mekong Delta. Click on the map for a larger version.
First, the Greater Mekong region - which takes in areas of China's Yunnan Province as well as Cambodia, Laos, Myanmar, Thailand and Vietnam - includes some incredibly remote areas, such as the Annamite Mountains on the Lao-Vietnamese border.
Under the best of circumstances, traveling to these frontiers is difficult and expensive. And during the region's decades of conflict (including, of course, the Vietnam War and Cambodia's wars), scientific exploration was nearly unthinkable.
"In some regions, there haven't been a lot of scientific expeditions purely because there's been a lot of [unexploded] ordnance around," Chapman said.
That's all changing now: Many parts of Southeast Asia are undergoing intense economic development. Just to cite one example, more than 150 large hydroelectric dams are being planned in the region. And that raises a huge challenge for scientists scrambling to explore the Mekong's lost world.
The 'race against time'"This poorly understood biodiversity is facing unprecedented pressure ... for scientists, this means that almost every field survey yields new diversity, but documenting it is a race against time," Raoul Bain, a biodiversity specialist from New York's American Museum of Natural History, said in today's news release.
Rising populations and greater economic development are putting wildlife habitat in danger. The World Conservation Union has already added 10 species from Vietnam to its extinction list, and another 900 species are considered threatened.
The WWF (fomerly known as the World Wildlife Fund) issued today's report as part of its effort to preserve the region's biological riches even as the 320 million people living there reach for new economic riches. "You don't have to have people choose between the two," Chapman said. "You can have both, with careful planning."
The organization called on the region's six governments to work together on a conservation and management plan for 230,000 square miles (600,000 square kilometers) of transboundary and freshwater habitats. Chapman said the governments already have identified corridors of land in need of cross-border conservation.
However, he said, "having them identified on the map hasn't resulted in transboundary planning. ... That kind of thinking hasn't really taken hold yet."
Coming attractionsThe biological riches could eventually yield new medicines and sustainable food sources for the region's needy populations - or perhaps new attractions for the world's eco-tourists. And for scientists at least, there are plenty of attractions out there, hiding in plain sight.

ITN's Chris Rogers reports on the Greater Mekong's biological riches.
For example, a new rat species was discovered as a delicacy in a Laotian food market - and scientists traced its evolutionary lineage back to a group of rodents that were thought to have gone totally extinct 11 million years ago. It turned out that the Laotian rock rat (listed as Kha-nyou on the menu) was the sole survivor of that ancient group.
Another previously unknown species of pit viper was first seen by scientists as it slithered through the rafters of a restaurant in Thailand's Khao Yai National Park.

Saturday, December 13, 2008

Intel is to manufacture car battery

Intel the world leader company on technology is going to introduce battery for car ! Good news ,also its a alternative fuel innitiative ,"strategic objective is tackling big problems and turning them into big businesses." He said Intel, with its cash resources, can invest in battery technology and manufacturing to bring down the cost of car batteries, which would drive adoption of plug-in electric cars.
Intel is arguably the world's most important technology company ,Andy Grove former Intel chairman told Rebecca Smith and Don Clark that he has been urging the company to get into the business of making car batteries. Like so many in Silicon Valley, Grove is apparently an electric car booster, and he has been evangelizing car batteries as a potential growth industry—one that he'd like to see Intel get in on the ground floor of.

In an interview with The Wall Street Journal published Friday, Grove said he is urging Intel to invest in battery manufacturing as a way to diversify from its core chip business.
Grove told the Journal that Intel's "strategic objective is tackling big problems and turning them into big businesses." He said Intel, with its cash resources, can invest in battery technology and manufacturing to bring down the cost of car batteries, which would drive adoption of plug-in electric cars.

Batteries are the most expensive component in plug-in electric vehicles, a market being pursued by a few U.S. companies.

General Motor's 2011 Volt is testing batteries from lithium-ion maker A123 Systems. Other U.S. companies include Ener1 and Valence Technology. Notebook battery maker Boston Power also intends to enter the auto market.

But battery makers and analysts say that U.S. manufacturers lack the financial means to meet the anticipated demand of electric cars.

"The technology exists today to put (electric drives) into an automobile," said Ener1 CEO Charles Gassenheimer at last week's Electric Drive Transportation Association's Conference & Exposition. "But it is not doable without the ability to drive down the cost of manufacturing."

Intel has invested in battery technology through its venture capital arm and other energy-related firms. Earlier this year, Intel also spun out SpectraWatt, which intends to lower the cost of manufacturing solar cells.

Grove has become an advocate for government policies that promote plug-in hybrid cars. This summer, he published a manifesto, called "Our Electric Future," in The American magazine, where he called for transitioning the American auto fleet to electricity for national security reasons.

"Because electricity is the stickiest form of energy, and because it is multi-sourced, it will give us the greatest degree of energy resilience. Our nation will be best served if we dedicate ourselves to increasing the amount of our energy that we use in the form of electricity," he wrote.

In a speech at the Plug-in 2008 conference in August, he called for a goal of putting 10 million plug-in vehicles on the road in 10 years.

Over here, or over there?
The WSJ piece leaves the reader with the impression that Grove might like to see Intel making batteries in the US. I'd love a transcript of the interview that underlies the piece, because it's not clear to me if this is the authors' takeaway, or if "let's make them in the US" was a point that Grove himself wanted to emphasize.

I bring this up because Intel doesn't actually make as many chips over here as they used to. Most of the company's sales are overseas (Asia is the biggest market), so that's where a large and growing percentage of its workforce is, as well. The company's pronounced shift in moving jobs abroad has been a sore spot for American Intel employees over the past decade, but I hear that, internally, the Intel top brass makes no bones about the fact that they have no qualms about moving the plants closer to the customers.

But regardless of where Grove wants to see these batteries made, Intel CEO Paul Otellini can't be too happy that his former chairman is exercised enough about this battery scheme that he's talking to the press about it and instigating a news cycle's worth of "should Intel make car batteries?" stories.

Wednesday, November 26, 2008

Robotic Surgery effeiency Vs Craze Merkting



Advantages of Robotic Surgery
In today's operating rooms, you'll find two or three surgeons, an anesthesiologist and several nurses, all needed for even the simplest of surgeries. Most surgeries require nearly a dozen people in the room. As with all automation, surgical robots will eventually eliminate the need for some personnel. Taking a glimpse into the future, surgery may require only one surgeon, an anesthesiologist and one or two nurses. In this nearly empty operating room, the doctor sits at a computer console, either in or outside the operating room, using the surgical robot to accomplish what it once took a crowd of people to perform.


The use of a computer console to perform operations from a distance opens up the idea of telesurgery, which would involve a doctor performing delicate surgery miles away from the patient. If the doctor doesn't have to stand over the patient to perform the surgery, and can control the robotic arms from a computer station just a few feet away from the patient, the next step would be performing surgery from locations that are even farther away. If it were possible to use the computer console to move the robotic arms in real-time, then it would be possible for a doctor in California to operate on a patient in New York. A major obstacle in telesurgery has been latency -- the time delay between the doctor moving his or her hands to the robotic arms responding to those movements. Currently, the doctor must be in the room with the patient for robotic systems to react instantly to the doctor's hand movements.


Is Robotic Surgery Better? Or Just Marketing?

Why the U.S. healthcare system (if you want to call it a system, which it isn't) is a mess is obvious. It's mostly because of bureaucratic, inefficient, denial-fixated health insurers—chop out the waste, and escalating costs will come back into line. Considering this albatross as well as various other handicaps, it's amazing that the quality of our healthcare is really good.

Myths, both. Administrative expenses are a relatively small driver of healthcare costs. And the quality of U.S. care not only fails in many respects to measure up to the care delivered in other countries but swings between extremes depending on where you live, the caregiver you see, and the hospital you use. Shannon Brownlee, a visiting scholar at the National Institutes of Health Clinical Center (and a former U.S. News colleague), and oncologist Ezekiel Emanuel, chairman of the center's bioethics department, busted those two myths and three other widespread misconceptions in a well-argued piece in Sunday's Washington Post that is well worth reading.

In their discussion of what is to blame for high and rising costs, they cite technology, among other things, meaning new drugs, new gizmos, new procedures. "Unfortunately," they write, "only a fraction of all that new stuff offers dramatically better outcomes."

That reminded me of a striking admission from Paul Levy, president of Beth Israel Deaconess Medical Center in Boston, who last Friday stated publicly on his blog that the hospital is buying a da Vinci surgical robot for marketing reasons. It costs well over $1 million, not counting its expensive annual care and feeding with new tools and software. All of the hospital's Boston competitors have the robot, and they are drawing referrals away from Beth Israel, which doesn't. "So there you have it," he wrote, his own sentiments clear. "It is an illustrative story of the healthcare system in which we operate."

I'm not sure this is a perfect allegorical example of a pricey technology purchased just because it is new and therefore represents a competitive advantage or, if a hospital doesn't have the technology, the loss of one. It is quite true that the da Vinci robot—which allows a surgeon sitting at a control station to manipulate tiny surgical tools and thus is no more of a "robot" than is a car being driven by a person—has not been shown, with the possible exception of a few specific procedures, to be clinically superior to conventional surgery.

But let's suppose an expensive gadget has been introduced that might be able to do one or more of the following: reduce deaths or complications (saving lives and money), get patients out of the hospital faster (saving money), and get patients back on their feet sooner (making them happier and reducing lost work time). Turning that hypothetical "might" into "yes, it can" or "no, it can't" requires that the gadget be put to use, doesn't it? How can a technology be evaluated without putting hands on, making comparisons with the usual ways, and so on?

Where I have a problem with the Beth Israel situation is that our system is very much driven by marketing. Referring physicians are clients, patients are customers, and every hospital competes for market share. That can mean feeling pressured to have the latest CT scanner or radio-beam therapy or surgical robot. If there aren't enough patients to keep a gadget in use enough to be profitable, get more by hyping the benefits (remember the temporary boom in whole-body scanning a few years ago) or luring patients from other hospitals. Does every hospital in Boston truly need a surgical robot system? Can't expensive technology be pooled?

RTR-4N™ Portable Digital X-Ray Inspection System


Overview
SAIC's RTR-4® X-ray imaging systems are fully portable and compact, designed to rapidly perform X-ray based inspections in the field. The RTR-4N™ configuration consists of a portable X-ray source, an integrated digital imager, and powerful notebook computer. It is used for both Explosive Ordnance Disposal and Non-Destructive Inspection applications.

RTR-4 systems are the only fully-digital portable X-ray systems with ground level imaging available to Explosive Ordnance Disposal (EOD) professionals, meeting the intended purpose of enhancing the safety margin for EOD technicians and innocent civilians. The RTR-4N imaging system with its optional integrated wireless feature is the world's most popular portable digital x-ray system, and provides the ability to quickly and efficiently search for weapons, drugs, and contraband in areas too difficult or time-consuming to search by hand.

Applications
The RTR-4N digital X-ray system is compact, rugged, and portable, which allows it to be useful in a number of scenarios. A few examples of RTR-4 applications include:

Improvised Explosive Device (IED) evaluation and disposal. Bomb technicians from a variety of law enforcement, military, and airport security organizations use RTR-4 systems to investigate suspicious packages for the presence of IEDs.
Unexploded Ordnance (UXO) disposal personnel employ the RTR-4N system with Large Area Imager to evaluate unexploded ordnance and determine fusing condition.
Mail and package evaluation in a mailroom scenario, as well as point-of-entry examination of personal belongings at special events.
Customs personnel utilize the RTR-4N system to x-ray and investigate private vehicles and other odd-shaped objects not appropriate for an x-ray baggage scanner.
Non-Destructive Evaluation/Testing/Inspection (NDE/NDT/NDI) for process control of component assembly, honeycomb aerospace structures and wood building structures.
Features
Portable Notebook Control Unit: The lightweight and powerful notebook computer possesses all the capabilities necessary to acquire and process images, enabling rapid threat assessment.
Powerful and Fast Processor: Notebook computer with Pentium® IV processor provides rapid processing of acquired data.
Large Display for Image Evaluation: The notebook computer display is large, with additional pixels to allow easy image evaluation and enhancement.
Image Analysis Software: Software includes full image analysis methods, such as smoothing, contrast stretch, subtracting, embossing, etc.
High-Capacity Hard Disk, Increased Memory, Built-in CD/RW and USB Ports: Some of the many notebook features that increase the effectiveness and productivity of the user.
Single Case for Transport and Storage: All components are conveniently stored in one hardened foam-lined case for easy, safe, efficient transport and storage.
Wireless Capability: A new integrated wireless option provides a digital and encrypted wireless connection from the Control Unit to the Imager and X-Ray Source with no add-on boxes. The operator, as well as other personnel and property, remain a safe distance from the potentially dangerous item being evaluated.
Upgrades: Several available options allow easy upgrade from the RTR-4/ARS system to the RTR-4N Notebook computer based system, creating an all-in-one case design.
Benefits
The RTR-4N system is a small, lightweight, and durable portable x-ray imaging system that produces better image quality, less noise, and more contrast than typical analog systems. The RTR-4N system's digital transmission means no image degradation. The RTR-4N system operates without film— images are instantly displayed and can be saved in an industry-standard format.

Data is immediately available after the X-ray image is acquired and can be processed and reviewed as the inspection is being completed.

This product is available for purchase on the General Services Administration Law Enforcement and Security Equipment Contract GS-07F-0210J. Visit GSA Advantage for more information on purchasing.


More News On portable Device
A new portable X-ray system, which can generate instant images and which allows for state-of-the-art dentistry to occur anywhere, will be making the rounds at area schools this season, according to the Augusta Regional Dental Clinic.
A $2,500 gift from the Augusta Health Care (AHC) Community Health Foundation helped the regional clinic purchase the portable X-ray system. Donations from the Staunton/Augusta Rotary Club and the American Dental Association also contributed to the program, which required about $20,000 to launch.

The system will be traveling to elementary schools in Staunton, Waynesboro, and Augusta County over the next few months. Any student can be screened for oral health issues and given a dental assessment including free X-rays, sealants, and recommendations for follow-up treatment, said Margaret Hersh, executive director of the Augusta Regional Free Clinic, in a press release sent out by the AHC Community Health Foundation.

Parents at the school were given prior notice of the services available on site and asked if they wanted their child to participate, she explained.

Tuesday, November 25, 2008

34 nm Flash Chip by Intel & Micron


Intel and Micron jointly started s producing NAND flash memory chips using tiny 34-nanometer technology, the companies said Monday.
About NAND flash
NAND flash memory is used to store songs, movies and more in iPods, iPhones and a range of other consumer electronics goods.

The chip is built with a manufacturing process that enables the companies to shrink chip components in order to get more memory in the same amount of space. The latest product can fit 4 GB of memory on a core and eight cores on a layer for a total of 64GB of memory on a two-layer stack within a package.

The technology fits into a standard 48-lead thin small-outline package (TSOP), which is a type of surface-mount integrated-circuit package found in MP3 players, mobile phones, and other devices where space is a premium.
"The tiny 34-nm, 32-GB chip enables our customers to easily increase their NAND storage capacity for a number of consumer and computing products," Brian Shirley, VP of Micron's memory group, said in a statement.

The latest chips are manufactured on 300-mm wafers and are smaller than the size of a thumbnail. The memory is targeted at makers of digital cameras, personal music players, and digital camcorders. In addition, the new technology can be used to increase the storage capacity of solid-state drives, the companies said.

................

The nanometer measurement describes the size of the smallest transistors and other parts that can be manufactured on a single chip. There are about three to six atoms in a nanometer, depending on the type of atom, and there are a billion nanometers in a meter.Chip makers such as Taiwan Semiconductor Manufacturing (TSMC) and Intel currently mass produce chips using technology as tiny as 40nm to 45nm. Generally, the more transistors on a chip and the closer they are together, the faster the chip can perform tasks.

Aside from performance, companies are working to make chips smaller and less expensive because people want ever-smaller, cheaper devices.

IM Flash is manufacturing 32G byte NAND chips the size of a thumbnail with its 34nm technology, and expects the chips to be used in small solid-state drives (SSDs) or flash memory cards aimed at products including digital cameras, digital camcorders and personal music players.

The 32G byte chips are multi-level cell (MLC) chips, which means they can handle more rewrites than the single level cell (SLC) variety of NAND flash.

Samsung Electronics, the world's largest NAND flash memory chip maker, is currently upgrading its chip factories to use 42nm technology and plans to start 30nm production next year.

The company showed off a multi-level cell 64G byte NAND flash memory chip made using 30nm manufacturing technology last year.

Tuesday, November 18, 2008

Large Hadron Collider repairs cost $21 Million


World’s most ambitious scientific project, the Large Hadron Collider, will take over six months and $21 million to repair, after a faulty electrical connection between the accelerator’s magnets on September 19 caused a helium leak that forced the particle accelerator to be shut down.

The incident took everyone by surprise, and following a detailed investigation, the conclusion was not as encouraging as the first assumptions made by CERN at the time. Not only were they unable to repair the particle accelerator by the end of the year, but the machine apparently won’t be functional until at least June.

The investigation results, made public in late October, revealed that a fault occurred in the electrical bus connection in the region between a dipole and a quadrupole, which resulted in mechanical damage and release of helium from the magnet cold mass into the tunnel.

However, there was no damage in neighboring interconnections, but the investigators did find contamination by soot-like dust which propagated in the beam pipes over some distance, as well as damage to the multilayer insulation blankets of the cryostats.

CERN spokesman James Gillies said in an interview with the Associated Press that the Collider is estimated to be restarted by the end of June or later. “If we can do it sooner, all well and good. But I think we can do it realistically in early summer.”

Despite these difficulties, the inauguration of the Large Hadron Collider took place in Geneva on October 21. “The younger generations target their ambitions on what they experience while growing up,” said Torsten Akesson, president of CERN Council. “Science and technology need flagships that stand and catch the eye, excite fantasy and fuel curiosity. The LHC is one such flagship.”

The European Center for Nuclear Research (CERN) is the world’s largest particle physics laboratory. The LHC accelerator is capable of producing beams seven times more energetic than any other similar machine, and the beams are expected to reach their maximum intensity (30 times greater) by 2010, when the machine will reach maximum design performance.

Physicists around the world will conduct several experiments, hoping to understand more about our Universe and the principles of physics. The particle accelerator will be used to recreate the conditions after the Big Bang.

more...

Repairing the Large Hadron Collider (LHC) near Geneva will cost almost £14m ($21m) and "realistically" take until at least next summer to start back up.

An electrical failure shut the £3.6bn ($6.6bn) machine down in September.

The European Organization for Nuclear Research (Cern) thought it would only be out of action until November but the damage was worse than expected.

It is hoped repairs will be completed by May or early June with the machine restarted at the end of June or later.

Cern spokesman James Gillies said: "If we can do it sooner, all well and good. But I think we can do it realistically (in) early summer."

Fundamental questions
The LHC was built to smash protons together at huge speeds, recreating conditions moments after the Big Bang, and scientists hope it will shed light on fundamental questions in physics.

The fault occurred just nine days after it was turned on with Cern blaming the shutdown on the failure of a single, badly soldered electrical connection in one of its super-cooled magnet sections.

The collider operates at temperatures colder than outer space for maximum efficiency and experts needed to gradually warm the damaged section to assess it.

"Now the sector is warm so they are able to go in and physically look at each of the interconnections," Mr Gillies told Associated Press.

The cost of the work will fall within the Cern's existing budget.

Dr Lyn Evans, the Welsh-born project director has called the collider "a discovery machine, the most sophisticated scientific instrument of our time."

Spiders in space aren't new.

Spiders have been spotted on the space station. These creatures are welcome guests, though one of them is missing at the moment.
Spiders in space aren't new. Two arachnids named Arabella and Anita flew to Skylab in 1973. Scientists were curious to see how the spiders would react in weightlessness – whether their webs would be different and how would they eat and sleep. Spiders in your house may send you scurrying for a shoe but spiders in space are almost hypnotic, as they struggle to weave a symmetric web in zero gravity.
Experiments with insects are an easy way for teachers across the country like to get students involved in hands-on science, which is the goal of sending these two spiders, and some butterflies, into orbit on the latest shuttle mission to the space station.
Spider One, on view in his box with clear windows, was busy spinning a very tangled web, but Spider Two appears to be AWOL. Flight Director Ginger Kerrick says they are looking for him. "The way it was explained to me is that he is a backup spider and he has his own contained space on board and he had moved out of that area – we think he came out of his bedroom and is in the living room of his house."
Kirk Shireman, deputy shuttle program manager, says while only one spider is visible, that doesn't mean the other is missing. 'We don't believe he has escaped the payload – I am sure we will find him spinning a web somewhere in the next few days."
Astronaut Sandy Magnus, the newest member of the space station crew, was asked how the visible spider was doing.
Mission Control: "Is it weaving an organized looking web or is it something neat to see?"
Magnus replied : " The web is more or less three dimensional and it looks like it is all over the inside of the box, more of a tangled disorganized looking web that a Charlotte's Web kind of web."
The spiders will return to Earth when the space shuttle Endeavour lands at the end of its 15-day mission later this month.

Monday, November 17, 2008

How nanowires made of silicon "

Researchers from IBM and Purdue University have discovered that tiny structures called silicon nanowires might be ideal for manufacturing in future computers and consumer electronics.

The researchers used an instrument called a transmission electron microscope to watch how nanowires made of silicon "nucleate," or begin to form, before growing into wires, said Eric Stach, an assistant professor of materials engineering at Purdue University.
The work is based at IBM's Thomas J. Watson Research Center in Yorktown Heights, N.Y., and at Purdue's Birck Nanotechnology Center in the university's Discovery Park. The research is funded by the National Science Foundation through the NSF's Electronic and Photonic Materials Program in the Division of Materials Research.
The nucleation process can be likened to the beginning of ice forming in a pool of water placed in a freezer. The liquid undergoes a "phase transition," changing from the liquid to the solid phase.
"What's unusual about this work is that we are looking at these things on an extremely small scale," Stach said. "The three major findings are that you can see that the nucleation process on this small scale is highly repeatable, that you can measure and predict when it's going to occur, and that those two facts together give you a sense that you could confidently design systems to manufacture these nanowires for electronics."
It was the first time researchers had made such precise measurements of the nucleation process in nanowires, he said.
Findings will be detailed in a research paper appearing Friday (Nov. 14) in the journal Science. The paper was written by Purdue doctoral student Bong Joong Kim, Stach and IBM materials scientists Frances Ross, Jerry Tersoff, Suneel Kodambaka and Mark Reuter from the physical sciences department at the Watson Research Center.
The silicon nanowires begin forming from tiny gold nanoparticles ranging in size from 10 to 40 nanometers, or billionths of a meter. By comparison, a human red blood cell is more than 100 times larger than the gold particles.
The gold particles are placed in the microscope's vacuum chamber and then exposed to a gas containing silicon, and the particles act as a catalyst to liberate silicon from the gas to form into solid wires. The particles are heated to about 600 degrees Celsius, or more than 1,100 degrees Fahrenheit, causing them to melt as they fill with silicon from the gas. With increasing exposure, the liquid gold eventually contains too much silicon and is said to become "supersaturated," and the silicon precipitates as a solid, causing the nanowire to begin forming.
"We found that there is a single nucleation event in each little droplet and that all of the nucleation events occur in a very controllable fashion," Stach said. "The implication is that if you are trying to create electronic devices based on these technologies, you could actually predict when things are going to start their crystal growth process. You can see that it's going to happen the same way every time, and thus that there is some potential for doing things in a repeatable fashion in electronics manufacturing."
Although the researchers studied silicon, the same findings could be applied to manufacturing nanowires made of other semiconducting materials. The electron microscope is the only instrument capable of observing the nanowire nucleation process, which would have to be a thousand times larger to be seen with a light microscope, Stach said.
Nanowires might enable engineers to solve a problem threatening to derail the electronics industry. New technologies will be needed for industry to keep pace with Moore's law, an unofficial rule stating that the number of transistors on a computer chip doubles about every 18 months, resulting in rapid progress in computers and telecommunications. Doubling the number of devices that can fit on a computer chip translates into a similar increase in performance. However, it is becoming increasingly difficult to continue shrinking electronic devices made of conventional silicon-based semiconductors.
"In something like five to, at most, 10 years, silicon transistor dimensions will have been scaled to their limit," Stach said.
Transistors made of nanowires represent one potential way to continue the tradition of Moore's law.
"Nanowires of silicon and things like gallium arsenide, gallium nitride or indium arsenide, or other types of exotic semiconductors, are being investigated as a step toward continuing to scale electronics down," Stach said. "If you want to manufacture devices made of nanowires, make them the same way every time on a 12-inch wafer, then you need to understand the basic physics of how to start their growth, the kinetics of their continued growth, how to quantify that, how to understand it. We are looking at all steps in nucleation."
One challenge to using nanowires in electronics will be replacing gold as a catalyst with other metals that are better suited for the electronics industry, Stach said.
The gold particles are created inside the microscope chamber, but future research may use gold nanoparticles manufactured to more uniform standards using a different technology.
The research was conducted using an IBM microscope. The researchers also are extending the observations using a transmission electron microscope at the Birck Nanotechnology Center to look at smaller nanoparticles.

New action would be taken Alternative of Coal Fuel

New action would be taken Alternative of Coal Fuel,
An environmental review board has shot down the EPA approval of a new coal plant, stating that the Environment Protection Agency needs to come up with nationwide standards for dealing with carbon dioxide. The decision will cause lengthy and stricter rules, making the investment in expensive coal plants substantially riskier.

Therefore, the money will go into alternative energy, like solar or wind energy. The Environmental Appeals Boards’ decision to send the coal plans back to EPA with instructions to come up with standards was not a legal victory exactly, but the result is practically the same. Basically, the agency’s regional office has to at least consider whether to regulate carbon dioxide emissions, before it gives a green light to build the plant located in Utah.

The decision — which responded to a Sierra Club petition to review an E.P.A. permit granted to a coal plant in Utah — does not require the E.P.A. to limit carbon dioxide emissions from power plants, something which environmentalists have long sought.

Rather, it requires the agency’s regional office to at least consider whether to regulate carbon dioxide emissions, before the agency gives a green light to build the Utah plant. On a broader scale, it will delay the building of coal-fired power plants across the country, long enough for the Obama administration to determine its policy on coal, according to David Bookbinder, chief climate counsel for the Sierra Club.

“They’re sending this permit — and effectively sending every other permit — back to square one,” he said, adding, “It’s minimum a one to two year delay for every proposed coal-fired power plant in the United States.”


The decision references the landmark Massachusetts v. E.P.A. decision last year that declared carbon dioxide a pollutant under the Clean Air Act. That ruling, however, has not yet prompted the E.P.A. to act to regulate it.

It is the latest setback for coal plants, which emit far more carbon dioxide than natural gas or other power plants. Last year Kansas state regulators denied a permit to a coal plant on the grounds of its carbon dioxide emissions.

“Although a new administration could always have reversed course, this makes it easier by providing the first prod,” said Jody Freeman, director of the environmental law program at Harvard Law School. “And it’s a heads-up to the coal industry that stationary-source regulation of CO2 is coming.”

The coal industry put its best face on the decision. The ruling “merely says what the court has said — that the E.P.A. has the authority to regulate greenhouse gases under the Clean Air Act,” said Carol Raulston, a spokeswoman for the National Mining Association, an industry group.

However, she said, before rulemaking occurs, the E.P.A. has to make an “endangerment” finding, which has not yet been done. An “endangerment” finding would involve the E.P.A. declaring that carbon dioxide is a danger to public welfare, and would lead to regulation.

“We still believe, as do many in Congress, that the Clean Air Act is not very well structured to regulate greenhouse gases, and that Congress ought to address this through legislation,” added Ms. Raulston.

Ms. Freeman said that this week’s decision was part of a larger debate going forward “over whether and how the Clean Air Act might be used to regulate greenhouse gases while we wait for new climate legislation.

“E.P.A. has the authority to impose limits on CO2 coming from sources like power plants through the normal permit process,” she continued. “And we may see this happen in the new administration.”
The decision — which responded to a Sierra Club petition to review an E.P.A. permit granted to a coal plant in Utah — does not require the E.P.A. to limit carbon dioxide emissions from power plants, something which environmentalists have long sought.

Rather, it requires the agency’s regional office to at least consider whether to regulate carbon dioxide emissions, before the agency gives a green light to build the Utah plant. On a broader scale, it will delay the building of coal-fired power plants across the country, long enough for the Obama administration to determine its policy on coal, according to David Bookbinder, chief climate counsel for the Sierra Club.

“They’re sending this permit — and effectively sending every other permit — back to square one,” he said, adding, “It’s minimum a one to two year delay for every proposed coal-fired power plant in the United States.”


The decision references the landmark Massachusetts v. E.P.A. decision last year that declared carbon dioxide a pollutant under the Clean Air Act. That ruling, however, has not yet prompted the E.P.A. to act to regulate it.

It is the latest setback for coal plants, which emit far more carbon dioxide than natural gas or other power plants. Last year Kansas state regulators denied a permit to a coal plant on the grounds of its carbon dioxide emissions.

“Although a new administration could always have reversed course, this makes it easier by providing the first prod,” said Jody Freeman, director of the environmental law program at Harvard Law School. “And it’s a heads-up to the coal industry that stationary-source regulation of CO2 is coming.”

The coal industry put its best face on the decision. The ruling “merely says what the court has said — that the E.P.A. has the authority to regulate greenhouse gases under the Clean Air Act,” said Carol Raulston, a spokeswoman for the National Mining Association, an industry group.

However, she said, before rulemaking occurs, the E.P.A. has to make an “endangerment” finding, which has not yet been done. An “endangerment” finding would involve the E.P.A. declaring that carbon dioxide is a danger to public welfare, and would lead to regulation.

“We still believe, as do many in Congress, that the Clean Air Act is not very well structured to regulate greenhouse gases, and that Congress ought to address this through legislation,” added Ms. Raulston.

Ms. Freeman said that this week’s decision was part of a larger debate going forward “over whether and how the Clean Air Act might be used to regulate greenhouse gases while we wait for new climate legislation.

“E.P.A. has the authority to impose limits on CO2 coming from sources like power plants through the normal permit process,” she continued. “And we may see this happen in the new administration.”

Friday, November 7, 2008

Computers at the headquarters of the Barack Obama and John McCain campaigns were hacked during the campaign

Obama, McCain campaigns' computers hacked for policy data
The source said the computers were hacked mid-summer by either a foreign government or organization.
Another source, a law enforcement official familiar with the investigation, says federal investigators approached both campaigns with information the U.S. government had about the hacking, and the campaigns then hired private companies to mitigate the problem.
U.S. authorities, according to one of the sources, believe they know who the foreign entity responsible for the hacking is, but refused to identify it in any way, including what country.
The source, confirming the attacks that were first reported by Newsweek, said the sophisticated intrusions appeared aimed at gaining information about the evolution of policy positions in order to gain leverage in future dealings with whomever was elected.
The FBI is investigating, one of the sources confirmed to CNN. The FBI and Secret Service refused comment on the incidents. Watch Brian Todd's report on the investigation. »
The sources refused to speak on the record due to the ongoing investigation and also because it is a sensitive matter involving presidential politics.
As described by a Newsweek reporter with special access while working on a post-campaign special, workers in Obama's headquarters first detected what they thought was a computer virus that was trying to obtain users' personal information.
The next day, agents from the FBI and Secret Service came to the office and said, "You have a problem way bigger than what you understand ... you have been compromised, and a serious amount of files have been loaded off your system."


Some computers are too important to be networked

There is a common defensive computing thread in two recent stories.
In the first story, Newsweek reports that both presidential candidates had their campaign computers hacked from afar. As they put it:
The computer systems of both the Obama and McCain campaigns were victims of a sophisticated cyberattack by an unknown "foreign entity," prompting a federal investigation, both the FBI and the Secret Service came to the campaign with an ominous warning: "You have a problem way bigger than what you understand," an agent told Obama's team. "You have been compromised, and a serious amount of files have been loaded off your system." ... Officials at the FBI and the White House told the Obama campaign that they believed a foreign entity or organization sought to gather information... "
The second story involves a former Intel employee who allegedly stole trade secrets. As CNET's Stephanie Condon writes, the employee resigned, yet continued on the Intel payroll for a few weeks (perhaps working off vacation time). During this transition period, he started working for Intel rival AMD, yet he remained in possession of his Intel laptop and still had access to Intel's computer network. The FBI later found him in possession of "top secret" Intel files worth more than $1 billion in research and development costs.
The lesson is clear. If you have really valuable or sensitive files, don't make them remotely accessible. Cut the wire. Some files should never be available off-site.
If this means buying a new computer just to hold really sensitive files, it's money well spent.
A couple years ago, I heard someone from the hacker group 2600 give out this same advice on their radio show, Off The Hook. It made sense back then and makes even more sense now.
Windows passwords are easily hacked. Instead of relying on a Windows password for local physical security, set both a power-on password and, if the computer supports it, a hard disk password. Whole disk encryption is another option, but one that involves much more work to implement.
If you put sensitive files on a laptop computer, then consider storing it in a safe when not in use. If you have a small safe, get a small laptop or a Netbook.
Laptops need more than just cutting the Ethernet wire. To begin with, turn off the Wi-Fi radio (there is probably a switch or a function key for this). If the laptop has Bluetooth, physically turn that off too.
Then, turn off the networking features in the operating system.
On Windows, turn off file sharing for every network adapter and turn off every network protocol. Then, disable all the network adapters.
Finally, disable the underlying Windows services that handle networking. On Windows XP this would be: Wireless Zero Configuration, Server, Computer Browser, Workstation and SSDP Discovery. Then since, the machine will be off-line forever, there are quite a few other Windows XP services that won't be needed and can be disabled: Automatic Updates, Distributed Link Tracking Client, Distributed Transaction Coordinator, Net Logon, NetMeeting Remote Desktop Sharing, Network DDE, Network DDE DSDM, Network Location Awareness (NLA), Network Provisioning Service, Remote Desktop Help Session Manager, Remote Registry and WebClient. The laptop I'm writing this on also has an Infrared Monitor service. I don't know what it's for, but I keep it disabled.
All told, this isn't much work and doesn't involve much expense. Yet, it's great insurance and can leave your sensitive files better defended than those at Intel and each presidential campaign.

more.......

Foreign hackers infiltrated the networks of John McCain and Barack Obama during the US presidential campaign.

CNN and Newsweek cited sources within both camps as reporting that hackers from an undisclosed foreign location targeted each network over the summer in an attempt to acquire information.
The report did not specify which group or nation was responsible for the attacks, but the target appears to be documents outlining the candidates' policy proposals.
The information would reportedly have been used in future policy negotiations with the winning candidate.
Following the attacks both camps reportedly hired outside consultants to seal up any security flaws, and the FBI and Secret Service are both said to be investigating the incidents.
Hacking for political reasons has emerged in recent years as a companion to traditional espionage. In 2007, Chinese government officials were accused of hacking government sites in the US, France, Germany and the UK.
Russian nationalists have also been thought to use cyber-attacks to supplement their political efforts. In the midst of conflicts with Estonia and Georgia, Russian hackers were said to be masterminding attacks on government and social infrastructure sites.

Thursday, October 30, 2008

Intel in China, ice-powered air conditioners

Even with turmoil in the financial markets, venture capital is still flowing to energy-tech ventures.
Here are the latest such investments: • Intel Capital has made its first clean-tech investment in China, the company said Tuesday.

The venture-capital arm of the chip giant put $20 million into Trony Solar Holdings, a Chinese solar thin-film cell developer. It also invested an undisclosed sum in NP Holdings, which makes large-scale energy storage systems for renewable energy and energy efficiency.
Intel Capital set up a $500 million fund for tech deals in China earlier this year, according to Reuters.
"We think innovation is the way to help companies out of this financial crisis," Cadol Cheung, head of Intel Capital in Asia Pacific told reporters Tuesday. "We have no plan of slowing down our investment pace."
• Ice Energy said Tuesday it has raised $33 million in a second round of funding. The round, led by Energy Capital Partners, also provides up to $150 million in project development financing.
Ice Energy makes rooftop air conditioners that use ice to help lower the cost of operating them.

During off-peak hours, such as the middle of the night, the machines freeze water. During the day, the ice cools the refrigerant to run the air conditioner, cutting down on the electricity it would otherwise need.
The ice storage can shift the demand to off-peak times by as much as 40 percent, according to the company. For that reason, the company is marketing its products to utilities looking for ways to reduce peak demand to avoid construction of new power plants.
• Blue Source said Monday that Goldman Sachs will take an equity stake in the company and finance carbon offset projects. Ice Energy's rooftop ice-cooled air conditioner.Ice Energy's rooftop ice-cooled air conditioner.
Blue Source identifies and runs projects that reduce greenhouse gases, such as methane capture at landfills, and carbon capture and storage at oil wells.
Goldman Sachs will market and trade the offsets from Blue Source projects in carbon emissions trading markets, according to the companies.
• General Electric said last week it is investing $30 million in lithium-ion battery maker A123 Systems, part of a planned $102 million series E round.
GE is now the largest investor in the company with a 9 percent stake after having put in $55 million. The two companies are working on various projects, including integrating A123 Systems' batteries in the Think all-electric town car and a hybrid bus platform.

Find here

Home II Large Hadron Cillider News