Search This Blog

Thursday, October 4, 2007

Astrological aspects for spiritual growth


Astrological aspects for spiritual growth


Recommended action around the exact moment: a 1/2 hour meditation or at least a short break from other activities in order to become conscious of the positive, spiritual aspects of the ongoing astrological aspects. All times are GMT (Universal Time). Please adjust the hour to fit your timezone.


October 14, 2006, 02:31 am GMT
Venus trine Neptune - This aspect refines all feelings, it awakens overwhelmingly romance in one's soul, it energizes the artistic inspiration and talents, it intensifies the subtle sensitivity, the intuition, the sensuality, it eases reaching a spiritual state of deep emotional understanding, it intensifies the spiritual state of unconditional giving and aspiration towards sublime ideals, it eases attaining a spiritual state of devotion.


October 15, 2006, 12:20 pm GMT
Mars sextile Pluto - This aspect awakens the power of transmutation of the creative potential, it allows the spontaneous raising of energies resulted from a harmonious, elevated passion, it energizes the courage to perform positive activities involving self improvement, it intensifies the spiritual transformation by conscious choice and orientation of one's will toward a state of abandoning oneself before the Will of God. The aspect eases overcoming selfishness and also represents a chance to improve self-control.


October 16, 2006, 07:29 am GMT
Sun sextile Saturn - This aspect's influence allows accelerated spiritual growing, becoming conscious of the spiritual priorities in one's life, the chance of becoming detached of useless activities and bad habits, it amplifies within one's self the spirit of self value and self respect, noble feelings, spiritual aspiration. The aspect eases the overcoming of the ego and experiencing new deeply spiritual feelings and states of consciousness.


October 18, 2006, 02:17 am GMT
Sun sextile Pluto - This aspect awakens the will, determines a state of intense spiritual yearning, offers a superior understanding of esoteric teachings, gives a higher confidence in one's forces, determines a state of positive, deeply transforming force, eases the intuition and revelation of our spiritual purpose in life. The aspect facilitates the abandoning in front of the omnipotent Will of God.


October 18, 2006, 07:54 pm GMT
Venus sextile Saturn - This aspect offers a state of spiritual maturity in the area of feelings and romance, emotional responsibility, it intensifies the manifestation of feelings, it leads to overcoming superficiality and eases the control over the inferior passions, it offers more self control of emotions and fidelity in relationships. The aspect allows perceiving the inner beauty and generates balance between the need to be extrovert or introvert, a state of harmony between the material world and the spiritual world.


October 20, 2006, 02:41 am GMT
Venus sextile Pluto - This aspect awakens and intensifies the refined sensitivity and artistic emotions, it energizes the creative imagination and determines elevated feelings, it also spiritualizes love and intensifies the creative forces and the refined erotic feelings, it eases the ascension and sublimation of subtle sexual energies into the area of artistic creativity. The aspect awakens spiritual yearning, it offers mystical emotions and states of consciousness and helps reaching the great ideals of the humankind





Technorati : , , ,

Yahoo, PayPal, eBay acting for Authentication Technology


Yahoo Inc., eBay Inc. and PayPal have joined forces to protect customers against fraudulent e-mails and phishing attacks with the implementation of new authentication technology.


From today, eBay and PayPal customers worldwide using Yahoo! Mail and Yahoo!7 Mail in Australia can expect fewer fake e-mails claiming to be sent by eBay and PayPal. Yahoo! Mail is the first Web mail service to block these types of malicious messages for eBay and PayPal through the use of DomainKeys e-mail authentication technology, which is already available in Yahoo!7 Mail.


The technology upgrade will be rolled out globally over the next few weeks to users of the service.


PayPal CISO, Michael Barrett, said the adoption of digital e-mail signing technology and the aggressive collaborative stance being taken by all three companies is a significant step forward in the fight to protect customers against e-mail based crimes. "While the battle against phishing and identity theft scammers will continue to require a multi-faceted approach, today's announcement demonstrates the power of DomainKeys and the security benefits to be gained by e-mail users worldwide," Barrett said.


PayPal operates more than 153 million accounts in 190 markets and in 17 currencies around the globe.


Both eBay and PayPal are early adopters of DomainKeys technology. It provides a unique way to verify the authenticity of e-mail messages, allowing Internet Service Providers (ISPs) to determine if messages should be delivered to a customer's inbox.


The collaborative effort between Yahoo!7, eBay and PayPal will block unauthenticated e-mail, reducing the volume of counterfeit e-mail and lowering the risk of unauthorized account activity and identity theft due to phishing attacks. eBay CISO, Dave Cullinane, said through cooperation the industry can collectively stamp out phishing and other e-mail scams.



Why 'authentication' technology can help your business


Today, more than ever, protecting one's electronic identity has become a top priority


And well it should. As identity thieves and computer hackers devise increasingly- sophisticated methods to steal confidential data, computers users must take greater precautions to protect their vital information.


With so much at stake, more than antivirus software and system patches are needed. Authentication technologies - and you - can help. As a financial advisor to small businesses, you have a unique opportunity to educate clients about available technologies that can validate the security and integrity of their confidential information. You may also want to implement these technologies at your office to better protect your clients' information, too.


While authentication technology has been around since the early days of computing, increased awareness due to more information security threats and greater affordability are pushing the use of these technologies to the forefront. In simplest terms, authentication technologies help to ensure an individual is who they claim to be. They "authenticate" or validate an individual's identity and control access to resources in three broad categories: something you know, something you have and something you are.


While authentication technology has been around since the early days of computing, increased awareness due to more information security threats and greater affordability are pushing the use of these technologies to the forefront. In simplest terms, authentication technologies help to ensure an individual is who they claim to be. They "authenticate" or validate an individual's identity and control access to resources in three broad categories: something you know, something you have and something you are.



Something you know: passwords


Passwords are the least expensive and most common type of authentication technology. Based on "something you know," passwords require a user to remember a string of characters and enter this information to gain access to a desired resource. Unfortunately, passwords are also one of the weakest forms of authentication technology, most often because of the users themselves. Passwords that are shared, left blank, unchanged for long periods of time, reused across multiple accounts or overly simplistic, leave the user at risk to even the most novice identity thief or simple hacking tool. Ultimately, passwords should continue to play a role in user authentication, but should be used in conjunction with other technologies for adequate security


Passwords are the least expensive and most common type of authentication technology. Based on "something you know," passwords require a user to remember a string of characters and enter this information to gain access to a desired resource. Unfortunately, passwords are also one of the weakest forms of authentication technology, most often because of the users themselves. Passwords that are shared, left blank, unchanged for long periods of time, reused across multiple accounts or overly simplistic, leave the user at risk to even the most novice identity thief or simple hacking tool. Ultimately, passwords should continue to play a role in user authentication, but should be used in conjunction with other technologies for adequate security




Technorati :

The New History of Black Holes: 'Co-evolution' Dramatically Alters Dark Reputation


The New History of Black Holes: 'Co-evolution' Dramatically Alters Dark Reputation


Black holes suffer a bad rap. Indicted by the press as gravity monsters, labeled highly secretive by astronomers, and long considered in theoretical circles as mere endpoints of cosmic evolution, these unseen objects are depicted as mysterious drains of destruction and death.


So it may seem odd to reconsider them as indispensable forces of creation.


Yet this is the bright new picture of black holes and their role in the evolution of the universe. Interviews with more than a half dozen experts presently involved in rewriting the slippery history of these elusive objects reveals black holes as galactic sculptors.


In this revised view, which still contains some highly debated facts, fuzzy paragraphs and sketchy initial chapters, black holes are shown to be fundamental forces in the development and ultimate shapes of galaxies and the distribution of stars in them. The new history also shows that a black hole is almost surely a product of the galaxy in which it resides. Neither, it seems, does much without the other.


The emerging theory has a nifty, Darwinist buzzword: co-evolution.


As a thought exercise, co-evolution has been around for less than a decade, or as much as 30 years, depending on who you ask. Many theorists never took it seriously, and no one had much evidence to support it. Only in the past six years or so has it gained steam. And only during the past three years have observations provided rock-solid support and turned co-evolution into the mainstream idea among the cognoscenti in both black hole development and galaxy formation.


"The emerging picture of co-evolving black holes and galaxies has turned our view of black holes on its head," says Meg Urry, an astronomer and professor of physics at Yale University. "Previously, black holes were seen as the endpoints of evolution, the final resting state of most or all of the matter in the universe. Now we believe black holes also play a critical role in the birth of galaxies."


The idea is particularly pertinent to explaining how massive galaxies developed in the first billion years of the universe. And it is so new that just last week theorists got what may be the first direct evidence that galaxies actually did form around the earliest black holes.


Chicken-and-egg question


Like archeologists, astronomers spend most of their careers looking back. They like to gather photons that have been traveling across time and space since well before Earth was born, some 4.5 billion years ago. Rogier Windhorst, an Arizona State University astronomer, has peered just about as deep into the past as anyone, to an era when the universe was roughly 5 percent of its present age.


Earlier this month, Windhorst and a colleague, Haojing Yan, released a Hubble Space Telescope image showing the most distant "normal" galaxies ever observed.


Though stretched and distorted by the technique used to spot them (an intervening galaxy cluster was used as a "gravitational lens"), the newfound galaxies, Windhorst's team assures us, resemble our own Milky Way. They are seen as they existed more than 13 billion years ago, within 1 billion years of the Big Bang.


Practically side-by-side in time, discovered in separate observations made as part of the Sloan Digital Sky Survey, are compact but bright objects known as quasars. These galaxies-to-be shine brilliantly because, researchers believe, each has a gargantuan black hole at its core, whose mass is equal to a billion suns or more, all packed into a region perhaps smaller than our solar system.


The resulting gravity pulls in nearby gas. The material is accelerated to nearly the speed of light, superheated, and swallowed. The process is not entirely efficient, and there is a byproduct: An enormous amount of energy -- radio waves, X-rays and regular light -- hyper-illuminates the whole scene.


Quasars also seem to be surrounded by halos of dark matter, a cryptic and unseen component of all galaxies. Co-existing around and amongst all this, researchers are coming to realize, is a collapsing region of stars and gas as big or larger than our galaxy.


It was no coincidence that the announcements of the two findings -- distant quasars and normal galaxies --were made together at a meeting of the American Astronomical Society (AAS) Jan. 9. Co-evolution was on the minds of the discoverers.


Among co-evolution's significant impacts is its ability to render mostly moot a longstanding chicken-and-egg question in astronomy: Which came first, the galaxy or the black hole?


"How about both?" Windhorst asks. "You could actually have the galaxy form simultaneously around a growing black hole."


Urry, who was not involved in either finding but was asked to analyze them, explained it this way: "We believe that galaxies and quasars are very intimately connected, that in fact quasars are a phase of galaxy evolution. In our current picture, as every galaxy forms and collapses, it has a brief quasar phase."


So when a quasar goes dormant, what's left are the things we associate with a normal galaxy -- stars and gas swirling around a central and hidden pit of matter.


Quasars are cagey characters, however. (The term is short for quasi-stellar radio source; astronomers first mistook the objects for stars within our galaxy in the early 1960s.) When one is firing, its brightness can exceed a thousand normal galaxies. The quasar outshines its entire host galaxy so significantly that scientists have not been able to see what's really causing all the commotion. That veil is lifting as you read this, however, as telescopic vision extends ever backward in time and data is fed into powerful new computer models.


Evolving idea


Demonstrations of co-evolution began to emerge in the mid-1990s when researchers found hints that the existence of a significant black hole at the center of a galaxy was related to the galaxy's shape, says Martin Haehnelt of the University of Cambridge. Only galaxies with a spherical bulge-like component appear to accommodate supermassive black holes.


Our Milky Way, if it could be viewed edge on, would display a good example of one of these galactic bulges: Imagine the profile of a stereotypical flying saucer, though with a wider and flatter disk. The Milky Way is smaller than many galaxies, however, and it has a correspondingly less massive black hole -- roughly 2.6 million suns worth. It almost surely once had a quasar phase, astronomers say.


At any rate, in the mid-1990s no one knew for sure how prevalent black holes were. Theory and some observational data pointed to the likelihood that they were ubiquitous.


Then, in the year 2000, astronomers found solid evidence that black holes lurk deep inside many and probably all galaxies that have the classic central bulge of stars. Further, an analysis showed a direct correlation between the mass in each black hole and the shape and scope of the bulge and the overall size of the galaxy.


At an AAS meeting in June of 2000, John Kormendy of the University of Texas at Austin, presented evidence for 10 mammoth black holes whose masses were related to their galactic bulges. Kormendy worked on a large team of researchers led by University of Michigan astronomer Douglas Richstone. This along with other studies in surrounding months by other teams served as a collective turning point for co-evolution, several researchers now say, advancing it to a stable quantitative footing.


"Subsequently the idea of the co-evolution of galaxies and supermassive black holes became more widely discussed and accepted," Haehnelt says.


Evidence continues to mount. In 2001, two separate teams showed that many smaller galaxies that don't have bulges also do not seem to contain significant black holes.


Over the past six months or so, other important studies have emerged, providing independent confirmation to some of the initial work. Haehnelt: "It becomes more and more clear that supermassive black holes can significantly change the structure and evolution of galaxies."


The first large-scale scientific meeting devoted to co-evolution -- a sure sign of a theory coming into its own -- was held just three months ago, sponsored by the prestigious Carnegie Observatories.


There are many variations on the basic theory of co-evolution. Each version attempts to explain a vexing fact: In the blink of a cosmic eye -- just a half a billion years -- invisible spheres of matter were born, and several gained the mass of a billion or more suns and were driving the shape and texture of swirling agglomerations of newborn stars.


Co-evolution is not a done deal. Perhaps, some have suggested, a huge black hole simply collapses out of a pre-galactic cloud and serves as a ready-made engine to drive further galaxy development. Even staunch supporters of co-evolution say there are still viable theories, not yet refutable, putting the immense black hole in place first, and others that have the galaxy solely responsible for driving the formation of a black hole.


If black holes did grow incrementally, it is unclear whether cooperative construction reigned from the beginning, or if it kicked in after some certain amount of mass was gathered.


"I think it is still unclear whether black holes play any role in the formation of the first galaxies," said Cambridge's Sir Martin Rees, who has collaborated with Haehnelt and who long ago authored some of the first scientific papers on the question.


"Indeed," Sir Martin says, "there is a lot of debate about whether black holes can form in very small galaxies, and whether there is a link between the 'small' holes that form as the endpoint of the evolution of massive stars and the holes of above a million solar masses that exist in the centers of galaxies."



Another dark matter


Infusing itself into the equation is an utter unknown: dark matter. This as-yet-undetected stuff permeates all galaxies, researchers believe. A halo of it surrounds our Milky Way. Dark matter does not interact with light, but it does possess great gravitational prowess, acting as invisible glue to help hold galaxies together.


Dark matter is taken into account in the leading co-evolution models, but only in a general, overall sense. Some researchers, however, think dark matter, more than a black hole, is clearly connected to a galaxy's birth and development.


Just last week, the first possible direct evidence was announced for dark matter halos around early quasars. The finding, by Rennan Barkana of Tel Aviv University and Harvard astronomer Abraham Loeb, appears to be the first glimpse at the anatomy of the most distant quasars. Importantly, it supports the fundamental ideas of co-evolution, Loeb said. But it also makes it clear that dark matter will not be denied a chapter in any book about the theory.


Laura Ferrarese, a Rutgers University physicist, analyzed the new dark matter finding. She says it shows that a supermassive black hole, the stars around it, and an all-encompassing dark matter halo are working in concert to build structure.



Taken with other evidence, Ferrarese sees dark matter's role as more significant, or at least more obvious, than many theorists have considered.


"There is an observational correlation between the mass of the black hole and the mass of the dark matter halo, not necessarily the mass of the galaxy itself," she said.


Through this haze of fuzzy information and diverse thinking, theorists must work to explain a stark and staggering fact: Somewhere between 300 million and 800 million years after the Big Bang, the first black holes were born and managed to each gulp down a mass of more than 1 billion suns.


Now before you ponder how these Sumo wrestlers of the early universe must have thrown their weight around in any evolutionary wrestling match, consider this: A black hole typically holds much less than 1 percent of the overall mass of the galaxy it anchors.


Shining light on the dark ages


The early history of black holes -- what went on in the 500 million years leading up to objects observable with current technology -- is tied back to the development of the very first stars. Speculating about it requires first rewinding to the very beginning.


When the universe was born, there was nothing but hydrogen, helium and a little lithium. All this raced outward for about 300,000 years before anything significant happened. The gas was too compacted and therefore too hot to be stable. Gradually, the stuff of space expanded and cooled enough for gas to "recombine and stabilize to neutral states," as scientists put it.


The hydrogen was still too hot to form stars, so more expansion was needed. A long stretch of boring darkness ensued, during which some ripples began to ruffle the otherwise smooth fabric of space.


"For 300 million years, nothing happened," explains Windhorst, the Arizona State University astronomer. "The universe is just sitting there. Then all of a sudden the first stars began to shine."


The exact timing for first light is not known. But the ensuing 500 million years are the so-called dark ages of cosmology. Or more precisely, they represent the illuminations of the universe and the elimination of the dark ages.


"The tail end of that is what we're seeing," Windhorst says of the latest Hubble and Sloan survey observations.


The first black holes


Scientists once imagined galaxies forming by a sort of monolithic collapse, in which a giant cloud of gas suddenly fell inward. The modern view is one of "hierarchical merging," in which bits and pieces build up over time. A rough outline of how it all went down is fairly well agreed upon.



The initial ripples in space drew together into knots and filaments, locally and over broader scales. Individual clumps of gas collapsed, and stars were born.


The first stars must have been massive, perhaps 200 times the weight of our Sun or more. They would have been almost pure hydrogen -- the primary ingredient of thermonuclear fusion, which makes a star shine.


Massive stars are known to die young. Some survive just 10 million years (the Sun is 4.6 billion years old and just reaching middle-age). A colossal explosion occurs, sending newly forged, heavier elements into space. Remaining material collapses. A mass equal to many stars might end up in a ball no larger than a city. The result: a stellar black hole. These object are so dense that nothing, not even light, escapes once inside a sphere of influence known as an event horizon.


Stellar gravity wells can weigh as little as a few suns. But the inaugural versions might have been 100 times as massive as the Sun or more.


During all these tens and hundreds of millions of years, more stars are being born from the detritus of the first stars. Locally denser regions of gas contract. Stars form groups of perhaps a few dozen, which might be attracted to other star clusters. Eventually, clusters of many thousands of stars develop and began to look and behave like something that could be called a sub-galaxy. Some probably harbored growing black holes near their centers.


Here, theory struggles. Intuition might suggest that many of these huge stellar black holes simply merged until one central object attained enough mass to drive the shape and future development of its galaxy.


If that intuition is right, however, which black hole became the center?


"It may be a question of being in the right place at the right time," says Roger Blandford, a theoretical astrophysicist at Caltech. "It could be accidental."



In fact, nobody knows for sure if the first super-sized black holes developed from a series of mergers -- several dozen solar masses becomes 200, then 1,000, then 10,000, and so on -- or if they collapsed from the condensing gas cloud. "Do they start from 100 solar masses or a million solar masses? That's a good question," Blandford said. "My personal guess is that they start from a few hundred solar masses, but that's a much more speculative business."


Elusive middleweights


Galaxy birth and development is a never-ending process, and clues to early black hole evolution are spread throughout our own galaxy and around the universe. Astronomers therefore examine modern-day cosmic creatures for clues to their ancestral roots.


Black holes are everywhere, for one thing. Millions of the stellar sort could litter our galaxy alone, based on early discoveries of a few.


If the mightiest black holes indeed developed out of the garden variety, then there ought to be some evidence lying around our cosmic backyard in the form of middleweight versions, one line of thinking goes.


A handful of astronomers are convinced they have found a couple of these missing links, and in fact are arguing their case this week at a conference in California. But the case of the middleweights is among the most controversial in all of astronomy.


"The existence of middleweight black holes is one of the big unanswered questions in this field," said Cambridge's Haehnelt. "The recent claimed detections are still very controversial."


Regardless, most experts agree middleweights would represent, at best, pocket change to the fully grown black hole, something like Microsoft's initial millions in annual revenue compared to the billions that poured into its coffers during the tech boom.



Researchers on both sides of the middleweight argument mostly agree that the bulk of a jumbo black hole doesn't come through early mergers. Once a critical mass is achieved -- and this appears to coincide with a point in time prior to what astronomers can see today -- a black hole seems to gain most of its mass by swallowing gas from its environment.


Amid all the squabbling over middleweights looms the likelihood of much larger merger candidates.


Mega-mergers


Galaxy merging is almost a given. It is thought to have contributed significantly to the past growth of the Milky Way, for example. The early universe, having not yet expanded much, was incredibly crowded. Like racked billiard balls, nascent galaxies were more likely to collide.


If two galaxies merge, so should their black holes. Recent computer modeling speculates the event would be violent, unleashing tremendous light as gas is trapped between the two black holes and then rushes toward the more massive one.


Galactic mergers take millions of years, so they can't readily be observed in progress.


A recent peek into a nearby galaxy provided evidence for the scenario, however. At the heart of galaxy NGC 6240 astronomers found not one but two black holes, roughly 3,000 light-years apart and closing on an apparent merger course. The Chandra X-ray Observatory observations show that NGC 6240 is actually two galaxies that started joining forces about 30 million years ago.



Other indications of mega-mergers come from relatively nearby quasars.


Richard Larson, a Yale astronomer who studies star formation in galactic nuclei, says galaxies can go through several quasar phases during their lives. In studying quasars at more reasonable distances (which also means not so far in the past), he consistently sees signs of recent galaxy mergers or other large-scale interactions that served as triggers.


"Interactions and mergers are an excellent way to dump a lot of gas into the center of a galaxy," Larson explains. "The first thing this gas does is suddenly form huge numbers of stars."


Bursts of intense star formation seem to last about 10 million to 20 million years around a typical quasar.


Some of the gas that does not go into generating stars falls on in to the black hole. This violent phase of consumption is the one that is readily observed, because the castoff energy turns the incoming gas and dust into a glowing cloud. Eventually, the chaos settles and the new stars become visible. Later, the quasar itself is left naked. Finally, it goes dormant.



Larson figures this scenario for black hole feeding probably applies to the most distant quasars, too. And it supports the notion that black holes do in fact gain most of their bulk by accreting gas.


Fresh spin


To sort out the specifics of co-evolution, astronomers will need to see more of the universe and inspect it in greater detail. The prospects are good, especially toward the end of this decade.


A project called LISA (Laser Interferometer Space Antenna) would search for "gravitational waves" kicked up in the aftermath of black hole mergers, perhaps proving that such colossal collisions do occur. The NASA satellite is tentatively slated for launch in 2008.


A vastly improved understanding of dark matter is also needed. Several telescopes should contribute to this effort, but since no one knows what the stuff is, forecasting any sort of resolution is highly speculative.


And the specific mechanics of black holes must be investigated fully. For now, theorists don't even know exactly how matter is shuttled inward and consumed. Much of this work can be done by observing the nearby universe.


Roger Blandford, the Caltech theoretician, has suggested a novel way to prove that early mergers were not serious contributors to black hole growth. Blandford says two primary parameters characterize black holes. Mass is the most obvious. A more subtle measurement is spin.


Yes, black holes seem to spin. The idea only emerged from theory to relatively firm observations in May of 2001, and it remains unproven.


But if spin can be proved a universal aspect of black holes, then the rate of spin can be used to infer something very important about a black hole's history.


"If black holes grow by merging, by combinations of black holes, they should spin down quite quickly," Blandford explains. "This then becomes a fairly good argument that, if you can show that black holes really are spinning rapidly, they probably didn't grow by merging, but would have grown by accreting gas."


Most important, vision simply must be extended further back in time, beyond the quasars that are now being studied, says Karl Gebhardt, a University of Texas astronomer and a member of Richstone's team.


"They're essentially the tip of the iceberg," Gebhardt says of the objects so far observed. "We are projecting from what we see in a very special number of objects to the whole sample. That is part of the problem of the uncertainty now."


Hubble may extend current vision a bit, but the next boon in deep-space discovery will likely have to wait for the James Web Space Telescope, planned for launch in 2010. Billed as the "first-light machine," the JWST will be Hubble on steroids, and it should muscle its way to a better view of a good portion of the cosmic dark ages.


It is ironic to think that when JWST goes up, many astronomers and cosmologists will be banking on black holes to light the way to a scientific account of the earliest epoch of the visible universe, an obscure time they have long dreamed about and can now, almost, see.






Technorati : , , , , , , , ,

Physicist defends Einstein's theory and 'speed of gravity' measurement


hysicist defends Einstein's theory and 'speed of gravity' measurement


Scientists have attempted to disprove Albert Einstein's theory of general relativity for the better part of a century. After testing and confirming Einstein's prediction in 2002 that gravity moves at the speed of light, a professor at the University of Missouri-Columbia has spent the past five years defending the result, as well as his own innovative experimental techniques for measuring the speed of propagation of the tiny ripples of space-time known as gravitational waves.


Sergei Kopeikin, associate professor of physics and astronomy in the College of Arts and Science, believes that his latest article, "Gravimagnetism, causality, and aberration of gravity in the gravitational light-ray deflection experiments" published along with Edward Fomalont from the National Radio Astronomical Observatory, arrives at a consensus in the continuing debate that has divided the scientific community.


An experiment conducted by Fomalont and Kopeikin five years ago found that the gravity force of Jupiter and light travel at the same speed, which validates Einstein's suggestion that gravity and electromagnetic field properties, are governed by the same principle of special relativity with a single fundamental speed. In observing the gravitational deflection of light caused by motion of Jupiter in space, Kopeikin concluded that mass currents cause non-stationary gravimagnetic fields to form in accordance with Einstein's point of view. The research paper that discusses the gravimagnetic field appears in the October edition of Journal of General Relativity and Gravitation.


Einstein believed that in order to measure any property of gravity, one has to use test particles. "By observing the motion of the particles under influence of the gravity force, one can then extract properties of the gravitational field," Kopeikin said. "Particles without mass - such as photons - are particularly useful because they always propagate with constant speed of light irrespectively of the reference frame used for observations."


The property of gravity tested in the experiment with Jupiter also is called causality. Causality denotes the relationship between one event (cause) and another event (effect), which is the consequence (result) of the first. In the case of the speed of gravity experiment, the cause is the event of the gravitational perturbation of photon by Jupiter, and the effect is the event of detection of this gravitational perturbation by an observer. The two events are separated by a certain interval of time which can be measured as Jupiter moves, and compared with an independently-measured interval of time taken by photon to propagate from Jupiter to the observer. The experiment found that two intervals of time for gravity and light coincide up to 20 percent. Therefore, the gravitational field cannot act faster than light propagates."


Other physicists argue that the Fomalont-Kopeikin experiment measured nothing else but the speed of light. "This point of view stems from the belief that the time-dependent perturbation of the gravitational field of a uniformly moving Jupiter is too small to detect," Kopeikin said. "However, our research article clearly demonstrates that this belief is based on insufficient mathematical exploration of the rich nature of the Einstein field equations and a misunderstanding of the physical laws of interaction of light and gravity in curved space-time."





Technorati : , , , , , ,

E-mail Security, Compliance Upgraded in Google Apps


E-mail Security, Compliance Upgraded in Google Apps


Google's acquisition of e-mail security vendor Postini is paying off for Google Apps enterprise users, who Wednesday are gaining access to extra security and compliance features.


The Google Apps Premier Edition will remain at its current price tag of US$50 per user, per year, despite significant upgrades based on Postini's software-as-a-service offerings, Google officials say. Google is also upping Gmail capacity for Premier Edition users from 10GB to 25GB.


"This is virtually unlimited storage," says Matthew Glotzbach, head of products for Google's Enterprise group.


Both moves could help Google compete against the Microsoft Office set of workplace tools, which has long dominated the business market. Google is quickly ramping up the feature set for Apps Premier Edition, says Gartner analyst Tom Austin, who says he counted 37 significant enhancements to the software-as-a-service platform between February and June.


"I would be very surprised if there isn't another major announcement from Google this month, two more in November . . . and on and on," Austin says.


Google added a presentations application last month to Apps, which already included e-mail, calendaring, instant messaging, voice chat, documents and spreadsheets.


Gmail already had spam and virus blockers, but now Google Apps Premier Edition gives businesses new configurable options, such as a whitelist and centrally managed content policy to filter messages that contain certain words or attachments. The Postini features also let administrators add footers to every outbound message, so you don't have to rely on employees adding text describing e-mail confidentiality policies, Glotzbach says.


"There's a whole realm of legal compliance around tagging messages in different ways," he says.


Companies can use the new features to block outgoing messages that mention particular products, competitors or employee names, or spreadsheets that have proprietary financial data, Glotzbach says.


"This is really an enterprise class system that gives a tremendous amount of flexibility and control to IT administrators to define those content policies," he says.


Postini services fit well with Google Apps, because they are also offered in the software-as-a-service model, he says.


The Postini products now bundled with Google Apps will continue to be sold separately as well, according to Glotzbach. Some of Postini's more sophisticated offerings won't be available free to Google Apps users.


Postini sells an archiving product that helps with legal compliance and e-discovery, which Google Apps users can buy as an add-on. Without the add-on, organizations that use Apps can see all of their messages from the past 90 days with a message recovery feature, but some businesses need storage going back years to meet regulatory requirements.


The price of the archiving add-on varies dramatically but generally starts at $150 per user per year, Glotzbach says.


Postini customers will be offered a free trial of Google Apps through June 2008.


Google says it is also adding configurable mail routing to make it easier for customers to use multiple e-mail systems in addition to Apps.


Google has made a big impact on the e-mail and collaboration market, but large enterprises still gravitate more toward Microsoft Office, Austin notes.


"Big businesses today are showing a lot of interest in [Google Apps] primarily to get into a negotiating club with Microsoft, to influence Microsoft's pricing on enterprise agreements," he says.





Technorati : , , , , ,

The vertical cell based memory technology is revolutionary since it could increase the production volume of memory by 5 times without the expansion of

A vertical memory cell EPROM array (FIGS. 1, 1a and 1b) uses a vertical floating gate memory cell structure that can be fabricated with reduced cell area and channel length. The vertical memory cell memory array includes multiple rows of buried layers that are vertically stacked--a drain bitline (34) over a source groundline (32), defining a channel layer (36) in between. In each bitline row, trenches (22) of a selected configuration are formed, extending through the drain bitline and channel layer, and at least partially into the source groundline, thereby defining corresponding source (23), drain (24) and channel regions (25) adjacent each trench. The array can be made contactless (FIG. 1a), half-contact (FIG. 2a) or full contact (FIG. 2b), trading decreased access time for increased cell area. BeSang, a US based fabless start-up established by Korean American Engineers, produced vertical cell based semiconductor memory, which had been possible only in the lab, for the first time in the world. The vertical cell based memory technology is revolutionary since it could increase the production volume of memory by 5 times without the expansion of wafer.
BeSang and Korea’s National Nanofab Center(NNFC) announced on October 2 that the both parties had cooperated to manufacture vertical cell memory and they succeeded in producing vertical cell memory on 200 millimeter or 8 inch wafer in the 0.18 nanometer process. The pilot product will be open to the public within this month.
BeSang had already experienced the success in the production of vertical cell memory from its cooperation with Standford University in manufacturing vertical cell memory on 100 millimeter or 4 inch wafer in the 0.80 nanometer process. The success in the 0.18 nanometer process will make the application of the technology earlier than expected.
President Sang-yoon Lee of BeSang said, "The vertical cell memory technology will be an equivalent of operating four fabs, without additional investment. That`s why the industry is paying attention to this technology a lot. Particularly, it could cut the vicious circle of endless investment in the development for the next generation process, which has taken the most of the income. Therefore, this will help Korean chip makers to expand the gap with late-comers."
BeSang had revealed its pilot product manufactured in the 0.80 nanometer process to major chip makers in Korea and the US. These companies have been considering the application of the technology proposed by BeSang.

The vertical cell based memory technology is revolutionary .

BeSang, a US based fabless start-up established by Korean American Engineers, produced vertical cell based semiconductor memory, which had been possible only in the lab, for the first time in the world. The vertical cell based memory technology is revolutionary since it could increase the production volume of memory by 5 times without the expansion of wafer.
BeSang and Korea’s National Nanofab Center(NNFC) announced on October 2 that the both parties had cooperated to manufacture vertical cell memory and they succeeded in producing vertical cell memory on 200 millimeter or 8 inch wafer in the 0.18 nanometer process. The pilot product will be open to the public within this month.
BeSang had already experienced the success in the production of vertical cell memory from its cooperation with Standford University in manufacturing vertical cell memory on 100 millimeter or 4 inch wafer in the 0.80 nanometer process. The success in the 0.18 nanometer process will make the application of the technology earlier than expected.
President Sang-yoon Lee of BeSang said, "The vertical cell memory technology will be an equivalent of operating four fabs, without additional investment. That`s why the industry is paying attention to this technology a lot. Particularly, it could cut the vicious circle of endless investment in the development for the next generation process, which has taken the most of the income. Therefore, this will help Korean chip makers to expand the gap with late-comers."
BeSang had revealed its pilot product manufactured in the 0.80 nanometer process to major chip makers in Korea and the US. These companies have been considering the application of the technology proposed by BeSang

ARM Cortex-M3

The ARM Cortex™-M3 32-bit RISC processor is the first ARM processor based on the ARMv7-M architecture and has been specifically developed to provide a high-performance, low-cost platform for a broad range of applications including microcontrollers, automotive body systems, industrial control systems and wireless networking. The Cortex-M3 processor provides outstanding computational performance and exceptional system response to interrupts while meeting low cost requirements through small core footprint, industry leading code density enabling smaller memories, reduced pin count and low power consumption.

The central core of the Cortex-M3 processor, based on a 3-stage pipeline Harvard bus architecture, incorporates advanced features including single cycle multiply and hardware divide to deliver an outstanding efficiency of 1.25 DMIPS/MHz. The Cortex-M3 processor also implements the new Thumb®-2 instruction set architecture, which when combined with features such as unaligned data storage and atomic bit manipulation delivers 32-bit performance at a cost equivalent to modern 8- and 16-bit devices.
Applications


The Cortex-M3 processor offers an excellent balance of architectural features, high performance and low costs, making it a very attractive choice for a broad range of applications, including:

Microcontrollers
32-bit performance at 8-bit costs
Wireless networking (inc Bluetooth, ZigBee and others)
Low power operation and integrated sleep modes supporting complex stacks
Automotive and industrial control systems
Secure, reliable and deterministic operation
White goods
High performance maths for complex motor algorithm support
Electronic toys
Low cost implementations for next generation intelligent toys
Medical instrumentation
High reliability core and tools enabling IEC61508 and FDA approval.
Features

ARMv7-M architecture
The microcontroller profile of the ARMv7 architecture
Optimized for microcontroller and low-cost applications
Thumb-2 instruction set
Enhanced levels of performance, energy efficiency, and code density
Mixed mode capability implies no need to interwork between modes
ARM levels of performance with Thumb level code density
Hierarchical structure with tightly integrated peripherals
CM3Core
Harvard bus architecture – separate instruction and data buses
Highly efficient 3-stage pipeline with branch speculation
Nested Vectored Interrupt Controller (NVIC)
Gate efficient stack-based register model
Configurable from 1-240 physical interrupts; up to 256 levels of priority
Non-Maskable Interrupt (NMI) enables critical interrupt capabilities
Low latency through tail chaining, late arrival service & stack pop pre-emption
Nesting (stacking) of interrupts
Dynamic interrupt reprioritization
Memory Protection Unit (MPU)
Optional component for separation of processing tasks and data protection
Up to 8 regions of protection; each of which can be divided into 8 sub-regions
Region sizes between 32 bytes to the entire 4 gigabytes of addressable memory
Embedded Trace Macrocell(ETM)
Optional component for real-time instruction trace
Data Watchpoint and Trace unit (DWT)
Implements hardware breakpoints and provides instruction execution statistics
Flash Patch and Breakpoint unit (FPB)
Implements 6 program breakpoints and 2 literal data fetch breakpoints
Debug Port ( SW-DP or SWJ-DP )
Configurable debug access through Serial Wire or JTAG interface
Single cycle multiply and hardware divide instructions
32-bit multiplication in a single cycle
Signed and unsigned divide operations between 2 and 12 cycles
Preconfigured memory map
Up to 4 gigabytes of addressable memory space
Predefined addresses for code, memory, external devices, peripherals
Dedicated space for vendor specific addressability
Atomic bit manipulation with bit banding
Direct access to single bits of data
Two 1MB bit banding regions for memory and peripherals mapping to 32MB alias regions
Atomic operation, cannot be interrupted by other bus activities
Unaligned data storage and access
Continuous storage of data requiring different byte lengths
Data access in a single core access cycle
Integrated sleep modes
Sleep Now mode for immediate transfer to low power state
Sleep on Exit mode for entry into low power state after the servicing of an interrup
Ability to extend power savings to other system components
Fully synthesizable and highly configurable
Easily customized for broad applicability
For more details on the CM3Core please click here.
For more detalis on the NVIC please click here.

Benefits

High performance
1.25 DMIPS/MHz on the Dhrystone 2.1 Benchmark
70% more efficient per MHz vs. the ARM7TDMI-S processor executing Thumb instructions
35% more efficient per MHz vs. the ARM7TDMI-S executing ARM instructions
Highly deterministic, low latency interrupt handling
Excellent data manipulation capabilities via Thumb-2 Bit Field Instructions
Low manufacturing costs
Low gate count implementations
33K gates Central Core (CM3Core)
60K gates or lower for complete standard implementation
Additional gate count reductions available through synthesis
All numbers for TSMC 0.18um G process, 50MHz target frequency
Smaller memory requirements
Up to 45% smaller code size vs. the ARM7TDMI-S executing ARM instructions
Up to 10% smaller code size vs. the ARM7TDMI-S executing Thumb instructions
Reduced pin count for lower packaging costs
Serial Wire Debug implements debug with just 2 pins
Single Wire Viewer implements single pin trace profiling
Enhanced energy efficiency
Clock gating, integrated sleep modes reduce power at no loss of performance
Power as low as 0.085 mW/MHz on the TSMC 0.13G process
Faster time to market with ease of use – system design
Fully synthesisable design
NVIC configurable to 1-240 physical interrupts with up to 256 levels of priority
Optional ETM can add trace capabilities
Optional MPU can add memory protection
Integrated debug/trace facilitate quicker debug
Faster time to market with ease of use – software development
Simplified stack-based programmer’s model ; simple vector based interrupt scheme
Thumb-2 removes need for interworking required by ARM/Thumb instructions
Native bitfield manipulation, hardware division and If/Then instructions
Thumb-2 is backwards compatible with existing ARM and Thumb solutions
Thumb-2 is compatible with other members of the Cortex family
The processor implements the stack manipulation in hardware
Hence assembler wrappers for handling stack manipulation for interrupt service routines are not necessary
NVIC integrates a systemtick timer that can provide an ideal heartbeat for a RealTime OS
ARM-EDA Reference Methodology deliverables significantly reduce the time to generate a specific technology implementation of the core and to generate industry standard views and models
Excellent 32-bit migration choice for 8/16 bit architecture based designs
Simplified stack-based programmer’s model is compatibile with traditional ARM architecture and retains the programming simplicity of legacy 8 and 16-bit architecture
Comparison of Cortex-M3 processor with ARM7TDMI® processor

The Cortex-M3 processor offers enhanced features and performance and an easy migration path to present a logical upgrade for ARM7TDMI processor-based designs desiring to meet the challenges of next generation technologies. The central core offers higher efficiency; a simpler programming model and excellent deterministic interrupt behaviour, whilst the integrated peripherals offer enhanced performance at lower cost and power consumption.

Nobel Winner: Global Warming Is the New Sputnik


Nobel Winner: Global Warming Is the New Sputnik


Physicist Leon Lederman calls climate change a "menace" that, like the Soviet satellite, will spur more science



Oh, the good old days. Most of the world may remember Sputnik as the seed of a standoff between superpowers, but for Nobel Prize-winning physicist Leon Lederman and many of his colleagues, it's a rosier memory: a time when it was cool to be a scientist.



Fifty years after the Soviet satellite initiated the space age, scientists and educators are again arguing, as they were in the years before Sputnik, that the United States is falling behind its competitors in producing scientists and engineers. Lederman, who received the 1988 Nobel Prize in physics with two colleagues at Columbia University for pioneering work on subatomic particles, has now turned his attention to finding new ways to inspire science and math teachers. He spoke with usnews.com about what it will take to get science education back on track.


Is there any hope of another Sputnik-like event to reinvigorate science education?


People have often joked, "Can we invent a Sputnik? Pull a fake Sputnik?" Frankly, we really don't have to. We have a menace hanging over our head which is, if anything, far more dangerous than Sputnik, and that is global climate change. I think it's going to take maybe oceans rising and the wiping out of some islands in sensitive areas before we realize that this is personal, that this really constitutes a menace to civilization.


You advocate a "physics first" method of teaching science, instead of the common practice of beginning with biology. So physics isn't too scary for ninth graders?


[Renowned physicist Richard] Feynman once said that if all civilization was going to fall apart and only one sentence of knowledge could be preserved, it should be: "Everything is made of atoms." Physics and chemistry are basic to modern molecular biology, and ninth grade is just perfect. Biology in ninth grade is about the stupidest thing. It's a concept subject. It's more mathematically complex than string theory. Physics starts with the simplest concepts. You have a broad sweep of how nature works. There is nothing in chemistry which, if you say, "Why does that happen?" you're not forced to go back to physics.


But won't it be harder to engage students with atoms and molecules than with organisms?


Telling stories is very important for that. You say, "Galileo drops two students off a tower, a fat one and a skinny one, and he listens for one squash or two squashes." There's a sense of wonder that you have to teach. One thing about physics that hasn't penetrated high school classrooms or popular science writing is its symmetry. What begins to emerge is a kind of simplicity and beauty. Science is not easy to teach-even physics teachers do not teach physics all that well.


How can one engage those of us who have escaped high school?


We could certainly use more communicators, like [The Elegant Universe author] Brian Greene, who's sort of a replacement for Carl Sagan. I'm working hard on that particular problem, the notion of getting the public involved in science education. My goal is to get invited on Oprah Winfrey.


Any luck?


Not yet-and I'm in Chicago. But you slog on.


So what exciting things are heading our way?


There are many discoveries to be made, maybe more in biology than in my field, especially in neuroscience. They're going to make incredible breakthroughs in how we think, in the whole human consciousness idea.


What can we do on the policy side?


We need more scientists going into Congress. There's a huge difference between being an adviser and being an elected official. [Nobel Prize-winning physicist] I. I. Rabi used to tell us that: "Some of you go to law school and run for Congress."


Last thoughts?


I worry so much about the danger that can come from fundamentalism, because we're a fragile society. Three young guys with college degrees can design a nuclear bomb.





Technorati : , , , , ,

Make hay (and a lot more) while the sun shines

MIT team aims to win annual solar house contest in Washington.A team of MIT students, faculty and volunteers has taken on the challenge of designing and building a house that relies entirely on solar energy to meet the electricity needs of a typical American family, from drying towels to cooking dinner.

Being from MIT, they also took on the challenge of being the best: For the first time, MIT has an entry in the Department of Energy's annual Solar Decathlon--a village of 20 off-grid solar homes built by college students to be assembled and open to the public on the National Mall in Washington from Oct. 12 to Oct. 20.

MIT's off-grid home, known as Solar7, is en route to the capital now. Designed and built at MIT on an asphalt lot at the corner of Albany and Portland Streets in Cambridge, Solar7 was broken into modules and sent off by flatbed truck.

About 20 MIT students and volunteers will reassemble the house in Washington and participate in the DOE contest in various roles.

Kurt Keville, faculty advisor to the Solar7 team and a research specialist at the Institute for Soldier Nanotechnologies, already has his eye on MIT's worthiest opponent. "The University of Colorado won the past two years; they're the team to beat," he said.

Each 800-square-foot solar home, once assembled in the Solar Decathlon Village, vies for points in 10 categories related to energy efficiency, design and marketability, which the DOE calls "communications."

As Corey Fucetola, student project leader and a graduate student in electrical engineering and computer science, said, "The best way to change human behavior is give people the information they need to change."

The MIT solar house team will communicate through student tour guides (the DOE expects 30,000 visitors to the contest) and an information panel in the kitchen that will give feedback on about 40 sensors monitoring light, temperature and energy use.

All Solar Decathlon entries must meet specific livability standards. Each home must retain warmth but not bake its residents. Each must have sufficient light to endure rainy days; it must provide warm water for showers; it must be handicapped-accessible; it must store enough energy to run a dishwasher and an electric car. It must use commercial building materials and available technologies--no weird science, no fresh-from-the-lab contraptions.

"You can't yank something out of the lab and throw it up on the roof. You have to use production-grade products," Keville said.

The consolations of technology
Since construction began last spring, Keville, Fucetola and construction manager Tom Pittsley had plenty of technology and new materials to keep their interest and to engage the weekend warriors managed by volunteer coordinator Arlis Reynolds.

For any passive solar home, the challenge is keeping the heat. The Solar7 team built a south-facing light wall made of 1-foot-thick square tiles. Each looks like a sandwich: Two opaque plastic squares are the "bread" for a filling of water and a layer of a thermal insulating gel spread on the inside of one of the tiles "slices."

The insulating gel transfers the sun's heat from the outside, through the water, to the inside wall.

Energy-efficient windows made of three panels of glass with krypton gas as an insulator are used elsewhere in the house.

Photovoltaic cells cover the south-facing roof of Solar7 and do the heavy lifting, energy-wise. They generate about nine kilowatts of energy per hour; electricity will be stored in 24 batteries. These can hold about 70 kilowatt-hours and can power the house for about 48 hours.

The batteries must also power the team's electric car--a potential deal-breaker, since the car that goes the -furthest wins.

Solar7's south face also holds 60 evacuated tubes that will carry solar-heated water into the house for showers and washing and for circulation in the warmboards, a radiant heating system based on a molded subfloor that's embedded with plastic tubing.

The house contains a kitchen, full bathroom, living and dining area, and a flexible bedroom/office space, defined by opaque pocket doors. It has a wide, gracious deck and ramps for accessibility.

It takes a village
The 2007 Solar Decathlon is more than a competition: It's a community of 20 universities, 20 houses, and hundreds of students. Some DOE contests challenge both energy and labor efficiency.

For example, there's the hot shower contest: Each team must be able to heat a bucket of water to 11 degrees.

Then there's the dinner party. The Decathlon Solar Village is divided into neighborhoods of four or five houses, and each solar house team must prepare a three-course vegetarian meal for their neighbors, using their stored energy, powering kitchen appliances, to cook and to clean up.

The MIT dinner menu is nothing short of sumptuous. It starts with pumpkin crab soup, offers meatless sausage kabobs or spinach tortellini for a main course, and sorbet and chocolate chip pudding cookies for dessert. The drink menu--minty "virgin mojitos" or fruity "Safe Sex on the Beaches"--is alcohol-free.

But dinner's a one-time thing; as in life, solar laundry is forever. The DOE requires each team to wash and fluff-dry a load of towels, just like home.

Great Glass Pumpkin Patch returns Oct. 5-6

It's a bumper crop you won't want to bump too hard: More than 1,000 hand-blown glass pumpkins, squashes and gourds in all shapes, sizes, colors and designs will be sold as part of MIT's annual Great Glass Pumpkin Patch.

The Great Glass Pumpkin Patch begins with a preview reception Friday, Oct. 5 from 5 to 8 p.m. on Kresge Oval. The following day, Oct. 6, between 10 a.m. and 3 p.m., shoppers and browsers will be able to purchase their favorite autumnal orb. Prices range from $20 to $200, depending on the piece's size and complexity. Many of the works feature vivid colors, swirls, stripes, spots, curlicues and unusual stems.

The rain date is Oct. 7, from 10 a.m. to 3 p.m.

The glass pumpkins were created by students and instructors in MIT's Glass Lab, where members of the MIT community learn and practice the art of glassblowing. Proceeds from this event benefit the lab, an art program connected with MIT's Department of Materials Science and Engineering. Pumpkin-making is overseen by glass artist Peter Houk, director of the MIT Glass Lab in the Department of Materials Science and Engineering.

The Great Glass Pumpkin Patch came to MIT in 2001 after a residency in the Glass Lab by 14 members of the Bay Area Glass Institute (BAGI). BAGI (a nonprofit corporation located in San Jose, Calif.), was founded in 1995 by San Jose State graduate Bobby Bowes and MIT alumnus Mike Binnard.

Every week or so, beginning, intermediate and advanced students work together for a few hours in teams of six or seven to produce pumpkins for the sale. Production for the October event continues steadily throughout the year in order to achieve the goal of 1,000 to 1,200 pumpkins.

For more information, including an illustrated step-by-step description of "How To Make a Pumpkin," see: web.mit.edu/glasslab/sales_pumpkin.html, or call (617) 253-5309.

MIT student turns hearing loss into knowledge gain

Brad Buran, a Harvard-MIT Division of Health Sciences and Technology (HST) graduate student, lost his hearing to pneumococcal meningitis when he was 14 months old. Today, the fifth-year doctoral candidate studies in HST's Speech and Hearing Biosciences and Technology program is becoming an expert in the neuroscience of speech and hearing.

Because he is immersed in an environment filled with researchers investigating hearing loss, speech therapy, linguistics, and cochlear implants, Buran sometimes becomes the subject of probing conversations. This constant scrutiny might be off-putting for some, but for Buran, it is fodder for his own musings about the way his brain works.

"It's interesting, because so many people in the program are specialists in an area that relates to me personally," says Buran.

His growing scientific expertise, combined with his personal experience, allow him to go from talking with a programmer about ways to improve cochlear implant coding strategies to discussing linguistics with a speech pathologist without missing a beat. And according to classmate Adrian "K.C." Lee, having Buran as a classmate, colleague and friend enriches his own learning experience by revealing to him the nuances of social communications, such as how people perceive accents (Lee is Australian) or pick up idioms.

Buran uses a technique called cued speech to follow lectures, conversations, and the subtle cues hearing people take for granted, such as the rustle of papers as the class flips to the next page. This technique turns sounds--from speech to sneezes and even page turns--into hand gestures to create a visual likeness of what he is unable to hear.

For instance, the word "bat" involves three sounds: "b," "a," and "t." Cued speech combines three cues--hand gestures paired with locations around the mouth and throat--to convey these distinct sounds visually. These gestures help Buran distinguish "bat" from "pat," two words that look the same to a lip reader. At the same time, they preserve similarities, allowing him to see that the words rhyme.

Cued speech has played a significant role in helping Buran reach his potential as a scientist by giving him access to the same information as his hearing peers. In this regard, cued speech differs from American Sign Language (ASL). ASL is a distinct language, not a way to represent English with the hands. Translating English into ASL requires interpretation of the meaning and then approximating it using signs.

In contrast, "Cued speech is a modality for language, like speaking or writing," says Tom Shull, a Boston-area cued speech professional. "When I cue, I'm not an interpreter. I'm a transliterator."

Transliteration transparently relays language phoneme by phoneme. "The sounds are just going into the transliterator and coming out as gestures. All the decoding happens at Brad's end," says Lee.

Because cued speech is not a language, it is relatively easy to learn. Lee and more than a dozen of Buran's classmates learned cued speech after meeting Buran. In fact, one friend nearly mastered the technique over a weekend. Now, the group cues during meetings, classes and social events.

"He's certainly not isolated by his impairment," says Buran's advisor Charles Liberman, an HST professor and director of the Eaton-Peabody Laboratory of Auditory Physiology at Massachusetts Eye and Ear Infirmary. "He's in there, organizing [things], galvanizing his classmates to take cued speech classes or to go skiing."

Buran also has a cochlear implant and is learning to use it more effectively. He aims to communicate well on the phone, in large groups and in noisy environments. By listening to himself, he is also improving his speech and intonation. "I can be a very sarcastic person, but you have to see it in my face," he said. "I am not good at using my voice for that yet."

Meanwhile, Buran thinks constantly about his own research. Though he now studies the physiology and molecular biology of the inner ear, his real interest is in language cognition. Because he acquired language both aurally and visually and because he speaks with his hands as well as his voice, he often thinks about how this works inside his brain. "When you decide to say something, what creates the appropriate motor sequence?" How does his brain pick between moving his mouth to speak and his hands to cue?

"No one really understands how all of that works," says Buran. More than likely, someday, he will.

Green Center for Physics dedication set for Oct. 5

Event also celebrates completion of PDSI building project.
The Green Center for Physics, the dynamic cornerstone of a major building and renovation project, will officially open with a dedication ceremony at 2:30 p.m. Friday, Oct. 5.

The community event will also celebrate the completion of the overall major building and renovation program, known as PDSI, for the departments of physics and materials science and engineering, the George R. Harrison Spectroscopy Laboratory and infrastructure renewal enhancements.

Named in honor of Cecil (E.E. 1923) and Ida Green, the center was designed with the goal of fostering new research collaborations. The center occupies the fourth floor of Building 6, the first (atrium), second, third and fourth floors of 6C (a new "infill" structure replacing 6A), and the third floors of Buildings 4, 6 and 8, in which spaces have been significantly improved.

Building 6C has huge windows, dramatic interior vistas and sleek glass-walled walkways on the second, third and fourth floors that connect it to buildings 4, 6 and 8.

"The Green Center fulfills a decades-long dream--to have a place that faculty and students throughout MIT can identify with the Department of Physics," said Marc A. Kastner, dean of science and Donner Professor of Science.

The department's administrative, academic and some community functions are now in one location, along with educational labs and a reading room. The new center has also brought together MIT's theoretical physicists, housing the Condensed Matter Theory Group and the Virgil Elings Center for Theoretical Physics.

"Theoreticians thrive on explaining their ideas to colleagues and on lively discussions," said Edmund Bertschinger, department head and professor of physics. "The building's design and the new environment provide for that."

Kastner agreed that the center facilitates the kind of collaboration that will help spark new research advances.

"Faculty and students working in these fields have many common interests, and the accidental discussions that will occur because of proximity in the Green Center are sure to lead to great new insights," he said. "People will meet each other frequently when they use the undergraduate or graduate student common rooms or the variety of conference and seminar rooms."

Architect Payette Associates and program planner Imai Keller Moore worked to assure that the new facility would provide both the space and the spirit for spontaneous meetings. The MIT project managers were John Hawes and Milan Pavlinic.

The goals of PDSI, announced in 2002 when the Main Group Master Plan was formulated, included building and renovation projects as well as upgrades for life safety and building services.

The Main Group, Institute shorthand for its historic Bosworth Buildings, were designed by William Welles Bosworth and mostly completed in 1916. They are connected by the Infinite Corridor and include Buildings 1 through 10.

"We are especially grateful that the generosity of the department's friends--especially Cecil and Ida Green, Neil (E.E. 1964) and Jane Pappalardo, Virgil Elings (Ph.D. 1966) and Jim (S.B. 1953, Ph.D. 1957) and Sylvia Earl--has made the Green Center a reality," Kastner said.

The visual centerpiece of the entire PDSI project is the bold design on the U-shaped ground floor atrium, a 7,000-square-foot artwork by renowned American conceptualist Sol LeWitt (1928-2007).

"Bars of Color Within Squares (MIT) 2006," commissioned through MIT's ongoing Percent-for-Art program, consists of squares of vibrantly colored geometric shapes that reflect on the Center's windows and walls.

LeWitt was selected for the project by a committee including Kastner; Samuel Allen, professor of materials science and engineering; Washington Taylor, professor of physics; Virginia Esau, manager, physics space and renovation; Marc Jones, assistant dean, School of Science; Jim Collins of Payette Associates; and Jane Farver, director of the MIT List Visual Arts Center.

The Oct. 5 celebration will include tours of the new and renovated spaces for physics, materials science and engineering, and the Spectroscopy Laboratory.

President Susan Hockfield, Dean of Engineering Subra Suresh, Spectroscopy Laboratory Director Michael Feld, chair of the MIT Corporation Dana Mead and Kastner are scheduled to make remarks.

The day the space age began


From the beginning we human are curious about all that we dont have or that we cannt.
which starts research . we get success.
Fifty years ago, a 184lb ball called Sputnik became the first man-made object to be launched successfully into orbit. The world was changed for ever. Rupert Cornwell looks back on an achievement that set the tone of geopolitics for a generation .

Exactly 50 years ago today, on 4 October 1957, the Russians launched Sputnik, the first ever artificial satellite to orbit the Earth. It circled our planet in roughly 96 minutes, at an altitude of about 150 miles, travelling at a speed of 18,000mph, crossing the US seven times a day. Inside the sphere of polished aluminium were two radio transmitters, and batteries.

Compared to the devices that orbit the planet now, it was primitive in the extreme. Yet Sputnik was a watershed in history.

Curiously, in the Soviet Union of the time, it didn't seem that big a deal, at least initially. The country's leader Nikita Khrushchev was told of the successful launch while he was attending a meeting of party functionaries in Kiev. He was delighted, but the others only wanted to talk about the need to boost local electricity supplies. Only when pandemonium ensued in the US did Moscow realise the magnitude of its propaganda coup – technological, but in those days above all military.

Satellites, scientists understood, could be important tools for both peace and war. But what struck such dread into Americans, and provided such a strong card for their opponents, was the R-7 rocket which carried Sputnik into space. As Khrushchev's son Sergei – a 22-year-old engineering student at the time of the launch who later worked on the Soviet space programme – stresses today, the top priority of the day was to develop an intercontinental ballistic missile to deter an American nuclear first strike. Security, not space exploration, was the name of the game: "Back then, we lived in the same situation as Iran is in now," Sergei, now an American citizen and a professor at the Ivy League Brown University in Rhode Island, said this week.

With the R-7 the Soviets had achieved their goal – or so it seemed to a panicked US.

News of Sputnik came as a bombshell – rating one of those across-the-front-page, triple deck headlines that The New York Times reserves for presidential election results and events like 9/11. Ordinary Americans were stunned, caught napping as they luxuriated in their consumer comforts of cars with shiny chrome tail fins and fancy safety razors. The post-war generation had never had it so good. Now, it appeared, the country faced a threat to its very survival as a free nation.

President Dwight Eisenhower himself reacted with restraint, refusing to overdramatise events – indeed, he is said to have put in five rounds of golf in the week after Sputnik was launched – but he was about the only one. With his blue-chip military reputation, Ike could get away with so measured a response, albeit barely. Today, in the pantheon of presidents, Eisenhower stands higher than he ever has. But for decades afterwards, the Sputnik shock seemed to mark him as old, complacent and out-of-touch. His Democratic opponents were merciless. Lyndon Johnson, the then Senate majority leader, conjured up a science fiction nightmare of giant Soviet platforms in space from which they would rain down bombs on America "like kids dropping rocks on to cars from freeway overpasses".

As the Iraq war shows, hysterical over-reaction to threats is a constant of US history. But at the time few considered it an over-reaction to build fall-out shelters across the land, and drill schoolchildren on how to shelter under their classroom desks in the event of nuclear attack.

A couple of months after Sputnik, hysteria merged with outright national humiliation, when the Vanguard rocket supposed to put America's first satellite into space blew up on the launch pad, live on television, having climbed just four feet into the air. The occasion proved that Pentagon spin was as brazen then as it is now. A military spokesman denied an explosion had taken place. What the world had watched, he said, was "rapid burning".

Eventually, in February 1958, the US did successfully launch its first satellite, Explorer-1. By then, however, the Russians had already put an animate object, in the shape of a terrier called Laika, into space in a much larger Sputnik 2. The unfortunate animal is believed to have died of stress and overheating a few hours into her flight, but the headlines paid scant attention to that. "Soviets Orbit Second Artificial Moon: Communist Dog in Space," read one typical specimen.

Even more profound was the political impact. The Democrats made hay of Ike fiddling (or more exactly perfecting his short game) as Washington burnt.

In the popular mythology, America had stood by watching, mesmerised by shallow consumerism, as the Reds vaulted into tomorrow.

In truth, Sputnik was something of a bluff. The R-7 rocket was too big and too expensive to manufacture in any number, nor did the Russians then have the technology to guide an ICBM to its destination. Sputnik, too, was a bluff. The missile, Nikita Khrushchev privately acknowledged, "was only a symbolic counterthreat to the United States". Not until the 1960s, his son would later reveal, did Moscow acquire its first operational ICBMs. But at the time no one knew this – or, to be precise, no one could say so.

The legend of the "missile gap" was born, and may have proved crucial to John F Kennedy's defeat of Richard Nixon in the 1960 presidential election, as he hammered on about how the Eisenhower-Nixon team had allowed the Soviet Union to achieve strategic superiority. It was not so, as Eisenhower had known well for years, thanks to secret U-2 spy plane flights at altitudes Soviet air defences were unable to reach.

Indeed, the Sputnik programme was partly conceived as a riposte to the U-2.

But for Eisenhower to have given proof that the Democrats' claim was false, he would have had to admit the U-2's existence (which did of course become public knowledge in 1960, when the Soviets finally managed to shoot down the one flown by Gary Powers). Thus did Sputnik usher in the most dangerous phase of the Cold War, which culminated in the Cuban missile crisis of October 1962 – after which both countries in effect decided "never again", installing a hotline between their two capitals, and accepting the doctrine of Mad (or mutually assured destruction).

Five decades on, it is even clearer that while the first Soviet satellite most certainly changed the world, it was not in the way expected at the time. Yes, the R-7's legacy persists, in the land- and submarine-launched ICBMs that are the basis of both sides' strategic nuclear weapons: lodged in Trident submarines silently patrolling the oceans, and the missile silos that look like unmanned power grid relay units, eerily dotted across the empty plains of North Dakota.

True, too, Sputnik might be seen as the first step towards a future militarisation of space, where one day lethal weapons may be dropped down on Earth, as LBJ so colourfully imagined. But the SDI "Star Wars" programme announced by Ronald Reagan has shrunk to an uncompleted missile defence system that causes diplomatic ructions with Moscow, is of uncertain reliability, and is of dubious strategic value.

If Sputnik was a poor guide to the military future, the giddy expectations it stirred of man's presence in space have not been met either. At the time, it seemed to prove that science fiction's claims were not fiction at all. Moreover, Nasa was set up within a few months, under intense public pressure.

In retrospect, however, the July 1969 Moon landing remains the defining moment of space exploration. The US had responded to Sputnik and showed that with its mind on the job, it was more than a technological match for the Soviet Union. But the wilder fantasies spawned by Sputnik, of human voyages to the planets and colonies in space, are little more plausible now than then.

In 2004, President George Bush set out a new vision for Nasa, vowing to complete the international space station and to establish a permanent base on the Moon by 2020, from which "human beings are headed for the cosmos". But the plans have struck no public chord. The foreseeable future extends no further than unmanned missions within our own solar system.

In short, the era that Sputnik inaugurated has been not outward-looking, but introspective, focused not on the great dark blue yonder of the universe, but on the needs and problems of our own troubled and fragile planet. Since Sputnik, 6,000-plus satellites have been put into space. Today there are perhaps 900 up there functioning, some monitoring the environment, and at least half of them for communications purposes, both civilian and military.

This is not the "Space Age" that Sputnik was meant to usher in, but an "Information Age" powered by satellites which has led to a new industrial revolution. Yet, paradoxically, it may be of greater military relevance than ever – not to annihilate an enemy power, but in the more subtle defeat of today's terrorist foes. Not with intercontinental missiles, but by intercepting their phone calls and spying on their activities from space – courtesy, ultimately, of Sputnik.

Microsoft has revamped its slow-selling Zune digital player


Microsoft Corp. Chairman Bill Gates, is joined by J Allard, the company's corporate vice president for design and development.

How many development?? for customer satisfaction or business ...or both.

Microsoft has revamped its slow-selling Zune digital music player and created a MySpace-style social-networking site in its drive to compete with Apple's market-leading iPod player.

In large part, the Microsoft moves announced Tuesday - the introduction of a smaller, sleeker version of the Zune player and the planned Zune social Web site - reflect an attempt to build scale for a brand that so far has achieved only niche status.

Microsoft said it had sold about 1.2 million units of the original device in the last year.

"For something we pulled together in six months, we are very pleased with the satisfaction we got," Bill Gates, Microsoft's chairman, said in an interview Tuesday. "The satisfaction for the device was superhigh. The satisfaction on the software actually is where we'd expect to see a huge uptick this year. It was just so-so on the software side."

Microsoft said it had re-engineered the Zune hardware and software and the associated digital music store to make them all easier to use."I'm sure a year from now we'll do even better," Gates said. "But I'm blown away by what they've been able to do in a year."

Many of the changes are stylistic. The company reworked the device's navigation button and dropped one of its signature colors, brown, from the list of options. The Zune will be available in black, pink, green and red.

But one of the most striking changes had to do with Microsoft's effort to enhance what had been perhaps the most talked-about feature on the original device: the ability to share music files and other media wirelessly with other Zune owners.

Far too few people, however, purchased the player for such sharing to become commonplace, and the function held little appeal because it was crippled by usage rules negotiated with the music industry. Shared songs expired within a few days, even if the recipient did not play them. And a file acquired from one Zune user could not be shared with a third user.

Under the new rules, Microsoft said, shared songs would have no expiration date, and it would be repeatedly possible to pass along songs sent from one device to another. But a shared file can be played only three times on each Zune.

Partly to warm up the initially tepid response, the company is creating a social-networking site, Zune Social, to encourage the sharing of samples of songs online, even for fans who do not own a Zune player. Members of the network will also be able to use a small application on their computers to display which songs they have been listening to, and that information can be posted on certain Web sites outside the network or sent by e-mail to friends.

Various social networking sites, like Facebook, already offer sharing of samples of songs.

"The whole idea behind Zune is much broader than the devices themselves," said J Allard, the Microsoft vice president who oversees design and development for consumer products like the Zune and the Xbox 360 game consoles.

"The conditioned thought is around a portable device being the center point of the experience, when in fact it's not. It really is about how do we start taking Zune beyond that device."

He said the social networking would appeal to Zune owners and people who had not bought the device.

Van Baker, an analyst at the research firm Gartner, said the Zune revisions amounted to "a much-needed line extension" for the brand.

"Is it enough to get somebody to move away from Apple to Microsoft? I don't think so," he said, "but it should help Microsoft against some of the other alternatives."

anesthesia method blocks pain without numbness or paralysis

The world's hottest work in anesthesiology is being done at Harvard, where researchers are pouring pepper on pain.

Scientists at Harvard Medical School and Massachusetts General Hospital today described a new "targeted" approach to anesthesia that uses the active ingredient in chili peppers as part of an ingenious recipe for blocking pain neurons. Most critically, the technique doesn't cause the numbness or partial paralysis that is the unwelcome side effect of anesthesia used for surgery performed on conscious patients.

If approved for use in humans, the method could dramatically ease the trial of giving birth -- by sparing women pain while allowing them to physically participate in labor. It could also diminish the trauma of knee surgery, for instance, or the discomfort of getting one's molars drilled. Not only would there be no "ouch," there would be none of the sickening wooziness or loss of motor control that comes from standard forms of "local" anesthesia.

In time, the process might even be employed for major surgery on the heart and other organs, the researchers said. More prosaically, the work might also represent a breakthrough cure for the common itch.

The work on lab rats, described in the scientific journal Nature, breaks from the standard approach to local anesthesia, which usually involves anesthetics delivered by catheter tubes or injections that silence all neurons in a given region of the body, not just those that sense pain. Shutting down just the pain neurons means that patients could still feel a light touch and other non-hurtful sensations."This could really change the experience of, for example, knee surgery, tooth extractions, or childbirth," said Dr. Clifford Woolf, senior author of the study and a researcher in anesthesia and pain management at Mass. General. "The possibilities are almost endless."

Woolf collaborated with Bruce Bean, professor of neurobiology at Harvard Medical School, in research that employed surprisingly basic scientific principles as well as some unlikely ingredients -- capsaicin, the stuff that imparts "hot" to chili peppers, as well as an all-but-forgotten variation of a standard anesthesia, long dismissed as clinically useless.

"We plucked a little of this and little of that off the shelves," Bean said. "The project is really a great illustration of how basic biological principles can have very practical applications."

Indeed, scientists with no involvement in the Harvard study were most surprised by its simplicity.

"It's a really clever piece of work, based on one of those 'I wish I'd thought of that' ideas," said Dr. Stephen G. Waxman, head of the department of neurology at Yale University's School of Medicine. "This is an important piece of research."

There's also sweet historic symmetry to the discovery.

Boston, after all, is the city that invented feeling no pain -- at least in surgery.

Modern anesthesia was first successfully employed in surgery in October 1846, one of the greatest moments in medicine. In Boston's Public Garden, the second-largest statue -- after that of George Washington on his horse -- is a soaring pillar, adorned with roaring lions and bas-relief depictions of 19th Century surgeons, that celebrates the "discovery that the inhaling of ether causes insensibility to pain. First proved to the world at the Massachusetts General Hospital."

Not far away, modern Mass. General's original "ether dome" still stands, a national landmark and popular pilgrimage point for anesthesiologists from around the world.

The work undertaken by Woolf, Bean and post-doctoral researcher Alexander Binshtok exploits well-known concepts of how electrical signals in the nervous system depend on ion channels -- proteins that make passageways through the membranes of nerve cells. Pain-sensing neurons possess a unique channel protein, TRPV1, but one that is usually blocked by a molecular "gate."

Medicine for more than 150 years has relied on general and standard anesthetics that penetrate and suppress sensation in all neurons, not just those nerve cells dedicated to sensing pain. That's why an epidural or a simple shot of Novocain leaves a whole region of the body numb or paralyzed, because all nerves cells are affected.

Enter the hot chili pepper, in the form of capsaicin.

Enter, too, a failed derivative of the common anesthetic lidocaine, invented in the 1940s. The derivative, known as QX-314, was deemed useless because it couldn't penetrate cell membranes to block sensation. In non-pharmaceutical terms, that's a bit like having a power shovel that can't cut earth.

In experiments, the Harvard researchers found that the chili pepper ingredient generated heat that opened the gate to pain neurons, but had no similar effect on other nerve cells. Then, when they introduced the lidocaine derivative, it charged through the open channels to block pain in those neurons, but was still unable to enter other nerve cells, such as "motor" neurons that control coordination and mobility.Thus, in rat experiments, there appeared to be a total shutdown of pain, with no apparent numbness or paralysis.

The rats received injections near nerves leading to their hind feet, and lost the ability to feel pain in their paws. But they continued to scamper about their cages normally and showed sensitivity to touch and other stimulation.

"We introduced a local anesthetic selectively into specific populations of neurons," said Bean. "Now we can block the activity of pain sensing neurons without disrupting other kinds of neurons that control movements or non-painful sensations."

Experimentation will likely move on to to sheep, then humans. One problem that needs to be addressed is whether the capsaicin might cause such a burning sensation when first injected -- before the lidocaine derivitive shuts down the pain -- that it may be too uncomfortable for use as an anesthetic. But the researchers are confident they can find a more practical "warming" chemical to open the gateways to the pain neurons.

"This method could really transform surgical and post-surgical analgesia. Patients could remain alert without suffering pain. But they also wouldn't have to cope with numbness or paralysis," Woolf said.

Noting that itch-sensitive neurons are similar to nerves that sense pain, he added: "We may have even found a good treatment for the common itch."

Find here

Home II Large Hadron Cillider News