Search This Blog
Friday, November 23, 2007
discoveries of large Earth-like planets outside our Solar System, so-called “super-Earths,”
Super Earths' Will Have Plate Tectonics, Scientists Predict
The discoveries of large Earth-like planets outside our Solar System, so-called “super-Earths,” has prompted much speculation about just how Earth-like they may be. Recently, scientists from Harvard University suggested that these planets will, like Earth, have plate tectonics.
Plate tectonics, the movement of the giant plates that make up Earth's solid outer shell, are responsible for earthquakes, volcanoes, and other major geological events. In essence, they have dominated Earth's geological history. Earth is the only known planet that has plate tectonics, and this activity has been proposed as one necessary condition for the evolution of life.
However, in a paper published in The Astrophysical Journal, Harvard planetary scientist Diana Valencia and her colleagues predict that super-Earths – which are between one and ten times as massive as Earth – will fulfill one of the requirements for sustaining life by having plate tectonics.
“Some of these super-Earths may be in the 'habitable zone' of their solar systems, meaning they are at the right distance from their mother star to have liquid water, and thus life,” Valencia, the paper's corresponding author, told PhysOrg.com. “Ultimately, though, only these planets' thermal and chemical evolution will determine whether they are habitable. But these thermal and chemical properties are closely tied to plate tectonics.”
Using detailed models they developed of the interior of massive terrestrial planets, Valencia and her group determined how the mass of a super-Earth is related to the thickness of its plates and the magnitude of the stresses the plates experience. These stresses, part of the slow, slow convection of Earth's mantle, are the driving force behind the deformation and subduction (when one plate sinks below another) of the plates. For planets more massive than Earth, this driving force is larger than Earth's.
The group found that as planetary mass increases, there is an increase in the shear stress and a decrease in the plate thickness. Both of these factors weaken the plates and contribute to plate subduction, which is a key component of plate tectonics. Therefore, the scientists say, the conditions required for plate deformation and subduction are easily met by super-Earths. Their results show that this is particularly true for the larger super-Earths.
“Our work strongly suggests that super-Earths, even if they have no water, will exhibit plate tectonic behavior,” Valencia said.
In the future, it may be possible to verify these results using NASA's Terrestrial Planet Finder devices or the European Space Agency's Darwin project, which will consist of three telescopes to search out Earth-like planets.
Super Earths Emerge From Snowy Conditions
Many extrasolar planets have been discovered circling other stars, a few of which are 5-15 times the mass of the Earth, and thought to be solid like our planet. Astronomers were surprised to find these planets orbiting small, cooler red dwarf stars. Researchers believe these "super Earths" form in the chilly halo of snow, ice and frozen gasses that collect around red stars as they cool. There probably isn't enough solid material to form rocky planets much larger than Mercury in the star's habitable zone.
The 200 known planets that orbit other stars exhibit incredible variety. Among them are a handful of worlds that weigh between 5 and 15 times Earth. Astronomers believe these "super-Earths" are rocky iceballs rather than gas giants like Jupiter. While theorists can explain how such worlds form around Sun-like stars, the discovery of super-Earths around tiny red dwarf stars was surprising. New research suggests that some super-Earths build up rapidly when local temperatures drop and ices condense out of the surrounding gas.
"We believe that some super-Earths form during a cosmic 'snowstorm.' Only this snowstorm envelops the whole planet and lasts millions of years," said astronomer Scott Kenyon of the Smithsonian Astrophysical Observatory.
All planets form within a disk of gas and dust surrounding a newborn star. Rocky planets form close to the star, where it is warm, while icy and gaseous planets form farther out, where it is cold. When it was young, the Sun was relatively stable, leading to a natural progression of small, rocky worlds in the hot inner solar system and large, gaseous worlds in the cold outer solar system.
In contrast, planetary systems around small red dwarf stars undergo dramatic changes in their early history. As the young star evolves, it dims. The warm inner disk starts to freeze, creating conditions where water and other volatile gases condense into snowflakes and ice pellets.
"It's like a massive cold front that sweeps inward toward the star," explained first author Grant Kennedy of Mount Stromlo Observatory in Australia. "The ices add mass to a growing planet, and also make it easier for particles to stick together. The two effects combine to produce a planet several times the size of Earth."
The disks that surround small red dwarf stars tend to contain less material than the disk that formed the solar system. Without the "snowstorms" in these smaller disks, there is not enough material to make super-Earths.
Although astronomers have discovered a few super-Earths orbiting red dwarf stars, it may be tough to find worlds hospitable to humans. All of the known super-Earths are icy worlds with no liquid water. Red dwarf stars are so dim and cool that their warm "habitable zones" are very close to the star, where there is very little planet-forming material.
"It's difficult to make anything larger than Mercury or Mars in the habitable zone of a red dwarf.
more...
Super-Earths will have plate tectonics
Super-Earths" - rocky planets up to 10 times the mass of Earth that orbit other stars - probably have similar structures to our world, with a solid inner core surrounded by a liquid mantle and then a crust. They may even have plate tectonics, which some argue is necessary for life to evolve. Dimitar Sasselov of the Harvard-Smithsonian Center for Astrophysics and colleagues came to this conclusion after modelling geological processes on planets of various sizes. They found that as planetary mass increases, more heat is trapped and convection increases. As a result the shear stress within the crust increases too and plate thickness decreases. That means the plates are weaker and plate tectonics becomes "inevitable". Our own planet seems to lie at the threshold. If it were any less massive, it would probably not have plate tectonics. Plate tectonics may boost biodiversity by recycling chemicals and minerals through the crust.
"When it comes to habitability, super-Earths are our best destination," says Sasselov. "The idea is right," says Jack Lissauer of NASA's Ames Research Center in Moffett Field, California. "Plate tectonics is more likely on more massive planets."
Scientists have tested the capabilities of cellular therapy for ischemic stroke treatment
Stem Cell Injection Protects Against Nerve Cell Death After Stroke, Study Suggest
Scientists have tested the capabilities of cellular therapy for ischemic stroke treatment on rats. It has turned out that intravenous transplantation of mesenchymal stem cells restores cerebrum blood supply and protects its nerve cells from death.
Under anaesthetic, the rats’ medium cerebral artery was pinched in order to impair the blood supply in the left hemisphere. Three days later, the animals were intravenously injected the mesenchymal stem cells (MSC) from the marrow. These cells are able to differentiate into the cells of other tissues, including nerve cells.
One group of the animals were false-operated – the operation was performed on them but the artery had not been pinched. The reference group animals’ artery was pinched but the stem cells were not introduced. The MSCs for transplantation were singled out from the marrow of thigh-bones of other animals of the same laboratory line, the MSCs were marked by a fluorescent dye and injected into the laboratory rats’ caudal vein. The animals’ cerebrum was investigated six weeks later.
Specialists of the “Trans-Technologies” Open Joint-Stock Company, with participation of Scientific Research Institute of Experimental Medicine, Russian Academy of Medical Sciences (St. Petersburg) were involved with the research.
There turned out to be unexpectedly few luminescent cells in the cerebrum specimen, and they were located not in the affected cortex zone but nearby ventricles of brain. This is strange as the specialists of “Tans-Technologies” have experimentally shown that stem cells introduced into the bloodstream come to the damaged tissue in several days. But nevertheless the stem cells introduction turned out effective for restoration of the affected brain. The area of affected zone with the experimental rats was less than that with the untreated animals.
Transplantation enables to preserve the parts of brain responsible for formation of emotions and motion regulation. With the untreated rats, these sections were noticeably damaged. Their stroke area was surrounded with an extensive zone of dying nerve cells. The stem cells increased almost by twice the number of blood vessels in the injured left hemisphere, which contributed to cerebral blood supply restoration. It is interesting that more vessels appeared in the symmetrical unaffected hemisphere. This phenomenon has not been described in scientific publications, therefore the researchers are planning to investigate it separately.
Thanks to the stem cells, the rats successfully passed the test in two or three weeks after transplantation. They became calmer, they better orientated themselves in space and memorized disposition of surrounding objects. Besides, the animals restored symmetry of reactions in the left and the right side of the body and in utilization of extremities.
In the researchers’ opinion, the mesenchymal stem cells (MSCs) is practically an ideal material for cellular therapy as they can be introduced directly into the blood. This allows to avoid serious operations under general anaesthetic, which are necessary for cell injection directly into the brain.
Although the researchers are now unable to fully explain the MSCs mechanism of action, but their beneficial action on the brain after a stroke is evident. Possibly, in case of earlier MSC transplantation, more cells will be able to get into the brain, and the beneficial action will be even more apparent.
more....
Mechanisms of Nerve Cell Death
Dr. Yin has been engaged in studying the complex mechanisms of cell death after stroke, spinal cord injury and other relevant neurodegenerative disorders such as Alzheimer’s disease using cell biology, molecular biology, biochemistry, and pharmacology approaches. Currently, effective treatments of these diseases are not available and the potential benefit of either medical or surgical treatment remains debatable. His major interest is to investigate the molecular regulation of apoptotic cell death in amyloid-beta induced endothelial cell death, ischemic brain damage and traumatic spinal cord injury, focusing on delineating the role of several key regulators of apoptosis (BH3-only family members) and multiple regulatory signaling pathways that regulate these apoptosis-related genes. Both in vitro and in vivo models of Alzheimer’s disease and CNS injury are employed in these studies.
By understanding these molecular mechanisms, the main goal of his research is to identify critical therapeutic targets which may lead to the development of novel neuroprotective strategies to attenuate CNS injury following cerebral ischemia, trauma to the spinal cord, and to prevent complications of cerebral amyloid angiopathy and related hemorrhagic stroke in the elderly.
Astronomers Observe acidic particulate clouds outside of our own Milky Way galaxy
Astronomers Observe Acidic Milky Way Galaxies
SRON astronomer Floris van der Tak is the first to have observed acidic particulate clouds outside of our own Milky Way galaxy. He did this by focusing the James Clerk Maxwell Telescope, located on Hawaii, on two nearby Milky Way galaxies. Astronomers think that acidification inhibits the formation of stars and planets in the dust clouds. Now it is a case of waiting for precise measurements from the SRON-built HIFI space instrument that will be launched on the Herschel space telescope next year.
The formation of stars and planets in the universe is a delicate process. Clouds of gas and matter rotate and draw together under the influence of gravity. Pressure and temperature then rise, which eventually leads to the kindling of a new star with planets potentially orbiting it. Yet why does this happen at some locations in the universe and not at others? What are the conditions for star and planet formation? How does this process start and when does it stop? Astronomers are fumbling in the dark.
‘The quantity of charged molecules in the dust cloud appears to have an inhibitory effect’, says Floris van der Tak. ‘These ensure that the magnetic fields can exert a greater influence on the cloud, as a result of which the entire cloud becomes agitated and the star-forming process is disrupted’. Observing these charged molecules directly is difficult. The ratio of acidic water molecules to ordinary water molecules is a measure of the quantity of charged molecules.
However, it is difficult to observe water molecules from under an atmosphere that is itself predominantly made up of water molecules. ‘It is like looking for stars in the daylight.’ On Earth it can only be done from a high mountain where the air is rarefied. Such a spot is the 4092 metre-high top of the Hawaiian volcano Mauna Kea, where the James Clerk Maxwell Telescope is located. Van der Tak focused this telescope on the Milky Way galaxies M82 en Arp 220, where he discovered areas rich in acidic water molecules.
‘Amazingly, what causes these acid water molecules to be present in both Milky Way galaxies is completely different’, says Van der Tak. ‘In Arp 220 they develop under the influence of X-rays in the vicinity of the central supermassive black hole. In M82, the cause is the ultraviolet radiation emitted by hot young stars in the star-forming area. Therefore, in these particular galaxies the process of star formation inhibits itself, due to more and more charged molecules being created.’
The astronomer will be able to deploy even heavier equipment for his research in the not too distant future. Next year, the European Space Agency (ESA) is launching the Herschel space telescope with the SRON-constructed Heterodyne Instrument for the Far Infrared (HIFI) attached to it. And in the 5000 metre-high and completely arid Atacama Desert in Chile, a start has been made on the construction of ALMA, 66 smart telescopes that can together produce detailed maps of the Milky Way galaxies. SRON is one of the partners involved in developing the detectors for these telescopes.
The results of the research of Floris van der Tak and his collegues Susanne Aalto of the Chamlers University of Technology, Onsala Sweden and Rowen Meijerink of the University of California are published in the scientific journal Astronomy & Astrophysics.
more....
A Central Bar of the Milky Way GalaxyArtist's Concept of the Milky Way, according to new studies by University of Wisconsin astronomers using the Spitzer Space Telescope of NASA.
Illustration credit: NASA/JPL, Caltech, R. Hurt (SSC)
From their massive survey of stars near the heart of our Milky Way Galaxy, these astronomers found evidence that the Milky Way probably has a definitive, large bar feature measuring about 27,000 light-years in length, making it look as shown in above illustration, where the position of our solar system is indicated. The survey sampled the light from an estimated 30 million stars in the Galactic plane.
Video, interactivity could nab Web users by '10
Thanks youtube! Web Will Slow by 2010
The Bandwith Demands of Increasingly Complex Web Sites and Content Will Slow Us Down
Enjoy your speedy broadband Web access while you can.
The Web will start to seem pokey as early as 2010, as use of interactive and video-intensive services overwhelms local cable, phone and wireless Internet providers, a study by business technology analysts Nemertes Research has found.
"Users will experience a slow, subtle degradation, so it's back to the bad old days of dial-up," says Nemertes President Johna Till Johnson. "The cool stuff that you'll want to do will be such a pain in the rear that you won't do it."
Nemertes says that its study is the first to project traffic growth and compare it with plans to increase capacity.
The findings were embraced by the Internet Innovation Alliance (IIA), a tech industry and public interest coalition that advocates tax and spending policies that favor investments in Web capacity.
"We're not trying to play Paul Revere and say that the Internet's going to fall," says IIA co-Chairman Larry Irving. "If we make the investments we need, then people will have the Internet experience that they want and deserve."
Nemertes says that the bottleneck will be where Internet traffic goes to the home from cable companies' coaxial cable lines and the copper wires that phone companies use for DSL.
Cable and phone companies provide broadband to 60.2 million homes, accounting for about 94% of the market, according to Leichtman Research Group.
To avoid a slowdown, these companies, and increasingly, wireless services providers in North America, must invest up to $55 billion, Nemertes says. That's almost 70% more than planned.
Much of that is needed for costly running of new high-capacity lines. Verizon vz is replacing copper lines with fiber optic for its FiOS service, which has 1.3 million Internet subscribers.
Johnson says that cable operators, with 32.6 million broadband customers, also must upgrade. Most of their Internet resources now are devoted to sending data to users - not users sending data. They'll need more capacity for the latter as more people transmit homemade music, photos and videos.
Internet slowdown on horizon, study claims..
A study claims that Internet performance could start to decline by 2010 due to a growing gap between access capacity and demand.
Nemertes Research estimates that up to US$55 billion needs to be spent to close that gap, or about 60 per cent to 70 per cent more than service providers intend to spend.
"The primary impact of the lack of investment will be to throttle innovation both the technical innovation that leads to increasingly newer and better applications, and the business innovation that relies on those technical innovations and applications to generate value," said the report released Tuesday.
"The next Google, YouTube, or Amazon might not arise, not because of a lack of demand, but due to an inability to fulfill that demand. Rather like osteoporosis, the underinvestment in infrastructure will painlessly and invisibly leach competitiveness out of the economy."
A University of Toronto computer science professor told CTV.ca that while he didn't analyze the report's details, he agrees with its general thrust about a looming Internet slowdown.
"This is an inevitability, whether it's 2010 or 2012," said Eugene Fiume.
"This was predictable in the 1980s," he said.
The exploding use of the Internet in emerging economies like China and India will create "hotspots" within the distributed network that is the Internet, he said.
Data will slow down in these hotspots, much like how traffic slows at a poorly designed city intersection. "You will eventually see the not-so-graceful degradation of the entire system," Fiume said.
Technology analyst Kris Abel told CTV Newsnet that the study may be making too many assumptions, that it's difficult to predict the future and new technologies could offset some of the concerns -- which have been widely known for some time -- raised in the study.
"They look specifically at just wired services," he said.
"We're living in an age now where increasingly we are getting a lot of our internet service through wireless solutions, and in wireless solutions, you don't have the same problems."
Backbone vs. the last mile
Nemertes said it analyzed consumer demand and capacity independently.
Some say the problem isn't with the core backbone of the Internet, but where service providers provide access to consumers -- what the telecommunications sector calls "the last mile."
Fiume said that's partially true, but added, "that's what telcos want you to believe, because that's pushing the problem onto the consumer."
Internet service providers can already regulate those who hog bandwidth by engaging in extensive use of peer-to-peer file-sharing networks, as one example, he said.
The Internet's overall pipes need to be widened, along with improving the efficiency of the rules by which data "packets" are transmitted, he said.
"The telcos didn't plan well enough to deal with the explosion of information content on the Internet writ large," Fiume said, adding, "they trying to make it seem like the fact you're watching YouTube is really causing the problem. That's really very laughable."
Nemertes, which didn't make a spokesperson available to CTV.ca, said no one group funded the study and that funding for it came from its client base.
The data came from several sources:
Research data collected by academic organizations
Publicly available documents, including vendor and service provider financials
Confidential interviews with enterprise organizations, equipment vendors, service providers, and investment companies
"During the course of this project we spoke with 70-plus individuals and organizations for these interviews, and we relied on our base of several hundred IT executives who participate in our enterprise benchmarks," the report said.
However, the group said the Internet remains an exceedingly opaque environment.
"Content providers refuse to reveal their inner workings. This is often for very good reasons, but it's detrimental to the industry," it said.
The group called for industry to develop ways of better sharing data with researchers.
Comments are now closed for this story
alex
It is no surprise to see this report. Providers such as bell and rogers have been decreasing bandwidth allowances for the past several years. the demand for more bandwidth simply is not met in this country.
Glenn
Fully agree with first poster. The big companies are stifiling competition just like in local phone service. At the same time, they are way behind the curve in what they have promised the government and consumers.Right now, we should be enjoying faster speed, 8-10 mbps, and that is still a year or more away for the majority of the population.
Ken
The only people that should be demanding more bandwidth are web hosts and other such providers. As a consumer, I almost never see a full 5 megabit download speed, even though that bandwidth is being allocated for me. And for what? Yes, I need high speed, I do a lot of things on the web. But do I need 5 megabit? No. Consumers need to stop demanding so much, or the next thing they will cry and moan about is the high cost of the bandwidth they DO have.
Gis Bun
Agreeing with Ken, getting what ISPs advertise for speed will never happen. Not just too much traffic but in the case of DSL users, the farther you are from a main hub, the crappier the speed.
I think there needs to be something done with ISPs who *claim* to give users 5 mbps but never do [because of conditions mostly coming from their side]. What they are doing is ignoring their side of the "contract" that an individual signs with them. We promise to pay $40 a month for 5 mbps but we aren't getting it.
John G Chicago
That is silly. These companies have an obligation to their customers that should be met. The idea that consumers are asking too much is not true, especially if their cable company contracts allow them this amount service.
DP
US$55 billion - guess what does it mean? Right, your internet bill increase, what else? Those companies have been collecting (can't use the word "earning", sorry) money for years - and spending them on acquisitions, thus killing competition, and more and more perks for their CEOs/CFOs. If it's now 70% less than needed - then it's their fault.
Jeebus
It had to happen sometime. More and more information is being sent. I think the best way to deal with the increased loads is to remove advertising or reduce the amount of ads and spam that flood the net.
We may end up paying slightly more to cover the lost revenue from advertising but in the long run it has to happen.
Francie Dennison
Technological capability is always well ahead of technolicial applications. Corporations only catch up when there is a potential for market share loss from failing to do so. Economics is always the driving force, but too often they fail to realize being on the leading edge can capture a much bigger share away from their competition.
Mike
Ken, bandwith and speed are separate things.
5mb is your download speed, not your bandwidth.
Bandwidth is measured by how much you download, in say in a month.
Like ISPs placing say a 20 GB cap, you can only download/upload that much in a month, or else you incur extra costs. That includes ALL your downloads/and uploads, including send/recieve calls from an email program, or web browser.
Speed is dictated by the speed you pay for, age of transportation equipment between the ISP and your computer. And speed of the server, or internet traffic.
Uwe Warkholdt, Elliot Lake, Northern Ontario
It appears to me that this will happen by 2010. That gives the experts better than 2 years to resolve the problem. But, perhaps the problem is too big for them to handle. But perhaps someone else now is planning a way to get as much out of the consumer as they can. It sounds to me like someone is using the same "old story" for their advantage and never mind everyone else. Is it not how the oil companies started with their song and dance? That is only in my opinion of course.
Mike
Canadians dont even realize how far behind Canada is compared to Europe. The Internet here is way overpriced relative to speed and bandwidth limits. I think it has a lot to do with lack of competition. Unless you're willing to go with a small mom and pop shop your stuck with either Rogers or Bell.
Jim
Posters 1 & 2 are missing the point. It has nothing to do with what the telcos/cable are offering. It has more to do with the increasing amount of useless drivel (i.e. 99.9% of You Tube content) travelling the information highway.
john
OR: is it a coinqeedink, that the date specified is also the date set for the completion of project S*P*P/N*A*U
ALSO: In an article by Paul Joseph Watson of Prison*Planet*dot* com, he describes the emergence of Internet 2. "The development of "Internet 2" is also designed to create an online caste system whereby the old Internet hubs would be allowed to break down and die, forcing people to use the new taxable, censored and regulated world wide web. If you're struggling to comprehend exactly what the Internet will look like in five years unless we resist this, just look at China and their latest efforts to completely eliminate dissent and anonymity on the web." …
--------------------------------------------------------------------------------
Brett G
In fact, service providers are going the other way, and introducing 'traffic shaping' technology which seriously cripples your bandwidth.
Shaw for example, uses this technology to slow down torrents by 90-95% of bandwidth capability.
I wrote Shaw and asked what benefit am I getting by upgrading to their Extreme-I package for an extra $10 a month, (buying more bandwidth) if they nerf my bandwidth by 90-95% on downloads? They haven't written back.
Traffic shaping technology is wrong, and I hope a class action law suit shuts it down.
IT Guy
Hi, Ken. You must not work in the technology sector. I work in the field of IT and I've specifically worked at an Internet Service Provider before they were all bought out.
There was an article released in April (do a web search for Japan comparative broadband prices) which stated that Japan pays approximately $.70 per Megabit of data compared to Canadian citizens who pay approximately $10.50 per Megabit of data. Guess which country has faster Internet access??
D
Here in Regina, we enjoy rich 10 mbps access and it always seems to get better.
Edward Carlile
I think it would be nice if Internet service providers spent some money getting DSL out to the people who are still on Dialup. Although two way Sat internet is now available... it is priced way above what a normal DSL subscription is.
Jeff
My 10 mbit connection seems fine here in Edmonton, and I can upgrade to 25 mbit with a simple phone call. Maybe the problem lies with location more than anything; smaller populations usually = older infrastructure.
jon
Out in the rest of Saskatchewan other than Regina, Saskatoon, etc. we are so far behind we won't catch up. I wrote Sasktel about infrastructure upgrades, and they replied that it is not even forseeable in the future..Stuck at 1.5mbs,128kb/s up and paying too much...I need another option.
Gary
ALL Propoganda!!!! Don't get fooled into thinking they need more $$ therefore need to run up our prices .... I know there is plenty of bandwidth but it is being throttled to make persons think there are issues ....
Johnson Mapple
And the U.N. is itching to take over the running of the internet from the U.S.A. If that ever happens, a slow-down won't be the only thing we'll have to worry about
Ken
IT Guy - As a matter of fact, I do work in the Technology sector. In fact, I'm an IT Manager. Unfortunately, CTV edited some of my comments which is probably why they didn't make sense to you. Regardless, I'm saying that as a home consumer, even if I could get 5 meg down, I don't need that kind of speed. And when the actual providers begin laying down more infrastructure, the cost of high speed internet will invariably go up.
Mike - you need to look up the definition of bandwidth. What you're talking about is not bandwidth at all. Bandwidth is the speed you get, not the arbitrary caps that ISPs can put on your transmissions.