Over in Europe, scientists are getting ready to turn on a huge machine. In fact, it is the biggest machine that human beings have ever built, and one of the most expensive. The machine is called the Large Hadron Collider, or LHC, and scientists hope that it will help them unlock some of the deepest, darkest secrets of the universe.
What are some of these secrets? It turns out that there are all sorts of things that scientists don't know about the universe. For example, where does "mass" come from? We know that all things made of atoms have mass, but we don't actually know where mass comes from. And speaking of mass, why can't we see lots of it? When we try to measure the mass of the universe, it seems to be a lot heavier than it should be. There seems to be lots of matter in the universe that we can't see. What is this "dark matter", and where is it hiding? And what about black holes? Can we create tiny black holes, and if we can, how do they behave? What can we learn from them? We may be able to answer all of these questions and many more using the LHC.
What is the LHC, and how does it work? It is an incredibly complex machine. But if we start with the basics, we can understand the essence of the LHC.
We have all heard of atoms. We can make water, for example, by combining hydrogen atoms with oxygen atoms. That's easy enough. What is inside an atom? Using fairly simple experiments at the beginning of the twentieth century, scientists were able to discover electrons, protons and neutrons. By the way, protons and neutrons are known as hadrons.
The next question is obvious: What is inside a hadron? This is not so easy a question to answer. But scientists discovered that they could bash two protons together to learn what's inside. The machine that does the bashing is called a particle accelerator, also known as an atom smasher.
The earliest particle accelerators were very simple and could fit in the palm of your hand. By building bigger and bigger particle accelerators, scientists could learn more and more. The basic idea behind a particle accelerator is simple. You take a particle like a proton, and you put a group of them in a sealed tube. You take all the air out of the tube using a vacuum pump, so the protons don't have anything to run into. Then, using microwave energy (a lot like the energy used in a microwave oven), you accelerate the protons.
Most particle accelerators are shaped like rings, and they contain magnets that steer the protons around the ring and keep the protons bunched together. As the protons accelerate, their speed gets closer and closer to the speed of light.
Protons are incredibly tiny, but at the speed of light they have a lot of energy. To understand this, think about a baseball. If a little kid throws a baseball at you, it probably won't even hurt. If a major league pitcher throws a 100 mph fastball at you, it will hurt a lot. If someone shoots a baseball out of a cannon at 500 mph and it hits you, it will kill you. A proton in a particle accelerator is going 186,000 mph, and it has a lot of energy despite its tiny size.
The Large Hadron Collider is the biggest particle accelerator ever built, and it will create the fastest protons human beings have ever created. Its ring is over 5 miles in diameter and has a tube 17 miles long. And the LHC actually has two tubes, so that two groups of protons can accelerate in opposite directions. The scientists will then slam the two streams of protons together in the biggest head on collision ever.
The collision will happen in an underground detector room that is as big as a warehouse. The detector is basically a gigantic, specialized movie camera that can sense all of the debris that flies out from the collision. The debris contains the particles that make up the protons - things like quarks and leptons. The only reason that we know that quarks and leptons exist is because we have particle accelerators.
Because the collisions in the LHC will be so massive, scientists are hoping that they will see new particles that no one has ever seen before. For example, scientists think there's a particle inside atoms called the Higgs Boson, and that this particle is the thing that gives atoms mass. But scientists have never witnessed a Higgs Boson, so they don't know whether it exists. Scientists also hope that the LHC will have enough energy that they are able to create mini black holes, which will then immediately evaporate because they are so small. And maybe scientists will find new particles that no one has ever imagined before.
Because of these possibilities, scientists all over the planet are excited about the LHC, and thousands of scientists are working on the project. With luck, they can start accelerating their first protons sometime in 2008 and begin making new discoveries. We should learn many new things about how the universe works from the LHC.
Digita information System and data
Digital information is being created at a faster pace than previously thought, and for the first time, the amount of digital information created each year has exceeded the world's available storage space, according to a report from analyst firm IDC.
"This is our first time ... where we couldn't store all the information we create even if we wanted to," states the EMC-sponsored report, titled The Diverse and Exploding Digital Universe.
The amount of information created, captured and replicated in 2007 was 281 exabytes (or 281 billion gigabytes), 10% more than IDC previously believed — and more than the 264 exabytes of the estimated available storage on hard drives, tapes, CDs, DVDs and memory. IDC revised its estimate upward after realising it had underestimated shipments of cameras and digital TVs, as well as the amount of information replication.
The 2007 total is well above that of 2006, when 161 exabytes of digital information was created or replicated.
The world isn't actually running out of storage space, IDC notes, because a lot of digital information doesn't need to be stored. Examples include radio and TV broadcasts consumers listen to and watch but don't record, voice call packets that aren't needed when a call is over, and surveillance video that isn't saved.
But the gap between available storage and digital information will only grow, making it that much harder for vendors and users to efficiently store information that does need to be archived.
In 2011 there will be nearly 1,800 exabytes of information created, twice the amount of available storage, IDC predicts. One long-term experiment planned for the soon-to-open Large Hadron Collider, the world's biggest particle acclerator, in Switzerland by itself will create an amazing 300 exabytes of data per year, IDC says.
EMC's president of content management, Mark Lewis, doesn't think the world will ever hit the point where the world's available storage is exceeded by the amount of information organisations need to store. "With the price points of storage continuing to decline, I don't think we're ever going to create some kind of storage shortage," he says.
Organisations and their employees create about a third of new data, but organisations are ultimately responsible for maintaining the security, privacy and reliability of 85% of all data, according to IDC.
Information growth is placing greater importance on retaining data in lower-cost, environmentally sound ways, with lower-performance drives, archiving and powering down storage devices containing rarely accessed data, Lewis says.
About 70% of new information is created when individuals take actions, such as snapping pictures, making VoIP calls, uploading content to YouTube and sending emails. But more than half of the information related to individuals isn't directly created by them. Rather, the bulk of this digital content is a person's "digital shadow", information about individual human beings sitting in cyberspace. Digital surveillance photos, web search histories, banking and medical records and general backup data all contribute to someone's digital shadow.
Here's a quick look from IDC at how a few businesses and industries contribute to growing data volumes:
@ Wal-Mart refreshes its customer databases hourly, adding a billion new rows of data each hour to a data warehouse that already holds 600 terabytes.
@ The oil and gas industry is developing a "digital oilfield" to monitor exploration activity. Chevron's system accumulates 2 terabytes of new data each day.
@ The utility industry may develop an "intelligent grid" with millions of sensors in the distribution system and power meters.
@ Manufacturing companies are rapidly deploying digital surveillance cameras and RFID tracking.
@YouTube's 100 million users create nearly as much digital information as all medical imaging operations
Sunday, March 23, 2008
Sony backed down to its decision to charge its customers $50 to remove trial software, known as “bloatware or “trialware,” from hard disks of new laptops.
The company has started to offer a new option called “Fresh Start,” which assured customers they will receive their new computers free from all the annoying applications and applets trying to convince them to buy various software applications.
“Opt for a Fresh Start and your VAIO PC will undergo a system optimization service where specific VAIO applications, trial software and games are removed from your unit prior to shipment. Fresh Start safely scrubs your PC to free up valuable hard drive space and conserve memory and processing power while maximizing overall system performance right from the start,” is the official description of the service, according to Sony’s website.
But the controversy started soon after Engadget has posted an article about the $50 Sony’s Free Start. As the customers showed their irritation for having to pay $50 for a bloatware-free computer, Sony changed its mind and announced that it would offer the new option as a free option. For the moment, the option is limited at the Vaio TZ2000 and Vaio TZ2500 model laptops.
After causing controversy for charging US$49.99 to remove trial software from hard disks of new laptops, Sony has backtracked from imposing the fee on customers.
Starting on Saturday, Sony's Fresh Start software optimization feature will be free, the company announced.
Fresh Start is a Sony feature that lets customers buy certain laptops without so-called "bloatware," trial software that laptop makers often load onto new machines. Sony was asking buyers of the Vaio TZ2000 and Vaio TZ2500 notebooks with the Windows Vista Business OS to pay $49.99 for the removal of the extra software. Those customers already pay an additional $100 to upgrade to Windows Vista Business OS from Windows Vista Home Premium.
But after an uproar erupted online Friday in response to the Fresh Start fee, Sony has decided to offer the option for free.
"We want Vaio users to have the best experience possible with our PCs, and we believe Fresh Start will help ensure that happens right out-of-the-box," the company said in a statement.
Sony earlier justified the fee by saying it covers removal of the unwanted software before shipment.
Fresh Start is only available to buyers of the two Vaio customizable machines.
Customers opting for Sony's Fresh Start could miss out on software including Microsoft Works, bundled with a trial version of Microsoft Office; Sony's Vaio Creation Suite Photo Software with a Corel Paint Shop Pro trial version; the Click to Disc video editor WinDVD; and an edition of QuickBooks Simple Start that tracked only 20 customers.
Software publishers often pay PC manufacturers to include trial versions on computers they ship. Bloatware, as it is also known, can reduce system performance and available hard disk space, and take away system resources. It could also affect office productivity by introducing security vulnerabilities, and bloatware games can distract workers.
Dell last year offered the removal of bloatware from its Vostro line of PCs. Everex is also among a few vendors that offer PCs with the option to remove bloatware.
A cross-section of nano-crystalline bismuth antimony telluride grains, as viewed through transmission electron microscope. Colors highlight the features of each grain of the semiconductor alloy in bulk form. A team of researchers from Boston College and MIT produced a major increase in thermoelectric efficiency after using nanotechnology to structure the material, which is commonly used in industry and research. Credit: Boston College, MIT and GMZ Inc.
Thermoelectric coolers and power generators were handed a 40-percent boost in performance recently by a nanotechnological reconstruction of a classic bulk material. The technique is suitable for mass production, according to its inventors at Boston College and the Massachusetts Institute of Technology (MIT). This makes it of use in both industrial and consumer cooling applications from semiconductors to nanoscale power generators.
Researchers at Boston College and MIT have used nanotechnology to achieve a major increase in thermoelectric efficiency, a milestone that paves the way for a new generation of products — from semiconductors and air conditioners to car exhaust systems and solar power technology — that run cleaner.
The team’s low-cost approach, details of which are published today in the online version of the journal Science, involves building tiny alloy nanostructures that can serve as micro-coolers and power generators. The researchers said that in addition to being inexpensive, their method will likely result in practical, near-term enhancements to make products consume less energy or capture energy that would otherwise be wasted.
The findings represent a key milestone in the quest to harness the thermoelectric effect, which has both enticed and frustrated scientists since its discovery in the early 19th century. The effect refers to certain materials that can convert heat into electricity and vice versa. But there has been a hitch in trying to exploit the effect: most materials that conduct electricity also conduct heat, so their temperature equalizes quickly. In order to improve efficiency, scientists have sought materials that will conduct electricity but not similarly conduct heat.
Using nanotechnology, the researchers at BC and MIT produced a big increase in the thermoelectric efficiency of bismuth antimony telluride — a semiconductor alloy that has been commonly used in commercial devices since the 1950s — in bulk form. Specifically, the team realized a 40 percent increase in the alloy’s figure of merit, a term scientists use to measure a material’s relative performance. The achievement marks the first such gain in a half-century using the cost-effective material that functions at room temperatures and up to 250 degrees Celsius. The success using the relatively inexpensive and environmentally friendly alloy means the discovery can quickly be applied to a range of uses, leading to higher cooling and power generation efficiency.
“By using nanotechnology, we have found a way to improve an old material by breaking it up and then rebuilding it in a composite of nanostructures in bulk form,” said Boston College physicist Zhifeng Ren, one of the leaders of the project. “This method is low cost and can be scaled for mass production. This represents an exciting opportunity to improve the performance of thermoelectric materials in a cost-effective manner.”
“These thermoelectric materials are already used in many applications, but this better material can have a bigger impact,” said Gang Chen, the Warren and Towneley Rohsenow Professor of Mechanical Engineering at MIT and another leader of the project.
At its core, thermoelectricity is the “hot and cool” issue of physics. Heating one end of a wire, for example, causes electrons to move to the cooler end, producing an electric current. In reverse, applying a current to the same wire will carry heat away from a hot section to a cool section. Phonons, a quantum mode of vibration, play a key role because they are the primary means by which heat conduction takes place in insulating solids.
Bismuth antimony telluride is a material commonly used in thermoelectric products, and the researchers crushed it into a nanoscopic dust and then reconstituted it in bulk form, albeit with nanoscale constituents. The grains and irregularities of the reconstituted alloy dramatically slowed the passage of phonons through the material, radically transforming the thermoelectric performance by blocking heat flow while allowing the electrical flow.
In addition to Ren and six researchers at his BC lab, the international team involved MIT researchers, including Chen and Institute Professor Mildred S. Dresselhaus; research scientist Bed Poudel at GMZ Energy, Inc, a Newton, Mass.-based company formed by Ren, Chen, and CEO Mike Clary; as well as BC visiting Professor Junming Liu, a physicist from Nanjing University in China.
Thermoelectric materials have been used by NASA to generate power for far-away spacecraft. These materials have been used by specialty automobile seat makers to keep drivers cool during the summer. The auto industry has been experimenting with ways to use thermoelectric materials to convert waste heat from a car exhaust systems into electric current to help power vehicles.
Scientists believe that an ocean of water and ammonia may exist below the surface of Saturn's largest moon, Titan. The findings appear in the March 21 issue of the journal Science.
The evidence of an underground ocean on Titan comes from observations by the Cassini spacecraft, which has visited Titan 19 times over the past three years.
Scientists marked 50 identifiable locations on Titan from Cassini radar data, and observed that, over time, many had moved up to 19 miles from where they were predicted to be. This led to the conclusion that the planet's crust may separated from the core by an ocean.
This graphic shows a cross-section of Titan showing an ocean that could be 62 miles below the moon's crust
Astronomers' mental image of Titan, the solar system's second-largest moon, used to be that of a vast swimming pool. But maybe they should have imagined a water bed instead.
Last year, researchers reported that radar mapping of Titan by the Cassini spacecraft had found a peculiar shift in landmarks on the moon's surface of up to 19 miles (30 kilometers) between October 2004 and May 2007.
Now investigators say the best explanation is a moon-wide underground ocean that disconnects Titan's icy crust from its rocky interior.
"We think the structure is about 100 kilometers of ice sitting atop a global layer of water … maybe hundreds of kilometers thick," says Cassini scientist Ralph Lorenz of Johns Hopkins University Applied Physics Laboratory in Laurel, Md.
If confirmed, Titan would be the fourth moon in the solar system thought to contain such an internal water ocean, joining Jupiter's satellites Ganymede, Callisto and Europa. Researchers believe that heat from radioactivity in a moon's core or gravitational squeezing may melt a layer of frozen water.
On Titan, Ganymede and Callisto, the liquid would become sandwiched between two different forms of ice, one that floats on water and one that sinks. Astronomers believe that of the four bodies, Europa has a larger and hotter core that directly borders its ocean, which lies beneath a thin layer of ice.
A hidden water layer would add to Titan's impressive resume: Larger in diameter than both Earth's moon and the planet Mercury, Titan is the only satellite in the solar system with a true atmosphere—a dense, rotating fog of nitrogen supporting hydrocarbon clouds made of methane and ethane.
For decades researchers suspected that its frosty surface temperature of around –290 degrees Fahrenheit (–180 degrees Celsius) would cause hydrocarbons to pool on its surface in a vast ocean. But during Cassini's first flyby in October 2004, its radar instruments detected no surface-spanning ocean, only methane lakes near the moon's north pole.
The shift in Titan's geologic features is strange because the moon is locked in a synchronous orbit around Saturn, meaning it always presents the same face to the planet. "It's a little bit improbable that Titan would be rotating asynchronously," Lorenz says.
Writing in Science, he and his colleagues instead connect the geologic displacement to models in which Titan's atmosphere pushes against mountains on the surface.
The exact thickness of the crust is an important component of the group's model of Titan but is not known precisely. Based on the dimensions of the Menrva impact crater, they estimate a thickness of about 60 miles (100 kilometers).
That would make the crust thinner than those of Ganymede or Callisto, where the oceans are thought to lie below as much as 125 miles (200 kilometers) of rock and ice. For Titan's presumed ocean to remain liquid at such a distance from the hot core, the researchers argue that it must contain ammonia.
There may also be other explanations for the observed shifting. In an editorial accompanying the report, planetary scientists Christophe Sotin of the Jet Propulsion Laboratory at the California Institute of Technology (Caltech), Gabriel Tobie of the University of Nantes in France, observe that a periodic wobble in Titan's rotation or, less likely, a recent asteroid impact could also explain the finding.
The ocean interpretation is still the most plausible one, according to David Stevenson, professor of planetary science at Caltech. "This is a perfectly natural thing to do in a water–ice dominated world, provided there is enough heat," he says.
What is less clear, he adds, is the ocean's depth. The movement of the crust likely depends on additional, poorly understood factors, such as seasonal weather patterns and gravitational attraction between the crust and the core, he says.
Luckily, the group's model is testable. It predicts a quickening of Titan's rotation rate in the coming year or two followed by a slowdown—something that can be measured on succeeding Cassini flybys.
As always, the possibility of water leads to talk of potential life. Researchers have speculated that Titan may have long ago harbored life or its building blocks, catalyzed by sunlight reacting with atmospheric carbon and hydrogen.
Experts have considered Europa a better candidate, however, because of the presumed contact between ocean and core, which would provide a steady supply of heat energy.
Lorenz and his colleagues note that Titan's ocean might be stirred instead by cryovolcanism or warmer (but still cold) water welling up from below. The addition of water, Lorenz says, makes Titan "astrobiologically very appealing."
Stevenson, for one, says he still sees Europa as a better bet for life. He agrees that Titan is an attractive natural laboratory for the kind of chemistry that would lead to life, but when it comes to energy sources, sizzling rock "is much better than ice."
On March 6, Apple CEO Steve Jobs unveiled the iPhone software roadmap, released the iPhone Software Development Kit, and introduced the iPhone Enterprise Beta Program.
Apple has announced that iPhone v2.0 will debut in June with a number of new capabilities, making it a more viable enterprise choice for e-mail and unified communications. The upgraded iPhone will support Microsoft Exchange ActiveSync that will make the platform much more palatable to corporate IT departments, since it will permit sophisticated password management, remote wipes of data, VPN capabilities and other enterprise-grade features.
PostPath also announced that it will support iPhone v2.0 in the latest upgrade of its Email Server, v3.1. PostPath is the only e-mail server, other than Exchange, that natively supports Exchange and ActiveSync.
Apple’s recent announcements will clearly create some uncertainty in the mobile messaging market. Apple’s support for third-party application developers to create iPhone-specific applications will likely have two impacts on the mobile messaging market:
* The iPhone’s flexible, “soft” interface will permit developers to provide a variety of new functions that other, less flexible handsets with their fixed buttons will not permit. This will create a new genre of software applications that will permit developers to be more creative in the types of applications that they develop for the iPhone.
* RIM and Microsoft are likely to follow with competitive offerings that attempt to emulate at least some of the functionality of the iPhone, resulting in a new wave of handsets driven by advances in user interface design.
Apple had projected that it would sell 10 million iPhones worldwide by late 2008, giving the company nearly 1% of the worldwide market of 1.15 billion mobile phones sold in 2007, one year after the phone was introduced. As of the end of 2007, Apple’s market share for mobile phones is 0.6%, compared to market leader Nokia at 40.4%. Nearly 4 million iPhones had been sold through 2007, indicating that Apple’s forecast may be on target.