Search This Blog

Wednesday, November 26, 2008

Robotic Surgery effeiency Vs Craze Merkting



Advantages of Robotic Surgery
In today's operating rooms, you'll find two or three surgeons, an anesthesiologist and several nurses, all needed for even the simplest of surgeries. Most surgeries require nearly a dozen people in the room. As with all automation, surgical robots will eventually eliminate the need for some personnel. Taking a glimpse into the future, surgery may require only one surgeon, an anesthesiologist and one or two nurses. In this nearly empty operating room, the doctor sits at a computer console, either in or outside the operating room, using the surgical robot to accomplish what it once took a crowd of people to perform.


The use of a computer console to perform operations from a distance opens up the idea of telesurgery, which would involve a doctor performing delicate surgery miles away from the patient. If the doctor doesn't have to stand over the patient to perform the surgery, and can control the robotic arms from a computer station just a few feet away from the patient, the next step would be performing surgery from locations that are even farther away. If it were possible to use the computer console to move the robotic arms in real-time, then it would be possible for a doctor in California to operate on a patient in New York. A major obstacle in telesurgery has been latency -- the time delay between the doctor moving his or her hands to the robotic arms responding to those movements. Currently, the doctor must be in the room with the patient for robotic systems to react instantly to the doctor's hand movements.


Is Robotic Surgery Better? Or Just Marketing?

Why the U.S. healthcare system (if you want to call it a system, which it isn't) is a mess is obvious. It's mostly because of bureaucratic, inefficient, denial-fixated health insurers—chop out the waste, and escalating costs will come back into line. Considering this albatross as well as various other handicaps, it's amazing that the quality of our healthcare is really good.

Myths, both. Administrative expenses are a relatively small driver of healthcare costs. And the quality of U.S. care not only fails in many respects to measure up to the care delivered in other countries but swings between extremes depending on where you live, the caregiver you see, and the hospital you use. Shannon Brownlee, a visiting scholar at the National Institutes of Health Clinical Center (and a former U.S. News colleague), and oncologist Ezekiel Emanuel, chairman of the center's bioethics department, busted those two myths and three other widespread misconceptions in a well-argued piece in Sunday's Washington Post that is well worth reading.

In their discussion of what is to blame for high and rising costs, they cite technology, among other things, meaning new drugs, new gizmos, new procedures. "Unfortunately," they write, "only a fraction of all that new stuff offers dramatically better outcomes."

That reminded me of a striking admission from Paul Levy, president of Beth Israel Deaconess Medical Center in Boston, who last Friday stated publicly on his blog that the hospital is buying a da Vinci surgical robot for marketing reasons. It costs well over $1 million, not counting its expensive annual care and feeding with new tools and software. All of the hospital's Boston competitors have the robot, and they are drawing referrals away from Beth Israel, which doesn't. "So there you have it," he wrote, his own sentiments clear. "It is an illustrative story of the healthcare system in which we operate."

I'm not sure this is a perfect allegorical example of a pricey technology purchased just because it is new and therefore represents a competitive advantage or, if a hospital doesn't have the technology, the loss of one. It is quite true that the da Vinci robot—which allows a surgeon sitting at a control station to manipulate tiny surgical tools and thus is no more of a "robot" than is a car being driven by a person—has not been shown, with the possible exception of a few specific procedures, to be clinically superior to conventional surgery.

But let's suppose an expensive gadget has been introduced that might be able to do one or more of the following: reduce deaths or complications (saving lives and money), get patients out of the hospital faster (saving money), and get patients back on their feet sooner (making them happier and reducing lost work time). Turning that hypothetical "might" into "yes, it can" or "no, it can't" requires that the gadget be put to use, doesn't it? How can a technology be evaluated without putting hands on, making comparisons with the usual ways, and so on?

Where I have a problem with the Beth Israel situation is that our system is very much driven by marketing. Referring physicians are clients, patients are customers, and every hospital competes for market share. That can mean feeling pressured to have the latest CT scanner or radio-beam therapy or surgical robot. If there aren't enough patients to keep a gadget in use enough to be profitable, get more by hyping the benefits (remember the temporary boom in whole-body scanning a few years ago) or luring patients from other hospitals. Does every hospital in Boston truly need a surgical robot system? Can't expensive technology be pooled?

RTR-4N™ Portable Digital X-Ray Inspection System


Overview
SAIC's RTR-4® X-ray imaging systems are fully portable and compact, designed to rapidly perform X-ray based inspections in the field. The RTR-4N™ configuration consists of a portable X-ray source, an integrated digital imager, and powerful notebook computer. It is used for both Explosive Ordnance Disposal and Non-Destructive Inspection applications.

RTR-4 systems are the only fully-digital portable X-ray systems with ground level imaging available to Explosive Ordnance Disposal (EOD) professionals, meeting the intended purpose of enhancing the safety margin for EOD technicians and innocent civilians. The RTR-4N imaging system with its optional integrated wireless feature is the world's most popular portable digital x-ray system, and provides the ability to quickly and efficiently search for weapons, drugs, and contraband in areas too difficult or time-consuming to search by hand.

Applications
The RTR-4N digital X-ray system is compact, rugged, and portable, which allows it to be useful in a number of scenarios. A few examples of RTR-4 applications include:

Improvised Explosive Device (IED) evaluation and disposal. Bomb technicians from a variety of law enforcement, military, and airport security organizations use RTR-4 systems to investigate suspicious packages for the presence of IEDs.
Unexploded Ordnance (UXO) disposal personnel employ the RTR-4N system with Large Area Imager to evaluate unexploded ordnance and determine fusing condition.
Mail and package evaluation in a mailroom scenario, as well as point-of-entry examination of personal belongings at special events.
Customs personnel utilize the RTR-4N system to x-ray and investigate private vehicles and other odd-shaped objects not appropriate for an x-ray baggage scanner.
Non-Destructive Evaluation/Testing/Inspection (NDE/NDT/NDI) for process control of component assembly, honeycomb aerospace structures and wood building structures.
Features
Portable Notebook Control Unit: The lightweight and powerful notebook computer possesses all the capabilities necessary to acquire and process images, enabling rapid threat assessment.
Powerful and Fast Processor: Notebook computer with Pentium® IV processor provides rapid processing of acquired data.
Large Display for Image Evaluation: The notebook computer display is large, with additional pixels to allow easy image evaluation and enhancement.
Image Analysis Software: Software includes full image analysis methods, such as smoothing, contrast stretch, subtracting, embossing, etc.
High-Capacity Hard Disk, Increased Memory, Built-in CD/RW and USB Ports: Some of the many notebook features that increase the effectiveness and productivity of the user.
Single Case for Transport and Storage: All components are conveniently stored in one hardened foam-lined case for easy, safe, efficient transport and storage.
Wireless Capability: A new integrated wireless option provides a digital and encrypted wireless connection from the Control Unit to the Imager and X-Ray Source with no add-on boxes. The operator, as well as other personnel and property, remain a safe distance from the potentially dangerous item being evaluated.
Upgrades: Several available options allow easy upgrade from the RTR-4/ARS system to the RTR-4N Notebook computer based system, creating an all-in-one case design.
Benefits
The RTR-4N system is a small, lightweight, and durable portable x-ray imaging system that produces better image quality, less noise, and more contrast than typical analog systems. The RTR-4N system's digital transmission means no image degradation. The RTR-4N system operates without film— images are instantly displayed and can be saved in an industry-standard format.

Data is immediately available after the X-ray image is acquired and can be processed and reviewed as the inspection is being completed.

This product is available for purchase on the General Services Administration Law Enforcement and Security Equipment Contract GS-07F-0210J. Visit GSA Advantage for more information on purchasing.


More News On portable Device
A new portable X-ray system, which can generate instant images and which allows for state-of-the-art dentistry to occur anywhere, will be making the rounds at area schools this season, according to the Augusta Regional Dental Clinic.
A $2,500 gift from the Augusta Health Care (AHC) Community Health Foundation helped the regional clinic purchase the portable X-ray system. Donations from the Staunton/Augusta Rotary Club and the American Dental Association also contributed to the program, which required about $20,000 to launch.

The system will be traveling to elementary schools in Staunton, Waynesboro, and Augusta County over the next few months. Any student can be screened for oral health issues and given a dental assessment including free X-rays, sealants, and recommendations for follow-up treatment, said Margaret Hersh, executive director of the Augusta Regional Free Clinic, in a press release sent out by the AHC Community Health Foundation.

Parents at the school were given prior notice of the services available on site and asked if they wanted their child to participate, she explained.

Tuesday, November 25, 2008

34 nm Flash Chip by Intel & Micron


Intel and Micron jointly started s producing NAND flash memory chips using tiny 34-nanometer technology, the companies said Monday.
About NAND flash
NAND flash memory is used to store songs, movies and more in iPods, iPhones and a range of other consumer electronics goods.

The chip is built with a manufacturing process that enables the companies to shrink chip components in order to get more memory in the same amount of space. The latest product can fit 4 GB of memory on a core and eight cores on a layer for a total of 64GB of memory on a two-layer stack within a package.

The technology fits into a standard 48-lead thin small-outline package (TSOP), which is a type of surface-mount integrated-circuit package found in MP3 players, mobile phones, and other devices where space is a premium.
"The tiny 34-nm, 32-GB chip enables our customers to easily increase their NAND storage capacity for a number of consumer and computing products," Brian Shirley, VP of Micron's memory group, said in a statement.

The latest chips are manufactured on 300-mm wafers and are smaller than the size of a thumbnail. The memory is targeted at makers of digital cameras, personal music players, and digital camcorders. In addition, the new technology can be used to increase the storage capacity of solid-state drives, the companies said.

................

The nanometer measurement describes the size of the smallest transistors and other parts that can be manufactured on a single chip. There are about three to six atoms in a nanometer, depending on the type of atom, and there are a billion nanometers in a meter.Chip makers such as Taiwan Semiconductor Manufacturing (TSMC) and Intel currently mass produce chips using technology as tiny as 40nm to 45nm. Generally, the more transistors on a chip and the closer they are together, the faster the chip can perform tasks.

Aside from performance, companies are working to make chips smaller and less expensive because people want ever-smaller, cheaper devices.

IM Flash is manufacturing 32G byte NAND chips the size of a thumbnail with its 34nm technology, and expects the chips to be used in small solid-state drives (SSDs) or flash memory cards aimed at products including digital cameras, digital camcorders and personal music players.

The 32G byte chips are multi-level cell (MLC) chips, which means they can handle more rewrites than the single level cell (SLC) variety of NAND flash.

Samsung Electronics, the world's largest NAND flash memory chip maker, is currently upgrading its chip factories to use 42nm technology and plans to start 30nm production next year.

The company showed off a multi-level cell 64G byte NAND flash memory chip made using 30nm manufacturing technology last year.

Tuesday, November 18, 2008

Large Hadron Collider repairs cost $21 Million


World’s most ambitious scientific project, the Large Hadron Collider, will take over six months and $21 million to repair, after a faulty electrical connection between the accelerator’s magnets on September 19 caused a helium leak that forced the particle accelerator to be shut down.

The incident took everyone by surprise, and following a detailed investigation, the conclusion was not as encouraging as the first assumptions made by CERN at the time. Not only were they unable to repair the particle accelerator by the end of the year, but the machine apparently won’t be functional until at least June.

The investigation results, made public in late October, revealed that a fault occurred in the electrical bus connection in the region between a dipole and a quadrupole, which resulted in mechanical damage and release of helium from the magnet cold mass into the tunnel.

However, there was no damage in neighboring interconnections, but the investigators did find contamination by soot-like dust which propagated in the beam pipes over some distance, as well as damage to the multilayer insulation blankets of the cryostats.

CERN spokesman James Gillies said in an interview with the Associated Press that the Collider is estimated to be restarted by the end of June or later. “If we can do it sooner, all well and good. But I think we can do it realistically in early summer.”

Despite these difficulties, the inauguration of the Large Hadron Collider took place in Geneva on October 21. “The younger generations target their ambitions on what they experience while growing up,” said Torsten Akesson, president of CERN Council. “Science and technology need flagships that stand and catch the eye, excite fantasy and fuel curiosity. The LHC is one such flagship.”

The European Center for Nuclear Research (CERN) is the world’s largest particle physics laboratory. The LHC accelerator is capable of producing beams seven times more energetic than any other similar machine, and the beams are expected to reach their maximum intensity (30 times greater) by 2010, when the machine will reach maximum design performance.

Physicists around the world will conduct several experiments, hoping to understand more about our Universe and the principles of physics. The particle accelerator will be used to recreate the conditions after the Big Bang.

more...

Repairing the Large Hadron Collider (LHC) near Geneva will cost almost £14m ($21m) and "realistically" take until at least next summer to start back up.

An electrical failure shut the £3.6bn ($6.6bn) machine down in September.

The European Organization for Nuclear Research (Cern) thought it would only be out of action until November but the damage was worse than expected.

It is hoped repairs will be completed by May or early June with the machine restarted at the end of June or later.

Cern spokesman James Gillies said: "If we can do it sooner, all well and good. But I think we can do it realistically (in) early summer."

Fundamental questions
The LHC was built to smash protons together at huge speeds, recreating conditions moments after the Big Bang, and scientists hope it will shed light on fundamental questions in physics.

The fault occurred just nine days after it was turned on with Cern blaming the shutdown on the failure of a single, badly soldered electrical connection in one of its super-cooled magnet sections.

The collider operates at temperatures colder than outer space for maximum efficiency and experts needed to gradually warm the damaged section to assess it.

"Now the sector is warm so they are able to go in and physically look at each of the interconnections," Mr Gillies told Associated Press.

The cost of the work will fall within the Cern's existing budget.

Dr Lyn Evans, the Welsh-born project director has called the collider "a discovery machine, the most sophisticated scientific instrument of our time."

Spiders in space aren't new.

Spiders have been spotted on the space station. These creatures are welcome guests, though one of them is missing at the moment.
Spiders in space aren't new. Two arachnids named Arabella and Anita flew to Skylab in 1973. Scientists were curious to see how the spiders would react in weightlessness – whether their webs would be different and how would they eat and sleep. Spiders in your house may send you scurrying for a shoe but spiders in space are almost hypnotic, as they struggle to weave a symmetric web in zero gravity.
Experiments with insects are an easy way for teachers across the country like to get students involved in hands-on science, which is the goal of sending these two spiders, and some butterflies, into orbit on the latest shuttle mission to the space station.
Spider One, on view in his box with clear windows, was busy spinning a very tangled web, but Spider Two appears to be AWOL. Flight Director Ginger Kerrick says they are looking for him. "The way it was explained to me is that he is a backup spider and he has his own contained space on board and he had moved out of that area – we think he came out of his bedroom and is in the living room of his house."
Kirk Shireman, deputy shuttle program manager, says while only one spider is visible, that doesn't mean the other is missing. 'We don't believe he has escaped the payload – I am sure we will find him spinning a web somewhere in the next few days."
Astronaut Sandy Magnus, the newest member of the space station crew, was asked how the visible spider was doing.
Mission Control: "Is it weaving an organized looking web or is it something neat to see?"
Magnus replied : " The web is more or less three dimensional and it looks like it is all over the inside of the box, more of a tangled disorganized looking web that a Charlotte's Web kind of web."
The spiders will return to Earth when the space shuttle Endeavour lands at the end of its 15-day mission later this month.

Monday, November 17, 2008

How nanowires made of silicon "

Researchers from IBM and Purdue University have discovered that tiny structures called silicon nanowires might be ideal for manufacturing in future computers and consumer electronics.

The researchers used an instrument called a transmission electron microscope to watch how nanowires made of silicon "nucleate," or begin to form, before growing into wires, said Eric Stach, an assistant professor of materials engineering at Purdue University.
The work is based at IBM's Thomas J. Watson Research Center in Yorktown Heights, N.Y., and at Purdue's Birck Nanotechnology Center in the university's Discovery Park. The research is funded by the National Science Foundation through the NSF's Electronic and Photonic Materials Program in the Division of Materials Research.
The nucleation process can be likened to the beginning of ice forming in a pool of water placed in a freezer. The liquid undergoes a "phase transition," changing from the liquid to the solid phase.
"What's unusual about this work is that we are looking at these things on an extremely small scale," Stach said. "The three major findings are that you can see that the nucleation process on this small scale is highly repeatable, that you can measure and predict when it's going to occur, and that those two facts together give you a sense that you could confidently design systems to manufacture these nanowires for electronics."
It was the first time researchers had made such precise measurements of the nucleation process in nanowires, he said.
Findings will be detailed in a research paper appearing Friday (Nov. 14) in the journal Science. The paper was written by Purdue doctoral student Bong Joong Kim, Stach and IBM materials scientists Frances Ross, Jerry Tersoff, Suneel Kodambaka and Mark Reuter from the physical sciences department at the Watson Research Center.
The silicon nanowires begin forming from tiny gold nanoparticles ranging in size from 10 to 40 nanometers, or billionths of a meter. By comparison, a human red blood cell is more than 100 times larger than the gold particles.
The gold particles are placed in the microscope's vacuum chamber and then exposed to a gas containing silicon, and the particles act as a catalyst to liberate silicon from the gas to form into solid wires. The particles are heated to about 600 degrees Celsius, or more than 1,100 degrees Fahrenheit, causing them to melt as they fill with silicon from the gas. With increasing exposure, the liquid gold eventually contains too much silicon and is said to become "supersaturated," and the silicon precipitates as a solid, causing the nanowire to begin forming.
"We found that there is a single nucleation event in each little droplet and that all of the nucleation events occur in a very controllable fashion," Stach said. "The implication is that if you are trying to create electronic devices based on these technologies, you could actually predict when things are going to start their crystal growth process. You can see that it's going to happen the same way every time, and thus that there is some potential for doing things in a repeatable fashion in electronics manufacturing."
Although the researchers studied silicon, the same findings could be applied to manufacturing nanowires made of other semiconducting materials. The electron microscope is the only instrument capable of observing the nanowire nucleation process, which would have to be a thousand times larger to be seen with a light microscope, Stach said.
Nanowires might enable engineers to solve a problem threatening to derail the electronics industry. New technologies will be needed for industry to keep pace with Moore's law, an unofficial rule stating that the number of transistors on a computer chip doubles about every 18 months, resulting in rapid progress in computers and telecommunications. Doubling the number of devices that can fit on a computer chip translates into a similar increase in performance. However, it is becoming increasingly difficult to continue shrinking electronic devices made of conventional silicon-based semiconductors.
"In something like five to, at most, 10 years, silicon transistor dimensions will have been scaled to their limit," Stach said.
Transistors made of nanowires represent one potential way to continue the tradition of Moore's law.
"Nanowires of silicon and things like gallium arsenide, gallium nitride or indium arsenide, or other types of exotic semiconductors, are being investigated as a step toward continuing to scale electronics down," Stach said. "If you want to manufacture devices made of nanowires, make them the same way every time on a 12-inch wafer, then you need to understand the basic physics of how to start their growth, the kinetics of their continued growth, how to quantify that, how to understand it. We are looking at all steps in nucleation."
One challenge to using nanowires in electronics will be replacing gold as a catalyst with other metals that are better suited for the electronics industry, Stach said.
The gold particles are created inside the microscope chamber, but future research may use gold nanoparticles manufactured to more uniform standards using a different technology.
The research was conducted using an IBM microscope. The researchers also are extending the observations using a transmission electron microscope at the Birck Nanotechnology Center to look at smaller nanoparticles.

New action would be taken Alternative of Coal Fuel

New action would be taken Alternative of Coal Fuel,
An environmental review board has shot down the EPA approval of a new coal plant, stating that the Environment Protection Agency needs to come up with nationwide standards for dealing with carbon dioxide. The decision will cause lengthy and stricter rules, making the investment in expensive coal plants substantially riskier.

Therefore, the money will go into alternative energy, like solar or wind energy. The Environmental Appeals Boards’ decision to send the coal plans back to EPA with instructions to come up with standards was not a legal victory exactly, but the result is practically the same. Basically, the agency’s regional office has to at least consider whether to regulate carbon dioxide emissions, before it gives a green light to build the plant located in Utah.

The decision — which responded to a Sierra Club petition to review an E.P.A. permit granted to a coal plant in Utah — does not require the E.P.A. to limit carbon dioxide emissions from power plants, something which environmentalists have long sought.

Rather, it requires the agency’s regional office to at least consider whether to regulate carbon dioxide emissions, before the agency gives a green light to build the Utah plant. On a broader scale, it will delay the building of coal-fired power plants across the country, long enough for the Obama administration to determine its policy on coal, according to David Bookbinder, chief climate counsel for the Sierra Club.

“They’re sending this permit — and effectively sending every other permit — back to square one,” he said, adding, “It’s minimum a one to two year delay for every proposed coal-fired power plant in the United States.”


The decision references the landmark Massachusetts v. E.P.A. decision last year that declared carbon dioxide a pollutant under the Clean Air Act. That ruling, however, has not yet prompted the E.P.A. to act to regulate it.

It is the latest setback for coal plants, which emit far more carbon dioxide than natural gas or other power plants. Last year Kansas state regulators denied a permit to a coal plant on the grounds of its carbon dioxide emissions.

“Although a new administration could always have reversed course, this makes it easier by providing the first prod,” said Jody Freeman, director of the environmental law program at Harvard Law School. “And it’s a heads-up to the coal industry that stationary-source regulation of CO2 is coming.”

The coal industry put its best face on the decision. The ruling “merely says what the court has said — that the E.P.A. has the authority to regulate greenhouse gases under the Clean Air Act,” said Carol Raulston, a spokeswoman for the National Mining Association, an industry group.

However, she said, before rulemaking occurs, the E.P.A. has to make an “endangerment” finding, which has not yet been done. An “endangerment” finding would involve the E.P.A. declaring that carbon dioxide is a danger to public welfare, and would lead to regulation.

“We still believe, as do many in Congress, that the Clean Air Act is not very well structured to regulate greenhouse gases, and that Congress ought to address this through legislation,” added Ms. Raulston.

Ms. Freeman said that this week’s decision was part of a larger debate going forward “over whether and how the Clean Air Act might be used to regulate greenhouse gases while we wait for new climate legislation.

“E.P.A. has the authority to impose limits on CO2 coming from sources like power plants through the normal permit process,” she continued. “And we may see this happen in the new administration.”
The decision — which responded to a Sierra Club petition to review an E.P.A. permit granted to a coal plant in Utah — does not require the E.P.A. to limit carbon dioxide emissions from power plants, something which environmentalists have long sought.

Rather, it requires the agency’s regional office to at least consider whether to regulate carbon dioxide emissions, before the agency gives a green light to build the Utah plant. On a broader scale, it will delay the building of coal-fired power plants across the country, long enough for the Obama administration to determine its policy on coal, according to David Bookbinder, chief climate counsel for the Sierra Club.

“They’re sending this permit — and effectively sending every other permit — back to square one,” he said, adding, “It’s minimum a one to two year delay for every proposed coal-fired power plant in the United States.”


The decision references the landmark Massachusetts v. E.P.A. decision last year that declared carbon dioxide a pollutant under the Clean Air Act. That ruling, however, has not yet prompted the E.P.A. to act to regulate it.

It is the latest setback for coal plants, which emit far more carbon dioxide than natural gas or other power plants. Last year Kansas state regulators denied a permit to a coal plant on the grounds of its carbon dioxide emissions.

“Although a new administration could always have reversed course, this makes it easier by providing the first prod,” said Jody Freeman, director of the environmental law program at Harvard Law School. “And it’s a heads-up to the coal industry that stationary-source regulation of CO2 is coming.”

The coal industry put its best face on the decision. The ruling “merely says what the court has said — that the E.P.A. has the authority to regulate greenhouse gases under the Clean Air Act,” said Carol Raulston, a spokeswoman for the National Mining Association, an industry group.

However, she said, before rulemaking occurs, the E.P.A. has to make an “endangerment” finding, which has not yet been done. An “endangerment” finding would involve the E.P.A. declaring that carbon dioxide is a danger to public welfare, and would lead to regulation.

“We still believe, as do many in Congress, that the Clean Air Act is not very well structured to regulate greenhouse gases, and that Congress ought to address this through legislation,” added Ms. Raulston.

Ms. Freeman said that this week’s decision was part of a larger debate going forward “over whether and how the Clean Air Act might be used to regulate greenhouse gases while we wait for new climate legislation.

“E.P.A. has the authority to impose limits on CO2 coming from sources like power plants through the normal permit process,” she continued. “And we may see this happen in the new administration.”

Friday, November 7, 2008

Computers at the headquarters of the Barack Obama and John McCain campaigns were hacked during the campaign

Obama, McCain campaigns' computers hacked for policy data
The source said the computers were hacked mid-summer by either a foreign government or organization.
Another source, a law enforcement official familiar with the investigation, says federal investigators approached both campaigns with information the U.S. government had about the hacking, and the campaigns then hired private companies to mitigate the problem.
U.S. authorities, according to one of the sources, believe they know who the foreign entity responsible for the hacking is, but refused to identify it in any way, including what country.
The source, confirming the attacks that were first reported by Newsweek, said the sophisticated intrusions appeared aimed at gaining information about the evolution of policy positions in order to gain leverage in future dealings with whomever was elected.
The FBI is investigating, one of the sources confirmed to CNN. The FBI and Secret Service refused comment on the incidents. Watch Brian Todd's report on the investigation. »
The sources refused to speak on the record due to the ongoing investigation and also because it is a sensitive matter involving presidential politics.
As described by a Newsweek reporter with special access while working on a post-campaign special, workers in Obama's headquarters first detected what they thought was a computer virus that was trying to obtain users' personal information.
The next day, agents from the FBI and Secret Service came to the office and said, "You have a problem way bigger than what you understand ... you have been compromised, and a serious amount of files have been loaded off your system."


Some computers are too important to be networked

There is a common defensive computing thread in two recent stories.
In the first story, Newsweek reports that both presidential candidates had their campaign computers hacked from afar. As they put it:
The computer systems of both the Obama and McCain campaigns were victims of a sophisticated cyberattack by an unknown "foreign entity," prompting a federal investigation, both the FBI and the Secret Service came to the campaign with an ominous warning: "You have a problem way bigger than what you understand," an agent told Obama's team. "You have been compromised, and a serious amount of files have been loaded off your system." ... Officials at the FBI and the White House told the Obama campaign that they believed a foreign entity or organization sought to gather information... "
The second story involves a former Intel employee who allegedly stole trade secrets. As CNET's Stephanie Condon writes, the employee resigned, yet continued on the Intel payroll for a few weeks (perhaps working off vacation time). During this transition period, he started working for Intel rival AMD, yet he remained in possession of his Intel laptop and still had access to Intel's computer network. The FBI later found him in possession of "top secret" Intel files worth more than $1 billion in research and development costs.
The lesson is clear. If you have really valuable or sensitive files, don't make them remotely accessible. Cut the wire. Some files should never be available off-site.
If this means buying a new computer just to hold really sensitive files, it's money well spent.
A couple years ago, I heard someone from the hacker group 2600 give out this same advice on their radio show, Off The Hook. It made sense back then and makes even more sense now.
Windows passwords are easily hacked. Instead of relying on a Windows password for local physical security, set both a power-on password and, if the computer supports it, a hard disk password. Whole disk encryption is another option, but one that involves much more work to implement.
If you put sensitive files on a laptop computer, then consider storing it in a safe when not in use. If you have a small safe, get a small laptop or a Netbook.
Laptops need more than just cutting the Ethernet wire. To begin with, turn off the Wi-Fi radio (there is probably a switch or a function key for this). If the laptop has Bluetooth, physically turn that off too.
Then, turn off the networking features in the operating system.
On Windows, turn off file sharing for every network adapter and turn off every network protocol. Then, disable all the network adapters.
Finally, disable the underlying Windows services that handle networking. On Windows XP this would be: Wireless Zero Configuration, Server, Computer Browser, Workstation and SSDP Discovery. Then since, the machine will be off-line forever, there are quite a few other Windows XP services that won't be needed and can be disabled: Automatic Updates, Distributed Link Tracking Client, Distributed Transaction Coordinator, Net Logon, NetMeeting Remote Desktop Sharing, Network DDE, Network DDE DSDM, Network Location Awareness (NLA), Network Provisioning Service, Remote Desktop Help Session Manager, Remote Registry and WebClient. The laptop I'm writing this on also has an Infrared Monitor service. I don't know what it's for, but I keep it disabled.
All told, this isn't much work and doesn't involve much expense. Yet, it's great insurance and can leave your sensitive files better defended than those at Intel and each presidential campaign.

more.......

Foreign hackers infiltrated the networks of John McCain and Barack Obama during the US presidential campaign.

CNN and Newsweek cited sources within both camps as reporting that hackers from an undisclosed foreign location targeted each network over the summer in an attempt to acquire information.
The report did not specify which group or nation was responsible for the attacks, but the target appears to be documents outlining the candidates' policy proposals.
The information would reportedly have been used in future policy negotiations with the winning candidate.
Following the attacks both camps reportedly hired outside consultants to seal up any security flaws, and the FBI and Secret Service are both said to be investigating the incidents.
Hacking for political reasons has emerged in recent years as a companion to traditional espionage. In 2007, Chinese government officials were accused of hacking government sites in the US, France, Germany and the UK.
Russian nationalists have also been thought to use cyber-attacks to supplement their political efforts. In the midst of conflicts with Estonia and Georgia, Russian hackers were said to be masterminding attacks on government and social infrastructure sites.

Find here

Home II Large Hadron Cillider News