Friday, 21 December 2012

Virtual reality and robotics in neurosurgery: Promise and challenges

Dec. 20, 2012 — Robotic technologies have the potential to help neurosurgeons perform precise, technically demanding operations, together with virtual reality environments to help them navigate through the brain, according to researchers.

The topic is the focus of a special supplement to Neurosurgery (http://www.neurosurgery-online.com/), official journal of the Congress of Neurological Surgeons. The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

"Virtual Reality (VR) and robotics are two rapidly expanding fields with growing application within neurosurgery," according to an introductory article by Garnette Sutherland, MD. The 22 reviews, commentaries, and original studies in the special supplement provide an up-to-the-minute overview of "the benefits and ongoing challenges related to the latest incarnations of these technologies."

Robotics and VR in Neurosurgery -- What's Here and What's Next Virtual reality and robotic technologies present exciting opportunities for training, planning, and actual performance of neurosurgical procedures. Robotic tools under development or already in use can provide mechanical assistance, such as steadying the surgeon's hand or "scaling" hand movements. "Current robots work in tandem with human operators to combine the advantages of human thinking with the capabilities of robots to provide data, to optimize localization on a moving subject, to operate in difficult positions, or to perform without muscle fatigue," writes Dr. Sutherland.

Virtual reality technologies play an important role, providing "spatial orientation" between robotic instruments and the surgeon. Virtual reality environments "recreate the surgical space" in which the surgeon works, providing 3-D visual images as well as haptic (sense of touch) feedback. The ability to plan, rehearse, and "play back" operations in the brain could be particularly valuable for training neurosurgery residents -- especially since recent work hour changes have limited opportunities for operating room experience.

The special supplement to Neurosurgery presents authoritative updates by experts working in the field of surgical robotics and VR technology, drawn from a wide range of disciplines. Topics include robotic technologies already in use, such as the "neuroArm" image-guided neurosurgical robot; reviews of progress in areas such as 3-D neurosurgical planning and virtual endoscopy; and new thinking on the best approaches to development, evaluation, and clinical uses of VR and robotic technologies.

But numerous and daunting technical challenges remain to be met before robotic and VR technologies become widely used in clinical neurosurgery. For example, VR environments require extremely fast processing times to provide the surgeon with continuously updated sensory information -- equal to or faster than the brain's ability to perceive it.

Economic challenges include the high costs of developing and implementing VR and robotic technologies, especially in terms of showing that the costs are justified by benefits to the patient. Continued progress in miniaturization will play an important role both in overcoming the technical challenges and in making the technology cost-effective.

The editors of Neurosurgery hope their supplement will stimulate interest and further progress in the development and practical implementation of VR and robotic technologies for neurosurgery. Dr. Sutherland adds, "Collaboration between the fields of medicine, engineering, science, and technology will allow innovations in these fields to converge in new products that will benefit patients with neurosurgical disease."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Wolters Kluwer Health, via Newswise.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Wednesday, 19 December 2012

The jacket that talks to Facebook in an emergency

Sep. 17, 2012 — In an emergency situation, we cannot expect rescue crews to do their jobs while fumbling with a tiny mobile phone when they need to read and send messages. Instead the scientists decided to create a prototype jacket that could communicate with Facebook.

Collision. Fire. Accidents. Chaos. In a rescue operation, it's no use trying to communicate via a small mobile phone display. But a jacket -- now you're talking!

The EU Societies project is all about technology and communication in extreme situations, such as rescue operations following major accidents. ICT researchers at SINTEF have been working on this topic for a long time, and the idea of developing a physical user interface for social media came from seeing how limited a normal mobile phone is as an aid during a chaotic emergency situation.

Screen is a minus in an emergency Most of our focus when we use computers is on the screen. We communicate via a display. But in an emergency situation, we cannot expect rescue crews to do their jobs while fumbling with a tiny mobile phone when they need to read and send messages. It doesn't just require full concentration -- it also requires two hands.

Similarly, a firefighter called out to an emergency doesn't have time to put on anything special -- he or she just grabs a jacket and helmet, and runs. "Crews therefore need devices with a much simpler user interface. That was the basic idea behind making the jackets," says researcher Babak Farshchian of SINTEF ICT.

BlueTooth A group of students at the Norwegian University of Science and Technology's (NTNU) Department of Computer and Information Science (IDI) decided to create a prototype jacket that could communicate with Facebook, and have been working on the assignment for the last six months. They decided to use the Arduino platform to create the physical user interface with social media. Arduino is a popular system used to develop physical prototypes that integrate with ICT. The platform that supports the jacket communicates with an ordinary Android mobile phone via BlueTooth. This means that the user does not get tangled in cables.

Keyboard in the sleeve They bought a simple lined jacket from a popular sports retailer called XXL, and inserted the cables and sensors between the inner and outer layers. Then they put a battery-operated circuit in the pocket, which controls the sensors and microphone. All the cables and electronics are concealed from the user. Instead of a telephone display, the jacket sleeve has a display sewn into it, showing a line of rolling text. The user will also feel a vibration in his or her neck, made via a small vibrator inserted in the collar. A vibration means that the person has received a message, which he or she can read by lifting an arm and looking at the display. Rescue work is often carried out in large groups, with professionals from different units and organizations that need to communicate and coordinate their actions efficiently during a rescue operation. "By using social media technology, we can enable these groups to communicate, and this jacket with a similar, customized user interface makes it easy and practical to use more advanced ICT in demanding rescue work," says Farshchian.

Better adapted to needs Easier access to social media is an idea that could be of interest to those with sight and hearing impairments, since these groups have problems using a screen. Being able to dictate and hear messages would not only be more user friendly, but also better adapted to their needs.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by SINTEF, via AlphaGalileo.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

'Liquid that thinks:' Swarm of ping-pong-ball-sized robots created

Dec. 14, 2012 — University of Colorado Boulder Assistant Professor Nikolaus Correll likes to think in multiples. If one robot can accomplish a singular task, think how much more could be accomplished if you had hundreds of them.

Correll and his computer science research team, including research associate Dustin Reishus and professional research assistant Nick Farrow, have developed a basic robotic building block, which he hopes to reproduce in large quantities to develop increasingly complex systems.

Recently the team created a swarm of 20 robots, each the size of a Ping Pong ball, which they call "droplets." When the droplets swarm together, Correll said, they form a "liquid that thinks."

To accelerate the pace of innovation, he has created a lab where students can explore and develop new applications of robotics with basic, inexpensive tools.

Similar to the fictional "nanomorphs" depicted in the "Terminator" films, large swarms of intelligent robotic devices could be used for a range of tasks. Swarms of robots could be unleashed to contain an oil spill or to self-assemble into a piece of hardware after being launched separately into space, Correll said.

Correll plans to use the droplets to demonstrate self-assembly and swarm-intelligent behaviors such as pattern recognition, sensor-based motion and adaptive shape change. These behaviors could then be transferred to large swarms for water- or air-based tasks.

Correll hopes to create a design methodology for aggregating the droplets into more complex behaviors such as assembling parts of a large space telescope or an aircraft.

In the fall, Correll received the National Science Foundation's Faculty Early Career Development award known as "CAREER." In addition, he has received support from NSF's Early Concept Grants for Exploratory Research program, as well as NASA and the U.S. Air Force.

He also is continuing work on robotic garden technology he developed at the Massachusetts Institute of Technology in 2009. Correll has been working with Joseph Tanner in CU-Boulder's aerospace engineering sciences department to further develop the technology, involving autonomous sensors and robots that can tend gardens, in conjunction with a model of a long-term space habitat being built by students.

Correll says there is virtually no limit to what might be created through distributed intelligence systems.

"Every living organism is made from a swarm of collaborating cells," he said. "Perhaps some day, our swarms will colonize space where they will assemble habitats and lush gardens for future space explorers."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Colorado at Boulder.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Biology-friendly robot programming language: Training your robot the PaR-PaR way

Oct. 23, 2012 — Teaching a robot a new trick is a challenge. You can't reward it with treats and it doesn't respond to approval or disappointment in your voice. For researchers in the biological sciences, however, the future training of robots has been made much easier thanks to a new program called "PaR-PaR."

Nathan Hillson, a biochemist at the U.S. Department of Energy (DOE)'s Joint BioEnergy Institute (JBEI), led the development of PaR-PaR, which stands for Programming a Robot. PaR-PaR is a simple high-level, biology-friendly, robot-programming language that allows researchers to make better use of liquid-handling robots and thereby make possible experiments that otherwise might not have been considered.

"The syntax and compiler for PaR-PaR are based on computer science principles and a deep understanding of biological workflows," Hillson says. "After minimal training, a biologist should be able to independently write complicated protocols for a robot within an hour. With the adoption of PaR-PaR as a standard cross-platform language, hand-written or software-generated robotic protocols could easily be shared across laboratories."

Hillson, who directs JBEI's Synthetic Biology program and also holds an appointment with the Lawrence Berkeley National Laboratory (Berkeley Lab)'s Physical Biosciences Division, is the corresponding author of a paper describing PaR-PaR that appears in the American Chemical Society journal Synthetic Biology. The paper is titled "PaR-PaR Laboratory Automation Platform." Co-authors are Gregory Linshiz, Nina Stawski, Sean Poust, Changhao Bi and Jay Keasling.

Using robots to perform labor-intensive multi-step biological tasks, such as the construction and cloning of DNA molecules, can increase research productivity and lower costs by reducing experimental error rates and providing more reliable and reproducible experimental data. To date, however, automation companies have targeted the highly-repetitive industrial laboratory operations market while largely ignoring the development of flexible easy-to-use programming tools for dynamic non-repetitive research environments. As a consequence, researchers in the biological sciences have had to depend upon professional programmers or vendor-supplied graphical user interfaces with limited capabilities.

"Our vision was for a single protocol to be executable across different robotic platforms in different laboratories, just as a single computer software program is executable across multiple brands of computer hardware," Hillson says. "We also wanted robotics to be accessible to biologists, not just to robot specialist programmers, and for a laboratory that has a particular brand of robot to benefit from a wide variety of software and protocols."

Hillson, who earlier led the development of a unique software program called "j5" for identifying cost-effective DNA construction strategies, says that beyond enabling biologists to manually instruct robots in a time-effective manner, PaR-PaR can also amplify the utility of biological design automation software tools such as j5.

"Before PaR-PaR, j5 only outputted protocols for one single robot platform," Hillson says. "After PaR-PaR, the same protocol can now be executed on many different robot platforms."

The PaR-PaR language uses an object-oriented approach that represents physical laboratory objects -- including reagents, plastic consumables and laboratory devices -- as virtual objects. Each object has associated properties, such as a name and a physical location, and multiple objects can be grouped together to create a new composite object with its own properties.

Actions can be performed on objects and sequences of actions can be consolidated into procedures that in turn are issued as PaR-PaR commands. Collections of procedural definitions can be imported into PaR-PaR via external modules.

"A researcher, perhaps in conjunction with biological design automation software such as j5, composes a PaR-PaR script that is parsed and sent to a database," Hillson says. "The operational flow of the commands are optimized and adapted to the configuration of a specific robotic platform. Commands are then translated from the PaR-PaR meta-language into the robotic scripting language for execution."

Hillson and his colleagues have developed PaR-PaR as open-source software freely available through its web interface on the public PaR-PaR webserver http://parpar.jbei.org.

"Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work," Hillson says. "PaR-PaR accomplishes all of these objectives and is intended to benefit a broad segment of the biological research community, including non-profits, government agencies and commercial companies."

This work was primarily supported by the DOE Office of Science.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by DOE/Lawrence Berkeley National Laboratory.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Gregory Linshiz, Nina Stawski, Sean Poust, Changhao Bi, Jay D. Keasling, Nathan J. Hillson. PaR-PaR Laboratory Automation Platform. ACS Synthetic Biology, 2012; : 121009112212000 DOI: 10.1021/sb300075t

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Bioinspired robot meets fish: Robotic fish research swims into new ethorobotics waters

Nov. 20, 2012 — New research is illuminating the emerging field of ethorobotics -- the study of bioinspired robots interacting with animal counterparts. They studied how real-time feedback attracted or repelled live zebrafish. The fish were more attracted to robots with tail motions that mimicked the live fish. The researchers hope that robots eventually may steer live animal or marine groups from danger.

Researchers at the Polytechnic Institute of New York University (NYU-Poly) have published findings that further illuminate the emerging field of ethorobotics -- the study of bioinspired robots interacting with live animal counterparts.

Maurizio Porfiri, associate professor of mechanical and aerospace engineering at NYU-Poly, doctoral candidates Vladislav Kopman and Jeffrey Laut and research scholar Giovanni Polverino studied the role of real-time feedback in attracting or repelling live zebrafish in the presence of a robotic fish.

Their findings, published in the Journal of the Royal Society Interface, show that zebrafish demonstrate increased attraction to robots that are able to modulate their tail motions in accordance with the live fishes' behavior.

The researchers deployed image-based tracking software to analyze the movement of the live zebrafish and provide real-time feedback to the robot. Porfiri and his colleagues found that zebrafish were most attracted to the robotic member when its tail beating motion replicated the behavior of "informed fish" attempting to lead "naive fish." When the robotic fish increased its tail beat frequency as a live fish approached, the zebrafish were likeliest to spend time near the robot.

This study shows the effectiveness of real-time visual feedback in efforts to use robots to influence live animal behavior. The findings may have particular application in wildlife conservation, where robotic members may be utilized to steer live animal or marine groups out of harms way.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Polytechnic Institute of New York University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

V. Kopman, J. Laut, G. Polverino, M. Porfiri. Closed-loop control of zebrafish response using a bioinspired robotic-fish in a preference test. Journal of The Royal Society Interface, 2012; 10 (78): 20120540 DOI: 10.1098/rsif.2012.0540

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Dragon readies for operational delivery flight

Oct. 5, 2012 — SpaceX is set to launch the first of a dozen operational missions for NASA to deliver more than 1,000 pounds of supplies to the International Space Station on Oct. 7. Launch time is 8:35 p.m. from Space Launch Complex 40 at Cape Canaveral Air Force Station in Florida, just a few miles south of the space shuttle launch pads. The spacecraft will be joined to the station three days later.

The flight, known as CRS-1, will launch and perform the same rendezvous with the station as a previous SpaceX craft.

The SpaceX Dragon capsule will ride into space on the strength of the company's Falcon 9 rocket and the booster's nine first stage kerosene- and oxygen-powered Merlin engines. The Falcon 9's second stage uses a single Merlin engine to boost the Dragon into its final orbit.

Eleven minutes after launch, when the Dragon capsule is safely in orbit, a pair of solar arrays will deploy from the sides of the Dragon and controllers on Earth will begin testing rendezvous sensors.

The mission is similar to the demonstration flight in May when a Dragon was grappled by the station's robotic arm to complete the first rendezvous and berthing by a private spacecraft at the space station.

The SpaceX craft will spend about three weeks connected to the station then it will be released to return to Earth.

A major difference for this mission is that the Dragon will be filled with an amount of cargo suitable for an operational mission. The prior flight carried just enough items to prove the capsule would do its job as a cargo hauler. This time, the manifest will include a freezer for the station's scientific samples, a powered middeck locker with an experiment inside along with a variety of materials for the astronauts living and working on the space station.

The supply flight is part of NASA's Commercial Resupply Services contract, which is paying SpaceX for 12 cargo runs to the orbiting laboratory. The station also is serviced by Russian Progress cargo capsules, European-made and launched Automated Transfer Vehicles, or ATVs, and Japanese-produced H-II Transfer Vehicles, or HTVs. All the cargo ships operate without astronauts or crew members aboard.

Once the spacecraft arrive at the station, the astronauts and cosmonauts onboard unload them and fill them with used materials or unneeded equipment before releasing them.

Here, SpaceX again does something unique. The Dragons are built with heat shields to survive a plunge through the atmosphere and splashdown safely in the ocean under billowing parachutes. The other cargo craft do not carry heat shields, so they just burn up in the atmosphere.

On its return trip, the Dragon capsule will carry more than a ton of scientific samples collected during space station research, along with the freezer the samples have been stored in. Astronauts also will load used station hardware into the capsule for return to Earth where engineers can get a firsthand look at it.

A second kind of American cargo craft is also being developed. The Orbital Sciences' Cygnus spacecraft and Antares rocket are due to make a demonstration flight later this year.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by NASA.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

NASA's Ironman-like exoskeleton could give astronauts, paraplegics improved mobility and strength

Oct. 12, 2012 — Marvel Comic's fictional superhero, Ironman, uses a powered armor suit that allows him superhuman strength. While NASA's X1 robotic exoskeleton can't do what you see in the movies, the latest robotic, space technology, spinoff derived from NASA's Robonaut 2 project may someday help astronauts stay healthier in space with the added benefit of assisting paraplegics in walking here on Earth.

NASA and The Florida Institute for Human and Machine Cognition (IHMC) of Pensacola, Fla., with the help of engineers from Oceaneering Space Systems of Houston, have jointly developed a robotic exoskeleton called X1. The 57-pound device is a robot that a human could wear over his or her body either to assist or inhibit movement in leg joints.

In the inhibit mode, the robotic device would be used as an in-space exercise machine to supply resistance against leg movement. The same technology could be used in reverse on the ground, potentially helping some individuals walk for the first time.

"Robotics is playing a key role aboard the International Space Station and will continue to be critical as we move toward human exploration of deep space," said Michael Gazarik, director of NASA's Space Technology Program. "What's extraordinary about space technology and our work with projects like Robonaut are the unexpected possibilities space tech spinoffs may have right here on Earth. It's exciting to see a NASA-developed technology that might one day help people with serious ambulatory needs begin to walk again, or even walk for the first time. That's the sort of return on investment NASA is proud to give back to America and the world."

Worn over the legs with a harness that reaches up the back and around the shoulders, X1 has 10 degrees of freedom, or joints -- four motorized joints at the hips and the knees, and six passive joints that allow for sidestepping, turning and pointing, and flexing a foot. There also are multiple adjustment points, allowing the X1 to be used in many different ways.

X1 currently is in a research and development phase, where the primary focus is design, evaluation and improvement of the technology. NASA is examining the potential for the X1 as an exercise device to improve crew health both aboard the space station and during future long-duration missions to an asteroid or Mars. Without taking up valuable space or weight during missions, X1 could replicate common crew exercises, which are vital to keeping astronauts healthy in microgravity. In addition, the device has the ability to measure, record and stream back, in real-time, data to flight controllers on Earth, giving doctors better feedback on the impact of the crew's exercise regimen.

As the technology matures, X1 also could provide a robotic power boost to astronauts as they work on the surface of distant planetary bodies. Coupled with a spacesuit, X1 could provide additional force when needed during surface exploration, improving the ability to walk in a reduced gravity environment, providing even more bang for its small bulk.

Here on Earth, IHMC is interested in developing and using X1 as an assistive walking device. By combining NASA technology and walking algorithms developed at IHMC, X1 has the potential to produce high torques to allow for assisted walking over varied terrain, as well as stair climbing. Preliminary studies using X1 for this purpose have already started at IHMC.

"We greatly value our collaboration with NASA," said Ken Ford, IHMC's director and CEO. "The X1's high-performance capabilities will enable IHMC to continue performing cutting-edge research in mobility assistance while expanding into the field of rehabilitation."

The potential of X1 extends to other applications, including rehabilitation, gait modification and offloading large amounts of weight from the wearer. Preliminary studies by IHMC have shown X1 to be more comfortable, easier to adjust, and easier to put on than previous exoskeleton devices. Researchers plan on improving on the X1 design, adding more active joints to areas such as the ankle and hip, which will, in turn, increase the potential uses for the device.

Designed in only a few years, X1 came from technology developed for Robonaut 2 and IHMC's Mina exoskeleton.

NASA's Game Changing Development Program, part of NASA's Space Technology Program, funds the X1 work. NASA's Space Technology Program focuses on maturing advanced space technologies that may lead to entirely new approaches for space missions and solutions to significant national needs.

For additional information about IHMC, visit: http://www.ihmc.us

For information about the X1 and Robonaut, visit: http://www.nasa.gov/robonaut

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by NASA.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here