Sneaky Robot ‘Cheats’ at Game

June 29, 2012

It might be time to start designing robotic ethical software — researchers at the Ishikawa Oku Lab at the University of Toyko have designed a robot with sensors fast enough to ‘cheat’ at rock, paper, scissors. The robot manages to process its opponent’s move and make its own so fast that its impossible to tell that it’s cheating — except for the fact that it wins 100% of the time.

While this video is a fun demonstration, it also shows the amazing progress in robotic and machine vision technology. The more advanced the technology can get, the more applications we can discover. What could we do with a robot that can ‘see’ and react faster than humanly possible?

Source: Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time on IEEE Spectrum

Advertisements

Service Robot Scales Wind Towers Vertically

June 28, 2012

The robotics industry is full of futuristic-minded people who realize the benefits of using technology to do jobs that are tedious, dangerous, or otherwise difficult for people. Helical Robotics has coupled the technology of robotics with another forward-thinking industry, wind power, to come up with a new tool in the maintaining of alternative energy solutions.

Robots ready for outside, up-tower work
by Paul Dvorak

Access to a wind tower has traditionally required the use of cranes, bucket trucks, or rappelling teams. Engineers at Wisconsin-based Helical Robotics have designed another way and one, they say, streamlines the work. It uses remote-controlled, robotic devices that can scale a wind tower. These robotic platforms can be fitted with a wide range of devices from cameras and non-destructive testing equipment, to robotic arms and lifts.

By using a service robot, the wind power industry can not only save money on expensive maintenance equipment and procedures, but they can also eliminate some of the need to for people to complete tasks at dangerous heights. Click here to read the full article on Helical’s new robot on Windpower Engineering & Development. What other applications can you see for a vertically-gliding robot? Go to Helical Robotics’ website for more information, including the video below.


Robots Get a Feel for Things with Human-Like Hands

June 21, 2012

For a robot to be an effective tool in the workplace, it usually needs to know what sort of materials it’s working with. While it’s relatively easy to program a robot to identify only expected items, it would be helpful to push this technology further, for robots to be able to identify a variety of materials — and maybe be able to learn new ones as well.

These Robot Fingers Can Feel Objects Better than a Human Can

by Keith Wagstaff

Developed by researchers at the University of Southern California’s Viterbi School of Engineering, the BioTac features a soft skin-like material set over a liquid filling, all centered around a solid “skeleton.” The faux-fingerprints help the finger pick up on vibrations as it moves across a surface, which are then detected by a hydrophone inside the “bone” and  processed by a computer to figure out what the material is.

It’s actually pretty similar to how a human judges texture, except that the BioTac is even more sensitive than a human finger. The researchers had it feel 117 common materials — it correctly identified 95% of them, touching an object an average of five times. The finger can even tell which direction an object is moving and what temperature it is.

Read the full article and watch a cool video of these robotic fingers in use at Time.com. If a robot had fingers instead of grippers, hands instead of claws, what sort of new applications could they be used for? How would that change existing processes?


MIT Researchers Teach Robots to Work with People

June 13, 2012

Last week we linked to an article about the evolving role of robotics in automation from a tool we used to a tool we work with. One of the biggest hurdles in pushing forward is a robot’s limitation of adaptability. A robot would require extensive programming before being able to work alongside a human — and that’s just what the researchers at MIT are working on.

MIT enables robot, human collaboration in manufacturing

by Sharon Gaudin

Researchers are using the algorithm to train robots to work with humans, according to MIT. Shaw and her team are scheduled to present their findings at the Robotics: Science and Systems Conference in Sydney in July.

“It’s an interesting machine-learning human-factors problem,” Shah said. “Using this algorithm, we can significantly improve the robot’s understanding of what the person’s next likely actions are.”

Researchers used a computational model in the form of a decision tree. Each branch of the tree represents a choice that a mechanic might make. For instance, does the mechanic want to put one bolt in place and hammer it in, or does the worker want to put a row of bolts in place first and then hammer them in.

The robot is designed to learn as it works, picking up on the mechanic’s personal preferences.

Read the full article at Computer World. How would a learning, adaptable robot help you in your automated system?


Machine Vision for Robot Guidance

June 7, 2012

by Bennett Brumson , Contributing Editor
Robotic Industries Association
Posted 06/04/2012

Robot using vision for small part handling or assembly, courtesy FANUC Robotics America Corp.Without a vision guidance system, robots would be blind, unable to present itself to parts. The increased power of vision guidance systems eliminate the need for expensive fixtures that often must be removed or modified when manufacturers introduce new products or parts.

“The biggest change in the past five years is how vision-guided robotic systems are used and how these systems can automatically generate new frames and new tools. Increased accuracy of vision guidance systems provides for increased robotic accuracy,” says Steve Prehn, Vision Product Manager at FANUC Robotics America Corp. (Rochester Hills, Michigan).

Seeing Accurately
More powerful and accurate cameras are a boon to end-users of industrial robotics. “Vision guidance systems are able to capture very accurate three-dimensional locations with just one camera,” says Doug Erlemann, Business Development Manager with Baumer Ltd. (Southington, Connecticut). Erlemann sees more accurate software, more rugged equipment and cameras with features that alleviate lighting problems. “Cameras with automatic gain are more accurate and robust. Vision guidance systems take into account more than just vision calculations and robot calculations, but are tied together in the overall system.”

Erlemann speaks of how increased accuracy of robotic vision guidance systems facilitates welding applications. “For very fine applications such as aircraft welds, the guidance system must be perfect. Software can run a welding bead to within 10 microns. In welding applications, 10 microns is very accurate.” Typical welding applications require accuracy to within plus or minus a millimeter or two says Erlemann, but some high-end aircraft demands near-perfect welds.

Screen image of a 2-D vision system in a packing application, courtesy Kawasaki Robotics (USA) Inc.Likewise, Brian Carpenter, Software Engineer with Kawasaki Robotics (USA) Inc. (Wixom, Michigan) sees more accurate vision guidance systems for robotics. “Recently, more single camera three-dimensional systems are available. Resolution and accuracy improvements to stereoscopic systems have increased and do not require calibration and can accommodate different working distance.”

Stereoscopic vision guidance systems allow more precise depth measurement. Camera systems will be capable of locating objects as well as track and predict their location while moving, Carpenter says. “Tracking will allow for faster, more responsive tracking.”

Prehn says vision guidance systems are utilized by end-users as a feedback device for generating very accurate frames and tools. “Robot-mounted cameras and the images they generate refine an object’s position through triangulation, providing for tremendous accuracies. Robots operating within six degrees of freedom are a perfect match with three-dimensional vision-guided solutions.”

Prehn goes on to say, “Robust locational systems have the flexibility to quickly adapt to new parts as they are presented to the robot and provide accurate results to have them engage with new parts. Increased processing power allows integrators to go after markets that would be too difficult otherwise.”

Assembly applications on the micro and nano-levels are among the new markets for robotics served by enhancements to vision guidance systems. “Guidance systems accurately locate very small objects or zoom in to validate positions very precisely. When looking at very small fields of view, resolution goes to the micron range. End-users use vision guidance systems to validate and correct for positional inaccuracies over the robot’s working area,” says Prehn.

Charles Ridley, Material Handling Service Manager with PAR Systems Inc. (Shoreview, Minnesota) also talks about the role of robotic vision guidance systems in micro-assembly applications. “The challenges with micro-assembly are similar to other robotic vision applications. Ensuring that the robot chosen for the application has the repeatability and accuracy to handle the tolerances that come with a micro application is key. The vision guidance system must have a higher resolution.”

Calibrated Guidance

Vision guidance systems require calibration with the robot to ensure proper positioning when that robot performs its tasks, says Greg Garmann, Technology Advancement Manager with Yaskawa America Inc.’s Motoman Robotics Division (Miamisburg, Ohio). “Calibrating multiple camera systems between the robotic space and the vision space so that the robot can understand what the vision camera sees is important. Many applications require variable focal lengths and end-users want automatic focus to determine the depth or distance the guidance camera is from objects.”

Vision-enabled robots at Automate 2011, courtesy Comau RoboticsEnd-users must recalibrate the vision system occasionally, says Garmann. “When the focus is changed, that also changes the field of view and the calibration of the camera system to the robot. End-users want automatic focus so the guidance system can understand different focal lengths.”

Calibration issues are important to end-users of Comau Robotics’ systems (Southfield, Michigan) says Process Technology Director, Joe Cyrek. “With advancements in computing power, systems allow for robot guidance in six degrees of freedom with one camera and cable without calibration. That advancement is significant.” Cyrek adds, “End-users want no calibration and simplicity in vision guidance systems. A single camera, cable, a simple interface without the need for calibration equals increased mean time between failures and decreased mean time to recovery, and fast set up.”

Algorithms and their application into robot guidance solution have changed the perception of robot guidance from “complicated” and “avoid at all costs” to “simple” and “find more ways to use it,” says Cyrek.

Calibration of robotic vision guidance systems is also on the mind of Nicholas Hunt, Automotive Technology Support Group Manager at ABB Inc. (Auburn Hills, Michigan). “I see more light with structured wavefronts coming of age for three-dimensional surface scanning applications. The result requires processing massive amounts of data very quickly. New processors provide the necessary speed to fit the demands of production throughput.” Hunts stresses the need for good calibration between the robot tool center point and the camera, or calibration between the camera and the work cell. “Vision system calibration issues might not be evident until weeks later.”

FANUC’s Prehn sums up the importance of proper calibration. “If mistakes are made in calibrating the camera system or applying positions relative to where the camera finds the object, the integrator can get into trouble.” Taking advantage of training courses offered by robot manufacturers or the Robotic Industries Association (RIA, Ann Arbor, Michigan) in vision guidance systems is a good way to avoid calibration pitfalls.

“Lighting, Lighting, Lighting”
Proper lighting is crucial for guidance systems to function properly and consistently. “Vision has always been about lighting and lensing, and guidance is no exception. Maintaining nominal lighting, whether it be raster Robot with gripper and vision guidance system ready for work, courtesy Applied Robotics Inc.lighting or structured lighting, is key to consistently reporting part position to the robot,” says Henry Loos, Controls and Applications Engineer with Applied Robotics Inc. (Glenville, New York). “Do not skimp on high quality lighting and use high quality lenses.”

ABB’s Nick Hunt says, “In machine vision for robotic guidance, the mantra should be ‘lighting, lighting, lighting.’ While debugging a system, some integrators spend too much time looking at the application or the processing of the information. Integrators should pay more attention to the lighting and focal lengths of the camera.”

When setting up a robotic guidance vision system, lights should be arranged to cast diffused illumination rather than direct light, Hunt suggests. “Machine vision applications should not use direct lighting, but should diffuse lighting with light boxes, light-emitting diode (LED) ring lights or other form of structured light.” Hunt says structured lighting is easier to apply than in the recent past. “Suppliers have stepped up to make simple LED ring lights easy to implement on today’s guidance cameras. The benefits of structured LED lighting in robotic machine vision applications are quicker, lowering integration costs while minimizing the effects of inconsistent ambient light.”

Lighting issues are best addressed in the early planning phase of a work cell to save time, effort and money, Hunt says. “Ambient lighting is very important. Integrators should not underestimate how much difficulty end-users might run into if lighting issues are not well thought out. Vision guidance systems are easier and less expensive to implement than in the past with simple LED ring lights.”

Lighting equipment degrades slowly through time, cautions Ridley. “Lighting characteristics change over time, presenting the biggest stumbling blocks to consistent vision operation.” Similarly Carpenter of Kawasaki Robotics says, “Lighting changes over time, even if LED lights. The intensity of the light changes over years so end-users must budget for making changes to lighting systems or have someone on staff who plans for that.”

2, 2.5, 3-D
Robotic vision guidance systems scan objects in two or three dimensions. Unless the application requires three-dimensional vision guidance, integrators favor using two-dimensional systems. “Get the simplest vision guidance system that meets the needs of the application. If the application only needs 2.5-dimensional guidance, end-users should not bother investing in a three-dimensional system,” advises Cyrek.

“Vision guided robotics systems matured in the two-dimensional arena first,” says Adil Shafi, President of ADVENOVATION Inc. (Rochester Hills, Michigan). “Three dimensional solutions have evolved steadily over the past decade most notably when directed at individual objects with reduced calibration, when the geometry of the part is feature-rich or highly repeatable. This trend will continue to grow as lighting, computing power and algorithms improve. Expect more bin picking solutions in the next decade.”

Three-dimensional guidance systems are being increasingly adopted by end-users due to growing reliability at a lower cost. “Single camera 3D systems are becoming popular due to resolution and accuracy improvements. Single camera three-dimensional systems are great because they do not require as difficult calibration as stereoscopic systems and can accommodate different working distances,” says Carpenter.

Prehn also anticipates more 3D guidance systems in robotic work cells. “Integrators have been doing three-dimensional guidance for quite some time. As guidance technology advances, end-users are able to leverage guidance systems to greater accuracy.” Increased processing power of vision systems and robot controllers allows vision-guided robots to enter new markets, Prehn says.

Watching and Learning
Integrators, robot end-users, teachers, students, newcomers and experts, can learn about vision-guided robotics through a free webinar hosted by RIA and Shafi on Thursday, June 7 at 12:00 PM EDT. The Robots: Vision Guidance Technology (2D) webinar will cover basic concepts, good applications, product examples, flexible tooling, hard tooling, vacuum, vision and other related topics. “I will show application videos and a PowerPoint presentation to provide a background of two-dimensional vision guidance applications,” says Shafi. “The webinar is balanced for experts as well as those who have never used robots before, and everyone in between.”

RIA webinars typically attract 400 viewers from around the world comprised of end-users, integrators, component providers, teachers, students and market analysis professionals. “I design webinars to be easy enough for people new to the industry to understand. I build on this foundation to show new trends and expert perspectives. We are assisted by a panel of industry experts who answer questions and present varied perspectives.”

On Track
As vision guidance systems become more powerful and more compact, they will routinely incorporate tracking and traceability functions, says Erlemann. “The automotive sector will eventually go the way of pharmacy applications, where each pill and bottle is tracked through the manufacturing process. The automotive industry will track each door panel and each part of the panel with individual serial numbers, to track where all parts are put together.”

Tracking and traceability are good for failure analysis, says Erlemann. “When a particular car model is seen as a lemon, the guidance system helps track individual panels or parts. Pharmaceutical applications are required by Food and Drug Administration (FDA) regulations to track each part. The automotive industry will move towards that in the next five years.”

Seeing Guidance in Action
Robot makers, end-users and integrators will show off vision guidance systems and other robotic equipment at Automate 2013. The Automate trade show and conference is collocated with ProMat in Chicago January 21-24, 2013 and will feature robot vision guidance systems. “Comau will once again bring our three-dimensional vision system in a user interactive demonstration of its simplicity and flexibility. Last year’s putt-putt golf ball retrieving robot was a crowd favorite,” recalled Joe Cyrek.

Originally posted on Robotics Online.


The Changing Shape of Robots in Manufacturing

June 5, 2012

Robots in manufacturing often have a role separate from human employees, doing what a man or woman cannot, moving heavy equipment, performing repetitive tasks, working in hazardous situations. But as robotic technology develops, we’ll see a trend of robots not just working opposite people, but with them.

Robots Get a Makeover in Factories
By JAMES R. HAGERTY and MIHO INADA

Today, robots are used mostly in making cars and semiconductors or other goods produced in high volumes and requiring force or precision beyond human levels. They are good in warehouses, too. In March, Amazon.com Inc. AMZN +3.05% announced it is paying $775 million to buy Kiva Systems Inc., a maker of squat, cube-shaped robots that move products around shipping centers.

Kawada’s new NextAge robot, whose sensor eyes give it a passing resemblance to the movie character WALL-E, is “capable of replacing or collaborating with humans.” The robots cost about $90,000 for the basic model.

Japanese industrial conglomerate Hitachi Ltd. 6501.TO -2.52% introduced a NextAge robot last September to a factory outside of Tokyo that makes computer storage products. There, NextAge puts a cover over each hard-disk drive’s fan and tightens screws. This simple task, once handled by a person, shaves off nearly a minute of production per disk drive—a big time saver over the span of thousands of devices—and lets the human workers assemble other types of parts.

Another Japanese company, Glory Ltd., started using NextAge in November 2010 to install a tiny part in its money-sorting equipment for retail stores. The company found that using the robot saves labor costs and brings the defect rate near zero, “which is not possible for human workers,” a Glory spokesman said. Glory now has 10 NextAge robots in the factory north of Tokyo.

ABB is also developing a humanoid-like robot that can squeeze into small workspaces and learn new tasks quickly. The “dual-arm concept robot” will be agile enough to assemble consumer-electronic products, among other things, ABB says.

Researchers aren’t just concerned with developing robots’ fine motor skills. They’re also hoping to increase a robot’s ability to handle uncertainty, unpredictability, and decision making. How do you see such developments impacting the processes of robotic automation and manufacturing?

We’re proud to see so many RIA members on the cutting edge of robotic technology. Read the full article at The Wall Street Journal.