Pulley System Gives Robot Real Muscle

December 12, 2012

A working human body requires more than just hooking up muscles and nerves, as poor Doctor Frankenstein found out. But researchers of humanoid robots are finding that sometimes the original design is the best. Swapping out muscles and arteries for pulleys and motors, they’ve come up with a robot that’s not afraid to flex its muscles.

Kenshiro Robot Gets New Muscles and Bones
by Angelica Lim

Why try and mimic the human body? It turns out that getting a robot’s weight right is a tricky problem. Yuto Nakanishi, the head of the project, spoke about the weight problems of Kenzoh, Kenshiro’s tendon-driven upper-body robot ancestor. Kenzoh was a hearty 45 kg, just for the upper body. Scaling up, they projected that a full-body Kenzoh could weigh as much as 100kg!

That was a lot of weight for a relatively small robot. So they decided to design a robot with the same weight ratios of a human. For example, a 55 kg boy would have about a 5 kg thigh and 2.5 kg calf. Kenshiro copies that ratio, with a 4 kg thigh and 2.76 kg calf. Balance is key.

Weight was one thing, but the researchers also tried to mimic the muscle torque and joint speeds. Kenshiro’s total power output is 5 times greater than Kojiro’s, allowing it to do things like the gymnastics-like leg lift in the video above. Kenshiro can get almost the same amount of joint torque as a human, with joint angular speed not quite at human level, at 70-100 degrees per second. It’s a trade-off in weight and power: bigger and stronger motors are often heavier.

Read the full article at IEEE Spectrum. What are some industrial applications that could benefit from a humanoid robot’s flexibility? On the other hand, what are some applications that robots are better at because they don’t have to adhere to a human-like design?


From the Stars to the Factory Floor – Software to Help Design

August 15, 2012

In NASA’s pioneering efforts to continually push the frontier of space exploration, they’ve turned to the robotics and automation industries for help more than once. But now NASA is returning the favor — a software program called HyperSizer, developed by Collier Research Corp., assisted them in the construction of their new Composite Crew Module (CCM), and they’re releasing the software for use by US industries.

NASA Tool Could Cross Over to Manufacturing

by Peter Alpern

The design and construction of NASA’s Composite Crew Module was optimized with the help of HyperSizer structural sizing and design analysis software.

HyperSizer is a structural sizing and design optimization software tool that works in a feedback loop with finite element analysis (FEA) to automatically search for solutions that minimize weight and maximize manufacturability. Although it can also be used on metallic structures, HyperSizer is particularly applicable to complex composite materials, providing the capability to optimize the architecture of large structures such as aircraft, railcars, or wind turbine blades ply-by-ply and element-by-element.

NASA’s CCM is an all-composite alternative for the flight crew module Orion, which is part of NASA’s Constellation program to return man to the moon or go to Mars. The recent tests were considered a milestone in the design of human-rated spacecraft that points toward increased use of lightweight composites in space vehicles.

Read the full article from IndustryWeek  here. How could use of this software influence the way engineers design robots and other automation technology?

Machine Vision for Robot Guidance

June 7, 2012

by Bennett Brumson , Contributing Editor
Robotic Industries Association
Posted 06/04/2012

Robot using vision for small part handling or assembly, courtesy FANUC Robotics America Corp.Without a vision guidance system, robots would be blind, unable to present itself to parts. The increased power of vision guidance systems eliminate the need for expensive fixtures that often must be removed or modified when manufacturers introduce new products or parts.

“The biggest change in the past five years is how vision-guided robotic systems are used and how these systems can automatically generate new frames and new tools. Increased accuracy of vision guidance systems provides for increased robotic accuracy,” says Steve Prehn, Vision Product Manager at FANUC Robotics America Corp. (Rochester Hills, Michigan).

Seeing Accurately
More powerful and accurate cameras are a boon to end-users of industrial robotics. “Vision guidance systems are able to capture very accurate three-dimensional locations with just one camera,” says Doug Erlemann, Business Development Manager with Baumer Ltd. (Southington, Connecticut). Erlemann sees more accurate software, more rugged equipment and cameras with features that alleviate lighting problems. “Cameras with automatic gain are more accurate and robust. Vision guidance systems take into account more than just vision calculations and robot calculations, but are tied together in the overall system.”

Erlemann speaks of how increased accuracy of robotic vision guidance systems facilitates welding applications. “For very fine applications such as aircraft welds, the guidance system must be perfect. Software can run a welding bead to within 10 microns. In welding applications, 10 microns is very accurate.” Typical welding applications require accuracy to within plus or minus a millimeter or two says Erlemann, but some high-end aircraft demands near-perfect welds.

Screen image of a 2-D vision system in a packing application, courtesy Kawasaki Robotics (USA) Inc.Likewise, Brian Carpenter, Software Engineer with Kawasaki Robotics (USA) Inc. (Wixom, Michigan) sees more accurate vision guidance systems for robotics. “Recently, more single camera three-dimensional systems are available. Resolution and accuracy improvements to stereoscopic systems have increased and do not require calibration and can accommodate different working distance.”

Stereoscopic vision guidance systems allow more precise depth measurement. Camera systems will be capable of locating objects as well as track and predict their location while moving, Carpenter says. “Tracking will allow for faster, more responsive tracking.”

Prehn says vision guidance systems are utilized by end-users as a feedback device for generating very accurate frames and tools. “Robot-mounted cameras and the images they generate refine an object’s position through triangulation, providing for tremendous accuracies. Robots operating within six degrees of freedom are a perfect match with three-dimensional vision-guided solutions.”

Prehn goes on to say, “Robust locational systems have the flexibility to quickly adapt to new parts as they are presented to the robot and provide accurate results to have them engage with new parts. Increased processing power allows integrators to go after markets that would be too difficult otherwise.”

Assembly applications on the micro and nano-levels are among the new markets for robotics served by enhancements to vision guidance systems. “Guidance systems accurately locate very small objects or zoom in to validate positions very precisely. When looking at very small fields of view, resolution goes to the micron range. End-users use vision guidance systems to validate and correct for positional inaccuracies over the robot’s working area,” says Prehn.

Charles Ridley, Material Handling Service Manager with PAR Systems Inc. (Shoreview, Minnesota) also talks about the role of robotic vision guidance systems in micro-assembly applications. “The challenges with micro-assembly are similar to other robotic vision applications. Ensuring that the robot chosen for the application has the repeatability and accuracy to handle the tolerances that come with a micro application is key. The vision guidance system must have a higher resolution.”

Calibrated Guidance

Vision guidance systems require calibration with the robot to ensure proper positioning when that robot performs its tasks, says Greg Garmann, Technology Advancement Manager with Yaskawa America Inc.’s Motoman Robotics Division (Miamisburg, Ohio). “Calibrating multiple camera systems between the robotic space and the vision space so that the robot can understand what the vision camera sees is important. Many applications require variable focal lengths and end-users want automatic focus to determine the depth or distance the guidance camera is from objects.”

Vision-enabled robots at Automate 2011, courtesy Comau RoboticsEnd-users must recalibrate the vision system occasionally, says Garmann. “When the focus is changed, that also changes the field of view and the calibration of the camera system to the robot. End-users want automatic focus so the guidance system can understand different focal lengths.”

Calibration issues are important to end-users of Comau Robotics’ systems (Southfield, Michigan) says Process Technology Director, Joe Cyrek. “With advancements in computing power, systems allow for robot guidance in six degrees of freedom with one camera and cable without calibration. That advancement is significant.” Cyrek adds, “End-users want no calibration and simplicity in vision guidance systems. A single camera, cable, a simple interface without the need for calibration equals increased mean time between failures and decreased mean time to recovery, and fast set up.”

Algorithms and their application into robot guidance solution have changed the perception of robot guidance from “complicated” and “avoid at all costs” to “simple” and “find more ways to use it,” says Cyrek.

Calibration of robotic vision guidance systems is also on the mind of Nicholas Hunt, Automotive Technology Support Group Manager at ABB Inc. (Auburn Hills, Michigan). “I see more light with structured wavefronts coming of age for three-dimensional surface scanning applications. The result requires processing massive amounts of data very quickly. New processors provide the necessary speed to fit the demands of production throughput.” Hunts stresses the need for good calibration between the robot tool center point and the camera, or calibration between the camera and the work cell. “Vision system calibration issues might not be evident until weeks later.”

FANUC’s Prehn sums up the importance of proper calibration. “If mistakes are made in calibrating the camera system or applying positions relative to where the camera finds the object, the integrator can get into trouble.” Taking advantage of training courses offered by robot manufacturers or the Robotic Industries Association (RIA, Ann Arbor, Michigan) in vision guidance systems is a good way to avoid calibration pitfalls.

“Lighting, Lighting, Lighting”
Proper lighting is crucial for guidance systems to function properly and consistently. “Vision has always been about lighting and lensing, and guidance is no exception. Maintaining nominal lighting, whether it be raster Robot with gripper and vision guidance system ready for work, courtesy Applied Robotics Inc.lighting or structured lighting, is key to consistently reporting part position to the robot,” says Henry Loos, Controls and Applications Engineer with Applied Robotics Inc. (Glenville, New York). “Do not skimp on high quality lighting and use high quality lenses.”

ABB’s Nick Hunt says, “In machine vision for robotic guidance, the mantra should be ‘lighting, lighting, lighting.’ While debugging a system, some integrators spend too much time looking at the application or the processing of the information. Integrators should pay more attention to the lighting and focal lengths of the camera.”

When setting up a robotic guidance vision system, lights should be arranged to cast diffused illumination rather than direct light, Hunt suggests. “Machine vision applications should not use direct lighting, but should diffuse lighting with light boxes, light-emitting diode (LED) ring lights or other form of structured light.” Hunt says structured lighting is easier to apply than in the recent past. “Suppliers have stepped up to make simple LED ring lights easy to implement on today’s guidance cameras. The benefits of structured LED lighting in robotic machine vision applications are quicker, lowering integration costs while minimizing the effects of inconsistent ambient light.”

Lighting issues are best addressed in the early planning phase of a work cell to save time, effort and money, Hunt says. “Ambient lighting is very important. Integrators should not underestimate how much difficulty end-users might run into if lighting issues are not well thought out. Vision guidance systems are easier and less expensive to implement than in the past with simple LED ring lights.”

Lighting equipment degrades slowly through time, cautions Ridley. “Lighting characteristics change over time, presenting the biggest stumbling blocks to consistent vision operation.” Similarly Carpenter of Kawasaki Robotics says, “Lighting changes over time, even if LED lights. The intensity of the light changes over years so end-users must budget for making changes to lighting systems or have someone on staff who plans for that.”

2, 2.5, 3-D
Robotic vision guidance systems scan objects in two or three dimensions. Unless the application requires three-dimensional vision guidance, integrators favor using two-dimensional systems. “Get the simplest vision guidance system that meets the needs of the application. If the application only needs 2.5-dimensional guidance, end-users should not bother investing in a three-dimensional system,” advises Cyrek.

“Vision guided robotics systems matured in the two-dimensional arena first,” says Adil Shafi, President of ADVENOVATION Inc. (Rochester Hills, Michigan). “Three dimensional solutions have evolved steadily over the past decade most notably when directed at individual objects with reduced calibration, when the geometry of the part is feature-rich or highly repeatable. This trend will continue to grow as lighting, computing power and algorithms improve. Expect more bin picking solutions in the next decade.”

Three-dimensional guidance systems are being increasingly adopted by end-users due to growing reliability at a lower cost. “Single camera 3D systems are becoming popular due to resolution and accuracy improvements. Single camera three-dimensional systems are great because they do not require as difficult calibration as stereoscopic systems and can accommodate different working distances,” says Carpenter.

Prehn also anticipates more 3D guidance systems in robotic work cells. “Integrators have been doing three-dimensional guidance for quite some time. As guidance technology advances, end-users are able to leverage guidance systems to greater accuracy.” Increased processing power of vision systems and robot controllers allows vision-guided robots to enter new markets, Prehn says.

Watching and Learning
Integrators, robot end-users, teachers, students, newcomers and experts, can learn about vision-guided robotics through a free webinar hosted by RIA and Shafi on Thursday, June 7 at 12:00 PM EDT. The Robots: Vision Guidance Technology (2D) webinar will cover basic concepts, good applications, product examples, flexible tooling, hard tooling, vacuum, vision and other related topics. “I will show application videos and a PowerPoint presentation to provide a background of two-dimensional vision guidance applications,” says Shafi. “The webinar is balanced for experts as well as those who have never used robots before, and everyone in between.”

RIA webinars typically attract 400 viewers from around the world comprised of end-users, integrators, component providers, teachers, students and market analysis professionals. “I design webinars to be easy enough for people new to the industry to understand. I build on this foundation to show new trends and expert perspectives. We are assisted by a panel of industry experts who answer questions and present varied perspectives.”

On Track
As vision guidance systems become more powerful and more compact, they will routinely incorporate tracking and traceability functions, says Erlemann. “The automotive sector will eventually go the way of pharmacy applications, where each pill and bottle is tracked through the manufacturing process. The automotive industry will track each door panel and each part of the panel with individual serial numbers, to track where all parts are put together.”

Tracking and traceability are good for failure analysis, says Erlemann. “When a particular car model is seen as a lemon, the guidance system helps track individual panels or parts. Pharmaceutical applications are required by Food and Drug Administration (FDA) regulations to track each part. The automotive industry will move towards that in the next five years.”

Seeing Guidance in Action
Robot makers, end-users and integrators will show off vision guidance systems and other robotic equipment at Automate 2013. The Automate trade show and conference is collocated with ProMat in Chicago January 21-24, 2013 and will feature robot vision guidance systems. “Comau will once again bring our three-dimensional vision system in a user interactive demonstration of its simplicity and flexibility. Last year’s putt-putt golf ball retrieving robot was a crowd favorite,” recalled Joe Cyrek.

Originally posted on Robotics Online.

The White House’s Eyes are on Robotics

May 2, 2012

Charles Thorpe, the White House’s man on robotics and a speaker at the 2012 Robotics Industry Forum, and Andrew Borene recently gave an interview on the future of robotics and the White House’s actions to continue robotics industry development.

Robotics and more U.S. jobs

by Neal St. Anthony

Q How does this tie into job creation?

A Advanced manufacturing [companies] are making our highways safer, our brave soldiers safer [with ground robots and aerial drones] our manufacturing more competitive and our workers more productive. These are growth companies. We in the United States have a wonderful record of inventing things, but not as good a record of keeping manufacturing and jobs in the U.S. We can compete with [Chinese and other low-cost factories] with smart tools, better-educated and more-productive workers.

These are not our fathers’ manufacturing jobs that just require strong backs and effort and a good work ethic. We need to make sure today’s workforce learns the math and science and gets the training to be CNC machine operators. [Baby boomers] all had shop class. We learned to work on stuff. Today’s kids don’t get that opportunity. The cars work better. And kids don’t tinker as much. We need to re-energize invention and tinkering and the skills to play with things and invent and build prototypes.

Read more of the interview here at the Star Tribune. Do you think the White House is looking down the right roads of advanced manufacturing?

RIA to Announce First Certified Robot Integrators on May 14

April 24, 2012

RIA will announce the first group of Certified Robot Integrators on May 14, 2012. These integrators have gone through a rigorous audit to confirm that they meet the requirements of the new Certified Robot Integrator designation. The program, which was launched in January at the Robotics Industry Forum in Orlando, has received tremendous interest from integrators and end users alike.

“We are thrilled at the high-level of interest in our new program and look forward to announcing our first group of certified integrators very soon,” said Jeff Burnstein, President of RIA. “Our Robotics Online website will have the news first,” Burnstein noted.

Click for more details on the Certified Robot Integrator Program.

Originally posted on Robotics Online.

Robotics in Michigan – Recharging Our Future!

April 5, 2012

Robotics Luminaries from Around the State Converge on U of M to Promote Michigan as a Global Leader in Robotics and Autonomous Vehicles

ANN ARBOR, MI – March 28, 2012. The National Center for Manufacturing Sciences (NCMS), in partnership with the University of Michigan, will host the Second Annual Michigan Robotics Day on Monday, April 9, 2012.

This event is open to anyone who wants to learn about the incredible advances made by Michigan’s robotics technology sector with a strong emphasis on companies who would like to do business in Michigan. The event also highlights Michigan’s vibrant science, technology, engineering, and mathematics (STEM) community and its outstanding research universities. Robotics industry experts will meet with University researchers and students to promote the State as a global force in robotics and autonomous vehicles.

“Michigan is positioned to lead the world in robotics innovation,” said NCMS President & CEO Rick Jarman. “Consider the massive talent and infrastructure that already exists here in the state. Design and deployment of robotics technology will ultimately depend on advanced manufacturing – exactly the kind of capability in which Michigan companies excel.”

The daylong event will highlight the promise of robotics in Michigan, and confirm the field as an economic development engine for the State. The highlight of the event will be a keynote by Professor Lawrence Burns, a noted expert on next generation vehicle technology, including autonomous vehicles, transportation energy, and connected vehicles. Professor Burns is working with Google’s autonomous vehicle project and formerly served as Vice President of R&D for General Motors.

The day will include demonstrations of autonomous vehicles from leading automakers, robots from many state research institutions, and FIRST high school robotics teams demonstrating the next generation’s commitment to the technology. Student teams will have the chance to meet leaders in the robotics world, garner feedback for their work, and begin networking within the industry.

“Robotics represents a cradle-to-career opportunity for Michigan students,” said Phil Callihan, an Executive Director at NCMS. “They can start in high school, competing with FIRST robotics teams, do cutting edge research at our universities, and then work at Michigan companies who are global leaders in their field. This is an opportunity for long-term job growth, innovation, and success.”

The event takes place on April 9, 2012, starting at 9:00 a.m. Attendance is free and the public is welcome, with lunch provided for those who register by April 2. Registration, agenda and speaker bios for this event are available at: http://www.mirobotics.org/

Read the original press release here.

Robots Start Naval Basic Training

April 3, 2012

The United States military is always on the forefront of cutting edge technology, and their interest in robotics is true to their course. Last month they opened the doors to LASR, their Laboratory for Autonomous Systems Research. The center will be a development site and testing grounds for new autonomous units designed for ground, air, and sea.

Like robots? Then you’ll love these pics from a new military robotics lab

by Jolie O’Dell

The new, $17.7 million dollar facility got its official ribbon-cutting ceremony just two weeks ago, nearly two years after ground was initially broken on the site. The facility includes a wide range of environments for testing, from simulated deserts and rainforests to a 45-by-25-foot pool with a wave generator capable of producing directional waves.

The Navy said the number and type of research robotics projects will increase as researchers register to use the new LASR facility.

“It’s the first time that we have, under a single roof, a laboratory that captures all the domains in which our Sailors, Marines and fellow DOD service members operate,” said Rear Adm. Matthew Klunder, chief of naval research, in today’s release. “Advancing robotics and autonomy are top priorities for the Office of Naval Research. We want to reduce the time it takes to deliver capability to our warfighters performing critical missions. This innovative facility bridges the gap between traditional laboratory research and in-the-field experimentation-saving us time and money.”

Read the full article and see pictures at VentureBeat.