You are here

Application of Stereo Vision on Determination of End-Effector Position and Orientation of Manipulators

Journal Name:

Publication Year:

Abstract (Original Language): 
the main idea of this paper is to present application of stereo vision approach to determine position and orientation of end-effector of mechanical manipulators. Stereo vision is a technique based on inferring depth of an object from two cameras. A stereo vision test set up that is made of two parallel CMOS cameras is provided and USB port is used to transmit image data to the computer. Experimental tests are exerted for 6R mechanical manipulator where the end-effector moves on circular trajectory and Scout mobile robot to move a pre-defined optimal trajectory. The image processing algorithm, matching algorithm and triangulation calculations are optimized to decrease time required for matching procedure. The results illustrate that the measurement error is less than 1 mm. Because of rapid image processing algorithm; proposed method can be used to detect end-effector in dynamic trajectories. This stereo vision set up can be applied easily as measuring unit in a closed loop control system.
9
16

REFERENCES

References: 

L. G. Hee, “sensor based localization of a mobile”, phd thesis, National Univ. of Singapore 2007.
E. Jones and S. Soatto, “Visual-inertial navigation, mapping and localization: A scalable real-time causal approach”, Int. J. of Robotics Reaserch, vol. 30, pp. 407-430, 2010.
A. Gutierrez, A. Campo, F. C. Santos, C. Pinciroli and M. Dorigo, “Social Odometry in Populations of Autonomous Robots ”, Lecture Notes in Computer Science, 5217,pp. 371-378, 2008.
P. Kucsera, “Sensors for mobile robot systems”, AARMS, vol. 5(4), pp. 645-658 2006.
T. Tsukiyama, “Mobile robot localization from landmark bearings”, XIX IMEKO World Congress Fundamental and Applied Metrology, September 6-11, Lisbon, Portugal 2009.
J. Santolaria, D. Guillomía, C. Cajal, J. A. Albajez and J. J. Aguilar, “Modeling and Calibration Technique of Laser Triangulation Sensors for Integration in Robot Arms and Articulated Arm Coordinate Measuring Machines”, Sensors,vol. 9, pp. 7374-7396, 2009.
MIT Computer Science and Artifcial Intelligence Laboratory, Cambridge, MA 02139, 2004.
Evolution Robotics, Inc., 130 W., CA 91103, NorthStar Projector Kit User Guide, 1.0 edition.
A. Ramisa, A. Tapus, R. L. Mantaras, and R. Toledo. “Mobile Robot Localization using Panoramic Vision and Combinations of Feature Region Detectors”, IEEE International Conference on Robotics and Automation, Pasadena, USA, pp. 538-543, 2008.
C.K. Chang, C. Siagian, L. Itti, “Mobile Robot Vision Navigation & Localization Using Gist and Saliency”, IEEE International Conference on Intelligent Robots and Systems, Taiwan, pp. 4147-4154, 2010.
L. Spacek, C. Burbridge, “Instantaneous robot self-localization and motion estimation with omnidirectional vision”, Robotics and Autonomous Systems, vol. 55,pp. 667–674, 2007.
M. Bleyer and M. Gelautz(2008), Computer Vision Theory and Applications, VISAPP, 415-422.
M. Humenberger, T. Engelke, W. Kubinger, “A census-based stereo vision algorithm using modified Semi-Global Matching and plane fitting to improve matching quality”, IEEE Computer Society Conference CVPRW, San Francisco, CA, pp. 77–84, 2010.
Q.Yang, “Real-time global stereo matching using hierarchical belief propagation”, British Machine Vision Conference, pp. 989–998, 2006.
B. Stan, T. Carlo, "Depth Discontinuities by Pixel-to-Pixel Stereo", International Journal of Computer Vision, vol. 35, no. 3, pp. 269–293, 1999.http://www.vision.deis.unibo.it/smatt/
A. H. Dastjerdi, “Optimizing Tracking Algorithm for Contact Probe of CMM by Non Contact Probe based On Image Processing”, MSc thesis, Amir Kabir Univ., Iran, 2010.
D. Scharstein and R. Szeliski, http://vision.middlebury.edu/stereo/eval/

Thank you for copying data from http://www.arastirmax.com