一、背景

机器人视觉应用中,手眼标定是一个非常基础且关键的问题。简单来说手眼标定的目的就是获取机器人坐标系和相机坐标系的关系,最后将视觉识别的结果转移到机器人坐标系下。

手眼标定行业内分为两种形式,根据相机固定的地方不同,如果相机和机器人末端固定在一起,就称之为“眼在手”(eye in
hand),如果相机固定在机器人外面的底座上,则称之为“眼在外”(eye to hand)。 
eye to hand
眼在外eye in hand
​​​​​​眼在手
二、手眼关系的数学描述

1. eye in hand,这种关系下,两次运动,机器人底座和标定板的关系始终不变。求解的量为相机和机器人末端坐标系的位姿关系。



2. eye to hand,这种关系下,两次运动,机器人末端和标定板的位姿关系始终不变。求解的量为相机和机器人底座坐标系之间的位姿关系。



 

三、AX = XB问题的求解

* Shiu Y C, Ahmad S. Calibration of wrist-mounted robotic sensors by solving
homogeneous transform equations of the form AX=XB[J]. IEEE Transactions on
Robotics & Automation, 1989, 5(1):16-29.
 
相关网上的英文教程 http://math.loyola.edu/~mili/Calibration/index.html
<http://math.loyola.edu/~mili/Calibration/index.html>其中也有一些AX=
XB的matlab代码可以使用。  

ROS 下也有相关的一些package可以利用

https://github.com/IFL-CAMP/easy_handeye
<https://github.com/IFL-CAMP/easy_handeye>

http://wiki.ros.org/handeye <http://wiki.ros.org/handeye>

http://visp-doc.inria.fr/doxygen/visp-daily/calibrateTsai_8cpp-example.html#_a0

<http://visp-doc.inria.fr/doxygen/visp-daily/calibrateTsai_8cpp-example.html#_a0>

 

四、其他参考资料

https://blog.csdn.net/u011089570/article/details/47945733
<https://blog.csdn.net/u011089570/article/details/47945733> 图不错

https://blog.csdn.net/qq_16481211/article/details/79764730
<https://blog.csdn.net/qq_16481211/article/details/79764730> 部分halocon代码

https://blog.csdn.net/qq_16481211/article/details/79767100
<https://blog.csdn.net/qq_16481211/article/details/79767100> halocon代码

https://blog.csdn.net/happyjume/article/details/80847822
<https://blog.csdn.net/happyjume/article/details/80847822> 部分原理

https://blog.csdn.net/zhang970187013/article/details/81098175
<https://blog.csdn.net/zhang970187013/article/details/81098175> UR5 与easy hand
eye

一般用“两步法”求解基本方程,即先从基本方程上式求解出旋转部分,再代入求解出平移部分。

https://blog.csdn.net/yunlinwang/article/details/51622143
<https://blog.csdn.net/yunlinwang/article/details/51622143>

五、Matlab下手眼标定解算

相机与机器人是eye-to-hand模式,机器人为加拿大Kinova 6轴机械臂,机器人pose为末端相对于基座的x,y,z,rx,ry,rz,单位为米和弧度

2017.08.29Kinova_pose_all_10_1.txt
0.475732,0.0143899,0.597381,-0.186261,-0.437222,2.36416
0.351412,0.268087,0.458479,0.0520873,-0.0950319,2.38993
0.251188,0.143736,0.426332,-0.216293,0.057463,-0.931251
0.243135,0.151277,0.464429,0.00644091,-0.039015,0.248319
0.288528,0.144912,0.409375,-0.456906,0.134654,-2.23237
0.240534,0.0828331,0.455197,-0.269758,-0.404214,-0.228711
0.358995,0.19536,0.504774,0.226276,0.237398,1.94334
0.188926,0.0555841,0.4517,0.441706,-0.250536,-0.0724471
0.19245,0.143225,0.489355,0.462128,-0.135995,-0.105669
0.379911,0.047085,0.543727,0.361346,-0.141438,3.0347
pattern pose为标定板相对于相机的x,y,z,rx,ry,rz,单位为米和弧度

2017.08.29Pattern_pose_all_10_1.txt
0.147349 , -0.064831 , 0.509528 , 0.197843 , -0.500065 , 1.792583 -0.106272 ,
0.070595 , 0.633095 , 0.078063 , -0.128841 , 1.874043 0.144437 , -0.11512 ,
0.602498 , 2.77556 , 2.96201 , 2.070114 -0.042621 , -0.091343 , 0.598773 ,
2.95559 , -3.102323 , 0.890111 0.192328 , 0.027207 , 0.552456 , 0.097234 ,
0.637267 , 0.328188 0.074878 , -0.10344 , 0.539767 , 2.598053 , 2.800433 ,
1.189678 -0.100101 , 0.030685 , 0.526716 , 0.098909 , 0.176179 , 2.323831
0.11135 , -0.055337 , 0.534584 , -2.986191 , 2.769534 , 1.274791 0.018719 ,
-0.063078 , 0.525738 , -2.923126 , 2.874801 , 1.287745 0.193075 , 0.044841 ,
0.52557 , -0.13286 , -0.198836 , 1.280157
此Matlab文件调用数据进行离线解算

Jaco_handeye_rewrite_10.m
%% Eye to Hand calibration with Ensenso N20 and Kinova robotics arm. % input :
Pattern pose to camera and arm cartesian pose in base coordiante. % output: The
left eye of Ensenso N20 to the arm base coordiante. % % Robot Pose(Homogeneous)
stored in cell A. -------------------10 pose % clear; close all; clc;
JacoCartesianPose = importdata('D:\\jaco\\2017.08.29Kinova_pose_all_10_1.txt');
[m,n] = size(JacoCartesianPose); % 10* 6 A = cell(1,m); % 1*10 for i = 1: 1: m
A{1,i} = transl(JacoCartesianPose(i,1), JacoCartesianPose(i,2),
JacoCartesianPose(i,3)) * trotx(JacoCartesianPose(i,4)) *
troty(JacoCartesianPose(i,5))* trotz(JacoCartesianPose(i,6)); end % Pattern
Pose(Homogeneous) stored in cell B. patternInCamPose =
importdata('D:\\jaco\\2017.08.29Pattern_pose_all_10_1.txt'); [melem,nelem] =
size(patternInCamPose); % 10*6 B=cell(1,melem); for x=1:1:melem B{1,x} =
transl(patternInCamPose(x,1), patternInCamPose(x,2), patternInCamPose(x,3)) *
trotx(patternInCamPose(x,4)) * troty(patternInCamPose(x,5))*
trotz(patternInCamPose(x,6)); end % %机器人位姿获取记得以五角星的形式获取,参照Tsai的论文 n2=m;
TA=cell(1,n2); TB=cell(1,n2); %--------------------- 10
----------------------------------- % for j=[1,6]% Only begin. % %
TA{1,j}=A{1,j+1}*inv(A{1,j}); % TA{1,j+1}=A{1,j+2}*inv(A{1,j+1}); %
TA{1,j+2}=A{1,j+3}*inv(A{1,j+2}); % TA{1,j+3}=A{1,j+4}*inv(A{1,j+3}); %
TA{1,j+4}=A{1,j}*inv(A{1,j+4}); % % TB{1,j}=B{1,j+1}*inv(B{1,j}); %
TB{1,j+1}=B{1,j+2}*inv(B{1,j+1}); % TB{1,j+2}=B{1,j+3}*inv(B{1,j+2}); %
TB{1,j+3}=B{1,j+4}*inv(B{1,j+3}); % TB{1,j+4}=B{1,j}*inv(B{1,j+4}); % % end % %
M1=[TA{1,1} TA{1,2} TA{1,3} TA{1,4} TA{1,5} TA{1,6} TA{1,7} TA{1,8} TA{1,9}...
% TA{1,10} ]; % M2=[TB{1,1} TB{1,2} TB{1,3} TB{1,4} TB{1,5} TB{1,6} TB{1,7}
TB{1,8} TB{1,9}... % TB{1,10} ]; % M1=[TA{1,1} TA{1,2} TA{1,3} TA{1,4} TA{1,5}
TA{1,6} TA{1,7} TA{1,8} TA{1,9} ]; % M2=[TB{1,1} TB{1,2} TB{1,3} TB{1,4}
TB{1,5} TB{1,6} TB{1,7} TB{1,8} TB{1,9} ]; %--------------------- 10
----------------------------------- C_Tsai=tsai(M1,M2); T_Tsai =
(transl(C_Tsai))'; C_Tsai_rad = tr2rpy(C_Tsai); C_Tsai_rpy_rx_ry_rz
=rad2deg(C_Tsai_rad); fprintf('Tsai(rad) = \n%f, %f, %f, %f, %f,
%f\n',T_Tsai(1,1), T_Tsai(1,2), T_Tsai(1,3), C_Tsai_rad(1,1), C_Tsai_rad(1,2),
C_Tsai_rad(1,3)); fprintf('Tsai(deg) = \n%f, %f, %f, %f, %f,
%f\n\n',T_Tsai(1,1), T_Tsai(1,2), T_Tsai(1,3), C_Tsai_rpy_rx_ry_rz(1,1),
C_Tsai_rpy_rx_ry_rz(1,2), C_Tsai_rpy_rx_ry_rz(1,3)); C_Shiu=shiu(M1,M2);
T_Shiu= [C_Shiu(1,4) C_Shiu(2,4) C_Shiu(3,4)] ; C_Shiu_rad = tr2rpy(C_Shiu);
C_Shiu_rpy_rx_ry_rz = rad2deg(C_Shiu_rad); fprintf('Shiu(rad) = \n%f, %f, %f,
%f, %f, %f\n',T_Shiu(1,1), T_Shiu(1,2), T_Shiu(1,3), C_Shiu_rad(1,1),
C_Shiu_rad(1,2), C_Shiu_rad(1,3)); fprintf('Shiu(deg) = \n%f, %f, %f, %f, %f,
%f\n\n',T_Shiu(1,1), T_Shiu(1,2), T_Shiu(1,3), C_Shiu_rpy_rx_ry_rz(1,1),
C_Shiu_rpy_rx_ry_rz(1,2), C_Shiu_rpy_rx_ry_rz(1,3)); C_Ijrr=ijrr1995(M1,M2);
T_ijrr= [C_Ijrr(1,4) C_Ijrr(2,4) C_Ijrr(3,4)] ; C_ijrr_rad = tr2rpy(C_Ijrr);
C_ijrr_rpy_rx_ry_rz = rad2deg(C_ijrr_rad); fprintf('Ijrr(rad) = \n%f, %f, %f,
%f, %f, %f\n',C_Ijrr(1,1), C_Ijrr(1,2), C_Ijrr(1,3), C_ijrr_rad(1,1),
C_ijrr_rad(1,2), C_ijrr_rad(1,3)); fprintf('Ijrr(deg) = \n%f, %f, %f, %f, %f,
%f\n\n',C_Ijrr(1,1), C_Ijrr(1,2), C_Ijrr(1,3), C_ijrr_rpy_rx_ry_rz(1,1),
C_ijrr_rpy_rx_ry_rz(1,2), C_ijrr_rpy_rx_ry_rz(1,3));
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % robotHcam =[ -0.076, -0.674,
0.760631455868699, 178.7221124879378, -0.0735038591212, -11.5304192925905 ]; %
robotHcam1 = transl(robotHcam(1,1), robotHcam(1,2), robotHcam(1,3)) *
trotx(robotHcam(1,4),'deg') * troty(robotHcam(1,5), 'deg')*
trotz(robotHcam(1,6), 'deg')* trotx(90,'deg'); % rotx 90 robotHcam =[ 0.013,
-0.94, 0.86, -90.0, 0.0, 0.0 ]; % robotHcam =[ 0.25, 0.22, 1.1, 180.0, 0.0,
90.0 ]; % bind to the stack robotHcam1 = transl(robotHcam(1,1), robotHcam(1,2),
robotHcam(1,3)) * trotx(robotHcam(1,4),'deg') * troty(robotHcam(1,5), 'deg')*
trotz(robotHcam(1,6), 'deg'); fprintf('robotHcam used in Program(rad) = \n%f,
%f, %f, %f, %f, %f\n',robotHcam(1,1), robotHcam(1,2), robotHcam(1,3),
deg2rad(robotHcam(1,4)), deg2rad(robotHcam(1,5)), deg2rad(robotHcam(1,6)));
fprintf('robotHcam used in Program(deg) = \n%f, %f, %f, %f, %f,
%f\n\n',robotHcam(1,1), robotHcam(1,2), robotHcam(1,3), robotHcam(1,4),
robotHcam(1,5), robotHcam(1,6)); t1 = eye(4);
trplot(t1,'frame','R','arrow','width', '1', 'color', 'r', 'text_opts',
{'FontSize', 10, 'FontWeight', 'light'},'view', [-0.3 0.5
0.6],'thick',0.9,'dispar',0.8 );% Display Robot coordinate hold on;
trplot(robotHcam1,'frame','C', 'color', 'black'); % Display Camera coordinate
used in Program trplot(C_Tsai,'frame','T', 'color', 'b'); % Display Tsai
trplot(C_Shiu,'frame','S', 'color', 'g'); % Display Shiu clc; clear; close all;
% D:\jaco\ConsoleApplication1/get_saveCartesian.cpp——Kinova_pose.txt robotAeef
= [-0.0860801, -0.641813, -0.0987199, 3.13316, 0.000389122, -0.297456];
robotBeef = transl(robotAeef(1,1), robotAeef(1,2), robotAeef(1,3)) *
trotx(robotAeef(1,4)) * troty(robotAeef(1,5))* trotz(robotAeef(1,6)); %
D:\jaco\C#\nxCsExamples.sln —— Pattern_pose_all.txt camAobj = [0.011651 ,
-0.069043 , 0.857845 , -3.12825 , 3.137609 , 3.048224]; camBobj
=transl(camAobj(1,1), camAobj(1,2), camAobj(1,3)) * trotx(camAobj(1,4)) *
troty(camAobj(1,5))* trotz(camAobj(1,6)); robotAcam = robotBeef * inv(camBobj);
robotAcam_Trans0 = (transl(robotAcam))'; fprintf('robotAcam_T = \n%f, %f,
%f\n',robotAcam_Trans0(1,1), robotAcam_Trans0(1,2), robotAcam_Trans0(1,3));
robotAcam_R_rad = tr2rpy((robotAcam)); fprintf('robotAcam_R(rad) = \n%f, %f,
%f\n',robotAcam_R_rad(1,1), robotAcam_R_rad(1,2), robotAcam_R_rad(1,3));
R_degree0 = rad2deg(robotAcam_R_rad); fprintf('robotAcam_R(deg) = \n%f, %f,
%f\n\n',R_degree0(1,1), R_degree0(1,2), R_degree0(1,3)); % [theta, v] =
tr2angvec(robotAcam) % robotAcam = [ % robotAcam(1, 1) robotAcam(1, 2)
robotAcam(1, 3) -0.097; % robotAcam(2, 1) robotAcam(2, 2) robotAcam(2, 3)
-0.695; % robotAcam(3, 1) robotAcam(3,2) robotAcam(3, 3) robotAcam(3, 4); % 0 0
0 1; % ] % Trans1 = (transl(robotAcam))' % R_rad1 = tr2rpy((robotAcam)) %
R_degree1 = rad2deg(R_rad1) fprintf('===============Test
point===============\n'); % camAobj2= [ 0.011634 , -0.068815 , 0.858039 ,
-3.124779 , 3.139759 , 3.046957]; % Workspace home. camAobj2= [ -0.102468 ,
-0.059197 , 0.858 , -3.127464 , 3.136249 , 3.053341 ]; camBobj2 =
transl(camAobj2(1,1), camAobj2(1,2), camAobj2(1,3)) * trotx(camAobj2(1,4)) *
troty(camAobj2(1,5))* trotz(camAobj2(1,6)); robotAobj2 = robotAcam * camBobj2;
robotAobj2_T = (transl(robotAobj2))'; fprintf('robotAobj2_T = \n%f, %f,
%f\n',robotAobj2_T(1,1), robotAobj2_T(1,2), robotAobj2_T(1,3));
robotAobj2_R_rad = tr2rpy((robotAobj2)); fprintf('robotAobj2_R(rad) = \n%f, %f,
%f\n',robotAobj2_R_rad(1,1), robotAobj2_R_rad(1,2), robotAobj2_R_rad(1,3));
robotAobj2_R_degree = rad2deg(robotAobj2_R_rad); fprintf('robotAobj2_R(deg) =
\n%f, %f, %f\n',robotAobj2_R_degree(1,1), robotAobj2_R_degree(1,2),
robotAobj2_R_degree(1,3));
====================平面九点标定法====================



相机坐标和机器人坐标写成齐次的形式,九个未知量,九组对应点,奇异值分解。
//Solve equation:AX=b #include <cv.h> #include<opencv2/opencv.hpp> using
namespace std; using namespace cv; int main(int argc, char** argv) {
printf("\nSolve equation:AX=b\n\n"); //Mat A = (Mat_<float>(6, 3) << //480.8,
639.4, 1, //227.1, 317.5, 1, //292.4, 781.6, 1, //597.4, 1044.1, 1, //557.7,
491.6, 1, //717.8, 263.7, 1 // );// 4x3 //Mat B = (Mat_<float>(6, 3) <<
//197170, 185349, 1, //195830, 186789, 1, //196174, 184591, 1, //197787,
183176, 1, //197575, 186133, 1, //198466, 187335, 1 // ); Mat A =
(Mat_<float>(4, 3) << 2926.36, 2607.79, 1, 587.093, 2616.89, 1, 537.031,
250.311, 1, 1160.53, 1265.21, 1);// 4x3 Mat B = (Mat_<float>(4, 3) << 320.389,
208.197, 1, 247.77, 209.726, 1, 242.809, 283.182, 1, 263.152, 253.715, 1); Mat
X; cout << "A=" << endl << A << endl; cout << "B=" << endl << B << endl;
solve(A, B, X, CV_SVD); cout << "X=" << endl << X << endl; Mat a1 =
(Mat_<float>(1, 3) << 1864, 1273, 1); Mat b1 = a1*X; cout << "b1=" << endl <<
b1 << endl; cout << "真实值为:" << "283.265, 253.049, 1" << endl; getchar(); return
0; }

https://docs.opencv.org/3.1.0/d2/de8/group__core__array.html#ga12b43690dbd31fed96f213eefead2373

<https://docs.opencv.org/3.1.0/d2/de8/group__core__array.html#ga12b43690dbd31fed96f213eefead2373>

结果对比:左halcon C#,右opencv c++,底下为Matlab结果,三者一致,算法检测通过。


友情链接
KaDraw流程图
API参考文档
OK工具箱
云服务器优惠
阿里云优惠券
腾讯云优惠券
华为云优惠券
站点信息
问题反馈
邮箱:ixiaoyang8@qq.com
QQ群:637538335
关注微信