Lost target simple search

Lost target simple search

This search recipe is perfect for robots with memory leaks whose always forget where did they put the charger or the remote controller. You can impress your host by adding this simple functionality to find the family dog, the salt in the kitchen or even save some people in an emergency or disaster situation.

Ingredients

  • Matlab
  • Prior information of the potential locations to search

Here you have the code ready to play !
 


% SIMPLE SEARCH (1-step ahead)
% for lost targets 2D in a static world using range sensors
% Author: Pablo Lanillos ICS TUM
% If you use the code please cite
% Lanillos, P. (2013): Minimum time search of moving targets in uncertain environments. Ph.D. Dissertation, Universidad Complutense de Madrid.
% http://therobotdecision.com/papers/PhD_MinimumTimeSearch_2013_FinalElectronic_[PabloLanillos].pdf
%
% June 2016 www.therobotdecision.com

function simple_search()

fprintf('-------------------------------------------------\n');
fprintf('> Simple search 2D start (greedy 1-step ahead)...\n');

% sensor params
Pdmax = 0.9;
dmax = 2;
sigma = 0.7;
% agent params
delta = 0.5; % constant displacement
z = 10; % z is considered constant
s = [5,5,z]'; % initial location (x,y,z) 

%% discretized prior map 2D, initial belief b0, Gaussian centered in the middle of the map
sz = [20,20];
% prepare grid for improve computation efficiency
I=repmat([1:sz(1)],sz(2),1);I=I(:)';
J = repmat([1:sz(2)],[1,sz(1)]);
COV =[2,0;0,4];
x_mu = [I-ceil(sz(1)/2);J-ceil(sz(2)/2)];
bk = zeros(sz(1)*sz(2),1);
for i = 1 : sz(1)*sz(2)
    bk(i) = exp(- 1/2 * x_mu(:,i)'/COV * x_mu(:,i));
end

%% Movement params
nsamples = 9; % nomber of movements
mat = [-delta,-delta; ...% 1 
       -delta, 0; ...% 2 
       -delta, delta; ...% 3 
        0,    -delta; ...% 4 
        0,     0; ...% 5 
        0,     delta; ...% 6 
        delta,-delta; ...% 7 
        delta, 0; ...% 8 
        delta, delta ];% 9

track = [s]; % visualization purpose only
cont = 0; found = 0;
%% start search
while (~found && cont < 100)
    %% compute discrete forward states     
    dirs = randperm(nsamples); % random selection 
    fs = repmat([s(1), s(2)],nsamples,1) +  mat(dirs,:); % forward states
    fs = [fs,ones(nsamples,1)*s(3)]; % heigth is constant
    % check bounds
    mask = fs(:,1) > 0 & fs(:,2) > 0 & fs(:,1) < sz(1) & fs(:,2) < sz(2);
    fs = fs(mask,:);    
    %% compute cost funtion of the potential forward states
    info = zeros(size(fs,1),1);
    for i = 1 : size(fs,1)
        cs = fs(i,:);
        dk = sqrt((I(1,:)-cs(1)).^2 + (J(1,:)-cs(2)).^2); % distance    
        obs = Pdmax.* exp((-sigma.*(dk/dmax).^2)); % No detection range sensor                    
        info(i) = 1-sum((1-obs').*bk); % information gain
    end
    [v,id] = max(info); % selection of action with max information gain
    %% simulate movement
    s = fs(id,:)';
    %%  update belief
    % observation model
    dk = sqrt((I(1,:)-s(1)).^2 + (J(1,:)-s(2)).^2); % distance    
    obs = Pdmax.* exp((-sigma.*(dk/dmax).^2)); % No detection range sensor   
    bk = (1-obs').*bk; % Bayesian update
    bk = bk./sum(bk); % normalization
    % No prediction step -> assuming static world
    %% visualize
    track = [track,s];    
    figure(11); clf; hold on;  grid on;  
    surf(reshape(bk, sz)');
    shading interp; colormap hot;
    height = 0.1;
    plot3(track(2,:),track(1,:),ones(size(track,2),1)*height, '-g', 'LineWidth',2);
    plot3(track(2,end),track(1,end),height, 'xg', 'LineWidth',2);
    plot3([track(2,end),track(2,end)],[track(1,end),track(1,end)],[height,0], 'o-y', 'LineWidth',2);
    xlabel('x');
    ylabel('y');
    zlabel('P');
    view(-6,62);
    pause(0.5);  
    %%
    cont = cont +1;
end

fprintf('> End search \n');
fprintf('----------------------------------------------\n');
end



Et voilá, everything is ready, now any robot can search for a lost object.

by Orbhe

src: http://www.mathworks.com/matlabcentral/fileexchange/58671-simple-lost-target-search

3D position of an object on a table

3D position of an object on a table

This 3D point recipe is perfect for special occasions such as demos or testing. The preparation is really easy and the result is tasty. This is the gourmet version of the 3D point computation of an object on a table using structured light sensors. You will be the best host to receive friends at your table.

Ingredients

  • Structured light sensor (asus xtion, kinect, …)
  • Dense point cloud software such as openni
  • pcl library

Get the dense point cloud from the software and then using RANSAC compute the most probable plane, that will be the table.

Tip: You can even make the execution faster by discretizing the dense cloud into voxels.

When you have the points that belong to the table, you remove them from the point cloud. You can do this as much as you want to remove all planes on the scene.

Afterwards you segment the remaining regions with points into independent sets. Every connected set of points is a potential object.

By computing the centre of mass of each set of points we have the 3D position on the space according to the camera reference frame.

The only thing missing is to transform the centre of each object into the world reference frame.

Et voilá, everything is ready, now any robot like me can reach the object.

by Orbhe

src: http://pointclouds.org/documentation/tutorials/planar_segmentation.php