Free Boundary Conditions Active Contours

Welcome

Welcome to our Free Boundary Conditions Open Active Contours (FBCOAC) project page! Here you may find various demos and download all project related materials such as the paper, supplementary materials, and the MATLAB code which allows you to experiment with the free boundary conditions active contour model using convinient GUI. Active contours with free boundary conditions are a type of open contour model, but unlike the few previous attempts to use open active contours, which relied on either no boundary conditions or fixed boundary conditions, here we present a third type whose end points are allowed to move freely along given parametric curves – or what’s known in the literature of calculus of variations as free boundary conditions. We believe this type offers some benefits that can be utilized in various computer vision applications, several of which are presented below.

The animation on the right shows in blue the evolution of such FABCOAC from an initial green state until it converges to a minimal energetic state. Here the energy of the curve is defined solely on its shape, but the scheme allows external energy (one that depends on the image) to be integrated as well. The two (predefined) parametric boundary curve are shown in red.

Download

Matlab Code

The following MATLAB code implements the parametric free boundary conditions active contour model, in which the active contour is representad explicitly as a sampled parametric curve (as described in the paper). The initial contour is created by the user, and evovlves until it reaches an optimal (local minimum) energetic state.

  • Download and extract the Free boundary conditions active contours GUI. Once you are done, a directory named “Free_Boundary_Conditions_Active_Countours_GUI” will include the relevant MATLAB files.
  • Launch MATLAB and set the “Current Folder” accordingly.
  • In the command window type “gui” to start the program.
  • Instructions can be found under the ‘Help’ button in the toolbar.

Coming soon: Code for FBCOAC with convergence to global energy optimization between the boundary condition curves.

Papers

Selected Demos and Applications

Boundary Detection

Open active cintours with free boundary conditions are excellent for boundary detection when it is clear that the sought-after boundary intersects a given curve in the image plane (either the image boundaries or not). Here you can see two such examples. The red curves indicate boudary curves for the free boundary conditions. The active contouer on the right is set to have dominant internal energy also to overcome occlussions.

 

Feature tracking

Naturally, the same mechanism can be used to track object whose boundaries are confined within region of interest or well-defined boundaries. Road tracking, the example shown here, is a classical case where the shoulders should be tracked within a range of interest, and therefore within boundary curves defined my the vertical margins of the image and a (virtual) horizontal curve obtained by the required distance of interest projected back to the image plane. Naturally, the active contours of the previous frame abd be used as an initial state for the active contour in the subsequent frame.

 

Seam carving

Open active contours with free boundary conditions can also be utilized for more remote applications like saem carving, with the advantage being more flexible and general than the classical discrete seams. As such, they are able to converge to better optimized and continuous minima, to provide mroe accurate seam removal for better image regargeting. Example of one such seam is shown here in blue, after it was initialized by a classical discrete seam and then evolved to yet better minima of the seam energy.

 
 

 

Credits

Who and Where…

This research is a joint work by Michal Shemesh and Ohad Ben-Shahar of the Computer Science DepartmentBen-Gurion University of The NegevBeer ShevaIsrael It was presented in the International Symposium on Visual Computing (ISVC), 2011, and the Israel Machine Vision Conference (IMVC) 2012.

Acknowledgments

This work was funded in part by the European Commission in the 7th Framework Programme (CROPS GA no 246252). We also thank the generous support of the Frankel fund, the Paul Ivanier center for Robotics Research and the Zlotowski Center for Neuroscience at Ben-Gurion University.