Logo LVSN
EnglishAccueil
A proposPersonnesRecherchePublicationsEvenementsProfil
A propos
Séminaires REPARTI


Les Séminaires REPARTI à l'Université Laval ont lieu le vendredi à 11h30.
Veuillez consulter le programme pour plus de détails.

Projet de maîtrise, de doctorat ou stage postdoctoral en apprentissage automatique au sein de l'équipe du Prof. Christian Gagné : veuillez consulter l'annonce suivante pour tous les détails : http://vision.gel.ulaval.ca/~cgagne/postes2017.html

 

 

 

 

REPARTI

MIVIM

Dec 13 2013 9:12AM

Trung Thien Tran

Automatic method for sharp feature extraction from 3D data of man-made objects



Résumé

Digital scanning devices have been used for various applications. Due to the rapid development of scanning technologies, large sets of accurate 3D points can be collected with such devices. Therefore, more and more applications use these sensors, especially in industrial manufacturing.

Among emerging problems, sharp feature extraction from scanned data of CAD models has recently received much attention from the research community. Sharp features help to understand the structure of the underlying geometry of a surface. Furthermore, sharp feature extraction is also an important issue to consider in segmentation, surface reconstruction and resampling. For example, most manufactured objects consist of the combination of common geometric primitives such as planes, cylinders, spheres, cones or toruses and the intersection between these primitives can be considered as sharp features. Most existing methods focus only on point clouds or meshes which limit their flexibility and generality. Moreover, the problem with these methods is that most of them require that many parameters be set manually (i.e. global threshold values set by the user) to assess whether a point lies on a sharp feature or not. Therefore, our study proposes a novel algorithm for extracting sharp features automatically from 3D data (mesh and point cloud).




     
   
   

©2002-. Laboratoire de Vision et Systèmes Numériques. Tous droits réservés