SHREC 2012 - Shape Retrieval Contest based on Generic 3D Dataset
Call For Participation:
SHREC 2012 - Shape Retrieval Contest based on Generic 3D Dataset
Objective
The objective of this track is to evaluate the performance of 3D shape retrieval approaches on Generic 3D Dataset.
Introduction
With the increasing number of 3D models are created every day and stored in databases, effectively searching a 3D repository for 3D shapes which are similar to a given 3D query model has become an important area of research. Benchmarking allows researchers to evaluate the quality of the results of different 3D shape retrieval approaches.
Task description
The task is to evaluate the dissimilarity between every two objects in the database mentioned above and then output the dissimilarity matrix.
Dataset
All the 3D models in the generic 3D dataset will be based on the combination of models from our previous generic 3D benchmarks. In this generic 3D dataset, there will be 1200 3D models, classified into 60 object categories based mainly on visual similarity. The file format used to represent the 3D models will be the ASCII Object File Format (*.off).
Evaluation Methodology
We will employ the following evaluation measures: Precision-Recall curve (PR), Nearest Neighbor (NN), First-Tier (FT), Second-Tier (ST), E-Measure (E), Discounted Cumulative Gain (DCG) and Average Precision (AP).
Procedure
The following list is a step-by-step description of the activities:
- The participants must register by sending a message to SHREC@nist.gov. Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
- The database will be made available via this website. Test dataset.
- Participants will submit the dissimilarity matrix (also named as distance matrix) for the test database. Up to 5 matrices per group may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the dissimilarity matrix file format. More information on the dissimilarity matrix file format .
- The evaluations will be done automatically.
- The organization will release the evaluation scores of all the runs.
- The participants write a one page description of their method with two figures and send their comments on the evaluation results.
- The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
- The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (May 13, 2012).
Schedule
January 28 | - Call for participation. |
February 1 | - Few sample models of the test database will be available on line. |
February 5 | - Please register before this date. |
February 7 | - Distribution of the whole database. Participants can start the retrieval. |
February 13 |
- Submission of results (dissimilarity matrix) and a one page description of their method(s). (Extended to February 15)
|
February 17 | - Distribution of relevance judgments and evaluation scores. |
February 23 | - Track is finished, and results are ready for inclusion in a track report. |
February 26 | - Camera ready track papers submitted for printing. |
May 13 | - Eurographics Workshop on 3D Object Retrieval including SHREC'2012. |
Organizers
Bo Li, Afzal Godil - National Institute of Standards and Technology
Please cite the paper:
B. Li, A. Godil, M. Aono, X. Bai, T. Furuya, L. Li, R. Lopez-Sastre, H. Johan, R. Ohbuchi, C. Redondo-Cabrera, A. Tatsuma, T. Yanagimachi, S. Zhang, In: M. Spagnuolo, M. Bronstein, A. Bronstein, and A. Ferreira (eds.): SHREC'12 Track: Generic 3D Shape Retrieval, Eurographics Workshop on 3D Object Retrieval 2012 (3DOR 2012), 2012.