Skip to content
forked from JiehongLin/SAM-6D

[CVPR2024] Code for "SAM-6D: Segment Anything Model Meets Zero-Shot 6D Object Pose Estimation".

Notifications You must be signed in to change notification settings

POSE-Lab/SAM-6D

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

68 Commits
 
 
 
 
 
 

Repository files navigation

Steps to follow for BOP Evaluation.

Getting Started:

Install SAM-6D env and the model checkpoints:

cd SAM-6D
bash prepare.sh

Data preparation

Download BOP Datasets inside /Data.

Structure should be:

Data/BOP
├── lmo
    ├──models           # object CAD models 
    ├──test             # bop19 test set
    ...

Render templates:

cd ../Render/
blenderproc run render_bop_templates.py --dataset_name $DATASET

The string "DATASET" could be set as lmo, icbin, itodd, hb, tless, tudl or ycbv. Rendered templates can also be downloaded from the SAM-6D repo.

Instance Segmentation

Run Instance Segmentation with SAM or FastSAM.

cd Instance_Segmentation_Model

export CUDA_VISIBLE_DEVICES=0

python run_inference.py dataset_name=$DATASET or python run_inference.py dataset_name=$DATASET model=ISM_fastsam.

The string "DATASET" could be set as lmo, icbin, itodd, hb, tless, tudl or ycbv.

Pose Estimation

Evaluation on BOP Datasets:

cd Pose_Estimation_Model

python test_bop.py --gpus 0 --model pose_estimation_model --config config/base.yaml --dataset $DATASET --view 42

The string "DATASET" could be set as lmo, icbin, itodd, hb, tless, tudl or ycbv.

About

[CVPR2024] Code for "SAM-6D: Segment Anything Model Meets Zero-Shot 6D Object Pose Estimation".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.8%
  • Cuda 3.7%
  • C++ 2.7%
  • Other 0.8%