|
|
4 months ago | |
|---|---|---|
| Docs | 4 months ago | |
| data_handler | 4 months ago | |
| kernel | 4 months ago | |
| .DS_Store | 4 months ago | |
| Inference_individually.py | 4 months ago | |
| README.md | 4 months ago | |
| SAM_with_prompt.py | 4 months ago | |
| SAM_without_prompt.py | 4 months ago | |
| args.py | 4 months ago | |
| data.py | 4 months ago | |
| double_decoder_infrence.py | 4 months ago | |
| fine_tune_good.py | 4 months ago | |
| requirements.txt | 4 months ago | |
| test_inference.py | 4 months ago | |
| utils.py | 4 months ago | |
Frist step is install requirements.txt bakages in a conda eviroment.
Clone the SAM repository.
Use the code below to download the suggested checkpoint of SAM:
!wget https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth"
For this segmentation report we used to populare pancreas datas:
After downloading and allocating datasets, we used a specefice data format (.npy) and for this step save.py provided. save_dir and labels_save_dir should modify.
As defualt data.py and data_loader_group.py are used in the desired codes.
Address can be modify in args.py.
Due to anonymous code submitiom we haven’t share our Model Weights.
For train model we use the files fine_tune_good.py and fine_tune_good_unet.py and the command bellow is an example for start training with some costume settings.
python3 fine_tune_good_unet.py --sample_size 66 --accumulative_batch_size 4 --num_epochs 60 --num_workers 8 --batch_step_one 20 --batch_step_two 30 --lr 3e-4 --inference
To infrence both types of decoders just run the double_decoder_infrence.py
To get individually infrence SAM with or without prompt use Inference_individually.py
To run the 3D Aggregator codes are available in kernel folder and just run the run.sh file.
becuase of opening so many files, the u -limit thresh hold should be increased using:
u -limit 15000