Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
88 commits
Select commit Hold shift + click to select a range
0564729
Created basic structure of repository included required folders and p…
rike568 Oct 23, 2025
3a652fe
Created basic requirements file that will be used to install all requ…
rike568 Nov 5, 2025
bc029c2
Updated requirements to automatically download best version for the u…
rike568 Nov 5, 2025
fd3b810
Imported releavant dependencies and set important default constants t…
rike568 Nov 5, 2025
eff132e
Added function to adjust for differing prefixes in the provided filen…
rike568 Nov 5, 2025
0252688
Added grayscale loader to confirm images provided are in grayscale, a…
rike568 Nov 5, 2025
6aa4c3f
Added random augmentations to images
rike568 Nov 5, 2025
b1d59a6
Created class to import and read provided dataset correcttly
rike568 Nov 5, 2025
77225c5
Added normalization to the provided images
rike568 Nov 5, 2025
7ccfa75
Added loaders to seperate the provided train/test/validate into their…
rike568 Nov 5, 2025
219441c
Added main function to test that code is runnning correctly.
rike568 Nov 5, 2025
d6a737f
Added some more comments to provide a bit more detail and tidied up c…
rike568 Nov 5, 2025
d982902
Added basic conventional block and up block as building units for imp…
rike568 Nov 5, 2025
9446bcf
Added improved UNET class and encoder/decoder modules.
rike568 Nov 5, 2025
fb070c4
Added forward pass with skip connections.
rike568 Nov 5, 2025
9c6b9cb
Added kaiming initialization for conventional layers and applied to m…
rike568 Nov 5, 2025
502c717
Added function that will automatically created the UNET model when re…
rike568 Nov 5, 2025
6cf8040
Tidied up code and added comments to provide more details on function…
rike568 Nov 5, 2025
a7a8b72
Adjusted default number of workers to one and adjusted my 2D unet mod…
rike568 Nov 6, 2025
3391f97
Added basic imports and set_seed to allow for results to be reproduce…
rike568 Nov 6, 2025
63ba11a
Added tensor helper funcitons for the traning loop and one hot encodi…
rike568 Nov 6, 2025
0e5293f
Added dice metrics and soft dice loss functions.
rike568 Nov 6, 2025
0dc7af1
Added class that will act as the final dice loss function.
rike568 Nov 6, 2025
d77eb6a
Added average metter to track the stats.
rike568 Nov 6, 2025
1bc857c
Added save and load checkpoint functions.
rike568 Nov 6, 2025
ac86a67
Added OASIS specific mask conversion helper function
rike568 Nov 6, 2025
6ccf02d
Added initial script structure and configuration
rike568 Nov 6, 2025
bd245e4
Implemented training for one epoch that will be reusued when training…
rike568 Nov 6, 2025
c10ce7d
Added validation function that will measure the validation loss
rike568 Nov 6, 2025
6459221
Added CSV history to log results after epochs and helper functions to…
rike568 Nov 6, 2025
0f51cc5
Implemented main function that will now be used to run over all epoch…
rike568 Nov 6, 2025
bcfd268
Added inital script structure with imports and basic layout
rike568 Nov 6, 2025
b5b6f76
Added visualisation helper functions that will color the images
rike568 Nov 6, 2025
8900822
Implemented starting main function with rough skeleton and model load…
rike568 Nov 6, 2025
0725f47
Implemenented test set metric valuation
rike568 Nov 6, 2025
a306862
Added code to save images that have been generated
rike568 Nov 6, 2025
8e16447
Implemented preview grid generation for saved overlays
rike568 Nov 6, 2025
476745c
Allows the user to input a seed value of their choosing, or just uses…
rike568 Nov 7, 2025
7238e05
Added basic script structure with required imports, with global const…
rike568 Nov 7, 2025
b28a51d
Implemented constructor to the load hip dataset, in addition paired i…
rike568 Nov 7, 2025
98459de
Added loading of nifti files, normalizing image and coverting to tens…
rike568 Nov 7, 2025
66e06fa
Addded class to randomly flip and rotate both the image and mask
rike568 Nov 7, 2025
0e48714
Implemented a function that would simply the creation of the data loa…
rike568 Nov 7, 2025
ffa98a8
Implemented final check to ensure the images have been correctly load…
rike568 Nov 7, 2025
3ca96f5
Added basic script layout with imports and a helper function for weig…
rike568 Nov 7, 2025
d429fa6
Implemented the core unet building blocks using the research paper re…
rike568 Nov 7, 2025
ccdd834
Added a 1x1 convulution layer which will be used to map feature chann…
rike568 Nov 7, 2025
aa029b8
Assembles the previously created components into a full encoder-decod…
rike568 Nov 7, 2025
f046766
Implemented the logic for the forward pass which will allow for the m…
rike568 Nov 7, 2025
0fe9003
Implemented helper function that will allow for easy implementation o…
rike568 Nov 7, 2025
1d5fcb0
Just added a check to make sure the UNET model has the same output as…
rike568 Nov 7, 2025
2907004
Added inital structure to script and basic set_seed() function that w…
rike568 Nov 7, 2025
875af00
Added helper functions that will allow us to move the tensors to the …
rike568 Nov 7, 2025
c56b44a
Added dice metrics and soft dice loss function
rike568 Nov 7, 2025
41bf2a9
Added combined CEDice class that combines the previously created func…
rike568 Nov 7, 2025
324d4b8
Added a class to keep a meter to track the running statistics of the …
rike568 Nov 7, 2025
11c3ebd
Added functions that will the save and loading of the model during tr…
rike568 Nov 7, 2025
20abb70
Added initial script structure, imports and cofig
rike568 Nov 7, 2025
2ec4a16
Implement the training of one epoch that will be used to repeatedly t…
rike568 Nov 7, 2025
29e0ff0
Implemented validate function that will determine the validation loss…
rike568 Nov 7, 2025
900999c
Added a helper funciton to keep track of the training history and all…
rike568 Nov 7, 2025
a8b1e1e
Implemented main function that will accept command line arguments, lo…
rike568 Nov 7, 2025
556eabd
Inside the main function implemented the training loop which will tra…
rike568 Nov 7, 2025
590ca65
Added code to evaluate the last training of the model, displaying res…
rike568 Nov 7, 2025
3aa0f98
Added inital script structure and config for predicting the results o…
rike568 Nov 7, 2025
b8e78b0
Added visualisation helper functions that will showcase the ground tr…
rike568 Nov 7, 2025
bd39bd9
Implemented main function and loading the previously trained model
rike568 Nov 7, 2025
3fd4369
Implemented test set metric evaluation and reporting
rike568 Nov 7, 2025
c4b2636
Added saving of individual prediction samples in which only 8 images …
rike568 Nov 7, 2025
2545592
Added generation of final preview grid, that will display the entiret…
rike568 Nov 7, 2025
740767a
Added function comments
rike568 Nov 7, 2025
0e259a6
Added detailed function comments to all comments in this file
rike568 Nov 7, 2025
1213c3c
Conda environment file that will allow others to recreate the environ…
rike568 Nov 7, 2025
fdbcfe0
Updated conda environment file to be very basic and only install esse…
rike568 Nov 7, 2025
672fa1b
Added function and class comments
rike568 Nov 7, 2025
3cb453b
Added function and class comments
rike568 Nov 7, 2025
ce1b7b7
Added more detailed function and class comments
rike568 Nov 7, 2025
f036373
Moved conda environment file to a more appropriate location
rike568 Nov 7, 2025
9e486e6
Started adding to the readme file with basic guide on how to actually…
rike568 Nov 7, 2025
bb9941e
Added assets that will be used in the report and adjust the readme to…
rike568 Nov 7, 2025
9317b73
Added image from research paper that guided the improved UNET that wa…
rike568 Nov 7, 2025
0d570c2
Adjusted how the predictions are saved instead of separate files they…
rike568 Nov 7, 2025
ce04333
Removed uneeded assets and added new ones made images appear on readme
rike568 Nov 7, 2025
98200e2
Update readme
rike568 Nov 7, 2025
28b571e
Finalised readme by adding some sentences on the algorithm that was u…
rike568 Nov 7, 2025
a843ea1
Update gitignore
rike568 Nov 23, 2025
024a576
Removed unecessary requirements.txt as request
rike568 Nov 23, 2025
d48cd5b
Removed project folder that is not required
rike568 Nov 23, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
recognition/oasis2d_unet_45807321/OASIS
recognition/hipmri2d_unet_45807321/HipMRI_Study_open
recognition/predicts_hip
recognition/predicts_oasis
recognition/final_outputs
recognition/final_outputs_hip
recognition/hip_outputs
recognition/brain_outputs
recognition/oasis2d_unet_45807321/v2
recognition/oasis2d_unet_45807321/v3
recognition/oasis2d_unet_45807321/outputs
recognition/oasis2d_unet_45807321/__pycache__
recognition/hipmri2d_unet_45807321/__pycache__
settings.json
recognition/hipmri2d_unet_45807321/README.pdf
recognition/oasis2d_unet_45807321
203 changes: 203 additions & 0 deletions recognition/hipmri2d_unet_45807321/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,203 @@
# HipMRI 2D Segmentation

## About

This repository contains the code for a 2D U-Net model for 6-class segmentation of HipMRI scans. The model uses a deep supervision architecture, and the code provides a full pipeline for training, evaluation, and visual inference.

The problem that I have been tasked with is to create a model that can perform segmentation on the provided hip MRI dataset. More specifically, we are trying to outline the prostate gland so that professionals can more easily look for signs of prostate cancer. To do this, I have created an improved 2D U-Net model that was designed for the purpose of image segmentation, which is exactly what I'm required to do.

How does this algorithm work? A normal 2D U-Net model works with the standard encoder, decoder, and skip connections. The improved 2D U-Net is similar; however, it just refines it further. For example, instead of utilizing Batch Normalization and ReLU, it instead uses Instance Normalization that will allow for more stable training. The below image shows an example of all the components that make up the 2D Improved U-Net model.

![Network Architecture](./report_assets/report_network.png)[1](#ref-1)

## 📊 Example Results

Here are the training curves and example predictions from a 20-epoch run with the default settings.

### Training Curves

![Training Curves](./report_assets/curves.png)

The plot on the left showcase the train_loss and validation_loss over the 20 epochs that were ran. The plot on the right showcases the dice coeffeicent over the 20 epochs.

## Outputs

The below are simple images that were created after the prediciton was ran on the model that was created.

It can be seen that the prediction isn't too far off from the ground truth, but there is still definitely room for improvement as it can bee seen that the model is still doing some underestimations.

### Output 1

![Example Predictions 1](./report_assets/sample_000_00_combined.png)

### Output 2

![Example Predictions 2](./report_assets/sample_000_01_combined.png)

### Output 3

![Example Predictions 2](./report_assets/sample_000_02_combined.png)

### Dice coefficents

The below are the end dice coefficents that I had after the mdoel was trained.

For the first 4 classes it can be seen that the dice coefficents are excellent, proably due to the fact that these features appear more overtly. For the last features my model still has decent dice coefficeints however, not as good as the first 4 probably due to the fact these features are a lot smaller, so are harder to pinpoint.

Per-class Dice: C0:0.982 C1:0.984 C2:0.942 C3:0.970 C4:0.876 C5:0.839
Mean Dice: 0.932

## 🚀 How to Run

Follow these steps to set up the environment, download the data, and run the code.

### 1. Get the Code

Clone the repository and navigate to the project directory:

```bash
git clone [https://github.com/rike568/PatternAnalysis-2025.git](https://github.com/rike568/PatternAnalysis-2025.git)
cd PatternAnalysis-2025
git checkout topic-recognition
cd recognition/hipmri2d_unet_45807321
```

### 2\. Set Up the Environment

You will need Conda to replicate the environment.

1. **Install Conda:** If you don't have it, please [install Miniconda](https://docs.conda.io/en/latest/miniconda.html) for your system.

2. **Create the Environment:** Use the provided `environment.yml` file to create the conda environment. This will install all required packages.

```bash
conda env create -f environment.yml
```

3. **Activate the Environment:** Before running any scripts, you must activate the new environment:

```bash
conda activate comp3710
```

### 3\. Download the Dataset

The code requires the `HipMRI_Study_open` dataset.

**Rangpur Location:**

If you are on the `rangpur` server, you can copy the data directly from the group directory:

```bash
# This copies the dataset into your current folder
cp -r /home/groups/comp3710/HipMRI_Study_open/keras_slices_data HipMRI_Study_open
```

Your final folder structure should look like this:

```
hipmri2d_unet_45807321/
├── HipMRI_Study_open/
│ ├── keras_slices_train/
│ ├── keras_slices_seg_train/
│ ├── keras_slices_validate/
│ ├── keras_slices_seg_validate/
│ ├── keras_slices_test/
│ └── keras_slices_seg_test/
├── train.py
├── predict.py
├── dataset.py
├── modules.py
├── utils.py
└── environment.yml
```

### 4\. Train the Model

With the `comp3710` environment active, you can run the training script. Checkpoints and results will be saved to the `outputs/` folder.

**To train with default settings:**
This will use the defaults set in the script (e.g., seed=42, lr=0.0005).

```bash
python train.py
```

**To train with custom hyperparameters:**
You can override the default settings by providing command-line arguments.

- `--seed`: Set the random seed (e.g., `--seed 123`).
- `--lr`: Set the learning rate (e.g., `--lr 0.001`).
- `--weight_decay`: Set the Adam weight decay (e.g., `--weight_decay 1e-5`).
- `--grad_clip_norm`: Set the gradient clipping norm (e.g., `--grad_clip_norm 1.0`).

**Example of a custom run:**
This command trains with a learning rate of 0.001 and a seed of 123.

```bash
python train.py --lr 0.001 --seed 123
```

### 5\. Run Predictions

After training, you can run inference on the test set. This will load the `best.pt` checkpoint from the `outputs/` folder, calculate final Dice scores, and save visual predictions to `outputs/predictions/`.

**To run with the default seed (42):**

```bash
python predict.py
```

**To run with a custom seed:**
You can specify a different seed for reproducibility.

```bash
python predict.py --seed 123
```

# 📦 Project Dependencies:

This document outlines the software environment and dependencies required for the `comp3710` project, typically sourced from an `environment.yml` or similar configuration file.

---

## 🔗 Configuration Channels

The following channels are used to locate and download packages:

- **`pytorch`**: Primary channel for PyTorch-related packages, especially those built with specific CUDA versions.
- **`defaults`**: The standard set of channels used by the package manager (e.g., Anaconda/Miniconda).

---

## 🐍 Core Dependencies (Conda)

These packages are managed directly by the environment tool (Conda, in this case).

- **`python=3.10`**: Specifies the required Python version.
- **`pip`**: Ensures the `pip` package installer is available for managing secondary dependencies.

---

## ⚙️ Python Packages (Pip)

These packages are installed using `pip`, often with specific build configurations.

| Package Name | Installation Source / Note | Description |
| :---------------- | :--------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------ |
| **`torch`** | `--index-url https://download.pytorch.org/whl/cu118` | The core **PyTorch** library, explicitly compiled for **CUDA 11.8** for GPU acceleration. |
| **`torchvision`** | `--index-url https://download.pytorch.org/whl/cu118` | A package for computer vision, providing datasets, models, and image transformations, also built for **CUDA 11.8**. |
| **`torchaudio`** | `--index-url https://download.pytorch.org/whl/cu118` | A package for audio data, including data loading and transformations, also built for **CUDA 11.8**. |
| **`matplotlib`** | Standard PyPI | A comprehensive library for creating static, animated, and interactive visualizations in Python. |
| **`nibabel`** | Standard PyPI | Provides read/write access to common neuroimaging file formats (e.g., NIfTI, DICOM). |
| **`tqdm`** | Standard PyPI | A fast, extensible progress bar for loops and iterables. |

---

## 🚀 Environment Summary

This environment is specifically configured for deep learning tasks involving PyTorch, with a strong focus on GPU acceleration (CUDA 11.8), and includes specialized libraries for handling neuroimaging data (`nibabel`) and providing utility (`tqdm`, `matplotlib`).

## References

1. <a id="ref-1"></a>Isensee, F., Kickingereder, P., Wick, W., Bendszus, M., & Maier-Hein, K. H. (2018). _Brain Tumor Segmentation and Radiomics Survival Prediction: Contribution to the BRATS 2017 Challenge_. arXiv:1802.10508. Available: [https://arxiv.org/abs/1802.10508](https://arxiv.org/abs/1802.10508)
Loading