Skip to content

Commit

Permalink
Remove niimath as it is no longer used
Browse files Browse the repository at this point in the history
  • Loading branch information
neurolabusc committed Dec 19, 2024
1 parent a88d237 commit d7ef37e
Show file tree
Hide file tree
Showing 5 changed files with 13 additions and 64 deletions.
6 changes: 4 additions & 2 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
MIT License

Copyright (c) 2021 neuroneural/brainchop
brainchop models Copyright (c) 2021 neuroneural/brainchop

Ported to NiiVue 2024 NiiVue developers
niivue visualization Copyright (c) 2024 Chris Rorden and Taylor Hanayik

itk-wasm visualization Copyright (c) 2024 Matt McCormick

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
16 changes: 6 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,21 +9,18 @@ This is an extension of [brainchop](https://github.com/neuroneural/brainchop) th
1. Open the [live demo](https://niivue.github.io/brain2print/).
2. **Option 1** The web page automatically loads with a default T1 MRI scan. If you want to use this scan, go to step 5.
3. **Option 2** If your T1 MRI scan is in NIfTI format, drag and drop the file onto the web page.
4. **Option 3** If your image is in DICOM format, it may load if you drag and drop the files. If this fails, convert your images with [dcm2niix](https://github.com/rordenlab/dcm2niix).
4. **Option 3** If your image is in DICOM format, it may load if you drag and drop the files. If this fails, convert your images with [dcm2niix](https://niivue.github.io/niivue-dcm2niix/) and save the result as a NIfTI format file that brain2print can open.
5. Segment your brain scan by choosing a model from the `Segmentation Model` pull-down menu. Not all models work with all graphics cards. The `Tissue GWM (High Acc, Low Mem)` is a good starting point. Hopefully, it will accurately segment your brain into gray matter, white matter and cerebral spinal fluid.
6. Press the `Create Mesh` button and select your preferred settings:

- ![settings dialog](Settings.png)

- You can choose `Smoothing` to make the surfaces less jagged at the expense of computation time.
- You can choose to `Simplify` to reduce the number of triangles and create smaller files.

7. Once you have set your preferences, press `Apply`.
8. You will see the mesh appear and can interactively view it. If you are unhappy with the result, repeat step 6 with different settings. If you want to print the results, press the `Save Mesh` button.

## How it Works

This web application uses some of the latest browser technologies that allow the tissue segmentation model to run on your local GPU, regardless of the type of GPU. This is possible via the `WebGPU` browser API. Additionally, we leverage `WebAssembly` to run the `niimath` [WASM wrapper](https://www.npmjs.com/package/@niivue/niimath) and [ITK-Wasm](https://wasm.itk.org) to turn the tissue segmentation into a 3D mesh. No data ever leaves your machine.
This web application uses some of the latest browser technologies that allow the tissue segmentation model to run on your local graphics card (GPU), regardless of the type of GPU. This is possible via the `WebGPU` browser API. Additionally, we leverage [ITK-Wasm](https://wasm.itk.org) to turn the tissue segmentation into a 3D mesh. No data ever leaves your machine.

### Developers - Running a Local Live Demo

Expand All @@ -43,9 +40,8 @@ npm run build

## References

### Our group's open source software used in this applicaiton
This web page combines three packages developed by our team:

- [brainchop](https://github.com/neuroneural/brainchop)
- [niivue](https://github.com/niivue/niivue)
- [niimath](https://github.com/rordenlab/niimath)
- [ITK-Wasm](https://github.com/InsightSoftwareConsortium/ITK-Wasm)
- [brainchop](https://github.com/neuroneural/brainchop) AI models for tissue segmentation.
- [niivue](https://github.com/niivue/niivue) reading images and visualization
- [ITK-Wasm](https://github.com/InsightSoftwareConsortium/ITK-Wasm) for voxel-to-mesh and mesh processing
47 changes: 3 additions & 44 deletions main.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
import { Niivue, NVMeshUtilities } from "@niivue/niivue";
import { Niimath } from "@niivue/niimath";
// import {runInference } from './brainchop-mainthread.js'
import { inferenceModelsList, brainChopOpts } from "./brainchop-parameters.js";
import { isChrome, localSystemDetails } from "./brainchop-telemetry.js";
import MyWorker from "./brainchop-webworker.js?worker";
Expand All @@ -23,20 +21,10 @@ setCuberillePipelinesUrl(pipelinesBaseUrl)
setMeshFiltersPipelinesUrl(pipelinesBaseUrl)

async function main() {
const niimath = new Niimath();
await niimath.init();
aboutBtn.onclick = function () {
const url = "https://github.com/niivue/brain2print";
window.open(url, "_blank");
};
/*diagnosticsBtn.onclick = function () {
if (diagnosticsString.length < 1) {
window.alert('No diagnostic string generated: run a model to create diagnostics')
return
}
navigator.clipboard.writeText(diagnosticsString)
window.alert('Diagnostics copied to clipboard\n' + diagnosticsString)
}*/
opacitySlider0.oninput = function () {
nv1.setOpacity(0, opacitySlider0.value / 255);
nv1.updateGLVolume();
Expand Down Expand Up @@ -73,7 +61,6 @@ async function main() {
.getParameter(rendererInfo.UNMASKED_RENDERER_WEBGL)
.includes("NVIDIA");
}

let opts = brainChopOpts;
opts.rootURL = location.href;
const isLocalhost = Boolean(
Expand Down Expand Up @@ -115,8 +102,7 @@ async function main() {
callbackUI(
event.data.message,
event.data.progressFrac,
event.data.modalMessage,
event.data.statData
event.data.modalMessage
);
}
if (cmd === "img") {
Expand All @@ -129,7 +115,6 @@ async function main() {
console.log(
"Only provided with webworker code, see main brainchop github repository for main thread code"
);
// runInference(opts, model, nv1.volumes[0].hdr, nv1.volumes[0].img, callbackImg, callbackUI)
}
};
saveBtn.onclick = function () {
Expand Down Expand Up @@ -178,30 +163,10 @@ async function main() {
saveBtn.disabled = false
createMeshBtn.disabled = false
}
async function reportTelemetry(statData) {
if (typeof statData === "string" || statData instanceof String) {
function strToArray(str) {
const list = JSON.parse(str);
const array = [];
for (const key in list) {
array[key] = list[key];
}
return array;
}
statData = strToArray(statData);
}
statData = await localSystemDetails(statData, nv1.gl);
diagnosticsString =
":: Diagnostics can help resolve issues https://github.com/neuroneural/brainchop/issues ::\n";
for (var key in statData) {
diagnosticsString += key + ": " + statData[key] + "\n";
}
}
function callbackUI(
message = "",
progressFrac = -1,
modalMessage = "",
statData = []
modalMessage = ""
) {
if (message !== "") {
console.log(message);
Expand All @@ -217,9 +182,6 @@ async function main() {
if (modalMessage !== "") {
window.alert(modalMessage);
}
if (Object.keys(statData).length > 0) {
reportTelemetry(statData);
}
}
function handleLocationChange(data) {
document.getElementById("location").innerHTML =
Expand Down Expand Up @@ -247,7 +209,6 @@ async function main() {
const img = nv1.volumes[volIdx].img;
const itkImage = nii2iwi(hdr, img, false);
itkImage.size = itkImage.size.map(Number);

const { mesh } = await antiAliasCuberille(itkImage, { noClosing: true });
meshProcessingMsg.textContent = "Generating manifold"
const { outputMesh: repairedMesh } = await repair(mesh, { maximumHoleArea: 50.0 });
Expand All @@ -260,7 +221,6 @@ async function main() {
const initialNiiMeshBuffer = NVMeshUtilities.createMZ3(initialNiiMesh.positions, initialNiiMesh.indices, false)
await nv1.loadFromArrayBuffer(initialNiiMeshBuffer, 'trefoil.mz3')
saveMeshBtn.disabled = false

meshProcessingMsg.textContent = "Smoothing and remeshing"
const smooth = parseInt(smoothSlide.value)
const shrink = parseFloat(shrinkPct.value)
Expand Down Expand Up @@ -297,8 +257,6 @@ async function main() {
for (let i = 0; i < pts.length; i++) pts[i] *= scale;
NVMeshUtilities.saveMesh(pts, nv1.meshes[0].tris, `mesh.${format}`, true);
};

var diagnosticsString = "";
var chopWorker;
let nv1 = new Niivue(defaults);
nv1.attachToCanvas(gl1);
Expand All @@ -316,6 +274,7 @@ async function main() {
nv1.onImageLoaded = doLoadImage;
modelSelect.selectedIndex = -1;
workerCheck.checked = await isChrome(); //TODO: Safari does not yet support WebGL TFJS webworkers, test FireFox
console.log('brain2print 20241218')
// uncomment next two lines to automatically run segmentation when web page is loaded
// modelSelect.selectedIndex = 11
// modelSelect.onchange()
Expand Down
7 changes: 0 additions & 7 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 0 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
"@itk-wasm/cuberille": "^0.1.0",
"@itk-wasm/mesh-filters": "^0.1.0",
"@niivue/cbor-loader": "^1.1.0",
"@niivue/niimath": "^0.1.1",
"@niivue/niivue": "^0.44.2",
"@tensorflow/tfjs": "^4.19.0",
"gl-matrix": "^3.4.3"
Expand Down

0 comments on commit d7ef37e

Please sign in to comment.