Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated docker context structure and initial push to dockerHub registry #47

Merged
merged 1 commit into from
Oct 4, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 0 additions & 26 deletions Dockerfile_SnakemakePipeline

This file was deleted.

7 changes: 0 additions & 7 deletions Dockerfile_jupyterNotebook

This file was deleted.

8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
* Current members: Kicheol Kim, Junhee Yoon
* Please, leave a message in **Discussions** tab if you have any question and requests
* Please use docker image to analyze the data. AWS module is ready and Please ask to members for getting auth f AWS is needed to analze data.
* Our data is living in S3 bucket
* Our data is located in S3 bucket

### Goal
* Finding potential biomarkers and therapeutic target for helping multiple sclerosis patients, **reference**: [Cell type-specific transcriptomics identifies neddylation as a novel therapeutic target in multiple sclerosis](https://pubmed.ncbi.nlm.nih.gov/33374005/)
Expand All @@ -21,13 +21,15 @@
* https://openkbc.github.io/multiple_sclerosis_proj/

### Usage of docker container
* 2 images are composing up for jupyter notebook and workflow. The workflow image does not have controller currently, so user needs to get inside to control it by using docker attach.
* 4 images are needed to use services (notebook, pipelines, celery and redis)
* We are using docker registry to distribute images, please refer to [here](https://hub.docker.com/repository/docker/swiri021/openkbc_msproject/general)

![overview](README_resource/overview_recent.png)

* Containers
```shell
docker-compose -f docker-compose.yaml up --build # composing up
docker-compose -f docker-compose.yaml up --build # composing up by the codes or
docker-compose -f docker-compose.example.yaml up # composing up by using the registry
```

* Jupyter notebook
Expand Down
57 changes: 0 additions & 57 deletions aws_deployment/docker-compose.yaml

This file was deleted.

9 changes: 3 additions & 6 deletions docker-compose.AWS.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,7 @@ version: "3"
services:
notebook: # Notebook
build:
context: .
dockerfile: Dockerfile_jupyterNotebook
context: ./notebook
volumes:
- /home/ubuntu/MSProject/multiple_sclerosis_proj/notebook/notebook_lib:/home/jovyan/work/notebook_lib
- /home/ubuntu/MSProject/multiple_sclerosis_proj/notebook/notebook_utils:/home/jovyan/work/notebook_utils
Expand All @@ -16,8 +15,7 @@ services:

pipelines: # Pipelines
build:
context: .
dockerfile: Dockerfile_SnakemakePipeline
context: ./pipelines
deploy:
resources:
limits:
Expand All @@ -43,8 +41,7 @@ services:

celery: # celery
build:
context: .
dockerfile: Dockerfile_SnakemakePipeline
context: ./pipelines
volumes:
- /home/ubuntu/MSProject/multiple_sclerosis_proj/pipelines:/pipelines
- /home/ubuntu/MSProject/multiple_sclerosis_proj/data:/MainData
Expand Down
54 changes: 54 additions & 0 deletions docker-compose.example.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
version: "3"
services:
notebook: # Notebook
build:
image: swiri021/openkbc_msproject:notebookcontainer1
volumes:
- your_library_path:/home/jovyan/work/notebook_lib # Anything you want to import, but you can set up our codes from the github
- your_utils_path:/home/jovyan/work/notebook_utils
- your_archive_path:/home/jovyan/work/notebook_archive
- your_resultFile_path:/home/jovyan/work/resultFiles
- s3data_path_in_your_local:/home/jovyan/MainData # S3 data from our bucket
ports:
- 8888:8888
container_name: notebookContainer

pipelines: # Pipelines
build:
image: swiri021/openkbc_msproject:pipelinecontainer1
deploy:
resources:
limits:
memory: 4000m
volumes:
- pipelines_code_path:/pipelines # This codes from our github(pipeline folder)
- s3data_path_in_your_local:/MainData
- your_resultFile_path:/Output # Directly connected to notebook
ports:
- 80:5000
depends_on:
- redis
container_name: pipelineContainer
working_dir: /pipelines/pipeline_controller
command: conda run -n pipeline_controller_base gunicorn --bind 0.0.0.0:5000 --workers 2 --threads 4 --worker-class gthread connector:app

redis: # redis
image: redis:alpine
command: redis-server
ports:
- 6379:6379
container_name: redisServer

celery: # celery
build:
image: swiri021/openkbc_msproject:celerycontainer1
volumes: # Celery volume path should be the same with pipeline volume
- pipelines_code_path:/pipelines
- s3data_path_in_your_local:/MainData
- your_resultFile_path:/Output
working_dir: /pipelines/pipeline_controller/
command: conda run -n pipeline_controller_base celery -A app.celery worker --loglevel=info
depends_on:
- redis
- pipelines
container_name: celeryContainer
12 changes: 6 additions & 6 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ version: "3"
services:
notebook: # Notebook
build:
context: .
dockerfile: Dockerfile_jupyterNotebook
context: ./notebook
#image: swiri021/openkbc_msproject:notebookcontainer1
volumes:
- /Users/junheeyun/OpenKBC/multiple_sclerosis_proj/notebook/notebook_lib:/home/jovyan/work/notebook_lib
- /Users/junheeyun/OpenKBC/multiple_sclerosis_proj/notebook/notebook_utils:/home/jovyan/work/notebook_utils
Expand All @@ -16,8 +16,8 @@ services:

pipelines: # Pipelines
build:
context: .
dockerfile: Dockerfile_SnakemakePipeline
context: ./pipelines
#image: swiri021/openkbc_msproject:pipelinecontainer1
deploy:
resources:
limits:
Expand All @@ -43,8 +43,8 @@ services:

celery: # celery
build:
context: .
dockerfile: Dockerfile_SnakemakePipeline
context: ./pipelines
#image: swiri021/openkbc_msproject:celerycontainer1
volumes:
- /Users/junheeyun/OpenKBC/multiple_sclerosis_proj/pipelines:/pipelines
- /Users/junheeyun/OpenKBC/multiple_sclerosis_proj/data:/MainData
Expand Down
7 changes: 7 additions & 0 deletions notebook/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
FROM jupyter/datascience-notebook

COPY installers/installer_Rpackage.R /installer_Rpackage.R
COPY installers/requirements.txt /requirements.txt

RUN Rscript /installer_Rpackage.R
RUN pip install -r /requirements.txt
12 changes: 12 additions & 0 deletions pipelines/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
FROM continuumio/miniconda

COPY pipeline_controller/requirements.txt .
COPY pipeline_controller/installer_Rpackage.R .

RUN conda create -n pipeline_controller_base python=3.8.2 R=3.6
SHELL ["conda", "run", "-n", "pipeline_controller_base", "/bin/bash", "-c"]

RUN pip install -r requirements.txt
RUN Rscript installer_Rpackage.R

RUN python -c "import flask"

This file was deleted.

This file was deleted.