Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

기본 세팅 및 CI/CD 추가 #3

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .env.docker
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
DATABASE_HOST=host.docker.internal
9 changes: 9 additions & 0 deletions .env.local
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
DJANGO_STORAGE_BACKEND=django.core.files.storage.FileSystemStorage
DATABASE_ENGINE=django.db.backends.mysql
DATABASE_NAME=pyconkr-api-v3-db
DATABASE_HOST=127.0.0.1
DATABASE_PORT=43306
DATABASE_USER=user
DATABASE_PASSWORD=password
DATABASE_ROOT_PASSWORD=root_password
DEBUG=True
37 changes: 37 additions & 0 deletions .github/workflows/lint.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Check lint

on:
pull_request:
push:
branches:
- 'main'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: true

jobs:
lint:
name: Run lint
runs-on: ubuntu-latest
steps:
- name: Checkout source codes
uses: actions/checkout@v4

- uses: actions/setup-python@v4
with:
python-version: '3.11'

- name: Install dependencies
run: pip install 'pre-commit'

- name: cache pre-commit repo
uses: actions/cache@v4
with:
path: ~/.cache/pre-commit
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
restore-keys: ${{ runner.os }}-pre-commit-

- name: Run pre-commit
id: run-pre-commit
run: pre-commit run --all-files
171 changes: 171 additions & 0 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,171 @@
name: Release

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: true

on:
workflow_dispatch:
push:
branches:
- 'main'

jobs:
BuildAndDeploy:
runs-on: ubuntu-latest

env:
API_STAGE: ${{ github.event_name == 'workflow_dispatch' && 'prod' || 'dev' }}
BUMP_RULE: ${{ github.event_name == 'workflow_dispatch' && 'patch' || 'prerelease' }}
AWS_ECR_REGISTRY: ${{ github.event_name == 'workflow_dispatch' && secrets.AWS_ECR_PROD_URL || secrets.AWS_ECR_DEV_URL }}

steps:
# Checkout source codes
- name: Checkout source codes
uses: actions/checkout@v4
with:
fetch-depth: 0

# Setup AWS Credentials, Python, Poetry, docker buildx, and login to ECR.
- name: Setup AWS Credentials
uses: aws-actions/configure-aws-credentials@master
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ vars.AWS_REGION }}

- run: pipx install poetry
- uses: actions/setup-python@v4
with:
python-version: '3.11'
cache: poetry
- run: poetry install --no-interaction --no-root --only=deployment

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to ECR
uses: docker/login-action@v3
with:
registry: ${{ env.AWS_ECR_REGISTRY }}

- name: Get current date and repo name
id: info
run: |
echo "::set-output name=date::$(date +'%Y-%m-%d_%H:%M:%S')"
echo "::set-output name=repository_name::$(echo ${{ github.repository }} | sed -e 's/${{ github.repository_owner }}\///')"
# Create new version tag
- name: Create Release tag
id: get-new-version-tag
run: |
poetry version ${{ env.BUMP_RULE }}
echo "::set-output name=TAG::$(poetry version -s)"
# Build and Push Docker image to ECR
- name: Build and Push Docker image to ECR
uses: docker/build-push-action@v5
with:
push: true
tags: ${{ env.AWS_ECR_REGISTRY }}:${{ steps.get-new-version-tag.outputs.TAG }},${{ env.AWS_ECR_REGISTRY }}:latest
cache-from: type=gha
cache-to: type=gha,mode=max
context: .
file: ./infra/Dockerfile
platforms: linux/amd64
provenance: false
build-args: |
GIT_HASH=${{ github.sha }}
IMAGE_BUILD_DATETIME=${{ steps.info.outputs.date }}
# Commit new deployment version
- uses: EndBug/add-and-commit@v9
with:
message: "${{ steps.get-new-version-tag.outputs.TAG }} 버전 Release"
tag: ${{ github.event_name == 'workflow_dispatch' && steps.get-new-version-tag.outputs.TAG || '' }}
add: "pyproject.toml"
pathspec_error_handling: exitImmediately

# Checkout and import zappa config from pyconkr-secrets repo
- name: Checkout secrets repo
uses: actions/checkout@v4
with:
repository: ${{ secrets.PYCONKR_SECRET_REPOSITORY }}
ssh-key: ${{ secrets.PYCONKR_SECRET_REPOSITORY_DEPLOY_KEY }}
path: secret_envs
clean: false
sparse-checkout-cone-mode: false
sparse-checkout: |
${{ steps.info.outputs.repository_name }}/zappa_settings.json
- run: mv secret_envs/${{ steps.info.outputs.repository_name }}/zappa_settings.json ./zappa_settings.json && rm -rf secret_envs

# Zappa update
- name: Zappa Update
run: poetry run zappa update ${{ env.API_STAGE }} --docker-image-uri ${{ env.AWS_ECR_REGISTRY }}:${{ steps.get-new-version-tag.outputs.TAG }}

- name: Collect staticfiles
run: poetry run zappa manage ${{ env.API_STAGE }} "collectstatic --no-input"

# Notify to Slack (Success)
- name: Notify deployment to Slack
if: failure() || cancelled()
uses: slackapi/slack-github-action@v1.26.0
with:
channel-id: ${{ vars.SLACK_DEPLOYMENT_ALERT_CHANNEL }}
payload: |
{
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "${{ steps.get-new-version-tag.outputs.TAG }} 버전 배포 실패 :rotating_light: (${{ job.status }})",
"emoji": true
}
},
{
"type": "section",
"text": {"type": "mrkdwn", "text": "GitHub Action 바로가기"},
"accessory": {
"type": "button",
"text": {"type": "plain_text", "text": "${{ github.run_id }}"},
"value": "github_action",
"url": "${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}",
"action_id": "button-action"
}
}
]
}
env:
SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }}

# Notify to Slack (Failure)
- name: Notify deployment to Slack
uses: slackapi/slack-github-action@v1.26.0
with:
channel-id: ${{ vars.SLACK_DEPLOYMENT_ALERT_CHANNEL }}
payload: |
{
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "${{ steps.get-new-version-tag.outputs.TAG }} 버전 배포 성공 :tada:",
"emoji": true
}
},
{
"type": "section",
"text": {"type": "mrkdwn", "text": "GitHub Action 바로가기"},
"accessory": {
"type": "button",
"text": {"type": "plain_text", "text": "${{ github.run_id }}"},
"value": "github_action",
"url": "${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}",
"action_id": "button-action"
}
}
]
}
env:
SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }}
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -159,3 +159,6 @@ cython_debug/
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/
/pyconkr/.idea/workspace.xml

admin/
ninja/
4 changes: 4 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -13,6 +13,8 @@ repos:
- id: check-yaml
- id: check-added-large-files
- id: detect-aws-credentials
args:
- --allow-missing-credentials
- id: detect-private-key
- id: end-of-file-fixer
- id: mixed-line-ending
@@ -57,6 +59,8 @@ repos:
- --disallow-untyped-defs
- --disallow-incomplete-defs
- --disallow-untyped-calls
additional_dependencies:
- types-PyMySQL
- repo: https://github.com/dosisod/refurb
rev: v2.0.0
hooks:
115 changes: 108 additions & 7 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,9 +1,71 @@
# Setup development environment
setup:
poetry install
MKFILE_PATH := $(abspath $(lastword $(MAKEFILE_LIST)))
PROJECT_DIR := $(dir $(MKFILE_PATH))

# Set additional build args for docker image build using make arguments
IMAGE_NAME := pyconkr_api_v3
ifeq (docker-build,$(firstword $(MAKECMDGOALS)))
TAG_NAME := $(wordlist 2,$(words $(MAKECMDGOALS)),$(MAKECMDGOALS))
$(eval $(TAG_NAME):;@:)
endif
TAG_NAME := $(if $(TAG_NAME),$(TAG_NAME),local)
CONTAINER_NAME = $(IMAGE_NAME)_$(TAG_NAME)_container

ifeq ($(DOCKER_DEBUG),true)
DOCKER_MID_BUILD_OPTIONS = --progress=plain --no-cache
DOCKER_END_BUILD_OPTIONS = 2>&1 | tee docker-build.log
else
DOCKER_MID_BUILD_OPTIONS =
DOCKER_END_BUILD_OPTIONS =
endif

AWS_LAMBDA_READYZ_PAYLOAD = '{\
"resource": "/readyz/",\
"path": "/readyz/",\
"httpMethod": "GET",\
"requestContext": {\
"resourcePath": "/readyz/",\
"httpMethod": "GET",\
"path": "/readyz/"\
},\
"headers": {"accept": "application/json"},\
"multiValueHeaders": {"accept": ["application/json"]},\
"queryStringParameters": null,\
"multiValueQueryStringParameters": null,\
"pathParameters": null,\
"stageVariables": null,\
"body": null,\
"isBase64Encoded": false\
}'

# =============================================================================
# Local development commands

# Setup local environments
local-setup:
@poetry install --no-root --sync

# Run local development server
local-api: local-collectstatic
@ENV_PATH=.env.local poetry run python manage.py runserver 48000

# Run django collectstatic
local-collectstatic:
@ENV_PATH=.env.local poetry run python manage.py collectstatic --noinput

# Run django shell
local-shell:
@ENV_PATH=.env.local poetry run python manage.py shell

# Run django migrations
local-migrate:
@ENV_PATH=.env.local poetry run python manage.py migrate

# For developers not using Poetry
dep-export:
@poetry export --output requirements.txt --without-hashes

# Devtools
hooks-install: setup
hooks-install: local-setup
poetry run pre-commit install

hooks-upgrade:
@@ -14,7 +76,46 @@ hooks-lint:

lint: hooks-lint # alias

hooks-mypy:
poetry run pre-commit run mypy --all-files
# =============================================================================
# Docker related commands

# Docker image build
# Usage: make docker-build <tag-name:=local>
# if you want to build with debug mode, set DOCKER_DEBUG=true
# ex) make docker-build or make docker-build some_TAG_NAME DOCKER_DEBUG=true
docker-build:
@docker build \
-f ./infra/Dockerfile -t $(IMAGE_NAME):$(TAG_NAME) \
--build-arg GIT_HASH=$(shell git rev-parse HEAD) \
--build-arg IMAGE_BUILD_DATETIME=$(shell date +%Y-%m-%d_%H:%M:%S) \
$(DOCKER_MID_BUILD_OPTIONS) $(PROJECT_DIR) $(DOCKER_END_BUILD_OPTIONS)

docker-run: docker-compose-up
@(docker stop $(CONTAINER_NAME) || true && docker rm $(CONTAINER_NAME) || true) > /dev/null 2>&1
@docker run -d --rm \
-p 48000:8080 \
--env-file .env.local --env-file .env.docker \
--name $(CONTAINER_NAME) \
$(IMAGE_NAME):$(TAG_NAME)

docker-readyz:
@curl -X POST http://localhost:48000/2015-03-31/functions/function/invocations -d $(AWS_LAMBDA_READYZ_PAYLOAD) | jq '.body | fromjson'

docker-test: docker-build docker-run docker-readyz

docker-stop:
docker stop $(CONTAINER_NAME) || true

docker-rm: docker-stop
docker rm $(CONTAINER_NAME) || true

# Docker compose setup
# Below commands are for local development only
docker-compose-up:
docker-compose --env-file .env.local -f ./infra/docker-compose.dev.yaml up -d

docker-compose-down:
docker-compose --env-file .env.local -f ./infra/docker-compose.dev.yaml down

mypy: hooks-mypy # alias
docker-compose-rm: docker-compose-down
docker-compose --env-file .env.local -f ./infra/docker-compose.dev.yaml rm
42 changes: 35 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,26 @@
# PyCon Korea API Server (2024 ~)

## 1. 로컬 개발 환경 설정
## 로컬 개발 환경 설정
본 프로젝트는 Python3.11 (또는 이후 버전)과 Poetry를 사용합니다.
- Poetry 설치는 [여기](https://python-poetry.org/docs/)에서 확인해주세요.

### 1.1. 프로젝트 설치
### 로컬 인프라 구성
#### Docker
```bash
poetry install
docker-compose --env-file .env.local -f ./infra/docker-compose.dev.yaml up -d
```
만약 `make`를 사용하실 수 있다면 아래와 같이 사용하실 수 있습니다.
```bash
make docker-compose-up # MySQL 컨테이너 시작
make docker-compose-down # MySQL 컨테이너 종료
make docker-compose-rm # MySQL 컨테이너 삭제
```

### 1.2. pre-commit hook 설정
### pre-commit hook 설정
본 프로젝트에서는 코딩 컨벤션을 준수하기 위해 [pre-commit](https://pre-commit.com/)을 사용합니다.
pre-commit을 설치하려면 다음을 참고해주세요.

#### 1.2.1. Linux / macOS
#### Linux / macOS
```bash
# 설치
make hooks-install
@@ -25,8 +32,8 @@ make lint
make mypy
```

#### 1.2.2. Windows
```bash
#### Windows
```powershell
# 설치
poetry run pre-commit install
@@ -36,3 +43,24 @@ poetry run pre-commit run --all-files
# 프로젝트 전체 코드에 대해 mypy 타입 검사
poetry run pre-commit run mypy --all-files
```

### 프로젝트 설치
```bash
poetry install
```

## Run
아래 명령어로 서버를 실행할 수 있습니다. `.env.local` 파일이 있는 경우, 해당 파일을 자동으로 사용합니다.
```bash
python manage.py runserver 0.0.0.0:48000
```
만약 .env 파일 경로를 별도로 지정하고 싶다면 아래와 같이 실행할 수 있습니다.
```bash
ENV_PATH='<dotenv 파일 경로>' python manage.py runserver
```

추가로 `make`를 사용하실 수 있다면, 아래와 같이 실행 시 `.env.local` 파일을 사용합니다.
`.env.local` 파일은 기본적으로 MySQL 컨테이너를 바라보도록 설정되어 있습니다.
```bash
make local-api
```
41 changes: 41 additions & 0 deletions infra/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
ARG PYTHON_VERSION=3.11
FROM public.ecr.aws/lambda/python:${PYTHON_VERSION}
WORKDIR ${LAMBDA_TASK_ROOT}
SHELL [ "/bin/bash", "-euxvc"]

ENV PATH="${PATH}:/root/.local/bin:" \
TZ=Asia/Seoul \
LANG=C.UTF-8 \
LC_ALL=C.UTF-8 \
PYTHONIOENCODING=UTF-8 \
PYTHONUNBUFFERED=1

# Setup timezone, user, and install dependencies, and clean up
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone

# Copy only the dependencies files to cache them in docker layer
COPY --chown=nobody:nobody pyproject.toml poetry.lock ${LAMBDA_TASK_ROOT}

RUN curl -sSL https://install.python-poetry.org | python3 - \
&& poetry config virtualenvs.create false \
&& poetry config installer.max-workers 10 \
&& poetry install --only main --no-interaction --no-ansi --no-root

RUN ZAPPA_HANDLER_PATH=$(python -c 'import zappa.handler; print(zappa.handler.__file__)') \
&& echo $ZAPPA_HANDLER_PATH \
&& cp $ZAPPA_HANDLER_PATH ${LAMBDA_TASK_ROOT}

ARG GIT_HASH
ENV DEPLOYMENT_GIT_HASH=$GIT_HASH

# Make docker to always copy app directory so that source code can be refreshed.
ARG IMAGE_BUILD_DATETIME=unknown
ENV DEPLOYMENT_IMAGE_BUILD_DATETIME=$IMAGE_BUILD_DATETIME

# Copy main app and zappa settings
COPY --chown=nobody:nobody pyconkr ${LAMBDA_TASK_ROOT}/pyconkr
COPY --chown=nobody:nobody zappa_settings.py ${LAMBDA_TASK_ROOT}

# The reason for using nobody user is to avoid running the app as root, which can be a security risk.
USER nobody
CMD ["handler.lambda_handler"]
28 changes: 28 additions & 0 deletions infra/docker-compose.dev.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
version: '3.9'

name: pyconkr-v3-local

volumes:
mysql-data:
driver: local

services:
pyconkr-v3-mysql:
image: mysql:8.0.36-debian
platform: linux/amd64
container_name: pyconkr-v3-mysql
command: mysqld

restart: unless-stopped
volumes:
- mysql-data:/var/lib/mysql:rw
- mysql-data:/docker-entrypoint-initdb.d:ro
- mysql-data:/etc/mysql/mysql.conf.d:ro

environment:
MYSQL_DATABASE: ${DATABASE_NAME}
MYSQL_USER: ${DATABASE_USER}
MYSQL_PASSWORD: ${DATABASE_PASSWORD}
MYSQL_ROOT_PASSWORD: ${DATABASE_ROOT_PASSWORD}
ports:
- ${DATABASE_PORT}:3306
787 changes: 785 additions & 2 deletions poetry.lock

Large diffs are not rendered by default.

13 changes: 13 additions & 0 deletions pyconkr/api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import ninja.main
from django.conf import settings

from pyconkr.health_check import router as health_check_route

api = ninja.main.NinjaAPI(
title="PyCon Korea API Server V3",
version="v1",
urls_namespace="api",
docs_url="/docs" if settings.DEBUG else None,
)

api.add_router(prefix="", router=health_check_route)
59 changes: 59 additions & 0 deletions pyconkr/health_check.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
from collections import defaultdict
from http import HTTPStatus
from os import getenv
from typing import Any

import ninja.router
from django.conf import settings
from django.db import DEFAULT_DB_ALIAS, DatabaseError, connections
from django.db.migrations.executor import MigrationExecutor
from django.http import HttpRequest, JsonResponse

router = ninja.router.Router(tags=["Health Check"])


def _check_databases() -> tuple[bool, dict[str, Any]]:
results: dict[str, dict[str, Any]] = {}
for alias in settings.DATABASES:
results[alias] = {"success": True, "error": None}
try:
with connections[alias].cursor() as cursor:
cursor.execute("SELECT 1")
except DatabaseError as e:
results[alias].update({"success": False, "error": str(e)})
return all(results[key]["success"] for key in results), results


def _check_django_migrations() -> tuple[bool, defaultdict[str, list[str]]]:
result: defaultdict[str, list[str]] = defaultdict(list)

executor = MigrationExecutor(connections[DEFAULT_DB_ALIAS])
migration_plan = executor.migration_plan(executor.loader.graph.leaf_nodes())
for migration_info, _ in migration_plan:
result[migration_info.app_label].append(migration_info.name)

return bool(migration_plan), result


@router.get("/readyz/", url_name="readyz")
def readyz(request: HttpRequest) -> JsonResponse:
is_dbs_ok, db_status = _check_databases()
requires_migrations, migration_status = _check_django_migrations()
response_data = (
{
"database": db_status,
"migrations": migration_status,
"git_sha": getenv("DEPLOYMENT_GIT_HASH", ""),
}
if settings.DEBUG
else {}
)
return JsonResponse(
data=response_data,
status=HTTPStatus.OK if is_dbs_ok and requires_migrations else HTTPStatus.SERVICE_UNAVAILABLE,
)


@router.get("/livez/", url_name="livez")
def livez(request: HttpRequest) -> JsonResponse:
return JsonResponse({}, status=HTTPStatus.OK)
94 changes: 71 additions & 23 deletions pyconkr/settings.py
Original file line number Diff line number Diff line change
@@ -12,24 +12,65 @@

from pathlib import Path

import environ
import pymysql

pymysql.install_as_MySQLdb()

# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent

env = environ.Env(DEBUG=(bool, False), LOG_LEVEL=(str, "DEBUG"))
env.read_env(env.str("ENV_PATH", default=".env.local"))

# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/5.0/howto/deployment/checklist/

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = "django-insecure-kjtg@&v106jt2wz9tlci@b!3uqrig7eud^3zk53&!me@gw_(q@" # nosec: B105
SECRET_KEY = env("DJANGO_SECRET_KEY", default="local_secret_key")

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True

ALLOWED_HOSTS: list[str] = []
DEBUG = env("DEBUG")

# Loggers
LOG_LEVEL = env("LOG_LEVEL")
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"handlers": {
"console": {
"level": LOG_LEVEL,
"class": "logging.StreamHandler",
},
},
"loggers": {
"django.db.backends": {
"handlers": ["console"],
"level": LOG_LEVEL,
},
},
}

# Zappa Settings
API_STAGE = env("API_STAGE", default="prod")
ADDITIONAL_TEXT_MIMETYPES: list[str] = []
ASYNC_RESPONSE_TABLE = ""
AWS_BOT_EVENT_MAPPING: dict[str, str] = {}
AWS_EVENT_MAPPING: dict[str, str] = {}
BASE_PATH = None
BINARY_SUPPORT = True
COGNITO_TRIGGER_MAPPING: dict[str, str] = {}
CONTEXT_HEADER_MAPPINGS: dict[str, str] = {}
DJANGO_SETTINGS = "pyconkr.settings"
DOMAIN = None
ENVIRONMENT_VARIABLES: dict[str, str] = {}
EXCEPTION_HANDLER = None
PROJECT_NAME = "PyConKR-API-V3"

# CORS Settings
ALLOWED_HOSTS = ["*"]

# Application definition

INSTALLED_APPS = [
"django.contrib.admin",
"django.contrib.auth",
@@ -75,28 +116,23 @@

DATABASES = {
"default": {
"ENGINE": "django.db.backends.sqlite3",
"NAME": BASE_DIR / "db.sqlite3",
}
"ENGINE": env("DATABASE_ENGINE", default="django.db.backends.sqlite3"),
"NAME": env("DATABASE_NAME", default=str(BASE_DIR / "db.sqlite3")),
"PORT": env("DATABASE_PORT", default=None),
"HOST": env("DATABASE_HOST", default=None),
"USER": env("DATABASE_USER", default=None),
"PASSWORD": env("DATABASE_PASSWORD", default=None),
},
}


# Password validation
# https://docs.djangoproject.com/en/5.0/ref/settings/#auth-password-validators

AUTH_PASSWORD_VALIDATORS = [
{
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
},
{
"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
},
{
"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator",
},
{
"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",
},
{"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator"},
{"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator"},
{"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator"},
{"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator"},
]


@@ -105,7 +141,7 @@

LANGUAGE_CODE = "ko-kr"

TIME_ZONE = "UTC"
TIME_ZONE = "Asia/Seoul"

USE_I18N = True

@@ -114,9 +150,21 @@

# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/5.0/howto/static-files/

STATIC_URL = "static/"

STORAGE_BACKEND = env("DJANGO_STORAGE_BACKEND", default="storages.backends.s3.S3Storage")
STORAGE_BUCKET_NAME = f"pyconkr-api-v3-{API_STAGE}"
STORAGE_OPTIONS = (
{"bucket_name": STORAGE_BUCKET_NAME, "file_overwrite": False}
if STORAGE_BACKEND == "storages.backends.s3.S3Storage"
else {}
)

STORAGES = {
"default": {"BACKEND": STORAGE_BACKEND, "OPTIONS": STORAGE_OPTIONS},
"staticfiles": {"BACKEND": STORAGE_BACKEND, "OPTIONS": STORAGE_OPTIONS},
}

# Default primary key field type
# https://docs.djangoproject.com/en/5.0/ref/settings/#default-auto-field

10 changes: 9 additions & 1 deletion pyconkr/urls.py
Original file line number Diff line number Diff line change
@@ -15,9 +15,17 @@
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""

from django.conf import settings
from django.conf.urls.static import static
from django.contrib import admin
from django.urls import path

from pyconkr.api import api as ninja_api

urlpatterns = [
path("admin/", admin.site.urls),
]
path("", ninja_api.urls),
] + static(settings.STATIC_URL, document_root=settings.STATIC_ROOT)

if settings.DEBUG:
urlpatterns += []
16 changes: 15 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -2,18 +2,32 @@
name = "pyconkr-api-v3"
version = "0.1.0"
description = ""
authors = ["Your Name <you@example.com>"]
authors = ["PyConKR <pyconkr@pycon.kr>"]
readme = "README.md"

[tool.poetry.dependencies]
python = "^3.11"
cryptography = "^42.0.5"
Django = "^5.0.4"
django-cors-headers = "^4.3.1"
django-environ = "^0.11.2"
django-mysql = "^4.13.0"
django-ninja = "^1.1.0"
django-storages = {extras = ["s3"], version = "^1.14.2"}
pydantic = "^2.6.4"
pymysql = "^1.1.0"
requests = "^2.31.0"
zappa = "^0.59.0"

[tool.poetry.group.dev.dependencies]
pre-commit = "^3.7.0"

[tool.poetry.group.deployment.dependencies]
django = "^5.0.4"
django-storages = {extras = ["s3"], version = "^1.14.2"}
django-environ = "^0.11.2"
zappa = "^0.59.0"

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
Binary file removed requirements.txt
Binary file not shown.
2 changes: 2 additions & 0 deletions zappa_settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# We'll use django-environ instead of Zappa Settings.
from pyconkr.settings import * # noqa: F403, F401