Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added automatic text size selection #15

Closed
wants to merge 5 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .github/workflows/linters.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,9 @@ jobs:
- name: Format with ruff
run: |
ruff format src --diff
# - name: Lint with mypy
# run: |
# mypy src tests
# - name: Run tests
# run: |
# pytest
- name: Lint with mypy
run: |
mypy src tests
- name: Run tests
run: |
pytest
685 changes: 24 additions & 661 deletions LICENSE

Large diffs are not rendered by default.

39 changes: 31 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,31 @@
# Fast Food Memes

➡️ https://t.me/ffmemesbot ⬅️
# FastAPI Example Project
Some people were searching my GitHub profile for project examples after reading the article on [FastAPI best practices](https://github.com/zhanymkanov/fastapi-best-practices).
Unfortunately, I didn't have useful public repositories, but only my old proof-of-concept projects.

Hence, I have decided to fix that and show how I start projects nowadays, after getting some real-world experience.
This repo is kind of a template I use when starting up new FastAPI projects:
- some configs for production
- gunicorn with dynamic workers configuration (stolen from [@tiangolo](https://github.com/tiangolo))
- Dockerfile optimized for small size and fast builds with a non-root user
- JSON logs
- sentry for deployed envs
- easy local development
- environment with configured postgres and redis
- script to lint code with `ruff` and `ruff format`
- configured pytest with `async-asgi-testclient`, `pytest-env`, `pytest-asyncio`
- SQLAlchemy with slightly configured `alembic`
- async SQLAlchemy engine
- migrations set in easy to sort format (`YYYY-MM-DD_slug`)
- pre-installed JWT authorization
- short-lived access token
- long-lived refresh token which is stored in http-only cookies
- salted password storage with `bcrypt`
- global pydantic model with
- explicit timezone setting during JSON export
- and some other extras like global exceptions, sqlalchemy keys naming convention, shortcut scripts for alembic, etc.

Current version of the template (with SQLAlchemy >2.0 & Pydantic >2.0) wasn't battle tested on production,
so there might be some workarounds instead of neat solutions, but overall idea of the project structure is still the same.

## Local Development

Expand All @@ -9,11 +34,11 @@
2. `docker network create ffmemes_network`
3. `docker-compose up -d --build`

Don't forget to fill the local `.env` file with all envs you need.
Don't forget to fill in local `.env` file all envs you need.

### Test local changes
### Test specific python functions

Before sending a PR you must test your new code. The easiest way is to run `ipython` shell, then import the functions you may need and test them. Note that ipython can run async functions without wrapping them with `asyncio.run(...)`.
After `docker-compose up -d --build` use this to run a python shell inside the Docker environment. Then import your Python function and test it as you want. Note that ipython can run async functions without asyncio.run(...).

``` shell
docker compose exec app ipython
Expand All @@ -38,7 +63,6 @@ docker compose exec app migrate
```shell
docker compose exec app downgrade -1 # or -2 or base or hash of the migration
```

### Tests
All tests are integrational and require DB connection.

Expand All @@ -60,7 +84,6 @@ You can find all the shortcuts in `justfile` or run the following command to lis
just --list
```
Info about installation can be found [here](https://github.com/casey/just#packages).

### Backup and Restore database
We are using `pg_dump` and `pg_restore` to backup and restore the database.
- Backup
Expand Down
4 changes: 3 additions & 1 deletion flow_deployments/broadcasts.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,15 @@
from prefect.server.schemas.schedules import CronSchedule

from src.config import settings

from src.flows.broadcasts.meme import broadcast_memes_to_users_active_hours_ago


deployment_broadcast_hourly = Deployment.build_from_flow(
flow=broadcast_memes_to_users_active_hours_ago,
name="broadcast_memes_to_users_active_hours_ago",
schedule=(CronSchedule(cron="3 * * * *", timezone="Europe/London")),
work_pool_name=settings.ENVIRONMENT,
)

deployment_broadcast_hourly.apply()
deployment_broadcast_hourly.apply()
2 changes: 2 additions & 0 deletions flow_deployments/parsers.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@
from prefect.server.schemas.schedules import CronSchedule

from src.config import settings

from src.flows.parsers.tg import parse_telegram_sources
from src.flows.parsers.vk import parse_vk_sources


deployment_tg = Deployment.build_from_flow(
flow=parse_telegram_sources,
name="Parse Telegram Sources",
Expand Down
6 changes: 4 additions & 2 deletions flow_deployments/stats.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@
from prefect.server.schemas.schedules import CronSchedule

from src.config import settings
from src.flows.stats.meme import calculate_meme_stats

from src.flows.stats.user import calculate_user_stats
from src.flows.stats.user_meme_source import calculate_user_meme_source_stats
from src.flows.stats.meme import calculate_meme_stats


deployment_user_stats = Deployment.build_from_flow(
flow=calculate_user_stats,
Expand Down Expand Up @@ -33,4 +35,4 @@
schedule=(CronSchedule(cron="3,18,33,48 * * * *", timezone="Europe/London")),
)

deployment_user_stats.apply()
deployment_user_stats.apply()
2 changes: 2 additions & 0 deletions flow_deployments/storage.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
from src.config import settings
from src.flows.storage.memes import ocr_uploaded_memes


deployment_ocr_uploaded_memes = Deployment.build_from_flow(
flow=ocr_uploaded_memes,
name="OCR Uploaded Memes",
Expand All @@ -12,3 +13,4 @@
)

deployment_ocr_uploaded_memes.apply()

5 changes: 2 additions & 3 deletions requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ alembic==1.13.*
psycopg2==2.9.*
python-jose==3.3.*
SQLAlchemy==2.0.23
httpx==0.26.*
httpx==0.25.*

pydantic[email]==2.5.*
pydantic-settings==2.1.*
Expand All @@ -18,7 +18,6 @@ prefect==2.14.*
beautifulsoup4==4.12.*
lxml==4.9.*

python-telegram-bot==20.8
python-telegram-bot==20.7

Pillow==10.1.*

9 changes: 3 additions & 6 deletions src/broadcasts/service.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,11 @@
from src.database import fetch_all


async def get_users_which_were_active_hours_ago(hours: int) -> list[dict]:
async def get_users_which_were_active_hours_ago(hours=48) -> list[dict]:
insert_query = f"""
SELECT
SELECT
id
FROM "user"
WHERE last_active_at BETWEEN
NOW() - INTERVAL '{hours} HOURS'
AND
NOW() - INTERVAL '{hours-1} HOURS'
WHERE last_active_at BETWEEN NOW() - INTERVAL '{hours} HOURS' AND NOW() - INTERVAL '{hours-1} HOURS'
"""
return await fetch_all(text(insert_query))
Loading
Loading