Skip to content

Commit 8f5e347

Browse files
authored
feat: github actions pipeline, pre-commit (#2)
1 parent 66a03ae commit 8f5e347

File tree

5 files changed

+49
-5
lines changed

5 files changed

+49
-5
lines changed

.github/workflows/pre-commit.yaml

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
name: Pre-commit linter
2+
3+
on: push
4+
5+
jobs:
6+
run-linters:
7+
runs-on: ubuntu-latest
8+
steps:
9+
- name: Check out Git repository
10+
uses: actions/checkout@v3
11+
12+
- name: Set up Python
13+
uses: actions/setup-python@v4
14+
with:
15+
python-version: '3.11'
16+
17+
- name: Set up pre-commit Cache
18+
uses: pre-commit/[email protected]

.pre-commit-config.yaml

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
repos:
2+
- repo: https://github.com/pre-commit/pre-commit-hooks
3+
rev: v4.4.0
4+
hooks:
5+
- id: check-ast
6+
- id: check-yaml
7+
- id: check-added-large-files
8+
- id: end-of-file-fixer
9+
- id: trailing-whitespace
10+
11+
- repo: https://github.com/asottile/reorder-python-imports
12+
rev: v3.12.0
13+
hooks:
14+
- id: reorder-python-imports
15+
16+
- repo: https://github.com/PyCQA/flake8
17+
rev: 6.1.0
18+
hooks:
19+
- id: flake8
20+
args: [--max-line-length=88]
21+
22+
- repo: https://github.com/psf/black
23+
rev: 23.9.1
24+
hooks:
25+
- id: black
26+
language_version: python3.11

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
### Otodom / Pracuj tasks repository
22
This repository are going to contain early stages of the pracuj / otodom scrapers. Here we will create fundamentals for the future one big project and the done things are going to be merged into one.
33

4-
The tasks you can find in the corresponding directories.
4+
The tasks you can find in the corresponding directories.

otodom/task_1/task_1.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ If something is missing you can leave the value as an empty string.
2525
The Bot should be able to iterate through all the listings pages. The listings should be again collected and the duplicates should be removed.
2626
### Task 2
2727

28-
Create a **settings.json** file. It should contain things which are going to define what bot is going to scrap. An example may look like:
28+
Create a **settings.json** file. It should contain things which are going to define what bot is going to scrap. An example may look like:
2929
```json
3030
{
3131
"base_url": "str",
@@ -40,4 +40,4 @@ Create a **settings.json** file. It should contain things which are going to def
4040
```
4141
and so on. Anything what may be usefull **please try to include**. Dependingly on the data the URL should be somehow generated. Look into Url how the Url is changed accordingly to what search parameters you applied on the site.
4242

43-
**Solutions** you can create in the **pracuj/task1/<your_name>** file and then make create a pull request.
43+
**Solutions** you can create in the **pracuj/task1/<your_name>** file and then make create a pull request.

pracuj/task_1/task_1.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ If something is missing you can leave the value as an empty string.
2020
The Bot should be able to iterate through all the listings pages. The listings should be again collected and the duplicates should be removed.
2121
### Task 2
2222

23-
Create a **settings.json** file. It should contain things which are going to define what bot is going to scrap. An example may look like:
23+
Create a **settings.json** file. It should contain things which are going to define what bot is going to scrap. An example may look like:
2424
```json
2525
{
2626
"base_url": "str",
@@ -33,4 +33,4 @@ Create a **settings.json** file. It should contain things which are going to def
3333
```
3434
and so on. Anything what may be usefull **please try to include**. Start with the most important things. Dependingly on the data the URL should be somehow generated. Look into Url how the Url is changed accordingly to what search parameters you applied on the site.
3535

36-
**Solutions** you can create in the **pracuj/task1/<your_name>** file and then make create a pull request.
36+
**Solutions** you can create in the **pracuj/task1/<your_name>** file and then make create a pull request.

0 commit comments

Comments
 (0)