Skip to content

Commit 48d21d1

Browse files
feat: django integration test (#564)
* chore: move flask README.md into flask/ folder * feat(python): add basic skeleton for test django app * feat(python): working e2e django setup Still needs wiring up to integration test server and the app needs thinning out * feat(python): ready the django app for integration test - check for existence of README_API_KEY and exit(1) if not present - set default port from os.getenv("PORT") * feat(python): working e2e integration test Had to make the following changes to get it to pass fully: - Fix date format in django.py. Same fix as here: #480 - Reduce default BUFFER_LENGTH down to 1. This mirrors what's happening in the other SDKs for now, but would be good to increase this in future. * feat(python): wire up django integration test to gh actions * fix(python): bug with POST data in har generation We are setting `rm_content_length` on the request object here: - [django](https://github.com/readmeio/metrics-sdks/blob/4217fefe9417ef39130913dc51989ab021e0405c/packages/python/readme_metrics/django.py#L23) - [flask](https://github.com/readmeio/metrics-sdks/blob/4217fefe9417ef39130913dc51989ab021e0405c/packages/python/readme_metrics/flask_readme.py#L34) But in the PayloadBuilder we were trying to fetch `content_length` without the rm_ prefix. This meant that POST bodies weren't getting picked up. * feat(integration/python/django): add support for POST method test * chore(integration/python/django): lint * fix(integration/python/django): test with start date Fixed after df7243d#diff-8f208c3f785655a644003c8703cba81313de50ce7aef3068a6b5dd858fcf2577R20 * chore(python): ignore virtualenv folders from linter * chore(python): lint * fix(python): ensure django integration test is running in gh actions * fix(python/integration): bump timeout of HTTPConnectionPool I think we were hitting this timeout sometimes from within the container This makes it pass locally at least, lets see if this works on gh * chore: updating the django branch with HEAD (#566) * refactor: improved integration test suite (#560) * feat(integration): add POST method test Right now the hapi/fastify examples are not working properly for accepting an incoming POST body. This adds a test to expose that flaw, but does not yet fix it. I think this refactor makes the most sense to finish once #459 is merged in. * fix(php): update postData to push into text instead of params[] * fix: rewriting how we retrieve data out of the request * feat: adding a heap of new unit tests to the metrics integration suite * fix: the node SDK not capturing text/plain and +json payloads * fix: node sdk not capturing `application/x-www-url-formencoded` payloads * fix: optionally skipping multipart tests for sdks that dont support it * fix: stop setting postData if there's none to set * fix: adding missing env vars * fix(python): adding query strings to urls, dont add postData if empty * fix(python): handling of `x-www-form-urlencoded` requests * fix: don't crash in node or php if a json payload is corrupted * fix: python code standards issues * fix: replaying test improvemnts i made * fix: moving the webhook test over to using the node protocol * fix: removing dead code * fix: regenerating the package lockfile * fix: broken tests * fix: compatibility issues with fastify and hapi * feat: fleshing out some useful makefiles * feat: more makefiles * fix: no longer setting an empty `text` property on urlencoded requests * feat: makefile targets for the .net sdk * fix: problem where dotnet request.url didn't contain querystrings * fix: problem where .net would send nullish postData on get requests * fix: broken php tests * fix: downgrading the python ci tests in docker to use node 16 * ci: docker python debugging * fix: remove uneeded python docker workarounds for node 18 * fix(integration): exit early from tests if child process didn't start * bug: attempt to fix python ci (#565) * Revert "fix: remove uneeded python docker workarounds for node 18" This reverts commit 0c9df96. * Revert "ci: docker python debugging" This reverts commit 067f95f. * Revert "fix: downgrading the python ci tests in docker to use node 16" This reverts commit af5c110. * fix(python/integration): bump timeout of HTTPConnectionPool I think we were hitting this timeout sometimes from within the container This makes it pass locally at least, lets see if this works on gh * fix(integration): disable undici on flask integration tests * fix: try to use mocha instead of jest * fix: add commented out experimental fetch disabling * fix: getting tests in mocha working * fix: re-disabling undici in node 18 * ci: attempts to get python working in docker always * ci: disabling python flask integration in ci for now Co-authored-by: Jon Ursenbach <[email protected]> Co-authored-by: Dom Harrington <[email protected]> Co-authored-by: Dom Harrington <[email protected]> * feat: getting the django tests all running locally * chore: remove debug * fix: code standards issues * chore: alphabetizing the docker-compose file Co-authored-by: Dom Harrington <[email protected]> Co-authored-by: Dom Harrington <[email protected]> * chore(integration): remove res.on('end') test finishers I dont think it's actually required and everything seems to pass without * chore(integration): add test watch command for mocha with nodemon * chore(integration): turn off django in CI * refactor(python/integration): only ask for README_API_KEY when running server * fix(python/django): move instantiation of grouping_function (#567) * fix(python/django): move instantiation of grouping_function Previously we were trying to import the grouping function in MetricsApiConfig which in Django's case gets called from settings.py (before the models have been setup). So if your grouping function was located in a file that happened to use a model (or was a function on a model) it would error with the following: ``` django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. ``` This PR moves the loading of the grouping function (if it needs to be imported) into Metrics instead of in the Config, this is late enough for django to have started so you can use models. * test(python/integration): add a very lightweight model to example app This makes the top level integration test fail with the issue: ``` django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. ``` Which helps us to ensure it can't happen again * chore(python/integration): lint Co-authored-by: Jon Ursenbach <[email protected]>
1 parent 44be670 commit 48d21d1

32 files changed

+415
-74
lines changed

.github/workflows/python.yml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,9 @@ jobs:
4444
# you can run this locally with `make test-python-metrics-flask`.
4545
# - run: docker-compose run integration_metrics_python_flask
4646
- run: docker-compose run integration_webhooks_python_flask
47+
# This Django suite is **extremely** flaky in Docker. Until we can figure out how to make it not
48+
# you can run this locally with `make test-python-metrics-django`.
49+
# - run: docker-compose run integration_metrics_python_django
4750

4851
- name: Cleanup
4952
if: always()

.vscode/settings.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
"packages/php/examples/laravel/bootstrap/cache/*.php": true,
77
"packages/php/examples/laravel/storage/**": true,
88
"packages/python/build/**": true,
9+
"packages/python/readme_metrics.egg*": true,
910

1011
// Hide test coverage directories
1112
"**/coverage": true,

Makefile

Lines changed: 21 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,21 @@
11
.PHONY: help
22

3+
help: ## Display this help screen
4+
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
5+
6+
##
7+
## .NET
8+
##
39
test-dotnet-metrics: ## Run metrics tests against the .NET SDK
410
EXAMPLE_SERVER="dotnet examples/net6.0/out/net6.0.dll" npm run test:integration-metrics
511

612
test-dotnet-webhooks: ## Run webhooks tests against the .NET SDK
713
EXAMPLE_SERVER="dotnet examples/net6.0-webhook/out/net6.0-webhook.dll" npm run test:integration-webhooks
814

15+
##
16+
## Node
17+
##
18+
919
test-node-metrics-express: ## Run metrics tests against the Node SDK + Express
1020
EXAMPLE_SERVER="node ./packages/node/examples/express/index.js" npm run test:integration-metrics
1121

@@ -18,17 +28,25 @@ test-node-metrics-fastify: ## Run metrics tests against the Node SDK + Fastify
1828
test-node-metrics-hapi: ## Run metrics tests against the Node SDK + hapi
1929
EXAMPLE_SERVER="node ./packages/node/examples/hapi/index.js" npm run test:integration-metrics
2030

31+
##
32+
## PHP
33+
##
34+
2135
test-php-metrics-laravel: ## Run metrics tests against the PHP SDK + Laravel
2236
SUPPORTS_MULTIPART=true EXAMPLE_SERVER="php packages/php/examples/laravel/artisan serve" npm run test:integration-metrics
2337

2438
test-php-webhooks-php-laravel: ## Run webhooks tests against the PHP SDK + Laravel
2539
EXAMPLE_SERVER="php packages/php/examples/laravel/artisan serve" npm run test:integration-webhooks
2640

41+
##
42+
## Python
43+
##
44+
45+
test-python-metrics-django: ## Run Metrics tests against the Python SDK + Django
46+
EXAMPLE_SERVER="python3 packages/python/examples/metrics_django/manage.py runserver" npm run test:integration-metrics
47+
2748
test-python-metrics-flask: ## Run Metrics tests against the Python SDK + Flask
2849
EXAMPLE_SERVER="python3 packages/python/examples/flask/app.py" npm run test:integration-metrics
2950

3051
test-python-webhooks-flask: ## Run webhooks tests against the Python SDK + Flask
3152
EXAMPLE_SERVER="python3 packages/python/examples/flask/webhooks.py" npm run test:integration-webhooks
32-
33-
help: ## Display this help screen
34-
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'

__tests__/integration-metrics.test.js

Lines changed: 11 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -149,7 +149,7 @@ describe('Metrics SDK Integration Tests', function () {
149149
it('should make a request to a Metrics backend with a HAR file', async function () {
150150
await fetch(`http://localhost:${PORT}`, { method: 'get' });
151151

152-
const [req, res] = await once(metricsServer, 'request');
152+
const [req] = await once(metricsServer, 'request');
153153
expect(req.url).to.equal('/v1/request');
154154
expect(req.headers.authorization).to.equal('Basic YS1yYW5kb20tcmVhZG1lLWFwaS1rZXk6');
155155

@@ -199,23 +199,20 @@ describe('Metrics SDK Integration Tests', function () {
199199
expect(request.headers).to.have.header('host', `localhost:${PORT}`);
200200

201201
expect(response.status).to.equal(200);
202-
expect(response.statusText).to.equal('OK');
202+
expect(response.statusText).to.match(/OK|200/); // Django returns with "200"
203203
expect(response.headers).to.have.header('content-type', /application\/json(;\s?charset=utf-8)?/);
204204

205205
// Flask prints a \n character after the JSON response
206206
// https://github.com/pallets/flask/issues/4635
207207
expect(response.content.text.replace('\n', '')).to.equal(JSON.stringify({ message: 'hello world' }));
208208
expect(response.content.size).to.equal(response.content.text.length);
209209
expect(response.content.mimeType).to.match(/application\/json(;\s?charset=utf-8)?/);
210-
211-
res.end();
212-
return once(res, 'finish');
213210
});
214211

215212
it('should capture query strings in a GET request', async function () {
216213
await fetch(`http://localhost:${PORT}?arr%5B1%5D=3&val=1`, { method: 'get' });
217214

218-
const [req, res] = await once(metricsServer, 'request');
215+
const [req] = await once(metricsServer, 'request');
219216
const body = await getBody(req);
220217
const [har] = body;
221218

@@ -237,9 +234,6 @@ describe('Metrics SDK Integration Tests', function () {
237234
]);
238235

239236
expect(request.postData).to.be.undefined;
240-
241-
res.end();
242-
return once(res, 'finish');
243237
});
244238

245239
it('should capture query strings that may be supplied in a POST request', async function () {
@@ -252,7 +246,7 @@ describe('Metrics SDK Integration Tests', function () {
252246
body: payload,
253247
});
254248

255-
const [req, res] = await once(metricsServer, 'request');
249+
const [req] = await once(metricsServer, 'request');
256250
const body = await getBody(req);
257251
const [har] = body;
258252

@@ -283,19 +277,15 @@ describe('Metrics SDK Integration Tests', function () {
283277
});
284278

285279
expect(response.status).to.equal(200);
286-
287-
res.end();
288-
return once(res, 'finish');
289280
});
290-
291281
it('should process a POST payload with no explicit `Content-Type` header', async function () {
292282
const payload = JSON.stringify({ user: { email: '[email protected]' } });
293283
await fetch(`http://localhost:${PORT}/`, {
294284
method: 'post',
295285
body: payload,
296286
});
297287

298-
const [req, res] = await once(metricsServer, 'request');
288+
const [req] = await once(metricsServer, 'request');
299289
const body = await getBody(req);
300290
const [har] = body;
301291

@@ -307,11 +297,7 @@ describe('Metrics SDK Integration Tests', function () {
307297
expect(request.postData.mimeType).to.match(/text\/plain(;charset=UTF-8)?/);
308298
expect(request.postData.params).to.be.undefined;
309299
expect(request.postData.text).to.equal(payload);
310-
311-
res.end();
312-
return once(res, 'finish');
313300
});
314-
315301
it('should process an `application/json` POST payload', async function () {
316302
const payload = JSON.stringify({ user: { email: '[email protected]' } });
317303
await fetch(`http://localhost:${PORT}/`, {
@@ -322,7 +308,7 @@ describe('Metrics SDK Integration Tests', function () {
322308
body: payload,
323309
});
324310

325-
const [req, res] = await once(metricsServer, 'request');
311+
const [req] = await once(metricsServer, 'request');
326312
const body = await getBody(req);
327313
const [har] = body;
328314

@@ -336,9 +322,6 @@ describe('Metrics SDK Integration Tests', function () {
336322
});
337323

338324
expect(response.status).to.equal(200);
339-
340-
res.end();
341-
return once(res, 'finish');
342325
});
343326

344327
/**
@@ -361,7 +344,7 @@ describe('Metrics SDK Integration Tests', function () {
361344
body: payload,
362345
});
363346

364-
const [req, res] = await once(metricsServer, 'request');
347+
const [req] = await once(metricsServer, 'request');
365348
const body = await getBody(req);
366349
const [har] = body;
367350

@@ -381,9 +364,6 @@ describe('Metrics SDK Integration Tests', function () {
381364
// process the payload into Metrics.
382365
415,
383366
]);
384-
385-
res.end();
386-
return once(res, 'finish');
387367
});
388368

389369
it('should process an `application/x-www-url-formencoded` POST payload', async function () {
@@ -398,7 +378,7 @@ describe('Metrics SDK Integration Tests', function () {
398378
body: payload,
399379
});
400380

401-
const [req, res] = await once(metricsServer, 'request');
381+
const [req] = await once(metricsServer, 'request');
402382
const body = await getBody(req);
403383
const [har] = body;
404384

@@ -419,9 +399,6 @@ describe('Metrics SDK Integration Tests', function () {
419399
// that to Metrics regardless if Fastify supports it or not.
420400
415,
421401
]);
422-
423-
res.end();
424-
return once(res, 'finish');
425402
});
426403

427404
it('should process a `multipart/form-data` POST payload', async function () {
@@ -443,7 +420,7 @@ describe('Metrics SDK Integration Tests', function () {
443420
body: Readable.from(encoder),
444421
});
445422

446-
const [req, res] = await once(metricsServer, 'request');
423+
const [req] = await once(metricsServer, 'request');
447424
const body = await getBody(req);
448425
const [har] = body;
449426

@@ -460,16 +437,12 @@ describe('Metrics SDK Integration Tests', function () {
460437
]);
461438

462439
expect(request.postData.text).to.be.undefined;
463-
464-
res.end();
465-
return once(res, 'finish');
466440
});
467441

468442
it('should process a `multipart/form-data` POST payload containing files', async function () {
469443
if (!supportsMultipart()) {
470444
this.skip();
471445
}
472-
473446
const owlbert = await fs.readFile('./__tests__/__datasets__/owlbert.png');
474447

475448
const payload = new FormData();
@@ -487,7 +460,7 @@ describe('Metrics SDK Integration Tests', function () {
487460
body: Readable.from(encoder),
488461
});
489462

490-
const [req, res] = await once(metricsServer, 'request');
463+
const [req] = await once(metricsServer, 'request');
491464
const body = await getBody(req);
492465
const [har] = body;
493466

@@ -513,9 +486,6 @@ describe('Metrics SDK Integration Tests', function () {
513486
]);
514487

515488
expect(request.postData.text).to.be.undefined;
516-
517-
res.end();
518-
return once(res, 'finish');
519489
});
520490

521491
it('should process a `text/plain` payload', async function () {
@@ -527,7 +497,7 @@ describe('Metrics SDK Integration Tests', function () {
527497
body: 'Hello world',
528498
});
529499

530-
const [req, res] = await once(metricsServer, 'request');
500+
const [req] = await once(metricsServer, 'request');
531501
const body = await getBody(req);
532502
const [har] = body;
533503

@@ -541,8 +511,5 @@ describe('Metrics SDK Integration Tests', function () {
541511
});
542512

543513
expect(response.status).to.equal(200);
544-
545-
res.end();
546-
return once(res, 'finish');
547514
});
548515
});

__tests__/integrations/python.Dockerfile

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,9 @@ RUN pip3 install --no-cache-dir -r requirements.txt
1313
WORKDIR /src/examples/flask
1414
RUN pip3 install --no-cache-dir -r requirements.txt
1515

16+
WORKDIR /src/examples/metrics_django
17+
RUN pip3 install --no-cache-dir -r requirements.txt
18+
1619
# Install top level dependencies
1720
WORKDIR /src
1821
COPY __tests__ /src/__tests__

docker-compose.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,14 @@ services:
7777
#
7878
# Python
7979
#
80+
integration_metrics_python_django:
81+
build:
82+
context: .
83+
dockerfile: ./__tests__/integrations/python.Dockerfile
84+
command: npm run test:integration-metrics
85+
environment:
86+
- EXAMPLE_SERVER=python3 examples/metrics_django/manage.py runserver
87+
8088
integration_metrics_python_flask:
8189
build:
8290
context: .

package.json

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,9 @@
1111
"publish": "npx lerna publish",
1212
"test": "npm test --workspaces",
1313
"test:integration-metrics": "NODE_OPTIONS=--experimental-vm-modules npx mocha __tests__/integration-metrics.test.js",
14+
"test:integration-metrics-watch": "NODE_OPTIONS=--experimental-vm-modules npx nodemon --exec mocha __tests__/integration-metrics.test.js",
1415
"test:integration-webhooks": "NODE_OPTIONS=--experimental-vm-modules npx mocha __tests__/integration-webhooks.test.js",
16+
"test:integration-webhooks-watch": "NODE_OPTIONS=--experimental-vm-modules npx nodemon --exec mocha __tests__/integration-webhooks.test.js",
1517
"version": "npx conventional-changelog-cli --pkg lerna.json -i CHANGELOG.md -s && git add CHANGELOG.md"
1618
},
1719
"repository": {

packages/python/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
*.egg-info
22
.eggs/*
3+
**/*.sqlite3
34
__pycache__
45
build/
56
dist/

packages/python/Makefile

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,15 @@ install: # Install all dependencies
66
pip3 install -r requirements.dev.txt
77

88
lint: ## Run code standard checks
9-
pylint --output-format=colorized examples/ readme_metrics/
9+
pylint --output-format=colorized examples/ readme_metrics/ --ignore=venv
1010
black --check .
1111

1212
lint-fix: ## Run code formatting checks
1313
black .
1414

15+
serve-metrics-django: ## Start the local Django server to test Metrics
16+
README_API_KEY=$(API_KEY) python3 examples/metrics_django/manage.py runserver
17+
1518
serve-metrics-flask: ## Start the local Flask server to test Metrics
1619
README_API_KEY=$(API_KEY) python3 examples/flask/app.py
1720

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# ReadMe Metrics Django Demo
2+
3+
To install the dependencies required for this application follow the instructions from [CONTRIBUTING.md](../CONTRIBUTING.md) for setting up dependencies in the parent folder then do the same thing in this directory to have deps in both places.
4+
5+
## 📊 Metrics
6+
7+
```sh
8+
make serve-metrics-django
9+
```
10+
11+
Access your test server to demo Metrics by making a cURL request:
12+
13+
```sh
14+
curl http://localhost:8000
15+
```
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
#!/usr/bin/env python
2+
"""Django's command-line utility for administrative tasks."""
3+
import os
4+
import sys
5+
6+
from django.core.management.commands.runserver import Command as runserver
7+
8+
if os.getenv("README_API_KEY") is None and "runserver" in sys.argv:
9+
sys.stderr.write("Missing `README_API_KEY` environment variable")
10+
sys.stderr.flush()
11+
sys.exit(1)
12+
13+
14+
def main():
15+
"""Run administrative tasks."""
16+
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "metrics_django.settings")
17+
runserver.default_port = os.getenv("PORT") or 8000
18+
try:
19+
# pylint: disable=import-outside-toplevel
20+
from django.core.management import execute_from_command_line
21+
except ImportError as exc:
22+
raise ImportError(
23+
"Couldn't import Django. Are you sure it's installed and "
24+
"available on your PYTHONPATH environment variable? Did you "
25+
"forget to activate a virtual environment?"
26+
) from exc
27+
execute_from_command_line(sys.argv)
28+
29+
30+
if __name__ == "__main__":
31+
main()

packages/python/examples/metrics_django/metrics/__init__.py

Whitespace-only changes.

packages/python/examples/metrics_django/metrics/admin.py

Whitespace-only changes.
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
from django.apps import AppConfig
2+
3+
4+
class MetricsConfig(AppConfig):
5+
default_auto_field = "django.db.models.BigAutoField"
6+
name = "metrics"

packages/python/examples/metrics_django/metrics/migrations/__init__.py

Whitespace-only changes.
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
from django.db import models
2+
3+
4+
class Person(models.Model):
5+
first_name = models.CharField(max_length=30)

packages/python/examples/metrics_django/metrics/tests.py

Whitespace-only changes.

0 commit comments

Comments
 (0)