Skip to content

Commit e36a08e

Browse files
authored
Merge pull request #45 from flask-dashboard/development
Development
2 parents dac31e5 + 283afb7 commit e36a08e

22 files changed

+296
-302
lines changed

README.md

+29-12
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ You can see the execution time and last access time per endpoint.
99

1010
Also, unit tests can be run by TravisCI and monitored.
1111

12+
IMPORTANT: Since the implementation uses string interpolation, the minimum python version is 3.6 [PEP 498](https://www.python.org/dev/peps/pep-0498/).
13+
1214
Installation
1315
============
1416
To install from source, download the source code, then run this:
@@ -27,7 +29,7 @@ Adding the extension to your flask app is simple:
2729
import dashboard
2830

2931
user_app = Flask(__name__)
30-
dashboard.config.from_file('/<path to your config file>/config.cfg')
32+
dashboard.config.init_from(file='/<path to your config file>/config.cfg')
3133

3234
def get_session_id():
3335
# Implement your own function for obtaining the user variable here.
@@ -36,6 +38,11 @@ Adding the extension to your flask app is simple:
3638
dashboard.config.get_group_by = get_session_id
3739
dashboard.bind(app=user_app)
3840

41+
Instead of having a hardcoded string containing the location of the config file in the code above, it is also possible
42+
to define an environment variable that specifies the location of this config file.
43+
The line should then be `dashboard.config.init_from(envvar='DASHBOARD_CONFIG')`. This will configure the dashboard based on the file
44+
provided in the environment variable called `DASHBOARD_CONFIG`.
45+
3946
Usage
4047
=====
4148
Once the setup is done, a config file ('config.cfg') should be set next to the python file that contains the entry point of the app.
@@ -51,7 +58,6 @@ The following things can be configured:
5158
DATABASE=sqlite:////<path to your project>/dashboard.db
5259
GIT=/<path to your project>/.git/
5360
TEST_DIR=/<path to your project>/tests/
54-
LOG_DIR=/<path to your project>/
5561
N=5
5662
SUBMIT_RESULTS_URL=http://0.0.0.0:5000/dashboard/submit-test-results
5763
OUTLIER_DETECTION_CONSTANT=2.5
@@ -65,23 +71,34 @@ When running your app, the dashboard van be viewed by default in the route:
6571

6672
TravisCI unit testing
6773
=====================
68-
To enable Travis to run your unit tests and send the results to the dashboard, four steps have to be taken.
69-
70-
First off, the file 'collect_performance.py' (which comes with the dashboard) should be copied to the directory where your '.travis.yml' file resides.
74+
To enable Travis to run your unit tests and send the results to the dashboard, four steps have to be taken:
7175

72-
Secondly, your config file for the dashboard ('config.cfg') should be updated to include four additional values, TEST_DIR, LOG_DIR, SUBMIT_RESULTS_URL and N.
73-
The first specifies where your unit tests reside, the second where the logs should be placed, the third where Travis should upload the test results to, and the last specifies the number of times Travis should run each unit test.
74-
See the sample config file in the section above for an example.
76+
1. Update the config file ('config.cfg') to include three additional values, TEST_DIR, SUBMIT_RESULTS_URL and N.
77+
- TEST_DIR specifies where the unit tests reside.
78+
- SUBMIT_RESULTS_URL specifies where Travis should upload the test results to. When left out, the results will not
79+
be sent anywhere, but the performance collection process will still run.
80+
- N specifies the number of times Travis should run each unit test.
7581

76-
Then, the installation requirement for the dashboard has to be added to the 'setup.py' file of your app:
82+
2. The installation requirement for the dashboard has to be added to the 'setup.py' file of your app:
7783

84+
dependency_links=["https://github.com/flask-dashboard/Flask-Monitoring-Dashboard/tarball/master#egg=flask_monitoring_dashboard"]
85+
7886
install_requires=('flask_monitoring_dashboard')
7987

80-
Lastly, in your '.travis.yml' file, two script commands should be added:
88+
3. In your '.travis.yml' file, three script commands should be added:
8189

8290
script:
83-
- export DASHBOARD_CONFIG=config.cfg
84-
- python ./collect_performance.py
91+
- export DASHBOARD_CONFIG=./config.cfg
92+
- export DASHBOARD_LOG_DIR=./logs/
93+
- python -m dashboard.collect_performance
94+
95+
The config environment variable specifies where the performance collection process can find the config file.
96+
The log directory environment variable specifies where the performance collection process should place the logs it uses.
97+
The third command will start the actual performance collection process.
98+
99+
4. A method that is executed after every request should be added to the blueprint of your app.
100+
This is done by the dashboard automatically when the blueprint is passed to the binding function like so: `dashboard.bind(app=app, blue_print=api)`.
101+
This extra method is needed for the logging, and without it, the unit test results cannot be grouped by endpoint that they test.
85102

86103
Screenshots
87104
===========

dashboard/__init__.py

+18-16
Original file line numberDiff line numberDiff line change
@@ -30,16 +30,32 @@ def loc():
3030
blueprint = Blueprint('dashboard', __name__, template_folder=loc() + 'templates')
3131

3232

33-
def bind(app):
33+
def bind(app, blue_print=None):
3434
"""
3535
Binding the app to this object should happen before importing the routing-
3636
methods below. Thus, the importing statement is part of this function.
37-
:param app: the app for which the performance has to be tracked
37+
:param app: the app for which the performance has to be tracked
38+
:param blue_print: the blueprint that contains the endpoints to be monitored
3839
"""
3940
assert app is not None
4041
global user_app, blueprint
4142
user_app = app
4243

44+
if blue_print:
45+
import os
46+
import datetime
47+
from flask import request
48+
log_dir = os.getenv('DASHBOARD_LOG_DIR')
49+
50+
@blue_print.after_request
51+
def after_request(response):
52+
if log_dir:
53+
t1 = str(datetime.datetime.now())
54+
log = open(log_dir + "endpoint_hits.log", "a")
55+
log.write("\"{}\",\"{}\"\n".format(t1, request.endpoint))
56+
log.close()
57+
return response
58+
4359
# Add all route-functions to the blueprint
4460
import dashboard.routings
4561

@@ -49,17 +65,3 @@ def bind(app):
4965

5066
# register the blueprint to the app
5167
app.register_blueprint(blueprint, url_prefix='/' + config.link)
52-
53-
# search for tests if test dir specified
54-
if config.test_dir:
55-
from dashboard.database.tests import add_test, get_tests
56-
suites = TestLoader().discover(config.test_dir, pattern="*test*.py")
57-
existing_tests = get_tests()
58-
tests = []
59-
for t in existing_tests:
60-
tests.append(t.name)
61-
for suite in suites:
62-
for case in suite:
63-
for test in case:
64-
if str(test) not in tests:
65-
add_test(str(test))

dashboard/collect_performance.py

+14-6
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,16 @@
1414
print('Please set an environment variable \'DASHBOARD_CONFIG\' specifying the absolute path to your config file.')
1515
sys.exit(0)
1616

17+
# Abort if log directory is not specified.
18+
log_dir = os.getenv('DASHBOARD_LOG_DIR')
19+
if log_dir is None:
20+
print('You must specify a log directory for the dashboard to be able to use the unit test monitoring functionality.')
21+
print('Please set an environment variable \'DASHBOARD_LOG_DIR\' specifying the absolute path where you want the log files to be placed.')
22+
sys.exit(0)
23+
1724
n = 1
1825
url = None
26+
sys.path.insert(0, os.getcwd())
1927
parser = configparser.RawConfigParser()
2028
try:
2129
parser.read(config)
@@ -26,9 +34,6 @@
2634
else:
2735
print('No test directory specified in your config file. Please do so.')
2836
sys.exit(0)
29-
if not parser.has_option('dashboard', 'LOG_DIR'):
30-
print('No log directory specified in your config file. Please do so.')
31-
sys.exit(0)
3237
if parser.has_option('dashboard', 'SUBMIT_RESULTS_URL'):
3338
url = parser.get('dashboard', 'SUBMIT_RESULTS_URL')
3439
else:
@@ -37,7 +42,10 @@
3742
print("Something went wrong while parsing the configuration file:\n{}".format(e))
3843

3944
data = {'test_runs': [], 'grouped_tests': []}
40-
log = open("test_runs.log", "w")
45+
log = open(log_dir + "endpoint_hits.log", "w")
46+
log.write("\"time\",\"endpoint\"\n")
47+
log.close()
48+
log = open(log_dir + "test_runs.log", "w")
4149
log.write("\"start_time\",\"stop_time\",\"test_name\"\n")
4250

4351
if test_dir:
@@ -61,7 +69,7 @@
6169

6270
# Read and parse the log containing the test runs
6371
runs = []
64-
with open('test_runs.log') as log:
72+
with open(log_dir + 'test_runs.log') as log:
6573
reader = csv.DictReader(log)
6674
for row in reader:
6775
runs.append([datetime.datetime.strptime(row["start_time"], "%Y-%m-%d %H:%M:%S.%f"),
@@ -70,7 +78,7 @@
7078

7179
# Read and parse the log containing the endpoint hits
7280
hits = []
73-
with open('endpoint_hits.log') as log:
81+
with open(log_dir + 'endpoint_hits.log') as log:
7482
reader = csv.DictReader(log)
7583
for row in reader:
7684
hits.append([datetime.datetime.strptime(row["time"], "%Y-%m-%d %H:%M:%S.%f"),

dashboard/config.py

+24-12
Original file line numberDiff line numberDiff line change
@@ -23,12 +23,12 @@ def __init__(self):
2323
self.guest_password = ['guest_password']
2424
self.outlier_detection_constant = 2.5
2525
self.colors = {}
26-
self.log_dir = None
26+
self.security_token = 'cc83733cb0af8b884ff6577086b87909'
2727

2828
# define a custom function to retrieve the session_id or username
2929
self.get_group_by = None
3030

31-
def from_file(self, config_file):
31+
def init_from(self, file=None, envvar=None):
3232
"""
3333
The config_file must at least contains the following variables in section 'dashboard':
3434
APP_VERSION: the version of the app that you use. Updating the version helps in
@@ -54,16 +54,28 @@ def from_file(self, config_file):
5454
average, extra information is logged into the database. A default value for this
5555
variable is 2.5, but can be changed in the config-file.
5656
57-
:param config_file: a string pointing to the location of the config-file
57+
SECURITY_TOKEN: Used for getting the data in /get_json_data/<security_token>
58+
59+
:param file: a string pointing to the location of the config-file
60+
:param envvar: a string specifying which environment variable holds the config file location
5861
"""
5962

60-
config = os.getenv('DASHBOARD_CONFIG')
61-
if config:
62-
config_file = config
63+
if envvar:
64+
file = os.getenv(envvar)
65+
if not file:
66+
print("No configuration file specified. Please do so.")
67+
68+
69+
# When collecting unit test performance results, create log file
70+
log_dir = os.getenv('DASHBOARD_LOG_DIR')
71+
if log_dir:
72+
log = open(log_dir + "endpoint_hits.log", "w")
73+
log.write("\"time\",\"endpoint\"\n")
74+
log.close()
6375

6476
parser = configparser.RawConfigParser()
6577
try:
66-
parser.read(config_file)
78+
parser.read(file)
6779
if parser.has_option('dashboard', 'APP_VERSION'):
6880
self.version = parser.get('dashboard', 'APP_VERSION')
6981
if parser.has_option('dashboard', 'CUSTOM_LINK'):
@@ -72,11 +84,6 @@ def from_file(self, config_file):
7284
self.database_name = parser.get('dashboard', 'DATABASE')
7385
if parser.has_option('dashboard', 'TEST_DIR'):
7486
self.test_dir = parser.get('dashboard', 'TEST_DIR')
75-
if parser.has_option('dashboard', 'LOG_DIR'):
76-
self.log_dir = parser.get('dashboard', 'LOG_DIR')
77-
log = open(self.log_dir + "endpoint_hits.log", "w")
78-
log.write("\"time\",\"endpoint\"\n")
79-
log.close()
8087

8188
# For manually defining colors of specific endpoints
8289
if parser.has_option('dashboard', 'COLORS'):
@@ -113,5 +120,10 @@ def from_file(self, config_file):
113120
if parser.has_option('dashboard', 'OUTLIER_DETECTION_CONSTANT'):
114121
self.outlier_detection_constant = ast.literal_eval(
115122
parser.get('dashboard', 'OUTLIER_DETECTION_CONSTANT'))
123+
124+
# when a security token is provided:
125+
if parser.has_option('dashboard', 'security_token'):
126+
self.security_token = parser.get('dashboard', 'SECURITY_TOKEN')
127+
116128
except configparser.Error:
117129
raise

dashboard/database/function_calls.py

+16-5
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
"""
44

55
from flask import request
6-
from sqlalchemy import func, desc, text, asc
6+
from sqlalchemy import func, desc, text, asc, DateTime
77
from dashboard import config
88
import datetime
9-
from dashboard.database import session_scope, FunctionCall
9+
from dashboard.database import session_scope, FunctionCall, MonitorRule
1010
from dashboard.colors import get_color
1111

1212

@@ -48,19 +48,30 @@ def get_times():
4848
return result
4949

5050

51-
def get_data():
52-
""" Returns all data in the FunctionCall table, for the export data option. """
51+
def get_data_from(time_from):
52+
"""
53+
Returns all data in the FunctionCall table, for the export data option.
54+
This function returns all data after the time_from date.
55+
"""
5356
with session_scope() as db_session:
5457
result = db_session.query(FunctionCall.endpoint,
5558
FunctionCall.execution_time,
5659
FunctionCall.time,
5760
FunctionCall.version,
5861
FunctionCall.group_by,
59-
FunctionCall.ip).all()
62+
FunctionCall.ip).filter(FunctionCall.time >= time_from)
6063
db_session.expunge_all()
6164
return result
6265

6366

67+
def get_data():
68+
"""
69+
Equivalent function to get_data_from, but returns all data.
70+
:return: all data from the database in the Endpoint-table.
71+
"""
72+
return get_data_from(datetime.date(1970, 1, 1))
73+
74+
6475
def get_data_per_version(version):
6576
""" Returns all data in the FuctionCall table, grouped by their version. """
6677
with session_scope() as db_session:

dashboard/database/monitor_rules.py

+24
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,31 @@ def get_monitor_rules():
1212
return result
1313

1414

15+
def get_monitor_names():
16+
""" Return all names of monitor rules that are currently being monitored"""
17+
with session_scope() as db_session:
18+
result = db_session.query(MonitorRule.endpoint).filter(MonitorRule.monitor).all()
19+
db_session.expunge_all()
20+
return result
21+
22+
1523
def reset_monitor_endpoints():
1624
""" Update all monitor rules in the database and set them to false. """
1725
with session_scope() as db_session:
1826
db_session.query(MonitorRule).update({MonitorRule.monitor: False})
27+
28+
29+
def get_monitor_data():
30+
"""
31+
Returns all data in the rules-table. This table contains which endpoints are being
32+
monitored and which are not.
33+
:return: all data from the database in the rules-table.
34+
"""
35+
with session_scope() as db_session:
36+
result = db_session.query(MonitorRule.endpoint,
37+
MonitorRule.last_accessed,
38+
MonitorRule.monitor,
39+
MonitorRule.time_added,
40+
MonitorRule.version_added).all()
41+
db_session.expunge_all()
42+
return result

dashboard/main.py

+6-1
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,15 @@
1+
"""
2+
This file can be executed for developing purposes. It is not used, when the flask_monitoring_dashboard is
3+
attached to an existing flask application.
4+
"""
5+
16
from flask import Flask, redirect, url_for
27
import dashboard
38
import os
49

510
user_app = Flask(__name__)
611
here = os.path.abspath(os.path.dirname(__file__))
7-
dashboard.config.from_file(here + '/config.cfg')
12+
dashboard.config.init_from(file=here + '/config.cfg')
813

914

1015
def get_session_id():

dashboard/measurement.py

-7
Original file line numberDiff line numberDiff line change
@@ -55,19 +55,12 @@ def wrapper(*args, **kwargs):
5555
# start a thread to log the stacktrace after 'average' ms
5656
stack_info = StackInfo(average)
5757

58-
t1 = str(datetime.datetime.now())
5958
time1 = time.time()
6059
result = func(*args, **kwargs)
6160
time2 = time.time()
6261
t = (time2 - time1) * 1000
6362
add_function_call(time=t, endpoint=endpoint)
6463

65-
# Logging for grouping unit test results by endpoint
66-
if config.log_dir:
67-
log = open(config.log_dir + "endpoint_hits.log", "a")
68-
log.write("\"{}\",\"{}\"\n".format(t1, endpoint))
69-
log.close()
70-
7164
# outlier detection
7265
endpoint_count[endpoint] += 1
7366
endpoint_sum[endpoint] += t

0 commit comments

Comments
 (0)