An IDB project for CS 373: Software Engineering that brings together data on American political parties, representatives, and congressional districts.
- Website: swethepeople.me
- API: api.swethepeople.me
- Technical report: https://wethesweople.gitbooks.io/report/
- Python Version: Python2.7
- Clone the repository
$ https://github.com/WeTheSWEople/SWEThePeople.git
- Install frontend dependencies
$ cd frontend
$ npm install
- Install backend dependencies
$ cd backend
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
Run the server locally using:
$ cd frontend
$ npm start
Then, access the local site by visiting localhost:3000
Run the API locally using:
$ cd backend
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ python2.7 main.py
Then, access the local API by visiting 0.0.0.0:4040
In the frontend folder, run the test script (located at frontend/tests.js)
$ cd frontend
$ npm test
Frontend acceptance tests run using the Selenium web driver.
- Install Firefox
- Install geckodriver
(should install to
/user/local/bin/) - cd into frontend folder
$ cd frontend
- Create Virtual Environment and Install Requirements:
$ virtualenv venv
$ source venv/bin/activate
$ pip install -r requirements.txt
- Run the test script
$ python2.7 guitests.py
In the backend folder, run the test script (You will need to export environment variables (db credentials) to run this test - provided in the turn in JSON)
$ cd backend
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ python2.7 tests.py
Ensure newman is installed, then run the tests in the repo's base directory
$ npm install -g newman
$ newman run Postman.json
The scrapers will store information in the database defined in config.json. The scrapers must be run in the order below to ensure that the data is stored properly.
From the backend folder:
$ cd backend
- Run the political parties seeder
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ python2.7 party_seed.py
- Run the representative scraper
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ python2.7 representatives_scraper.py
- Run the congressional districts scraper
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ python2.7 districts_scraper.py
- Right click on servers -> create server
- Fill in the name as
swethepeople - Fill in the
host name/addressunder the connection tab. - Save
- Right click and connect to the server
- Fill in the username and password
- Expand the server tab
- Go to Databases -> swethepeople -> Schemas -> Tables
- SSH into the EC2 instance
- Clone the repository
$ https://github.com/WeTheSWEople/SWEThePeople.git`
- Install dependencies
$ virtualenv venv
$ source venv/bin/activate
$ pip2.7 install -r requirements.txt
$ export PYTHONPATH=.:$PYTHONPATH
$ npm install
- Build the project
$ cd frontend
$ npm run build
- You have a choice here. You can either run using Gunicorn via
$ cd ..
$ gunicorn router:app -b localhost:8000
or with flask itself via
$ cd ..
$ python router.py &





