Skip to content
This repository was archived by the owner on Aug 31, 2019. It is now read-only.

Latest commit

 

History

History
56 lines (32 loc) · 4.2 KB

devenv-readme.md

File metadata and controls

56 lines (32 loc) · 4.2 KB

Development Environment Setup

Thinking of contributing to this project? Great! This document provides some help on getting a development environment setup for that work.

Common/Shared Instructions

Most of the setup and configuration for this project is the same as for the other Java-based Blue Button projects. Accordingly, please be sure to follow all of the instructions documented here, first: bluebutton-parent-pom: Development Environment Setup.

Git LFS

This project uses git-lfs to manage the large DE-SynPUF archives needed in the source tree. Install it, as follows:

$ curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
$ sudo apt-get install git-lfs
$ git lfs install

If you'd already cloned this repo before installing Git LFS, run the following to grab the LFS files:

$ git lfs pull

However, if you'd installed Git LFS before cloning, you should already be good to go.

Build Dependencies

HAPI FHIR

This project depends on the HAPI FHIR project. Releases of that project are available in the Maven Central repository, which generally makes things pretty simple: our Maven builds will pick up theirs.

Unfortunately, this project will sometimes need to depend on an interim/snapshot build of HAPI FHIR. When that's the case, developers will first need to locally checkout and mvn install that interim version themselves, manually. To keep this simpler, a fork of HAPI FHIR is maintained in the HHSIDEAlab/hapi-fhir repository on GitHub, which will always point to whatever version of HAPI FHIR this one depends on. You can checkout and build that fork, as follows:

$ git clone https://github.com/HHSIDEAlab/hapi-fhir.git hhsidealab-hapi-fhir.git
$ cd hhsidealab-hapi-fhir.git
$ mvn clean install -DskipITs=true -DskipTests=true

Once the build is done, the HAPI FHIR artifacts will be placed into your user's local Maven repository (~/.m2/repository), available for use by this project or others.

AWS Credentials

Some of this project's integration tests require AWS credentials. These credentials are used to stand up and tear down AWS resources used in the tests. For example, DataSetMonitorIT uses the AWS API to create and manipulate the S3 buckets that it uses during the tests, removing them at the end of each test case. This test code uses the AWS Java API's DefaultAWSCredentialsProviderChain class, which will supports multiple different mechanisms for retrieving the credentials to use. Credentials must be supplied via at least one of those mechanisms, or the ITs will fail.

(The project's Jenkinsfile uses the environment variable mechanism, which the project's main Jenkins server has been configured to support.)

Running the Benchmarks

This project's bluebutton-data-pipeline-benchmarks module contains the S3ToFhirLoadAppBenchmark class, which runs a series of (time-consuming) benchmarks of this application's performance. See the Design Decisions document for a discussion on how these benchmarks were designed.

To run the benchmarks, a number of additional parameters have to be included in the typical Maven build command, as follows:

$ export AWS_ACCESS_KEY_ID='foo'
$  export AWS_SECRET_ACCESS_KEY='bar'
$ mvn clean verify -DskipBenchmarks=false -Dec2KeyName=fizz -Dec2KeyFile=/somedir/buzz.pem

The ec2KeyName variable must identify the name of the AWS EC2 keypair entry that the benchmark systems should be created with. The ec2KeyFile variable must point to the local path to the private key PEM file for that keypair.

In addition, you may optionally specify a path to a different data collection file to use with an argument such as "-DbenchmarkDataFile=somedir/benchmark-data.csv". Builds on a build server should do this to ensure that benchmark runs from all builds write to a central location.