@@ -79,7 +79,9 @@ insight.plot("beeswarm", new_measurements)
79
79
The ` force ` plot type requires the user to additionally select which single data point
80
80
they want to visualize by specifying the corresponding ` explanation_index ` :
81
81
~~~ python
82
- insight.plot(" force" , explanation_index = 3 ) # plots the force analysis of the measurement at index 3
82
+ insight.plot(
83
+ " force" , explanation_index = 3
84
+ ) # plots the force analysis of the measurement at positional index 3
83
85
~~~
84
86
![ SHAP_Force] ( ../_static/insights/shap_force.svg )
85
87
@@ -92,7 +94,9 @@ mechanics, we refer to the [SHAP documentation](https://shap.readthedocs.io/en/l
92
94
93
95
The explainer can be changed when creating the insight:
94
96
~~~ python
95
- insight = SHAPInsight.from_campaign(campaign, explainer_cls = " KernelExplainer" ) # default explainer
97
+ insight = SHAPInsight.from_campaign(
98
+ campaign, explainer_cls = " KernelExplainer"
99
+ ) # default explainer
96
100
~~~
97
101
98
102
### Experimental and Computational Representations
@@ -129,7 +133,9 @@ In addition to SHAP-based explainers, we also support
129
133
[ MAPLE] ( https://papers.nips.cc/paper_files/paper/2018/hash/b495ce63ede0f4efc9eec62cb947c162-Abstract.html )
130
134
variants. For example:
131
135
~~~ python
132
- insight = SHAPInsight.from_campaign(campaign, explainer_cls = " LimeTabular" , use_comp_rep = True )
136
+ insight = SHAPInsight.from_campaign(
137
+ campaign, explainer_cls = " LimeTabular" , use_comp_rep = True
138
+ )
133
139
insight.plot(" bar" )
134
140
~~~
135
141
![ SHAP_Bar_Lime] ( ../_static/insights/shap_bar_lime.svg )
0 commit comments