shapr: Prediction Explanation with Dependence-Aware Shapley Values

Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements the method described in Aas, Jullum and Løland (2019) <doi:10.48550/arXiv.1903.10464>, which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values.

Version: 0.2.2
Depends: R (≥ 3.5.0)
Imports: stats, data.table, Rcpp (≥ 0.12.15), condMVNorm, mvnfast, Matrix
LinkingTo: RcppArmadillo, Rcpp
Suggests: ranger, xgboost, mgcv, testthat, knitr, rmarkdown, roxygen2, MASS, ggplot2, caret, gbm, party, partykit
Published: 2023-05-04
DOI: 10.32614/CRAN.package.shapr
Author: Nikolai Sellereite ORCID iD [aut], Martin Jullum ORCID iD [cre, aut], Annabelle Redelmeier [aut], Anders Løland [ctb], Jens Christian Wahl [ctb], Camilla Lingjærde [ctb], Norsk Regnesentral [cph, fnd]
Maintainer: Martin Jullum <Martin.Jullum at>
License: MIT + file LICENSE
NeedsCompilation: yes
Language: en-US
Materials: README NEWS
In views: MachineLearning
CRAN checks: shapr results


Reference manual: shapr.pdf
Vignettes: 'shapr': Explaining individual machine learning predictions with Shapley values


Package source: shapr_0.2.2.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
macOS binaries: r-release (arm64): shapr_0.2.2.tgz, r-oldrel (arm64): shapr_0.2.2.tgz, r-release (x86_64): shapr_0.2.2.tgz, r-oldrel (x86_64): shapr_0.2.2.tgz
Old sources: shapr archive

Reverse dependencies:

Reverse imports: PPtreeregViz


Please use the canonical form to link to this page.