Speaker Details

https://www.devfestdc.org/wp-content/uploads/2019/06/Rybeck_Gabriel-1-320x256.png

Data Scientist at Booz Allen Hamilton

Gabriel Rybeck


Gabriel (Gabe) Rybeck is a data scientist at Booz Allen Hamilton in the DC area. Previously, he studied Computer Science and Economics at Haverford College and worked with Dr. Friedler on exploring the unintended consequences of using black-box machine learning algorithms to make important decisions, which carried into a paper they co-wrote entitled “Auditing black-box models for indirect influence” (http://link.springer.com/article/10.1007/s10115-017-1116-3). His current work focuses on enabling data science capabilities for the Intelligence Community using cloud services, Docker, and many other fun technologies. Along with his work, he also loves Ultimate Frisbee and hiking.


Gabriel’s

Tech Talk


4:05 PM | TRACK #3

Techniques for Interpreting Black-box Machine Learning Models

Explainable AI (XAI) has been a growing field in Data Science in response to the pervasive development and use of black-box Machine Learning models for important decisions. While some decisions are well suited to fully explainable and transparent techniques like Linear Regression and Decision Trees, other decisions are better fit by complex models like Deep Neural Networks, which are inherently more opaque. In recent years, model-agnostic techniques have allowed us to interpret the behavior of black-box models, and in addition determine the influence of protected attributes like race and gender. We will look at three python-based techniques for interpreting black-box Machine Learning models: SHAP, LIME, and one that we developed called BlackBoxAuditing.