Nok Chan
1 min readOct 1, 2018

Thank you for your response, I was not so clear about what do you mean global and local importance before, after doing some experiment myself it is much clearer.

I still have some more question about SHAP value, when I use it to interpret individual row prediction, it seems that I can interpret the SHAP value itself as the delta between the expected value, i.e. + 0.5 SHAP value corresponds to + 0.5 on the original output y.

Follow on this assumption, can I simply sum up the absolute SHAP value of individual features and normalize it across all features (Sum to 1) and interpret it as their relative importance?

Example

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Responses (1)

Write a response