Thank you for your response, I was not so clear about what do you mean global and local importance before, after doing some experiment myself it is much clearer.
I still have some more question about SHAP value, when I use it to interpret individual row prediction, it seems that I can interpret the SHAP value itself as the delta between the expected value, i.e. + 0.5 SHAP value corresponds to + 0.5 on the original output y.
Follow on this assumption, can I simply sum up the absolute SHAP value of individual features and normalize it across all features (Sum to 1) and interpret it as their relative importance?
