make_objective_churn#

empulse.metrics.make_objective_churn(model, *, accept_rate=0.3, clv=200, incentive_fraction=0.05, contact_cost=15)[source]#

Create an objective function for the Expected Cost measure for customer churn.

The objective function presumes a situation where identified churners are contacted and offered an incentive to remain customers. Only a fraction of churners accepts the incentive offer. For detailed information, consult the paper [1].

Read more in the User Guide.

See also

B2BoostClassifier : Uses the instance-specific cost function as objective function.

Parameters:
model{‘xgboost’, ‘lightgbm’, ‘catboost’}

The model for which the objective function is created.

accept_ratefloat, default=0.3

Probability of a customer responding to the retention offer (0 < accept_rate < 1).

clvfloat or 1D array, shape=(n_samples), default=200

If float: constant customer lifetime value per retained customer (clv > incentive_cost). If array: individualized customer lifetime value of each customer when retained (mean(clv) > incentive_cost).

incentive_fractionfloat, default=0.05

Cost of incentive offered to a customer, as a fraction of customer lifetime value (0 < incentive_fraction < 1).

contact_costfloat, default=1

Constant cost of contact (contact_cost > 0).

Returns:
objectiveCallable

A custom objective function for xgboost.XGBClassifier.

Notes

The instance-specific cost function for customer churn is defined as [1]:

\[C(s_i) = y_i[s_i(f-\gamma (1-\delta )CLV_i] + (1-y_i)[s_i(\delta CLV_i + f)]\]

The measure requires that the churn class is encoded as 0, and it is NOT interchangeable. However, this implementation assumes the standard notation (‘churn’: 1, ‘no churn’: 0).

References

[1] (1,2)

Janssens, B., Bogaert, M., Bagué, A., & Van den Poel, D. (2022). B2Boost: Instance-dependent profit-driven modelling of B2B churn. Annals of Operations Research, 1-27.

Examples

from xgboost import XGBClassifier
from empulse.metrics import make_objective_churn

objective = make_objective_churn(model='xgboost')
clf = XGBClassifier(objective=objective, n_estimators=100, max_depth=3)