Changelog#
Unreleased#
…
0.9.0 (15-06-2025)#
Feature Added
optimal_threshold
andoptimal_rate
methods to calculate the optimal threshold(s) and optimal predicted positive rate for a given metric. This is useful for determining the best decision threshold and predicted positive rate for a cost-sensitive or value-driven model.Feature
CSTreeClassifier
,CSForestClassifier
, andCSBaggingClassifier
can now take aMetric
instance as their criterion to optimize.Feature
CSThresholdClassifier
can now take aMetric
instance to choose the optimal decision threshold.Feature
RobustCSClassifier
can now take estimators with aMetric
instance as the loss function or criterion.RobustCSClassifier
will treat any cost marked as outlier sensitive. This can be done by using themark_outlier_sensitive
method.Feature Allow savings metrics to be used in
CSBoostClassifier
andCSLogitClassifier
as the objective function. Internally, the expected cost loss is used to train the model, since the expected savings score is just a transformation of the expected cost loss.API Change kind argument to
Metric
has been replaced by strategy. TheMetric
class now takes aMetricStrategy
instance. This change allows for more flexibility in defining the metric strategy. The currently available strategies are:Fix Fix error when importing Empulse without any optional dependencies installed.
Fix Fix
CSLogitClassifier
not properly using the gradient when using a custom loss function fromMetric
.Fix Fix models throwing errors when differently shaped costs are passed to the fit or predict method.
Fix Fix sympy distribution parameters not being properly translated to scipy distribution parameters when using the
MaxProfit
strategy (formerly kind=’max profit’) with the quasi monte-carlo integration method.
0.8.0 (01-06-2025)#
Feature
CSBoostClassifier
,CSLogitClassifier
, andProfLogitClassifier
can now take aMetric
instance as their loss function. Internally, the metric instance is converted to the appropriate loss function for the model. For more information, read the User Guide.Feature Type hints are now available for all functions and classes.
Enhancement Add support for more than one stochastic variable when building maximum profit metrics with
Metric
Enhancement Allow
Metric
to be used as a context manager. This ensures the metric is always built after defining the cost-benefit elements.Fix Fix datasets not properly being packaged together with the package
Fix Fix
RobustCSClassifier
when array-like parameters are passed to fit method.Fix Fix boosting models being biased towards the positive class.
0.7.0 (05-02-2025)#
Major Feature Add
CSTreeClassifier
,CSForestClassifier
, andCSBaggingClassifier
to support cost-sensitive decision tree and ensemble modelsEnhancement Add support for scikit-learn 1.5.2 (previously Empulse only supported scikit-learn 1.6.0 and above).
API Change Removed the
emp_score
andemp
functions from themetrics
module. Use theMetric
class instead to define custom expected maximum profit measures. For more information, read the User Guide.API Change Removed numba as a dependency for Empulse. This will reduce the installation time and the size of the package.
Fix Fix
Metric
when defining stochastic variable with fixed values.Fix Fix
Metric
when stochastic variable has infinite bounds.Fix Fix
CSThresholdClassifier
when costs of predicting positive and negative classes are equal.Fix Fix documentation linking issues to sklearn
0.6.0 (28-01-2025)#
Major Feature Add
Metric
to easily build your own value-driven and cost-sensitive metricsFeature Add support for LightGBM and Catboost models in
CSBoostClassifier
andB2BoostClassifier
API Change
make_objective_churn
andmake_objective_acquisition
now take amodel
argument to calculate the objective for either XGBoost, LightGBM or Catboost models.API Change XGBoost is now an optional dependency together with LightGBM and Catboost. To install the package with XGBoost, LightGBM and Catboost support, use the following command:
pip install empulse[optional]
API Change Renamed
y_pred_baseline
andy_proba_baseline
tobaseline
insavings_score
andexpected_savings_score
. It now accepts the following arguments:If
'zero_one'
, the baseline model is a naive model that predicts all zeros or all ones depending on which is better.If
'prior'
, the baseline model is a model that predicts the prior probability of the majority or minority class depending on which is better (not available for savings score).If array-like, target probabilities of the baseline model.
Feature Add parameter validation for all models and samplers
API Change Make all arguments of dataset loaders keyword-only
Fix Update the descriptions attached to each dataset to match information found in the user guide
Fix Improve type hints for functions and classes
0.5.2 (12-01-2025)#
Feature Allow
savings_score
andexpected_savings_score
to calculate the savings score over the baseline model instead of a naive model, by setting they_pred_baseline
andy_proba_baseline
parameters, respectively.Enhancement Reworked the user guide documentation to better explain the usage of value-driven and cost-sensitive models, samplers and metrics
API Change
CSLogitClassifier
andProfLogitClassifier
by default do not perform soft-thresholding on the regression coefficients. This can be enabled by setting thesoft_threshold
parameter to True.Fix Prevent division by zero errors in
expected_cost_loss
0.5.1 (05-01-2025)#
Fix Fixed documentation build issue
0.5.0 (05-01-2025)#
Major Feature Added supported for python 3.13
- Major Feature Added cost-sensitive models
- Major Feature Added cost-sensitive metrics
Major Feature Added
empulse.datasets
moduleFeature Added
CostSensitiveSampler
Enhancement Allow all cost-sensitive models and samplers to accept cost parameters during initialization
API Change Renamed metric arguments which expect target score from y_pred to y_score and target probabilities from y_pred to y_proba