How to optimize a utility function that contains step function?

Operations Research Asked on January 5, 2022

I have an optimization problem with an uncommon utility: to find a $beta$ that maximizes

r^{T}cdot H(Xcdotbeta)

where $H()$ is a Heaviside step function as in wiki

$r$ is a vector of size 1000

$X$ is a 1000×50 "tall" matrix

$beta$ is a vector of size 50

I am familiar with gradient descent, which is how I usually solve an optimization problem. But Heaviside function does not work with gradient descent. So I am wondering if anyone here could shed some light on how to solve such optimization problem.

One Answer

You can solve the problem via integer linear programming as follows, assuming $r_i ge 0$ for all $i$. Let $M_i$ be a (small) upper bound on $-(X cdot beta)_i$. Let binary decision variable $y_i$ indicate whether $(X cdot beta)_i ge 0$. The problem is to maximize $$sum_{i=1}^{1000} r_i y_i$$ subject to $$-(X cdot beta)_i le M_i(1 - y_i)$$ for all $i$. This "big-M" constraint enforces $y_i=1 implies (X cdot beta)_i ge 0$.

Answered by RobPratt on January 5, 2022

Add your own answers!

Related Questions

Free solver for MINP problems

1  Asked on February 18, 2021 by dspinfinity


Flexible Job Shop with Preemption

0  Asked on January 15, 2021 by robert-hildebrand


Constraint programming resources

3  Asked on November 28, 2020 by joffrey-l


Pyomo variable creation dilemma

1  Asked on October 31, 2020 by ethan-deakins


Ask a Question

Get help from others!

© 2022 All rights reserved. Sites we Love: PCI Database, MenuIva, UKBizDB, Menu Kuliner, Sharing RPP