TransWikia.com

How to optimize a utility function that contains step function?

Operations Research Asked on January 5, 2022

I have an optimization problem with an uncommon utility: to find a $beta$ that maximizes

$$
r^{T}cdot H(Xcdotbeta)
$$

where $H()$ is a Heaviside step function as in wiki

$r$ is a vector of size 1000

$X$ is a 1000×50 "tall" matrix

$beta$ is a vector of size 50

I am familiar with gradient descent, which is how I usually solve an optimization problem. But Heaviside function does not work with gradient descent. So I am wondering if anyone here could shed some light on how to solve such optimization problem.

One Answer

You can solve the problem via integer linear programming as follows, assuming $r_i ge 0$ for all $i$. Let $M_i$ be a (small) upper bound on $-(X cdot beta)_i$. Let binary decision variable $y_i$ indicate whether $(X cdot beta)_i ge 0$. The problem is to maximize $$sum_{i=1}^{1000} r_i y_i$$ subject to $$-(X cdot beta)_i le M_i(1 - y_i)$$ for all $i$. This "big-M" constraint enforces $y_i=1 implies (X cdot beta)_i ge 0$.

Answered by RobPratt on January 5, 2022

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP