Least squares regression is defined as the minimization of the sum of squared residuals e.g.
Minimize(sum_squares(X * beta - y))
However, I'd like to propose a slight modification such that we are still minimizing the following
Minimize(sum_modified_squares(X*beta - y))
where sum_modified_squares(X*beta - y) = 0 if sign(X*beta) == sign(y)
else sum_modified_squares(X*beta - y) = sum_squares(X*beta - y)
Basically I want to only penalize when the sign of the prediction is not equal to the sign of the actual y. Is there any literature on this or implementations? I'm trying to implement in cvxpy but am not sure how to do it