2

I performed a binary classification using logistic regression.

My goal is the following:

I know the coefficient w of the hyperplane equation $y = wTx + b$. What I would like to do is create opposing instances by disrupting my points so that they come out just behind my hyperplane.

That is to say ensure that the points classified as 0 go to 1 and those classified as 1 go to 0.

I would like to find the minimal perturbation allowing to do this, that is to say, the perturbation which orthogonally projects the points a little further than the hyperplane.

Ethan
  • 1,625
  • 8
  • 23
  • 39
Majid Az
  • 51
  • 3
  • What is your threshold for classifying as $0$ or $1$? Do you just mean moving the probability from $0.49$ to $0.51$? – Dave Jun 07 '21 at 09:38

1 Answers1

3

In logistic regression, you're assuming $p(\text{class } y\!=\!1|x) = \sigma(w^\top\!x+b)$ for some parameters $w$, $b$ that can be estimated from the data. Typically, a point $x$ is assigned to class $y\!=\!1$ If $\sigma(w^\top\!x+b)>0.5$, otherwise $y\!=\!0$. The decision boundary are those $x$ where $\sigma(w^\top\!x+b)=0.5$, or simply $w^\top\!x+b =0$ (a hyper-plane orthogonal to $w$, with $w^\top\!x=-b$).

Given some $x$ that the model labels $y\!=\!1$ (i.e. $w^\top\!x+b >0$), it is projected to the closest point $x'$ on the decision boundary, by subtracting some amount ($\lambda$) of the vector $w$, since that is orthogonal to the decision boundary. That is, $\lambda$ satisfies $0=w^\top\!x'+b =w^\top\!(x\!-\!\lambda w)+b$.

Rearranging, gives $\lambda = \tfrac{w^\top\!x+b}{w^\top w} \ $, so $\ x' = x\!-\!\tfrac{w^\top\!x+b}{w^\top w} w\ $ is the projection of $x$ on the decision boundary. Picking a larger $\lambda$ gives a point past the decision boundary, which is assigned class $y\!=\!0$. (Note that $\lambda = \tfrac{w^\top\!x+b}{w^\top w}$ must be positive since $w^\top\!x+b$ is positive for the given $x$ and $w^\top\!w$ is always positive).

So, for $x$ labelled $y\!=\!1$, the closest point $x^*$ labelled $y\!=\!0$ (by Euclidean distance) is $\ x^* = x-\lambda^* w\ $, where $\ \lambda^*=\tfrac{w^\top\!x+b}{w^\top w}+\delta\ $, and $\delta\!>\!0$ is small.

Carl
  • 356
  • 1
  • 6