$\newcommand{\ones}{\mathbf 1}$

Algorithms for unconstrained optimization

In descent methods, the particular choice of search direction does not matter so much.
  1. True.
    Incorrect.
  2. False.
    Correct!

In descent methods, the particular choice of line search does not matter so much.
  1. True.
    Correct!
  2. False.
    Incorrect.

When the gradient descent method is started from a point near the solution, it will converge very quickly.
  1. True.
    Incorrect.
  2. False.
    Correct!

Newton's method with step size $h=1$ always works.
  1. True.
    Incorrect.
  2. False.
    Correct!

When Newton's method is started from a point near the solution, it will converge very quickly.
  1. True.
    Correct!
  2. False.
    Incorrect.

Using Newton's method to minimize $f(Ty)$, where $Ty=x$ and $T$ is nonsingular, can greatly improve the convergence speed when $T$ is chosen appropriately.
  1. True.
    Incorrect.
  2. False.
    Correct!

If $f$ is self-concordant, its Hessian is Lipschitz continuous.
  1. True.
    Incorrect.
  2. False.
    Correct!

If the Hessian of $f$ is Lipschitz continuous, then $f$ is self-concordant.
  1. True.
    Incorrect.
  2. False.
    Correct!

Newton's method should only be used to minimize self-concordant functions.
  1. True.
    Incorrect.
  2. False.
    Correct!

$f(x) = \exp x$ is self-concordant.
  1. True.
    Incorrect.
  2. False.
    Correct!

$f(x) = -\log x$ is self-concordant.
  1. True.
    Correct!
  2. False.
    Incorrect.

Consider the problem of minimizing \[ f(x) = (c^Tx)^4 + \sum_{i=1}^n w_i \exp x_i, \] over $x \in \mathbf{R}^n$, where $w \succ 0$.

Newton's method would probably require fewer iterations than the gradient method, but each iteration would be much more costly.
  1. True.
    Incorrect.
  2. False.
    Correct!

Newton's method is seldom used in machine learning because
  1. common loss functions are not self-concordant
    Incorrect. While this is true, it is not the reason Newton's method isn't used.
  2. Newton's method does not work well on noisy data
    Incorrect. This statement doesn't even make sense.
  3. machine learning researchers don't really understand linear algebra
    Incorrect. It is known that at least some machine learning researchers do know linear algebra.
  4. it is generally not practical to form or store the Hessian in such problems, due to large problem size
    Correct!