Optimization is the process of minimizing/maximizing an objective (aka target) function. You can do this analytically, or numerically (numerical methods), or a combination of the two (depending on your target function). Gradient descent is a numerical method for minimizing scaler valued multivariate functions.

Regression is when have some input-output pairs, and you want to find the function (in a class of functions) that best fits the input-output pairs. LMS is a common way to define what “best fits” means when you are doing regression.

Machine learning is a field with lots of methods for machines to “learn” (get better with experience). One of the subfields is called supervised learning, aka function approximation. Regression is one method of function approximation.