Thinking about #machinelearning? It will be helpful to understand some numerical computations and concepts that affect the #ML algorithm. 

One might not interact with these directly, but we surely can feel the effect. The things you need to think about are:

1. Overflow and underflow - thinking of them as rounding up or down errors that shift the functions enough, and compounded across the iterations cam be devastating. Of course can also easily get to division by zero. 

2. Poor conditioning - essentially with small changes of input data, how large can the output move. You want this small. (And in cryptography you want the opposite, and large). 

3. Gradient optimizations - there will be some optimization happening in the algorithm, question is how does it handle various local points on the curve? Local minimum, saddle points, and local maximum. Generally speaking, it’s about optimizing continuous spaces.

Some algorithms take this a step further by measuring a second derivative (think of it as measuring the derivative of a derivative - the curvature of a function). 

4. Constrained Optimization - sometimes we just want to operate on a subset - so constraints only on that set. 

All of these come into play some way, directly or indirectly and having a basic understanding and constraints around this would help a long way.