A trend in mathematics is a pattern in a set of data points. Knowing the trend allows outcomes to be predicted by a mathematical model. Estimating the trend of data requires a technique known as regression or curve fitting.
As a conceptual example, if two sets of numbers are plotted and one set of values called x increases when the other set of values called y increases, there is a positive trend. More specifically, a model with mathematical equations needs to be defined by a curve fit. The least squares method finds the curve with the minimum deviation from all data points. Consider a set of data points (x1, y1), (x2, y2), ..., (xn, yn) where x is the independent variable, and y is the dependent variable. The fitting curve f(x) has deviation errors d1 = y1 - f(x1), d2 = y2 - f(x2), ..., dn = yn - f(xn). The best fitting curve follows the property where d1^2 + d2^2 + ... + dn^2 is a minimum.