Linear regression is a type of regression which attempts to predict the values of a particular dependent variable given a value of an independent variable. Regression curves are very useful in science because it allows scientists to predict values for certain experiments without having to actually perform an experiment for a given set of values. This is helpful because sometimes laboratory
. equipment lacks the measuring capacity to measure certain values. For example, absolute zero has not yet been reached in experimental conditions, however the it has only been estimated by extrapolation of regression lines. Regression lines can take on several different shapes including parabolic, exponential, linear and sigmoid. Linear regression lines are straight lines with a slope and a y intercept. Usually, the slope tells some information about the relationship between the dependent and independent variable. The y intercept usually tells something about how well the experiment was conducted. The R coefficient is used to determine how well the data points fit on the line. In order to form a linear regression line, there need to be at least two data point. However, the more data points there are, the better the linear regression line. Ideally, the R coefficient should be 1. This means that the data points fit perfectly on the line and the line can perfectly predict the values on the line. However, a value of .95 is usually accepted as the acceptable minimum value for the R coefficient. This means, essentially that 95% of the time the values are close to the line. Usually, if the R coefficient is too low, thin it is possible to try to use a different type of regression curve to predict the values. Although the shape of the clustering of the points allows someone to predict which type of line to use, it may be that a different line will produce a higher R value, and thus produce better predictions. http://www.stat.yale.edu/Courses/1997-98/101/linreg.htm or http://illuminations.nctm.org/activitydetail.aspx?id=82