Linear / Logistic regression learning algorithms can be used to learn surfaces to fit non-linear data (either for regression or classification).

It can be done by adding extra polynomial terms (quadratic, cubic, etc) in the dataset using the existing features. The approximated surfaces can be circles, ellipses, parabolas and hyperbolas.

However, (1) it is difficult to approximate very complex non-linear surfaces. Also, when the features are many then (2) it is very computationally expensive to calculate all polynomial terms (eg: for a 100×100 grayscale images we need 100000 quadratic terms).

Usually, due to the limitations (1) and (2) we either perform dimensionality reduction to reduce the number of features (eg: eigenfaces for face recognition) or use more complex non-linear approximators such as Support Vector Machines and Neural Networks.