
Consequently, any search algorithm will work, but not all will work well. It's a gradient-based search.Įssentially, all regression problems are search problems: one searches for parameters that shape the target function in the most optimal way. Effectively, this is the same as doing a local linearization of the regression function. Levenberg-Marquardt is a commonly-used algorithm for these things. You could use an evolutionary algorithm to perform a stochastic search in a parameter space, and use the cost function as your survival criterion.įinally, you could employ any number of non-linear least squares algorithms to estimate the parameters of your fit. (Actually, this might not work without pre-multiplying each sin term by, say, a Gaussian, or a top-hat function). If that fails, you could configure a neural network to give you the result as a series of sines.

If you think your function is a series of sines, you could write a Fourier series expansion, and perform a least squares fit on the Fourier series coefficients. So you could perform an $n$-term Taylor series expansion and do a regular linear polynomial regression on the result. So, you could compute the derivative of your function, and assume that your data is locally approximated by many linear functions.Īnother way is to note that $\sin$ has a Taylor series expansion, and that sufficient terms should give you a polynomial that is pretty close to $\sin$ in some domain. A differentiable function is well-approximated by a linear function at some point. Instead, you could think back to calculus, and recall that $\sin$ is a continuous, and indeed, differentiable function. Of course, inverse trig functions like to behave badly, so this might not work. Given m data points $(x_i,y_i)$ for regression with a function of n parameters $\vec \beta =(\beta_1.,\beta_n)$ Gauss-Newton algorithm directly deals with this type of problems.
