What does a "closed-form solution" mean?

I have come across the term "closed-form solution" quite often. What does a closed-form solution mean? How does one determine if a close-form solution exists for a given problem? Searching online, I found some information, but nothing in the context of developing a statistical or probabilistic model / solution. I understand regression very well, so if any one can explain the concept with reference to regression or model-fitting, it will be easy to consume. :)

47.3k 33 33 gold badges 179 179 silver badges 293 293 bronze badges asked Sep 23, 2013 at 23:31 2,663 6 6 gold badges 17 17 silver badges 8 8 bronze badges

4 Answers 4

$\begingroup$

"An equation is said to be a closed-form solution if it solves a given problem in terms of functions and mathematical operations from a given generally accepted set. For example, an infinite sum would generally not be considered closed-form. However, the choice of what to call closed-form and what not is rather arbitrary since a new "closed-form" function could simply be defined in terms of the infinite sum." --Wolfram Alpha

"In mathematics, an expression is said to be a closed-form expression if it can be expressed analytically in terms of a finite number of certain "well-known" functions. Typically, these well-known functions are defined to be elementary functions—constants, one variable x, elementary operations of arithmetic (+ − × ÷), nth roots, exponent and logarithm (which thus also include trigonometric functions and inverse trigonometric functions). Often problems are said to be tractable if they can be solved in terms of a closed-form expression." -- Wikipedia

An example of a closed form solution in linear regression would be the least square equation

answered Sep 23, 2013 at 23:45 user25658 user25658

$\begingroup$ Considering that all regression scenarios can be cast as a problem of solving a system of equations, when would there not be a closed-form solution? An ill-posed or sparse problem will require an approximate solution, so is that the case where a closed-form solution does not exist? How about when one uses conjugate gradient descent with regularization? $\endgroup$

Commented Sep 25, 2013 at 18:44

$\begingroup$ I found this discussion helpful - "Solving for regression parameters in closed-form vs gradient descent" link $\endgroup$

Commented Sep 25, 2013 at 18:53

$\begingroup$ @arjsgh21 do you still need further clarification on what it means to be a closed form solution? Because your new question seems to be about when are there closed form solutions (or not) in regression problems which is an entirely new topic and should be asked as a new question, in my opinion. $\endgroup$

Commented Sep 25, 2013 at 18:57

$\begingroup$ Thanks BabakP. I think I get it now, with reference to regression and also otherwise. $\endgroup$

Commented Sep 25, 2013 at 19:06

$\begingroup$ p.s. here can see closed-form solution for Ridge (l2-regularized linear regression) $\endgroup$

Commented Mar 31 at 12:28 $\begingroup$

I think that this website provides a simple intuition, an excerpt of which is:

A closed-form solution (or closed form expression) is any formula that can be evaluated in a finite number of standard operations. . A numerical solution is any approximation that can be evaluated in a finite number of standard operations. Closed form solutions and numerical solutions are similar in that they both can be evaluated with a finite number of standard operations. They differ in that a closed-form solution is exact whereas a numerical solution is only approximate.

31k 10 10 gold badges 101 101 silver badges 156 156 bronze badges answered Nov 24, 2014 at 11:16 Luca Bertinetto Luca Bertinetto 411 4 4 silver badges 5 5 bronze badges $\begingroup$ While only providing a link, this is definitely the most helpful answer. $\endgroup$ Commented Dec 18, 2016 at 22:18

$\begingroup$ Wayne's inclusion of a quote from the link quite definitely improved the answer. $\endgroup$

Commented Jul 7, 2017 at 2:24

$\begingroup$ wiki adds more clarity "Unlike the broader analytic expressions, the closed-form expressions do not include infinite series or continued fractions; neither includes integrals or limits." $\endgroup$

Commented Apr 7 at 8:09 $\begingroup$

Most estimation procedures involve finding parameters that minimize (or maximize) some objective function. For example, with OLS, we minimize the sum of squared residuals. With Maximum Likelihood Estimation, we maximize the log-likelihood function. The difference is trivial: minimization can be converted to maximization by using the negative of the objective function.

Sometimes this problem can be solved algebraically, producing a closed-form solution. With OLS, you solve the system of first order conditions and get the familiar formula (though you still probably need a computer to evaluate the answer). In other cases, this is not mathematically possible and you need to search for parameter values using a computer. In this case, the computer and the algorithm play a bigger role. Nonlinear Least Squares is one example. You don't get an explicit formula; all you get is a recipe that you need to computer to implement. The recipe might be start with an initial guess of what the parameters might be and how they might vary. You then try various combinations of parameters and see which one gives you the lowest/highest objective function value. This is the brute force approach and takes a long time. For example, with 5 parameters with 10 possible values each you need to try $10^5$ combinations, and that merely puts you in the neighborhood of the right answer if you're lucky. This approach is called grid search.

Or you might start with a guess, and refine that guess in some direction until the improvements in the objective function is less than some value. These are usually called gradient methods (though there are others that do not use the gradient to pick in which direction to go in, like genetic algorithms and simulated annealing). Some problems like this guarantee that you find the right answer quickly (quadratic objective functions). Others give no such guarantee. You might worry that you've gotten stuck at a local, rather than a global, optimum, so you try a range of initial guesses. You might find that wildly different parameters give you the same value of the objective function, so you don't know which set to pick.

Here's a nice way to get the intuition. Suppose you had a simple exponential regression model where the only regressor is the intercept: \begin E[y]=\exp\ \end

The objective function is \begin Q_N(\alpha)=-\frac \sum_i^N \left( y_i - \exp\ \right)^2 \end

With this simple problem, both approaches are feasible. The closed-form solution that you get by taking the derivative is $\alpha^* = \ln \bar y$. You can also verify that anything else gives you a higher value of the objective function by plugging in $\ln (\bar y + k) $ instead. If you had some regressors, the analytical solution goes out the window.