Many concepts in neuroscience can be condensed into a (series of) mathematical equation(s). A prominent example is the ordinary least-squares multiple regression case, which uses the outcome of the matrix inversion of the covariance matrix of the design:
The data (y) is being explained by the product of the regression weights (beta/s) and the design matrix (X), plus the residual error term (epsilon):
After solving the normal equation that suffices the condition that the sum of squares of errors (SSE) is minimized, the result is:
Of course, this is just an example! :)
W.r.t. the formulas on this page, here is the code that was used to generate the images above:
y = X beta ~ + ~ varepsilon
hat{beta} = (X^T X)^ {-1} X^T y