You can use OLS (Ordinary Least Squares model) as done here:
#imports
import numpy as np
import statsmodels.api as sm
#generate the input matrix
X=[foo,bar,spam]
#turn it into a numpy array
X = np.array(X).T
#add a constant column
X=sm.add_constant(X)
This gives the input matrix X:
array([[   1.,    1.,   50.,  -10.],
       [   1.,    2.,   60.,  -20.],
       [   1.,    3.,   70.,  -30.],
       [   1.,    4.,   80.,  -40.],
       [   1.,    5.,   90.,  -50.],
       [   1.,    6.,  100.,  -60.]])
And now you can fit each desired output vector:
resFoo = sm.OLS(endog=foofoo, exog=X).fit()
resBar = sm.OLS(endog=barbar, exog=X).fit()
resSpam = sm.OLS(endog=spamspam, exog=X).fit()
resham = sm.OLS(endog=hamham, exog=X).fit()
The result gives you the coefficients (for the constant, and the three columns foo, bar, and spam):
>>> resFoo.params
array([-0.00063323,  0.0035345 ,  0.01001583, -0.035345  ])
You can now check it with the input:
>>> np.matrix(X)*np.matrix(resFoo.params).T
matrix([[ 0.85714286],
        [ 1.31428571],
        [ 1.77142857],
        [ 2.22857143],
        [ 2.68571429],
        [ 3.14285714]])
Which is close to the desired output of foofoo.
See this question for different ways to do the regression: Multiple linear regression in Python