Skip to content

Commit 51d5bb5

Browse files
langmorejosef-pkt
authored andcommitted
Updated examples
1 parent 6a66123 commit 51d5bb5

File tree

3 files changed

+53
-46
lines changed

3 files changed

+53
-46
lines changed

README.txt

-43
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,3 @@
1-
What the l1 addition is
2-
=======================
3-
A slight modification that allows l1 regularized LikelihoodModel.
4-
5-
Regularization is handled by a fit_regularized method.
6-
7-
Main Files
8-
==========
9-
10-
l1_demo/demo.py
11-
$ python demo.py --get_l1_slsqp_results logit
12-
does a quick demo of the regularization using logistic regression.
13-
14-
l1_demo/sklearn_compare.py
15-
$ python sklearn_compare.py
16-
Plots a comparison of regularization paths. Modify the source to use
17-
different datasets.
18-
19-
statsmodels/base/l1_cvxopt.py
20-
fit_l1_cvxopt_cp()
21-
Fit likelihood model using l1 regularization. Use the CVXOPT package.
22-
Lots of small functions supporting fit_l1_cvxopt_cp
23-
24-
statsmodels/base/l1_slsqp.py
25-
fit_l1_slsqp()
26-
Fit likelihood model using l1 regularization. Use scipy.optimize
27-
Lots of small functions supporting fit_l1_slsqp
28-
29-
statsmodels/base/l1_solvers_common.py
30-
Common methods used by l1 solvers
31-
32-
statsmodels/base/model.py
33-
Likelihoodmodel.fit()
34-
3 lines modified to allow for importing and calling of l1 fitting functions
35-
36-
statsmodels/discrete/discrete_model.py
37-
L1MultinomialResults class
38-
Child of MultinomialResults
39-
MultinomialModel.fit()
40-
3 lines re-directing l1 fit results to the L1MultinomialResults class
41-
42-
43-
441
What Statsmodels is
452
===================
463
What it is

README_l1.txt

+40
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
What the l1 addition is
2+
=======================
3+
A slight modification that allows l1 regularized LikelihoodModel.
4+
5+
Regularization is handled by a fit_regularized method.
6+
7+
Main Files
8+
==========
9+
10+
l1_demo/demo.py
11+
$ python demo.py --get_l1_slsqp_results logit
12+
does a quick demo of the regularization using logistic regression.
13+
14+
l1_demo/sklearn_compare.py
15+
$ python sklearn_compare.py
16+
Plots a comparison of regularization paths. Modify the source to use
17+
different datasets.
18+
19+
statsmodels/base/l1_cvxopt.py
20+
fit_l1_cvxopt_cp()
21+
Fit likelihood model using l1 regularization. Use the CVXOPT package.
22+
Lots of small functions supporting fit_l1_cvxopt_cp
23+
24+
statsmodels/base/l1_slsqp.py
25+
fit_l1_slsqp()
26+
Fit likelihood model using l1 regularization. Use scipy.optimize
27+
Lots of small functions supporting fit_l1_slsqp
28+
29+
statsmodels/base/l1_solvers_common.py
30+
Common methods used by l1 solvers
31+
32+
statsmodels/base/model.py
33+
Likelihoodmodel.fit()
34+
3 lines modified to allow for importing and calling of l1 fitting functions
35+
36+
statsmodels/discrete/discrete_model.py
37+
L1MultinomialResults class
38+
Child of MultinomialResults
39+
MultinomialModel.fit()
40+
3 lines re-directing l1 fit results to the L1MultinomialResults class

l1_demo/short_demo.py

+13-3
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,6 @@
1111
1212
The l1 Solvers
1313
--------------
14-
The solvers are slower than standard Newton, and sometimes have
15-
convergence issues Nonetheless, the final solution makes sense and
16-
is often better than the ML solution.
1714
The standard l1 solver is fmin_slsqp and is included with scipy. It
1815
sometimes has trouble verifying convergence when the data size is
1916
large.
@@ -36,14 +33,19 @@
3633
logit_mod = sm.Logit(spector_data.endog, spector_data.exog)
3734
## Standard logistic regression
3835
logit_res = logit_mod.fit()
36+
3937
## Regularized regression
38+
4039
# Set the reularization parameter to something reasonable
4140
alpha = 0.05 * N * np.ones(K)
41+
4242
# Use l1, which solves via a built-in (scipy.optimize) solver
4343
logit_l1_res = logit_mod.fit_regularized(method='l1', alpha=alpha, acc=1e-6)
44+
4445
# Use l1_cvxopt_cp, which solves with a CVXOPT solver
4546
logit_l1_cvxopt_res = logit_mod.fit_regularized(
4647
method='l1_cvxopt_cp', alpha=alpha)
48+
4749
## Print results
4850
print "============ Results for Logit ================="
4951
print "ML results"
@@ -58,32 +60,39 @@
5860
anes_exog = sm.add_constant(anes_exog, prepend=False)
5961
mlogit_mod = sm.MNLogit(anes_data.endog, anes_exog)
6062
mlogit_res = mlogit_mod.fit()
63+
6164
## Set the regularization parameter.
6265
alpha = 10 * np.ones((mlogit_mod.J - 1, mlogit_mod.K))
66+
6367
# Don't regularize the constant
6468
alpha[-1,:] = 0
6569
mlogit_l1_res = mlogit_mod.fit_regularized(method='l1', alpha=alpha)
6670
print mlogit_l1_res.params
71+
6772
#mlogit_l1_res = mlogit_mod.fit_regularized(
6873
# method='l1_cvxopt_cp', alpha=alpha, abstol=1e-10, trim_tol=1e-6)
6974
#print mlogit_l1_res.params
75+
7076
## Print results
7177
print "============ Results for MNLogit ================="
7278
print "ML results"
7379
print mlogit_res.summary()
7480
print "l1 results"
7581
print mlogit_l1_res.summary()
7682
#
83+
#
7784
#### Logit example with many params, sweeping alpha
7885
spector_data = sm.datasets.spector.load()
7986
X = spector_data.exog
8087
Y = spector_data.endog
88+
8189
## Fit
8290
N = 50 # number of points to solve at
8391
K = X.shape[1]
8492
logit_mod = sm.Logit(Y, X)
8593
coeff = np.zeros((N, K)) # Holds the coefficients
8694
alphas = 1 / np.logspace(-0.5, 2, N)
95+
8796
## Sweep alpha and store the coefficients
8897
# QC check doesn't always pass with the default options.
8998
# Use the options QC_verbose=True and disp=True
@@ -94,6 +103,7 @@
94103
method='l1', alpha=alpha, trim_mode='off', QC_tol=0.1, disp=False,
95104
QC_verbose=True, acc=1e-15)
96105
coeff[n,:] = logit_res.params
106+
97107
## Plot
98108
plt.figure(1);plt.clf();plt.grid()
99109
plt.title('Regularization Path');

0 commit comments

Comments
 (0)