%PDF-1.3
1 0 obj
<<
/Kids [ 4 0 R 5 0 R 6 0 R 7 0 R 8 0 R 9 0 R 10 0 R 11 0 R 12 0 R ]
/Type /Pages
/Count 9
>>
endobj
2 0 obj
<<
/Subject (Neural Information Processing Systems http\072\057\057nips\056cc\057)
/Publisher (Curran Associates)
/Language (en\055US)
/Created (2012)
/Description-Abstract (This paper considers a wide spectrum of regularized stochastic optimization problems where both the loss function and regularizer can be non\055smooth\056 We develop a novel algorithm based on the regularized dual averaging \050RDA\051 method\054 that can simultaneously achieve the optimal convergence rates for both convex and strongly convex loss\056 In particular\054 for strongly convex loss\054 it achieves the optimal rate of \044O\050\134frac\1731\175\173N\175\053\134frac\1731\175\173N\1362\175\051\044 for \044N\044 iterations\054 which improves the best known rate \044O\050\134frac\173\134log N \175\173N\175\051\044 of previous stochastic dual averaging algorithms\056 In addition\054 our method constructs the final solution directly from the proximal mapping instead of averaging of all previous iterates\056 For widely used sparsity\055inducing regularizers \050e\056g\056\054 \044\134ell\1371\044\055norm\051\054 it has the advantage of encouraging sparser solutions\056 We further develop a multi\055stage extension using the proposed algorithm as a subroutine\054 which achieves the uniformly\055optimal rate \044O\050\134frac\1731\175\173N\175\053\134exp\134\173\055N\134\175\051\044 for strongly convex loss\056)
/Producer (Python PDF Library \055 http\072\057\057pybrary\056net\057pyPdf\057)
/Title (Optimal Regularized Dual Averaging Methods for Stochastic Optimization)
/Date (2012)
/Type (Conference Proceedings)
/firstpage (395)
/Book (Advances in Neural Information Processing Systems 25)
/Description (Paper accepted and presented at the Neural Information Processing Systems Conference \050http\072\057\057nips\056cc\057\051)
/Editors (F\056 Pereira and C\056J\056C\056 Burges and L\056 Bottou and K\056Q\056 Weinberger)
/Author (Xi Chen\054 Qihang Lin\054 Javier Pena)
/lastpage (403)
>>
endobj
3 0 obj
<<
/Type /Catalog
/Pages 1 0 R
>>
endobj
4 0 obj
<<
/Contents 13 0 R
/Parent 1 0 R
/Type /Page
/Resources 14 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
5 0 obj
<<
/Contents 68 0 R
/Parent 1 0 R
/Type /Page
/Resources 69 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
6 0 obj
<<
/Contents 74 0 R
/Parent 1 0 R
/Type /Page
/Resources 75 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
7 0 obj
<<
/Contents 80 0 R
/Parent 1 0 R
/Type /Page
/Resources 81 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
8 0 obj
<<
/Contents 114 0 R
/Parent 1 0 R
/Type /Page
/Resources 115 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
9 0 obj
<<
/Contents 120 0 R
/Parent 1 0 R
/Type /Page
/Resources 121 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
10 0 obj
<<
/Contents 122 0 R
/Parent 1 0 R
/Type /Page
/Resources 123 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
11 0 obj
<<
/Contents 128 0 R
/Parent 1 0 R
/Type /Page
/Resources 129 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
12 0 obj
<<
/Contents 144 0 R
/Parent 1 0 R
/Type /Page
/Resources 145 0 R
/MediaBox [ 0 0 612 792 ]
>>
endobj
13 0 obj
<<
/Length 4127
/Filter /FlateDecode
>>
stream
x[[㶕~_Gv%n$hWRqb{cuZWũ
GbH<3 ^=?4|0^=~7^*˴\m/QfJUS_ş^RxjuwB0Ud,nG?Ǯ[nSL$vgڢU2lv[W]cD$v&e1יd2K?n
.I&ﬕ@:ҙ{|+½u7ϊct5K33/^p=E϶{[ķ_h@ѴeӔNhRp>uڴls8b{ͶrfYILwr])sKHY4q]q0bƅG!c۵mcWc`M%~7:W>`}ۺ*ySF ,NB
"3
glYc` c.LʫmS:&HKeDIh(gB%O_]oƙdLa/]I,Tt̝$6uՖ[0+4XltpVV89L`NOޭ/V'm_,:z+~Y߬yh]uߟ8?X5qW_u{nn2;i_gT{ihIMCݔF[bdB
gޭEdz8R{@~ug7
G,Jvy"rA*jiUQF7ٷkt V(-I&z(ɻu%x{WCkHY|YĨD}Y&rC7/pn}´hio15-Ժr4*9pLy'p-wI=9@SP[o\a&RO,Ƃ0XR =/Haqb=K+zԄ/'
SID〿SnX*qnG礃e8xvOT #sOxKvb`:Po!CrI
={L-ԀK-0 \FBZ}Pi\y (:s*+deҼ03v_{9$ls+E' ܰTG0Bri&Su}4-WMLX6|/%I ;0`p22aqώnKRB -
&cUHnG@':BFZ/ģY>ѳ6dBo/Hz$^
h[h-((aIzF%*|$6