JB> L<span class="Apple-style-span" style="font-family: arial, sans-serif; font-size: 13px; border-collapse: collapse; color: rgb(68, 68, 68); ">ots of use of Support Vector Machines</span><div><span class="Apple-style-span" style="font-size: 13px; "></span><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">Certainly, there will be a return of the investment.</span></font></div>
<div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;"><br></span></font></div><div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">JB> <span class="Apple-style-span" style="border-collapse: separate; color: rgb(0, 0, 0); font-family: arial; ">arguably</span></span></font></div>
<div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;"><span class="Apple-style-span" style="font-family: arial; "></span><font class="Apple-style-span" color="#000000"><span class="Apple-style-span" style="border-collapse: separate;">Indeed.</span></font></span></font></div>
<div><font class="Apple-style-span" face="arial, sans-serif"><br></font></div><div><font class="Apple-style-span" face="arial, sans-serif">> <span class="Apple-style-span" style="font-size: 13px; border-collapse: collapse; color: rgb(68, 68, 68); ">SVMs are arguably quite consistent with the ideas of robust statistics, since they are fairly resistant to outliers and certain kinds of model violations.</span></font></div>
<div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">But now, back on topic.</span></font></div><div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;"><br>
</span></font></div><div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;">Rainer</span></font></div><div><font class="Apple-style-span" color="#444444" face="arial, sans-serif"><span class="Apple-style-span" style="border-collapse: collapse;"><font class="Apple-style-span" color="#000000"><span class="Apple-style-span" style="border-collapse: separate;"><br>
</span></font></span></font><br><div class="gmail_quote">On 26 March 2010 16:30, John Burger <span dir="ltr"><<a href="mailto:john@mitre.org">john@mitre.org</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">Rainer Ottmueller wrote:<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
My first impression is that Robust statistics is not used in all NLP.<br>
Please confirm or decline.<br>
</blockquote>
<br></div>
Lots of use of Support Vector Machines, especially in topic identification, etc. SVMs are arguably quite consistent with the ideas of robust statistics, since they are fairly resistant to outliers and certain kinds of model violations.</blockquote>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<br>
- John D. Burger<br>
MITRE<div><div></div><div class="h5"><br>
<br>
<br>
_______________________________________________<br>
Corpora mailing list<br>
<a href="mailto:Corpora@uib.no" target="_blank">Corpora@uib.no</a><br>
<a href="http://mailman.uib.no/listinfo/corpora" target="_blank">http://mailman.uib.no/listinfo/corpora</a><br>
</div></div></blockquote></div><br></div>