Show simple item record

dc.contributor.authorWild, Edward
dc.contributor.authorMangasarian, Olvi
dc.date.accessioned2013-01-17T18:07:51Z
dc.date.available2013-01-17T18:07:51Z
dc.date.issued2006
dc.identifier.citation06-04en
dc.identifier.urihttp://digital.library.wisc.edu/1793/64338
dc.description.abstractPrior knowledge over general nonlinear sets is incorporated into nonlinear kernel classification problems as linear constraints in a linear program. The key tool in this incorporation is a theorem of the alternative for convex functions that converts nonlinear prior knowledge implications into linear inequalities without the need to kernelize these implications. Effectiveness of the proposed formulation is demonstrated on three publicly available classification datasets, including a cancer prognosis dataset. Nonlinear kernel classifiers for these datasets exhibit marked improvements upon the introduction of nonlinear prior knowledge compared to nonlinear kernel classifiers that do not utilize such knowledge.en
dc.subjecttheorem of the alternativeen
dc.subjectlinear programmingen
dc.subjectkernel classificationen
dc.subjectprior knowledgeen
dc.titleNonlinear Knowledge-Based Classificationen
dc.typeTechnical Reporten


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • DMI Technical Reports
    DMI Technical Reports Archive for the Department of Computer Sciences at the University of Wisconsin-Madison

Show simple item record