<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.jstacs.de/index.php?action=history&amp;feed=atom&amp;title=AUC-PR</id>
	<title>AUC-PR - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://www.jstacs.de/index.php?action=history&amp;feed=atom&amp;title=AUC-PR"/>
	<link rel="alternate" type="text/html" href="https://www.jstacs.de/index.php?title=AUC-PR&amp;action=history"/>
	<updated>2026-04-04T12:24:50Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.38.2</generator>
	<entry>
		<id>https://www.jstacs.de/index.php?title=AUC-PR&amp;diff=625&amp;oldid=prev</id>
		<title>Grau: Created page with &quot;== Area under ROC and PR curves for weighted and unweighted data == by Jens Keilwagen, Ivo Grosse, and Jan Grau  Precision-recall and ROC curves are highly informative about the ...&quot;</title>
		<link rel="alternate" type="text/html" href="https://www.jstacs.de/index.php?title=AUC-PR&amp;diff=625&amp;oldid=prev"/>
		<updated>2013-04-22T19:55:52Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;== Area under ROC and PR curves for weighted and unweighted data == by Jens Keilwagen, Ivo Grosse, and Jan Grau  Precision-recall and ROC curves are highly informative about the ...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;== Area under ROC and PR curves for weighted and unweighted data ==&lt;br /&gt;
by Jens Keilwagen, Ivo Grosse, and Jan Grau&lt;br /&gt;
&lt;br /&gt;
Precision-recall and ROC curves are highly informative about the performance of binary classifiers, and the area under these curves is a popular scalar performance measure for comparing different classifiers.&lt;br /&gt;
For many applications, class labels are not provided with absolute certainty, but with some degree of confidence, often reflected by weights or soft labels assigned to the data points.&lt;br /&gt;
Here, we provide a command line program that uses an interpolation for precision-recall curves (and ROC curves) that can also be used for weighted test data.&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
After downloading [http://www.jstacs.de/download.php?which=AUC AUC.jar], you can compute the area under the precision-recall and ROC curve from lists of scores provided in one (weighted data) or two (unweighted data) files.&lt;br /&gt;
&lt;br /&gt;
For unweighted data, please use:&lt;br /&gt;
 java -jar AUC.jar &amp;lt;fg&amp;gt; &amp;lt;bg&amp;gt;&lt;br /&gt;
where &amp;lt;fg&amp;gt; and &amp;lt;bg&amp;gt; are files with one classification score per line for the positive (fg) and negative (bg) class, respectively.&lt;br /&gt;
&lt;br /&gt;
For weighted data please use:&lt;br /&gt;
 java -jar AUC.jar &amp;lt;weighted&amp;gt;&lt;br /&gt;
where &amp;lt;weighted&amp;gt; is a tab-delimited file with one classification score and the weights for fg (positive class) and bg (negative class) per line.&lt;/div&gt;</summary>
		<author><name>Grau</name></author>
	</entry>
</feed>