The Journal of Instruction-Level Parallelism
Championship Branch Prediction
The 1st JILP Championship Branch Prediction Competition (CBP-1)
Sponsored by: Intel MRL, IEEE TC-uARCH
in conjunction with:
MICRO-37  http://www.microarch.org/micro37/

 trophy

Championship Branch Prediction (CBP) is a branch predictor competition.  Contestants will be given a fixed storage budget to implement their best branch prediction algorithms on a common evaluation framework distributed by the CBP steering committee.
Objective: The goal of the Championship Branch Prediction competition is to evaluate and compare branch prediction algorithms in a common framework.  The competition’s simple and transparent evaluation process enables dissemination of results and techniques to the larger computer design community and allows independent verification of the competition’s results.  The performance and cost metrics are selected to be as simple and quantitative as possible.
    Predictors must be implemented within a fixed storage budget, and will be judged on performance.  Finalists will be selected in the first round of the competition on the basis of predictor performance on a distributed benchmark set.  In the second round of the competition a second benchmark set (not distributed) will be used to select the champion from the finalists. 
Prizes: Selected finalists (including the champion) will be published in a special issue of the Journal of Instruction-Level Parallelism (JILP).  All finalists' source code, predictor writeups, and performance results will be made publicly available through the CBP website.  The champion will receive a trophy (details not yet determined).
Submission requirements: See http://www.jilp.org/cbp/rules.htm for additional information and requirements.  Information on how and where to send submissions can be found by clicking here.  Submissions must include the following (write-up, results, code as part of a tar file) :

  • Abstract: which must be submitted before Friday, October 15, 2004, 9pm PST (one week before the predictor submission deadline).

  • Writeup: of the prediction algorithm including references to published work directly relevant to the implemented algorithm.

  • Performance results: a table giving performance for the distributed trace list.

  • Branch predictor code: the predictor code, which must fit into the framework.

To prevent minor variations of a predictor, no person may be part of more than one submission.  The final submission must be submitted in its entirety before Friday October 22, 2004, 9pm PST.  No extensions will be granted.  Submission issues should be directed to the mailing list at cbp@lists.ncsu.edu.

Mailing List: We request that everyone who downloads the framework sign up for the CBP mailing list.  We will use the mailing list to announce updates to the framework and to answer questions.  To sign up, send mail to mj2@lists.ncsu.edu, type "subscribe cbp" without quotes in the message body.  Use cbp@lists.ncsu.edu to post to the list.

Common Evaluation Framework, Version 3: The third, and final, version of the framework is now available.  Submissions must use this version of the framework.  This version has both branch and instruction information, including static information about the instructions, data values, and memory addresses.  Traces with data values and memory addresses are very large, totaling 550 megabytes, so we've created two sets of traces: one without data values and memory addresses, and one with.  The traces without data values and memory addresses and the framework are contained in a big tarball.  Click here to download.  If you need the traces with data values and memory addresses, you can download them individually by clicking here.  After downloading the framework, please take a moment to sign up for the mailing list.

New!!! Workshop Agenda and Results: The workshop agenda and results can be accessed by clicking here.  The agenda includes links to all the presentations and links to all the finalists' writeups and code.  The results includes the performance of the finalists on both the distributed and undistributed traces lists.

Important dates:
Competition formally announced at ISCA: June 19, 2004
Evaluation framework available:
July 29, 2004
Abstract submission:
October 15, 2004, (9pm Pacific Time, USA)
Predictor submission: October 22, 2004, (9pm Pacific Time, USA)
Finalists selected/notified: November 5, 2004
Champion selected from finalists at MICRO:
December 5, 2004

Affiliated logos:

JILP   IEEE TC-uARCH   

 

Steering Committee
Dan Connors, Univ. of Colorado
Tom Conte, North Carolina State Univ.
Konrad Lai, MRL,Intel
Yale Patt, Univ. of Texas at Austin
Jim Smith, Univ. of Wisconsin
Jared Stark, MRL, Intel

Mateo Valero, Univ. Politecnica Catalunya
Chris Wilkerson, MRL, Intel

Selection Committee
Dan Connors, Univ. of Colorado
Tom Conte, North Carolina State Univ.
Phil Emma, IBM Research
Konrad Lai, MRL,Intel

Scott McFarling, Microsoft
Chuck Moore, AMD
Yale Patt, Univ. of Texas at Austin
Jim Smith, Univ. of Wisconsin

Jared Stark, MRL, Intel
Mateo Valero, Univ. Politecnica Catalunya
Chris Wilkerson, MRL, Intel