The Journal of Instruction-Level
Championship Branch Prediction (CBP)
is a branch predictor competition. Contestants will be given a
fixed storage budget to implement their best branch prediction
algorithms on a common evaluation framework distributed by the CBP
To prevent minor variations of a predictor, no person may be part of more than one submission. The final submission must be submitted in its entirety before Friday October 22, 2004, 9pm PST. No extensions will be granted. Submission issues should be directed to the mailing list at firstname.lastname@example.org.
Mailing List: We request that everyone who downloads the framework sign up for the CBP mailing list. We will use the mailing list to announce updates to the framework and to answer questions. To sign up, send mail to email@example.com, type "subscribe cbp" without quotes in the message body. Use firstname.lastname@example.org to post to the list.
Common Evaluation Framework, Version 3: The third, and final, version of the framework is now available. Submissions must use this version of the framework. This version has both branch and instruction information, including static information about the instructions, data values, and memory addresses. Traces with data values and memory addresses are very large, totaling 550 megabytes, so we've created two sets of traces: one without data values and memory addresses, and one with. The traces without data values and memory addresses and the framework are contained in a big tarball. Click here to download. If you need the traces with data values and memory addresses, you can download them individually by clicking here. After downloading the framework, please take a moment to sign up for the mailing list.
New!!! Workshop Agenda and Results: The workshop agenda and results can be accessed by clicking here. The agenda includes links to all the presentations and links to all the finalists' writeups and code. The results includes the performance of the finalists on both the distributed and undistributed traces lists.
Evaluation framework available: July 29, 2004
Abstract submission: October 15, 2004, (9pm Pacific Time, USA)
Predictor submission: October 22, 2004, (9pm Pacific Time, USA)
Finalists selected/notified: November 5, 2004
Champion selected from finalists at MICRO: December 5, 2004