|  Results  SPECgpc  Download  Resources | 
  
  
    |  The Graphics
        Performance Characterization Project Group Rules  Version
        2.13  Last
        Updated: 12/16/2010  
         Overview 
          
             Rules
              Inheritance 
              
                 All rules declared in  "The
                  Graphics and Workstation Performance Group (SPEC/GWPG): Rules
                  For Project Groups" document (known hereinafter as the GWPG Project Groups Ruleset)
                  shall apply, unless specifically overruled by a rule in this
                  document.  Rules declared in this
                  document shall apply in addition to the rules declared in the
                  GWPG Project Groups Ruleset.  General
              Philosophy 
              
                 The Graphics Performance
                  Characterization Project of SPEC/GWPG (henceforth abbreviated
                  as SPECgpcSM) believes the user community will
                  benefit from an objective series of tests, which can serve as
                  common reference and be considered as part of an evaluation
                  process.  The SPECgpc seeks to
                  develop benchmarks for generating accurate graphics performance
                  measures in an open, accessible and well-publicized manner.  The SPECgpc wishes to
                  contribute to the coherence of the field of graphics
                  performance measurement and evaluation so that vendors will be
                  better able to present well-defined performance measures; and
                  customers will be better able to compare and evaluate vendors'
                  products and environments.  The SPECgpc will provide
                  formal beta software to members and final software releases to
                  the public in a timely fashion.  Hardware and software
                  used to run the SPECgpc benchmarks must provide a suitable
                  environment for running typical graphics programs.  SPECgpc reserves the
                  right to adapt its benchmarks as it deems necessary to preserve
                  its goal of fair and useful benchmarking (e.g. remove
                  benchmark, modify benchmark code or data, etc). If a change is
                  made to the suite, SPECgpc will notify the appropriate parties
                  (i.e. SPECgpc members and users of the benchmark) and SPECgpc
                  will re-designate the metrics (e.g. changing the metric from
                  UGNX-01 composite to UGNX-02 composite). In the case that a
                  benchmark is removed in whole or in part, SPECgpc reserves the
                  right to republish in summary form "adapted" results
                  for previously published systems, converted to the new metric.
                  In the case of other changes, such a republication may
                  necessitate re-testing and may require support from the
                  original test sponsor.  Overview of
              Optimizations 
              
              
                 SPECgpc is aware of the
                  importance of optimizations in producing the best system
                  performance. SPECgpc is also aware that it is sometimes hard to
                  draw an exact line between legitimate optimizations that happen
                  to benefit SPECgpc benchmarks and optimizations that
                  specifically target SPECgpc benchmarks. However, with the list
                  below, SPECgpc wants to increase awareness of implementers and
                  end-users to issues of unwanted benchmark-specific
                  optimizations that would be incompatible with SPECgpc's goal of
                  fair benchmarking.  To ensure that results
                  are relevant to end-users, SPECgpc expects that the hardware
                  and software implementations used for running SPECgpc
                  benchmarks adhere to a set of general rules for optimizations.  General Rules
              for Optimization 
              
                 Optimizations must generate correct
                  images. Correct images are those deemed by the SPECgpc committee to
                  be sufficiently adherent to the respective graphics API specification
                  for the targeted end-user community.  Optimizations must not have an adverse effect on system stability.
                  A published SPEC GPC result carries an implicit claim that the performance
                  methods employed are more than just "prototype" or "experimental" or
                  "research" methods. It is a claim that there is a certain level of
                  maturity and general applicability in its methods.  Optimizations affecting Viewperf must also improve performance for
                  at least one commonly-available application.  It is not permitted to detect Viewperf in order to invoke an optimization.  The OpenGL API stream from the Viewperf binary must not be modified.  OpenGL implementations must fully process Viewperf's OpenGL and
                  related API streams affecting the frame buffer and GL state.  Differences to the frame buffer between immediate and display
                  list modes must not exceed 0.01% of the number of pixels in the window.  Optimizations must be generally available and supported by the
                  providing vendor.  In the case where it appears the guidelines in this document have
                  not been followed, SPECgpc may investigate such a claim and
                  request that the optimization in question (e.g. one using SPECgpc
                  benchmark-specific pattern matching) be removed and the results
                  resubmitted. Or, SPECgpc may request that the vendor correct the
                  deficiency (e.g. make the optimization more general purpose or
                  correct problems with image generation) before submitting results
                  based on the optimization.  
 
       Benchmarks 
        
           Benchmark
            Definition 
            
               Benchmark components are
                defined as
                
                   code sets (e.g. SPECviewperf®),  designated executables where included in the downloadable package  run rules  launch framework/GUI and associated benchmark definition file  data sets (e.g. viewsets).  Benchmark
            Acceptance 
            
               New or modified benchmark
                components require a 2/3-majority vote of the SPECgpc
                electorate to be accepted for publication.  A minimum 3-week review
                period is required for new or significantly modified benchmark
                components.  At the end of the review
                period a vote will be called to approve the proposed changes.  An amendment to a
                benchmark component during the review period must be
                unanimously accepted. If not, the review period shall be
                restarted.  It is the option of any
                future SPECviewperf viewset author(s) to require passing of
                selected conformance tests prior to submission of results for
                that viewset.  Benchmark Code
            Versioning 
            
               Benchmark code is defined
                as the set of source code required to build and run a benchmark
                executable (e.g. SPECviewperf).  SPECviewperf Benchmark
                code uses the following version coding: M.m.p (e.g. 8.0.1) M is
                the major release number, m is the minor release number and p
                is the patch level.
                
                   The major release number
                    is only incremented when large amounts of code are changed and
                    the scripting language is dramatically changed as a result --
                    backward compatibility is highly unlikely when moving scripts
                    or data sets between major releases (e.g. running v2 scripts
                    on a v3 executable would almost certainly fail).  The minor release number
                    is bumped if some small set of code is replaced or removed -
                    but the standard, unchanged scripts and data sets, as a whole,
                    must run on the new version (but perhaps with different
                    performance).  Patch releases can
                    contain additions of new properties and additions of new
                    attributes to existing properties, but cannot change or remove
                    any existing properties, attributes or functionality. These
                    are typically used for bug fixes, small enhancements and so
                    forth.  SPECviewperf
            Viewset Versioning 
            
               The version of a
                SPECviewperf viewset should be incremented if:
                
                   changes to SPECviewperf
                    affect the performance of the viewset,  or changes to the
                    viewset script affect performance,  or if the viewset data
                    changes,  or if rule changes
                    affect the acceptance criteria.  New results for the
                previous version of a viewset will no longer be published.  SPECviewperf
            Benchmark Release 
            
               On the release date of a new benchmark,
                it replaces the previous benchmark on the public website. Submissions
                for the previous benchmark will no longer be accepted.  
 
           Benchmark Run Rules 
            
               Benchmark Run
                Rules 
                
                   The system under test
                    must perform all of the respective graphics API's functionality
                    requested by the benchmark with the exception that the system
                    does not have to support dithering.  The systems under test
                    must be OpenGL Conformant for the pixel format or visual used
                    by the benchmark.  Settings for environment
                    variables, registry variables and hints must not disable
                    compliant behavior.  No interaction is allowed
                    with the system under test during the benchmark, unless
                    required by the benchmark.  The system under test can
                    not skip frames during the benchmark run.  It is not permissible to
                    change the system configuration during the running of a given
                    benchmark. For example, one can not power off the system, make
                    some changes, then power back on and run the rest of the
                    benchmark.  Screen grabs for
                    SPECviewperf will be full window size.  The color depth used must
                    be at least 24 bits (true color), with at least 8 bits of red,
                    8 bits of green and 8 bits of blue.  If a depth buffer is
                    requested, it must have at least 24 bits of resolution.  The display raster
                    resolution must be at least 1920 pixels by 1080 pixels.  The monitor must support
                    the stated resolution and refresh rate and must fully display
                    all of the benchmark tests being submitted.  Screen resolution must be large
                    enough to run the individual tests at their requested window
                    size, with no reduction or clipping of test window.  Results to be made public
                    must be generated by the official benchmark which may not be changed.  Recompilation of the viewperf executable
                    is permitted provided all non-default compile parameters are documented
                    in the submission, and the binary and any dependencies are made available
                    upon request during the review period.  Tests may be run with or
                    without a desktop/window manager, but must be run on some
                    native windowing system.  
 
           Submission and
            Review Rules 
            
               Submission
                Content Rules 
                
                   These rules are specific
                    to SPECgpc and shall apply in addition to the Submission
                    Content Rules in the GWPG Project Groups Ruleset.  A SPECviewperf submission
                    can be for one or more viewsets per configuration.  A SPECviewperf submission must
                    be run on a 64-bit operating system.  The
                    SPECviewperf submission upload file must have the structure
                    defined in Figure 1:
                     Figure
                      1
 Submitters are not
                    required to submit depth images with a submission. Submitters
                    must provide depth images upon request by any
                    committee member during the review period. After the review
                    period, submitters are not required to retain depth images.  Submission
                Process Rules 
                
                   These rules are specific
                    to SPECgpc and shall apply in addition to the Submission
                    Process Rules in the GWPG Project Groups Ruleset.  The submission file names
                    must contain gpc_v for SPECviewperf, contain all lower case
                    letters and not contain '.' except prior to the zip or tar file
                    extension (e.g. intel_gpc_v_jun10_v0.zip). The file version is
                    denoted prior to the file extension. The initial file version
                    is v0. Resubmitted files must increment the version number.  Review Period
                Rules 
                
                   These rules are specific
                    to SPECgpc and shall apply in addition to the Review Period
                    Rules in the GWPG Project Groups Ruleset.  Reviewers will decide if
                    the image quality of the submission is sufficiently adherent to
                    the respective graphics API's specification to satisfy the
                    intended end user's expectations. If a reviewer rejects the
                    quality of an image for a stated reason, the submitter can ask
                    for a vote of the full SPECgpc electorate. In case of a tie the
                    submission is rejected.  System configurations
                    submitted for the SPECgpc benchmark suite must be able to run
                    the corresponding SPECapc application benchmarks if applicable.
                    If this criterion is not met the submission will be rejected.  |  
    | Adoption  
         Changes
          for version 1.1 adopted June 10, 1999
 Changes for version 1.2 adopted January 12, 2000
 V1.4 changes -- 5.02 (d) added
 V1.5 changes -- 4.01.i.2(2), 4.01.i.2(4), 5.02.w
 V1.6 changes -- 1.03.c, 5.04.o - Adopted by the SPECopc on January 23, 2003
 V1.17 Adopted by the SPECopc on August 13, 2004
 V1.18 Adopted by the SPECopc on October 21, 2004
 V1.19 Adopted by the SPECopc on April 19, 2005
 V1.20 Adopted by the SPECopc on October 20, 2005
 V2.00 Adopted by the SPECopc on January 26, 2006
 V2.10: Adopted on 13 September 2007 (reflects SPECopc->SPECgpc name change and wider API charter scope)
 V2.11: Adopted on 20 April 2010 - Added section II.4 and rule II.4.a.
 V2.12: Adopted on 23 June 2010 - Rules updated for SPECviewperf 11
 V2.13: Adopted on 16 December 2010 - Optimization rules updated (section I.4)
 |  |