|  Results  SPECapc  Download/Order  Resources | 
  
    
      |   The Application Performance Characterization
              Project Committee Rules  V1.26 Last Updated: 8/06/2005 
  AdoptionOverview 
  
    General Philosophy 
    
      Within SPEC's Graphics Performance Characterization (GPC) Group there 
      was a strong belief that it is important to benchmark graphics performance 
      based on actual applications. Application-level benchmarks exist, but they 
      are not standardized and they do not cover a wide range of application 
      areas. Thus, the Application Performance Characterization 
      (SPECapcSM)  Project was created within the GPC to create 
      a broad-ranging set of standardized benchmarks for graphics-intensive 
      applications. 
      The SPECapc seeks to develop benchmarks for generating accurate 
      application-level graphics performance measures in an open, accessible and 
      well-publicized manner. 
      The SPECapc wishes to contribute to the coherence of the field of 
      application performance measurement and evaluation so that vendors will be 
      better able to present well-defined performance measures and customers 
      will be better able to compare and evaluate vendors products and 
      environments. 
      The SPECapc will provide formal beta benchmarks to members and final 
      benchmark releases to the public in a timely fashion. 
      Hardware and software used to run the SPECapc benchmarks must provide 
      a suitable environment for running typical (not just benchmark) workloads 
      for the applications in question. 
      SPECapc reserves the right to adapt its benchmarks as it deems 
      necessary to preserve its goal of fair and useful benchmarking (e.g. 
      remove benchmark, modify benchmark code or data, etc). If a change is made 
      to the suite, SPECapc will notify the appropriate parties (i.e. SPECapc 
      members and users of the benchmark) and SPECapc will re-designate the 
      benchmark by changing its name and/or version. In the case that a 
      benchmark is removed in whole or in part, SPECapc reserves the right to 
      republish in summary form "adapted" results for previously published 
      systems, converted to the new metric. In the case of other changes, such a 
      republication may necessitate re-testing and may require support from the 
      original test sponsor. Overview of Optimizations 
    
      SPECapc is aware of the importance of optimizations in producing the 
      best system performance. SPECapc is also aware that it is sometimes hard 
      to draw an exact line between legitimate optimizations that happen to 
      benefit SPECapc benchmarks and optimizations that specifically target 
      SPECapc benchmarks. However, with the list below, SPECapc wants to 
      increase awareness of implementers and end-users to issues of unwanted 
      benchmark-specific optimizations that would be incompatible with SPECapc's 
      goal of fair benchmarking. 
      To ensure that results are relevant to end-users, SPECapc expects that 
      the hardware and software implementations used for running SPECapc 
      benchmarks adhere to a set of general rules for optimizations. 
    General Rules for Optimization 
    
      Optimizations must generate correct images and results for the 
      application under test, for both the benchmark case and similar cases. 
      Correct images and results are those deemed by the majority of the SPECapc 
      electorate, potentially with input from the associated independent 
      software vendor (ISV) and/or end-users, to be sufficiently adherent to the 
      intent behind the application. 
      Optimizations must improve performance for a class of workloads where 
      the class of workloads must be larger than a single SPECapc benchmark or 
      SPECapc benchmark suite. 
      For any given optimization a system should generate correct images 
      with and without said optimization. An optimization should not reduce 
      system stability. 
      The vendor encourages the implementation for general use (not just for 
      running a single SPECapc benchmark or SPECapc benchmark suite). 
      The implementation is generally available, documented and supported by 
      the providing vendor. 
      In the case where it appears that the above guidelines have not been 
      followed, SPECapc may investigate such a claim and request that the 
      optimization in question (e.g. one using SPECapc benchmark-specific 
      pattern matching) be removed and the results resubmitted. Or, SPECapc may 
      request that the vendor correct the deficiency (e.g. make the optimization 
      more general purpose or correct problems with image generation) before 
      submitting results based on the optimization. 
      It is expected that system vendors would endorse the general use of 
      these optimizations by customers who seek to achieve good application 
      performance. 
      No pre-computed (e.g. driver-cached) images, geometric data, or state 
      may be substituted within an SPECapc benchmark on the basis of detecting 
      that said benchmark is running (e.g. pattern matching of command stream or 
      recognition of benchmark's name). Membership 
  
    Membership 
    
      Membership in the SPECapc is open to any organization that has a 
      direct and/or material interest in graphics-focused application 
      benchmarking. 
      Members are expected but not required to be active participants 
      developing and improving SPECapc benchmarks. 
      Members are entitled to secure access to development code. 
      Members are entitled to unlimited publication rights. 
      New members become eligible for voting on the 2nd 
      consecutive qualified meeting. The first qualified meeting may have been 
      attended prior to becoming a member. 
      A member maintains voting rights by attending 1 out of the last 3 
      qualified meetings. A member loses their voting rights upon missing 3 
      consecutive qualified meetings. 
      A member regains voting rights on attending a second consecutive 
      qualified meeting. Associate Status 
    
      Associate status is available to non-profit organizations. 
      All SPECapc, GPC and SPEC rights and rules apply to Associates unless 
      specifically stated otherwise. 
      Associates are entitled to secure access to development code. 
      Associates do not have voting rights. Officers and Elections 
    
      On an annual basis the SPECapc will elect from its membership the 
      following officers: 
      
        Chairperson 
        Vice Chairperson 
        Secretary-Treasurer The Chairperson's responsibilities are to 
      
        conduct meetings, 
        send out the agenda on time, 
        conduct votes on time, 
        deal with outside organizations such as the press, 
        police the submission, review and appeal processes. The Vice Chairperson's responsibility is to do the chairman's job when 
      the chairman is not available. 
      The Secretary-Treasurer responsibilities are to: 
      
        record minutes, 
        maintain the rules document, 
        keeps a history of email, 
        track finances and interact with the GPC and SPEC Board in that 
        regard. Meetings 
    
      The SPECapc has three types of meetings (not including sub-committee 
      meetings) 
      
        Regular quarterly meetings 
        Special SPECapc face-to-face meetings for the full membership 
        Conference call meetings Meetings which qualify for attendance only include: 
      
        Face-to-face meetings scheduled one month in advance 
        Conference call meetings scheduled at least two weeks in advance 
        which are indicated as qualified at least two weeks in advance. 
      Membership Dues and Billing 
    
      Dues for the SPECapc will be set annually by the SPEC Board of 
      Directors with input from the SPECapc. Once set, the dues amount will be 
      recorded in the SPEC minutes and communicated to the SPECapc by the SPEC 
      office. 
      Payment of dues for a given calendar year must be received at the SPEC 
      office by March 1st of that year. Alternately, a Letter of Intent to join 
      the SPECapc must be received by the SPEC office by March 1st of that year 
      with a subsequent dues payment by May 1st of that year. Failure to meet 
      these deadlines will result in loss of membership and voting rights which 
      will be reinstated when full payment is received at the SPEC office. 
    Non-Member Publication 
    
      The SPECapc will accept submissions from non-members for review and 
      publication on the SPEC public website. 
      Non-member submissions must follow the same rules and procedures as 
      member submissions. 
      Non-members are not eligible to participate in reviewing results. 
      Non-members will be charged per system configuration for their 
      submissions. Any change in hardware or software constitutes a new 
      configuration. 
      On an annual basis the SPECapc will establish the pricing and periods 
      for non-member publication. These will be recorded in the SPECapc minutes 
      and published on the GPC web site.
      The SPECapc project group may remove current published results due to 
      benchmark revision. In this case, the submitter  will be given notice 
      by the project group and may, at no charge, resubmit the identical 
      configuration for the revised benchmark. Benchmarks 
  
    Benchmark Acceptance 
    
      Benchmark components are defined as 
      
        specific revision of an application, 
        run rules,  scripts and associated data sets. New or modified benchmark components require a 2/3-majority vote to be 
      accepted for publication. Selection of datecode versions of a specific 
      revision of an application is by majority vote. 
      A minimum 3-week review period is required for new or significantly 
      modified benchmark components. 
      At the end of the review period a vote will be called to approve the 
      proposed changes. 
      An amendment to a benchmark component during the review period must be 
      unanimously accepted. If not, the review period shall be restarted. 
    Benchmark Code Versioning 
    
      Benchmarks use the following version coding: M.m (e.g. 
      SPECapcSM for Pro/ENGINEER™ 20.0 v1.1) M is the major release 
      number and m is the minor release number. 
      The major release number is only incremented when large amounts of 
      code are changed and the scripting language is dramatically changed as a 
      result -- backward compatibility is highly unlikely when moving scripts or 
      data sets between major releases (e.g. running v2 scripts on a v3 
      executable would almost certainly fail). 
      The minor release number is bumped if some small set of code is 
      replaced or removed - but the standard, unchanged scripts and data sets, 
      as a whole, must run on the new version (but perhaps with different 
      performance). 
      When there is a new major release of a benchmark, submissions using 
      the previous release will be accepted for at least one submission cycle. 
      Submission, Review and Publication 
  
    Submission Preparation Rules 
    
      The rules for the submission and review cycle to be used are those 
      posted on the SPECapc web site two weeks prior to the submission deadline. 
      The benchmark and application versions to be used for a submission are 
      those posted on the SPECapc web site two weeks prior to the submission 
      deadline. 
      All benchmark sources for a submission must be the same as that posted 
      on the SPECapc web site two weeks prior to the submission deadline. 
      Members who wish not to review the submission of other specific 
      members due to conflict of interest must submit that list to the SPEC 
      office prior to the submission deadline. The SPEC office will hold the 
      conflict of interest list in confidence from other members. 
General Benchmark Run Rules 
    
      The system under test must correctly perform all of the operations 
      being requested by the application during the benchmark. 
      No changes to any files associated with the benchmark are permitted 
      excepted as noted in the benchmark-specific rules (section 5 of this 
      document). 
      The entire display raster must be available for use by the application 
      being benchmarked. 
      It is not permissible to override the intended behavior of the 
      application through any means including, but not limited to, registry 
      settings or environment variables. 
      No interaction is allowed with the system under test during the 
      benchmark, unless required by the benchmark. 
      The system under test can not skip frames during the benchmark run. 
      It is not permissible to change the system configuration during the 
      running of a given benchmark. That is, one can't power off the system, 
      make some changes, then power back on and run the rest of the benchmark. 
      Results submitted must be obtained using the scripts, models, and 
      application revisions which are specified for that submission cycle by the 
      SPECapc. 
      The monitor used in the benchmark must support the stated resolution 
      and refresh rate. 
      The benchmark must successfully obtain all requested window sizes, 
      with no reduction or clipping of any benchmark-related windows.  
      Windows created by the benchmark must not be obscured on the screen by 
      anything other than other elements created by the benchmark. 
      Tests may be run with or without a desktop/window manager if the 
      application allows this, but must be run on some native windowing system. 
      General Submission Content Rules 
    
      The information supplied should reflect the SYSTEM AS TESTED. 
      All  fields in the of the result files must be supplied by the 
      submitter unless they are marked as "opt.", indicating an optional field. 
      Submitters must specify a date for 'General Availability' that is 
      accurate for the entire system - hardware, software, O/S, drivers, etc. 
      The "Comments" area of the results page must describe how the system 
      may be acquired. 
      Date fields should always contain a valid date. "Now" is not valid in 
      a date field - the field should instead indicate the earliest date of 
      availability. 
      Price includes system and monitor as tested. 
      The color depth used must be at least 24 bits (true color), with at 
      least 8 bits of red, 8 bits of green and 8 bits of blue. 
      If a depth buffer is requested, it must have at least 16 bits of 
      resolution. 
      The display raster resolution must be at least 1280 pixels by 1024 
      pixels. 
      The monitor refresh rate must be at least 75Hz. This requirement does 
      not apply to digital flat panel displays. 
      Alternate currency from the US dollar can be submitted as price and 
      the submission will sort separately on the summary pages for Price and 
      Price/Performance. 
      The submitter is required to declare sufficient information to 
      reproduce the performance claimed. This includes but is not limited to: 
      
        non-default environment variables, 
        non-default registry variables, 
        hints, 
        changes to the standard makefiles. Any information required to be reported such as non-default 
      environment variables, registry variables or hints, that does not have a 
      predefined field must be documented in the "Comments" or "API Extensions" 
      areas of the results page. 
      Valid submissions must include screen captures if required by the 
      benchmark. 
      Results previously published for a system can be resubmitted. 
      Previously published results being re-submitted can only have price 
      changes. 
      The submission upload file structures are defined in the 
      benchmark-specific section below. 
      Each member company should ensure that the upload file contains data 
      for all the new configurations and existing published configurations they 
      wish to continue publishing. 
      Standardized cache nomenclature are as follows: 
      
        (D+I) is a Unified instruction and data cache 
        (D/I) is a for separate instruction and data caches 
        A number followed by KB or MB can be used to describe the size of 
        the cache. 
        Caches dedicated to a processor are listed as per-processor cache 
        size. 
        Caches shared by multiple processors are listed by total size. 
      Each component of the submitted software configuration (including the 
      graphics driver) shall be: 
      
        uniquely identified, 
        available to SPECapc members, upon demand, by the submission 
        deadline and for the duration of the review process, 
        verifiably available to the public by the publication date, with continued 
        availability at least through the next submission deadline, with 
		sufficient information in the comment field to enable users to directly 
		obtain this component.On or before the date of publication the platform as described in the 
      submission shall be available for purchase by the public, for the 
      specified price or less, with a firm delivery date of 60 days or less. 
      Submissions will be categorized as either “Single Supplier” or “Parts 
      Built”, where “Single Supplier” is defined as a configuration where all 
      hardware and drivers are sold and supported by a single supplier. 
      “Supported” is defined as providing hardware and driver warranty for a 
      defined period. “Parts Built” is defined as a configuration built and 
      supported by multiple suppliers. 
      	“Parts built” system pricing must include enough detail to reproduce 
      all aspects of the submission, including performance and price, and 
      include all hardware and O/S costs necessary to run the benchmark. 
      	Any change to or replacement of, subsequent to publication, any of the 
      elements of the submitted software configuration that results in more than 
      a 5% degradation in any of the benchmark results for that submitted system 
      will be cause for removal of the results for that system from the SPEC 
      public website. Submission Process Rules 
    
      Each benchmark is considered a separate submission. 
      Submission of each benchmark's results (e.g., Pro/ENGINEER™ , 
      Solidworks 2003™) will be in different tar/zip files. 
      The submission file names are detailed below under the 
      benchmark-specific rules. 
      A submitter of SPECapc benchmark results must upload their submission 
      to the proper location by the submission deadline. 
      The submitter must notify SPEC Office after a submission is uploaded 
      to the server prior to the submission deadline with contact information 
      for questions about the submission. 
      The submitter must contact the SPEC office if they have attempted to 
      upload their submission and were not successful. 
      The SPEC office will not disclose who has submitted results until the 
      submission deadline has passed. 
      Submissions will not be accepted after the submission deadline. 
      The upload directory will be set to write-only until the submission 
      deadline has passed. Then it is set to read-write (not modify) after the 
      submission deadline. 
      If a submitter is notified that their submission format is incorrect, 
      they must re-send their submission in proper format within 3 business days 
      of notification. 
      Abuse of the resubmission allowance is grounds for rejection of a 
      submission. Review Period Rules 
    
      SPECapc members shall keep all submitted results confidential until 
      those results appear on the public SPEC website, or until they become 
      public through some other means. SPECapc members are free to make their 
      own submitted results public at any time. 
      SPEC Office pairs reviewers to submitters. 
      The various SPECapc benchmark review pools will be independent of each 
      other. The SPEC office will send the list of contact information for the 
      submissions under review. 
      All members will have access to all benchmark submissions once the 
      review period begins. 
      There will be a 5 calendar-day review period on submissions. 
      Submissions can not be withdrawn after the submission deadline. 
      If a primary reviewer has a question with a submission they must pose 
      the question to the submitter first. 
      Any reviewer/member who has any question with a submission must 
      either: 
      
        Pose any questions to the submitter and cc the primary reviewer. 
        Pose any questions to the primary reviewer. The primary reviewer 
        must then pose the question(s) to the submitter. 
        Pose any questions to an officer of the SPECapc. The officer of the 
        SPECapc must then pose the question(s) to the submitter and cc the 
        primary reviewer. The submitter can request that their submission be rejected on stated 
      technical grounds. 
      With the permission of the primary reviewer, the submitter may 
      resubmit their submission. 
      The submitter must provide the primary reviewer access to the system 
      under test at the submitter's facilities if requested by the reviewer 
      during the review period. The reviewer must state prior to the visit what 
      part of the submission is going to be verified. Travel expenses are the 
      responsibility of the reviewer. 
      Previously published results being re-submitted can only be reviewed 
      for consistency with the previous submission and price changes. 
      Price can be challenged. If so, the submitter must provide 
      documentation that the system can be purchased for the price quoted. Price 
      must be valid for two submission cycles from date of first publication. 
      Quantity 1 pricing must be used. 
      Reviewers will decide if the image quality and results of the 
      submission are sufficiently correct with respect to the intent of the ISV 
      to satisfy the intended end-users' expectations. 
      The primary reviewer of a submission must either approve the 
      submission without comment, approve the submission with comment or reject 
      the submission with comment by the end of the review period. If the 
      primary reviewer fails to do this, the submission will be automatically 
      accepted. The submitter may appeal a rejection as described in "Review 
      Appeal Rules" below. 
      Any comments for rejection of a submission received after the end of 
      the review period will not affect or delay publication of the submission. 
      Review Appeal Rules 
    
      There will be a 2-week appeal period following the review period. 
      During the appeal process, any submitters of rejected submissions can 
      make their case to the SPECapc via email. 
      At the end of the appeal period, if there is no resolution the Chair 
      of the SPECapc will call for a vote to approve or reject the submission. 
      The whole SPECapc electorate votes on approval or rejection of an 
      appealed submission. A simple majority of the SPECapc electorate is 
      required to approve or reject the appeal. In case of a tie the submission 
      is rejected. Challenging Approved Results 
    
      Any member may challenge approved results at any time. This includes 
      
        archived results, 
        currently published results and 
        re-submitted results not subject to the regular submission review 
        process. The burden of proof that the result should be modified is on the 
      member who is challenging the result. 
      The challenge must be ratified by a majority vote of the SPECapc 
      electorate. 
      The Chair of the SPECapc will call a special review cycle for a 
      resubmission in the event that a current submission is successfully 
      challenged. 
      Successful challenges of archived results can only result in 
      annotation, not removal or modification. Annotation is determined by the 
      majority of the SPECapc electorate. SPECapc Benchmark Specific Rules and Procedures  
  
    Pro/Engineer* 2001 
    
      The benchmark must be run using the datecode version of Pro/ENGINEER 
		2001 specified on the SPECapc website two weeks prior to the submission 
		deadline. 
      The config.pro file must be used as-is and may not be modified or 
		overridden. 
      The script files  utilities\runbench.bat or utilities/runbench.csh 
		may be modified as necessary to enable execution of the benchmark on the 
		system being tested.  If modified, the modified version must be 
		included in the benchmark submission.
      The color depth in the 3D graphics windows used by Pro/ENGINEER must 
		be at least 24 bits (true color). 
      The displayed raster resolution must be at least 1280 pixels by 1024 
		pixels. 
      The monitor refresh rate must be at least 75Hz. This requirement does 
		not apply to digital flat panel displays. 
      The border width of the windows created during the benchmark shall not 
		exceed 10 pixels. 
      The submission must contain the proe_result.txt file (as generated by 
		the proescore program) as well as the corresponding trail.txt file 
		generated from running the benchmark. Both of these files may be found 
		in the "results" directory after a successful run of the benchmark.   
		The first section of the proe_result.txt file may be edited to reflect 
		the system configuration.
      The directory structure of the submission  must be as follows: 
      .../<Company-name/<system_1/proe2001/proe_result.txt
 .../<Company-name/<system_1/proe2001/trail.txt (may be compressed)
 .../<Company-name/<system_1/proe2001/runbench.bat (or runbench.csh, if 
		modified, as required by Rule 5.15.c)
 .../<Company-name/<system_2/proe2001/proe_result.txt
 .../<Company-name/<system_2/proe2001/trail.txt (may be compressed)
 .../<Company-name/<system_1/proe2001/runbench.bat (or runbench.csh, if 
		modified, as required by Rule 5.15.c)
 etc...
Compression may be accomplished using UN*X compress(1), tar -Z or zip. 
		The reviewer may ask the submitter to supply an uncompressed version of 
		the trail file(s). 
      The submission file must be named 
      company_apc_proe2001_vN.zip or 
      company_apc_proe2001_vN.tar.z where company is the 
		member company or organization name in lower case and vN is the 
		file version (e.g. sgi_apc_proe2001_v0.tar.z and 
		intel_apc_proe2001_v0.zip.) The initial submission is v0. Resubmitted 
		files must have the version number incremented. Solid Edge V14 
    
      The benchmark must be run using Solid Edge V14.00.00.70 
      The color depth in the 3D graphics windows used by Solid Edge must be 
      at least 24 bits (true color). 
      The displayed raster resolution must be at least 1280 pixels by 1024 
      pixels. 
      The application can be run with Tools-Options-View Graphics Display 
      set to Graphics Card Driven, Software Driven or Backing Store. The choice 
      must be documented in the notes portion of the results. 
      Application settings must not be changed from the defaults set by the 
      benchmark installation.  
      Settings that must not be changed include, but are not limited to: 
      
        Culling 
        Wireframe display in Move Part command 
        Arc smoothness (3)The monitor refresh rate must be at least 75Hz. This requirement does 
      not apply to digital flat panel displays. 
      The border width of the windows created during the benchmark shall not 
      exceed 10 pixels. 
      The submission must contain the file result.txt that is generated 
      during the benchmark run. 
      The directory structure of the submission  must be as 
      follows:.../<Company-name/<system_1/SEV14/result.txt
 .../<Company-name/<system_2/SEV14/result.txt
 etc...
The submission file must be named 
      company_apc_SolidEdgeV14_vN.zip where company is the 
      member company or organization name in lower case and vN is the 
      file version (e.g. hp_apc_SolidEdgeV14_v0.zip.) The initial submission is 
      v0. Resubmitted files must have the version number incremented. 
    3dsmax6 
	
		The benchmark must be run using 3dsmax  
		version 6, service pack 1.  The color depth in the 3D graphics window 
		used by 3dsmax must be at least 24 bits (true color).  The displayed raster resolution must be 
		at least 1280 pixels x 1024 pixels.  The monitor refresh rate must be at 
		least 75Hz. This requirement does not apply to digital flat panel 
		displays.  The border width of the windows created 
		during the benchmark shall not exceed 10 pixels.  The application windows will be visible 
		as specified:  
		
			The "3D Studio Max R6" window is 
			maximized and fills the screen (The window is not occluded by any 
			other windows.)  The command panels is visible and 
			docked on the right.  Drivers and custom drivers must be 
		configured (explicitly or implicitly) to use:  
		
			GL_TEXTURE_MIN_FILTER = 
			GL_LINEAR_MIPMAP_LINEAR  GL_TEXTURE_MAG_FILTER = GL_LINEAR  
			Textures cannot be modified, i.e. 
			reduced in size or depth.  Wireframe objects must NOT be drawn 
			using triangle strips.  The following Viewport parameters must 
		be checked in the Viewports Tab found in the 3dsmax 6 Customize 
		Preferences menu: 
		
			Backface cull on object creation
			Mask viewport to safe region 
			Update background while playing
			Display world axis The following Viewport parameters must 
		NOT be checked in the 3dsmax 6 Customize->preferences menu: 
		
			Attenuate lights Filter environment backgrounds
			The benchmark can be run using either 
		OpenGL or Direct 3D. For OpenGL submissions, a custom driver 
		may be used, but the custom driver must produce a pixel exact match to 
		the supplied 3dsmax OpenGL driver. 
		
			If custom driver does not match, the 
			submitter may apply for a waiver based on documented errors in the 
			3Dsmax OpenGL driver The submission must contain the files result.txt and result.xls that 
		are generated 
      during the benchmark run. 
      The directory structure of the submission  must be as 
      follows:.../<Company-name/<system_1/3dsmax6/result.txt
 .../<Company-name/<system_1/3dsmax6/result.xls
 .../<Company-name/<system_2/3dsmax6/result.txt
 etc...
The submission file must be named 
      company_apc_3dsmax6_vN.zip where company is the 
      member company or organization name in lower case and vN is the 
      file version (e.g. hp_apc_3dsmax6_v0.zip.) The initial submission is 
      v0. Resubmitted files must have the version number incremented.Solidworks 2005
	
		The benchmark must be run using Solidworks2005™ service pack 0.The color depth in the 3D graphics windows used by Solidworks2005™ 
      must be at least 24 bits (true color).
		The displayed raster resolution must be at least 1280 pixels x 1024 
      pixels. 
      	The monitor refresh rate must be at least 75Hz. This requirement does 
      not apply to digital flat panel displays. 
      	The border width of the windows created during the benchmark shall not 
      exceed 10 pixels. 
      	The application window size must not be changed from its initial 
		size.The submission must contain a results.txt file generated by running 
      the benchmark. Its contents are extracted from the file apcresultsN.txt, 
		generated by benchmark GUI , where N is the number of the 
		benchmark test run.
		The submission will be derived from the best composite generated by 
		the default 5 benchmark runs, as controlled and reported by the 
		benchmark GUI.The appearance of the the application's Quick Tips / 
		Dynamic Help box, or other pop-ups that do not dismiss themselves, will 
		cause the benchmark run to be invalid.  The benchmark FAQ has 
		information on how to prevent this.The directory structure of the submission  must be as 
      follows:.../<Company-name/<system_1/sw2005/results.txt
 .../<Company-name/<system_2/sw2005/results.txt
 etc...
The submission file must be named company_apc_sw2005_vN.zip where
		company is the member company or organization name in lower 
      case and vN is the file version (e.g. ibm_apc_sw2005_v0.zip.) The 
      initial submission is v0. Resubmitted files must have the version number 
      incremented. 
	Maya 6.0
	 
		The benchmark must be run using Maya 6.0 The color depth in the 3D graphics windows used by 
		Maya 6.0 must be at least 24 bits (true color). The displayed raster resolution must be at least 
		1280 pixels x 1024 pixels. The monitor refresh rate must be at least 75Hz. 
		This requirement does not apply to digital flat panel displays. The border width of the windows created during the 
		benchmark shall not exceed 10 pixels. The benchmark script must be run with the command 
		'mayaTest(3)' where '3' is the number of runs. The submission must contain the results.txt 
		submission file as well as the scoring spreadsheet used to calculate the 
		result. The directory structure of the submission  must be 
		as follows:.../<Company-name/<system_1/maya60/results.txt
 .../<Company-name/<system_1/maya60/MayaResults.xls
 etc...
The submission file must be named 
		company_apc_maya60_vN.zip where company is the member company 
		or organization name in lower case and vN is the file version 
		(e.g. ibm_apc_maya60_v0.zip.) The initial submission is v0. Resubmitted 
		files must have the version number incremented.
	 3dsmax7
	 
		The benchmark must be run using 3ds max version 7, 
		service pack 0 (i.e. unpatched).The color depth in the 3D graphics window used by 
		3dsmax must be at least 24 bits (true color).The displayed raster resolution must be at least 
		1280 pixels x 1024 pixels.The monitor refresh rate must be at least 75Hz. 
		This requirement does not apply to digital flat panel displays.The border width of the windows created during the 
		benchmark shall not exceed 10 pixels.The application windows will be visible as 
		specified:
 		
			The "3ds max 7" window is maximized and fills 
			the screen (The window is not occluded by any other windows.) Note: 
			Task-bar auto-hide should be enabled.Drivers and custom drivers must be configured 
		(explicitly or implicitly) to use:
 		
			GL_TEXTURE_MIN_FILTER = 
			GL_LINEAR_MIPMAP_LINEAR GL_TEXTURE_MAG_FILTER = GL_LINEAR Textures cannot be modified, i.e. reduced in 
			size or depth. Wireframe objects must NOT be drawn using 
			triangle strips. The following Viewport parameters must be checked 
		(i.e. enabled) in the Viewports Tab found in the 3ds max 7 
		Customize->Preferences menu (Viewports tab):
 		
			Backface cull on object creation Mask viewport to safe region Update background while playing Display world axisThe following Viewport parameters must NOT be 
		checked in the 3ds max 7 Customize->Preferences menu (Viewports tab):
 		
			Attenuate lights Filter environment backgrounds The benchmark can be run using either OpenGL or 
		Direct 3D. For OpenGL submissions, a custom driver may be 
		used, but the custom driver’s images must not differ from the supplied 
		3ds max OpenGL driver by more than 5%. If the custom driver’s images do 
		not match, the submitter may apply for a waiver based on documented 
		errors in the 3ds max OpenGL driver.The submission must contain the files result.txt 
		and result.xls that are generated during the benchmark run.The directory structure of the submission  must be 
		as follows:.../<Company-name/<system_1/3dsmax7/result.txt
 .../<Company-name/<system_1/3dsmax7/result.xls
 .../<Company-name/<system_2/3dsmax7/result.txt
 etc...
The submission file must be named company_apc_3dsmax7_vN.zip 
		where company is the member company or organization name in lower 
		case and vN is the file version (e.g. hp_apc_3dsmax7_v0.zip.) The 
		initial submission is v0. Resubmitted files must have the version number 
		incremented.Submitters are not required to include 
		pixel-comparison images.A committee member raising a challenge to a 
		submission can employ pixel-comparison images to support the challenge. 
		However, reference and grabbed images must be generated on a system with 
		identical GPU, graphics and custom drivers and settings to the submitted 
		configuration being challenged.
	Maya 6.5
	 
		The benchmark must be run using Maya 6.5 The color depth in the 3D graphics windows used by 
		Maya 6.5 must be at least 24 bits (true color). The displayed raster resolution must be at least 
		1280 pixels x 1024 pixels. The monitor refresh rate must be at least 75Hz. 
		This requirement does not apply to digital flat panel displays. The border width of the windows created during the 
		benchmark shall not exceed 10 pixels. The benchmark script must be run with the command 
		'mayaTest(3)' where '3' is the number of runs. The submission must contain the results.txt 
		submission file as well as the scoring spreadsheet used to calculate the 
		result. The directory structure of the submission  must be 
		as follows:.../<Company-name/<system_1/maya65/results.txt
 .../<Company-name/<system_1/maya65/MayaResults.xls
 etc...
The submission file must be named 
		company_apc_maya65_vN.zip where company is the member company 
		or organization name in lower case and vN is the file version 
		(e.g. ibm_apc_maya65_v0.zip.) The initial submission is v0. Resubmitted 
		files must have the version number incremented.       Adopted  6 August 2005
   |  |