The Application Performance Characterization Project Committee Rules
	Version 3.0
		Last Updated: 7/13/2018
	
		- Overview
 
 
				- General
					Philosophy 
 
						- All rules
							declared in "The
							Graphics and Workstation Performance Group (SPEC/GWPG): Rules For Project Groups document (known
							hereafter as the GWPG Project Groups Ruleset) apply, unless specifically
							overruled by a rule in this document. 
- Rules declared in this document shall apply in addition to the rules 
							declared in the GWPG Project Groups Ruleset.
 
- General
					Philosophy 
					
						- The Application Performance Characterization Project of SPEC/GWPG 
							(henceforth abbreviated as SPECapc) believes the user community will 
							benefit from an objective series of tests that are application-based, 
							which can serve as a common reference and be considered as part of an 
							evaluation process. 
- The SPECapc seeks to develop benchmarks for generating accurate 
							application-level graphics performance measures in an open, accessible 
							and well-publicized manner.
- The SPECapc wishes to contribute to the coherence of the field of 
							application performance measurement and evaluation so that vendors will 
							be better able to present well-defined performance measures and customers 
							will be better able to compare and evaluate vendors' products and environments.
- The SPECapc will provide formal beta benchmarks to members and final 
							benchmark releases to the public in a timely fashion.
- Hardware and software used to run the SPECapc benchmarks must provide 
							a suitable environment for running typical workloads for the applications 
							for which benchmarks are provided.
- SPECapc reserves the right to adapt its benchmarks as it deems necessary 
							to preserve its goal of fair and useful benchmarking (e.g. remove benchmark, 
							modify benchmark code or data, etc). If a change is made to the suite, 
							SPECapc will notify the appropriate parties (i.e. SPECapc members and users 
							of the benchmark) and SPECapc will re-designate the benchmark by changing 
							its name and/or version. In the case that a benchmark is removed in whole 
							or in part, SPECapc reserves the right to republish in summary form "adapted" 
							results for previously published systems, converted to the new metric. In 
							the case of other changes, such a republication may necessitate re-testing 
							and may require support from the original test sponsor.
 
 
 
- Overview
					of Optimizations 
 
						- SPECapc is aware of the importance of optimizations in producing the best 
							system performance. SPECapc is also aware that it is sometimes hard to draw an 
							exact line between legitimate optimizations that happen to benefit SPECapc 
							benchmarks and optimizations that specifically target SPECapc benchmarks. 
							However, with the list below, SPECapc wants to increase awareness of implementers 
							and end-users to issues of unwanted benchmark-specific optimizations that would 
							be incompatible with SPECapc's goal of fair benchmarking.
- To ensure that results are relevant to end-users, SPECapc expects that the 
							hardware and software implementations used for running SPECapc benchmarks adhere 
							to a set of general rules for optimizations. 
 
 
 
- General
					Rules for Optimization 
					
						- Optimizations must generate correct images for the application under 
							test, for both the benchmark case and similar cases. Correct images are 
							those deemed by the SPECapc committee to be sufficiently adherent to the
							respective graphics API specification for the targeted application.
- Optimizations must not have an adverse effect on system stability. 
							A published SPEC APC result carries an implicit claim that the performance 
							methods employed are more than just "prototype" or "experimental" 
							or "research" methods. It is a claim that there is a certain 
							level of maturity and general applicability in its methods.
- Optimizations must improve performance for a class of workloads where 
							the class of workloads must be larger than a single SPECapc benchmark or 
							SPECapc benchmark suite.
- No pre-computed (e.g. driver-cached) images, geometric data, or state 
							may be substituted within an SPECapc benchmark in order to detect the 
							benchmark (e.g. pattern matching of command stream or recognition of 
							benchmark's name). 
- Optimizations must be generally available and supported by the 
							providing vendor.
- In the case where it appears that the guidelines in this document have 
							not been followed, SPECapc members will follow the SPEC Violations 
							Determination, Penalties and Remedies Document guidelines to 
							bring resolution to the matter.
 
 
 
 
- Benchmarks
 
 
				- Benchmark
					Definition 
					
						- Benchmark
							components are defined as
							
								- Specific version of the application
- Application models and/or scenes
- Benchmark scripts and framework
- Launch GUI and associated benchmark definition files
- Benchmark run rules
 
 
 
 
- Benchmark Acceptance
					
						- New or modified benchmark components require a 2/3-majority 
							vote of the SPECapc electorate to be accepted for publication. 
- Selection of the specific version of the application used for 
							a benchmark (datecode, service pack, etc) is chosen by majority vote.
- A minimum 3-week review period is required for new or significantly 
							modified benchmark components.
- At the end of the review period a vote will be called to approve 
							the proposed changes.
- An amendment to a benchmark component during the review period must 
							be unanimously accepted. If not, the review period shall be restarted.
 
 
 
- Benchmark
					Code Versioning 
					
						- SPECapc benchmarks use the following version coding: M.m.p (e.g. 
							SPECapcSM for Creo 3.0 v1.0.1) M is the 
							major release number, m is the minor release number and p is the 
							patch level.
							
								- The major release number is only incremented when large 
									amounts of code are changed and the scripting language is 
									dramatically changed as a result — backward compatibility 
									is highly unlikely when moving scripts or data sets between 
									major releases (e.g. running v2 scripts on a v3 executable 
									would almost certainly fail).
- The minor release number is incremented if some small 
									set of code is replaced or removed - but the standard, 
									unchanged scripts and data sets, as a whole, must run on 
									the new version (but perhaps with different performance).
- Patch releases can contain additions of new properties and 
									additions of new attributes or functionality. These are 
									typically used for bug fixes, small enhancements and so forth.
 
 
 
 
- Benchmark Release
					
						- When there is a new major release of a benchmark, submissions using the 
							previous release will be accepted for at least one submission cycle. 
- When a superseded benchmark is retired, the associated run rules and 
							results will be archived along with the benchmark information page.
 
 
 
 
- Benchmark Run Rules 
 
 
				- General
					Benchmark Run Rules 
					
						- The system under test must correctly perform all of the operations 
							being requested by the application by the benchmark.
- No changes to any files associated with the benchmark are permitted.
- The entire display must be available for use by the application and benchmark.
- The color depth used must be at least 24 bits (true color), with at least 
							8 bits of red, 8 bits of green and 8 bits of blue.
- The display resolution must be at least 1920 pixels by 1080 pixels.
- Screen resolution must be large enough to run the individual tests 
							at their requested window size, with no reduction or clipping of the test window.
- Windows created by the benchmark must not be obscured by anything other 
							than other elements created by the benchmark.
- Screen DPI scaling must be set to 100% on Microsoft Windows, and no UI 
							or other window elements, including the task bar, may occlude the viewport 
							rendering context
- It is not permissible to override the intended behavior of the application 
							through any means including, but not limited to, registry settings or environment 
							variables.
- No interaction is allowed with the system under test during the benchmark.
- The system under test cannot skip frames during the benchmark run.
- It is not permissible to change the system configuration during the running 
							of a given benchmark. For example, one cannot power off the system, make some 
							changes, then power back on and run the rest of the benchmark.
- Results submitted must be obtained using the scripts, models, and application 
							revisions which are specified for that submission cycle by the SPECapc.
- Tests may be run with or without a desktop/window manager if the application 
							allows this, but must be run on some native windowing system.
- Virtualized configurations, defined as any operating system configuration 
							running on a Hyper Visor or virtualization layer of any kind, must include 
							the word "virtualized" in the comment field of the config.txt file. 
							This information must be populated before the execution of the benchmark to 
							ensure the results reflect this attribute. 
- Virtualized configurations as defined above must also include a declaration 
							of the transport layer name and version used by the virtualization software in 
							parenthesis after the graphics accelerator listed in the Graphics Accelerator 
							field of the Graphics Hardware Configuration section of the result file.
- Virtualized configurations as defined above must also include a declaration 
							of the Hyper Visor name and version used by the virtualization software in 
							parenthesis after the model listed in the Model field of the System Hardware 
							Configuration section of the result file.
- All configurations shall include a link (URL) to a publicly-accessible device 
							driver package used to generate the submission.
 
 
 
 
- Submission and Review Rules
 
 
				- Submission Content Rules 
					
						- These rules are specific to SPECapc and shall apply in addition to the 
							Submission Content Rules in the GWPG Project Group Rules. 
- The submission upload is defined in the benchmark-specific section for 
							each specific SPECapc benchmark.
 
 
 
- Submission Process Rules 
					
						- These rules are specific to SPECapc and shall apply in addition to the Submission 
							Process Rules in the GWPG Project Groups Ruleset.
- The submission zip file name must contain apc_<application benchmark name>, 
							contain all lower-case letters and not contain "." except prior to the zip 
							file extension. The initial file version is v0. Resubmitted files must increment the 
							version number.  Example <company>_apc_solidworks2017_v0.zip
 
 
 
- Review Period Rules 
					
						- These rules are specific to SPECgpc and shall apply in addition to the Submission 
							Process Rules in the GWPG Project Groups Ruleset.
- Reviewers
							will decide if the image quality and results of the submission are sufficiently
							correct with respect to the intent of the application provider to satisfy the 
							intended application use.
 
 
 
 
- SPECapc Benchmark Specific Rules and
			Procedures
 
 
				- Autodesk
					3ds max 2015
					
						- The
							benchmark must be run using Autodesk 3ds Max 2015 with Service Pack 1
							(SP1) applied. Do not install the 3ds Max Subscription Advantage Pack,
							as it will interfere with the operation and results of the benchmark.
- The
							default 3ds Max 2015 application settings must be used. 
- The
							benchmark may only be run using Nitrous DX11 display driver.
- The
							display resolution must be at least 1920 pixels by 1080 pixels. If the
							system has an integrated display which cannot achieve 1920x1080
							resolution, e.g., a notebook, the system’s maximum possible resolution
							must be used.
- Submissions
							can optionally be made at 4k resolutions.
- Submissions
							can optionally be made with AA modes greater than the default 0.
- This
							benchmark is only supported on systems running Microsoft Windows Win7
							64-bit operating system.
- The
							3dsmax 2015 benchmark should be run with at least 16GB of system memory
							installed.
- The
							application window must be fully visible and not be occluded by any
							other windows. Task-bar auto-hide must be enabled. Any windows on top of
							the application rendering window may interfere with the performance. The
							benchmark status window is an exception to this rule as it should not
							intersect the graphics rendering window.
- The
							results.txt file must be generated by the GwpgSystemInfo.exe found in the ./submissions directory.
- After
							completion of the benchmark the submitter should create a populated
							specAPC2015.xlsm  and results.txt files by following these steps:
							
								- Copy <3dsmax 2015
									install directory>/maxtest/submissions/GwpgSystemInfo.exe
									to the newly created TestResults folder.
- Copy <3dsmax 2015
									install directory>/maxtest/submissions/specAPC2015.xlsm
									to the newly created TestResults folder.
- Open specAPC2015.xlsm
									with Office Excel 2007 or greater.  In the spreadsheet; press
									Enable Content and Select Test Results Folder buttons.
- Create a results.txt by
									running GwpgSystemInfo.exe
- Verify all prepopulated
									values in results.txt.
- Copy the scores from
									specAPC2015.xlsm Overview sheet into the results.txt file.
- Add appropriate values
									to the Submitter fields.
 
- The
							directory structure of the submission must be as follows:
 
 .../company-name/system_1/3dsmax2015/result.txt
 .../company-name/system_1/3dsmax2015/results1.xml
 .../company-name/system_1/3dsmax2015/results2.xml
 .../company-name/system_1/3dsmax2015/results3.xml
 .../company-name/system_1/3dsmax2015/testrenders/*.jpeg
 .../company-name/system_1/3dsmax2015/specAPC2015.xlsm
 .../company-name/system_1/3dsmax2015/result.txt
 .../company-name/system_2/3dsmax2015/results1.xml
 .../company-name/system_2/3dsmax2015/results2.xml
 .../company-name/system_2/3dsmax2015/results3.xml
 .../company-name/system_2/3dsmax2015/testrenders/*.jpeg
 .../company-name/system_2/3dsmax2015/specAPC2015.xlsm
 etc....
- The
							submission file must be named company_apc_3dsmax2015_vN.zip
							where company is the member company or organization name in lower case and vN is the file version (e.g.
							dell_apc_3dsmax2015_v0.zip.) The initial submission is v0. Resubmitted
							files must have the version number incremented.
 
 
 
- Autodesk
					Maya 2017
 
						- The benchmark must be run using Autodesk Maya 2017 with Update 4 applied.
- The default Maya 2017 application settings must be used.
- The benchmark may only be run using the default display driver.
- The display resolution must be at least 1920 pixels by 1080 pixels. If the system has an integrated display which cannot achieve 1920x1080 resolution, e.g., a notebook, the system’s maximum possible resolution must be used.
- Submissions can optionally be made at 4k resolutions.
- Windows Display scaling in Windows 10 and DPI scaling in Windows 7 must be set to 100%.
- This benchmark is only supported on systems running Microsoft Windows 7 and Windows 10 64-bit operating system.
- The Maya 2017 benchmark should be run with at least 16GB of system memory installed.
- It is recommended that the system be rebooted before running the benchmark, and the application window must be fully visible and not be occluded by any other windows. Task-bar auto-hide must be enabled. Any windows on top of the application rendering window may interfere with the performance. The benchmark status window is an exception to this rule as it should not intersect the graphics rendering window.
- The submission must contain the entire results folder generated from running the benchmark. The results folder will be named results_[MSAA|noMSAA]_timestamp after a successful run of the benchmark. All fields in the "Submission Info" panel must be filled out prior to running the benchmark.
- The directory structure of the submission must be as follows:
 
 .../company-name/system_1/maya2017/results_[MSAA|noMSAA]_timestamp/
 .../company-name/system_2/maya2017/results_[MSAA|noMSAA]_timestamp/
 etc...
 
- The submission file must be named company_apc_maya2017_vN.zip where company is the member company or organization name in lower case and vN is the file version (e.g. dell_apc_maya2017_v0.zip.) The initial submission is v0. Resubmitted files must have the version number incremented
 
 
 
- PTC Creo 3.0
					
						- The
							benchmark must be run using Creo 3.0 build
							M010.
- The Creo 3.0 benchmark is only supported on systems
							running Microsoft Windows 7 64-bit operating system.
- Only
							submissions run on Microsoft Windows 7 64-bit operating system and with
							graphics cards supporting a minimum of 8X MSAA and Order Independent
							Transparency will be accepted for review.
- The
							display resolution must be at least 1920 pixels by 1080 pixels. If the
							system has an integrated display which cannot achieve 1920x1080
							resolution, e.g., a notebook, the system's maximum possible resolution
							must be used. 
- It
							is recommended that the Creo 3.0 benchmark be
							run with at least 8GB of RAM installed.
- The All_V25.txt, config.pro and apc.dat files must be used as-is and may
							not be modified or overridden. 
- The
							benchmark may not use any other config.pro, config.sup, config.win, menu_def.pro,
							or protk.dat files. These files must be removed from the
							application install location and user folder before running the
							benchmark.
- The
							graphics card used must support a minimum of 8X MSAA and Order
							Independent Transparency for comparable results.
- The
							script file utilities\runbench.bat may be modified to accommodate
							an alternate install location for Creo 3.0. If
							modified, the modified version of runbench.bat must be included
							in the benchmark submission.
- The utilities\config.txt file must be edited to reflect the system configuration before running
							the benchmark.
- The
							submission must contain the result.txt, trail.txt, score-steps.txt, score.csv, apc.dat and *.tif files. These files may be found in the results directory after a
							successful run of the benchmark. 
- There
							must be no license-related warnings in the trail.txt file.
- The
							directory structure of the submission must be as follows: 
 
 .../company-name/system_1/creo3/result.txt
 .../company-name/system_1/creo3/trail.txt
 …/company-name/system_1/creo3/score-steps.txt
 …/company-name/system_1/creo3/score.csv
 .../company-name/system_1/creo3/apc.dat
 …/company-name/system_1/creo3/*.tif
 .../company-name/system_1/creo3/runbench.bat (if modified, as
							required by rule i above)
 …/company-name/system_2/creo3/result.txt
 .../company-name/system_2/creo3/trail.txt
 …/company-name/system_2/creo3/score-steps.txt
 …/company-name/system_2/creo3/score.csv
 .../company-name/system_2/creo3/apc.dat
 …/company-name/system_2/creo3/*.tif
 .../company-name/system_2/creo3/runbench.bat (if modified, as
							required by rule i above)
 etc...
- The
							submission file must be named company-name_apc_creo2_vN.zip where
							company-name is the member company or organization name in lower case
							and vN is the file version (e.g.
							dell_apc_creo2_v0.zip.) The initial submission is v0. Resubmitted files
							must have the version number incremented.
 
 
 
- Dassault Systemes Solidworks 2017
 
 
- Siemens
					PLM NX 9.0 and 10.0
					
						- The
							benchmark must be run using Siemens
							PLM NX 9.0.3.4 (MaintPack5) and/or NX 10.0.2.6 (MaintPack1).
- The
							NX 9 and 10 license must include the NX Shape Studio or equivalent
							bundle for Advanced Studio rendering and be properly enabled by
							disabling all checkboxes except Disable Dynamic Update Shadows and
							Disable Dynamic Soft Shadows in the Studio Views section of the
							Visualization Performance Preferences General Graphics tab.
- The
							benchmark is only supported on systems running Microsoft Windows 7
							64-bit operating system.
- The
							display resolution must be at least 1920 pixels by 1080 pixels. If the
							system has an integrated display which cannot achieve 1920x1080
							resolution, e.g., a notebook, the system’s maximum possible resolution
							must be used.
- All
							application settings must be the default settings outside of those
							documented in rule f and g and the benchmark setup guide.  See the document,
							SPECapc_NX9-10_BenchmarkSetup, included with the benchmark, for the full
							details of the benchmark setup.
- The
							"Fixed Frame Rate" setting must be globally disabled.
- The
							"Fit View to Stage" setting must be unchecked in the Advanced
							Studio Scene Editor Stage tab.
- Any
							application setting that is not the default or specified value must be
							documented in the Comments field of the "apcNxResults.html"
							file.
- Submissions
							run on Microsoft Windows 7 64-bit will be accepted for review. A
							submission must include a run with FSAA enabled and may optionally
							include a run with FSAA disabled.
- The
							submission must contain the entire results folder generated from running
							the benchmark. The results folder will be named results_sanx_nx[9|10]_[noaa|fsaa]_timestamp after a successful run of the
							benchmark. All fields in the "Submission Info" panel must be
							filled out prior to running the benchmark.
- The
							directory structure of the submission must be as follows: 
 
 .../company-name/system_1/nx[9|10]/results_sanx_nx[9|10]_[noaa|fsaa]_timestamp/
 .../company-name/system_2/nx[9|10]/results_sanx_nx[9|10]_[noaa|fsaa]_timestamp/
 etc...
 
- The submission
							file must be named company-name_apc_nx[9|10]_vN.zip where
							company-name is the member company or organization name in lower case
							and vN is the file version (e.g.
							hp_apc_nx9_v0.zip.) The initial submission is v0. Resubmitted files must
							have the version number incremented.
 
 
 
 
 
	 
	Adoption
		V3.0: Adopted 13 July 2018 – update rules to current GWPG practices and add Solidworks 2017 benchmark
		v2.23: Adopted 27 September (adds Maya 2017 and deletes retired Maya 2012) 
		v2.22: Adopted 21 June 2016 (adds NX 9.0-10.0 and deletes retired NX 8.5)
		v2.21: Adopted 30 June 2015 (adds SolidWorks 2015 and deletes retired SolidWorks
		2013)
		v2.20: Adopted 11 Dec 2014 (adds Creo 3.0 and deletes
		retired Creo 2.0 and Lightwave 9.6 benchmark)
		v2.19: Adopted 28 July 2014 (adds 3ds Max 2015 and deletes retired 3ds Max 2011
		benchmark)
		v2.17: Adopted 20 February 2013 (adds SolidWorks 2013 and deletes retired
		SolidWorks 2007 benchmark)
		v2.16: Adopted 6 August 2012 (PTC Creo 2 rule a.
		change, adds Maya 2012 and deletes retired Maya 2009 benchmark) 
		v2.15: Adopted 22 May 2012 (adds PTC Creo 2 and
		deletes retired WF2 benchmark) 
		v2.14: Adopted 30 March 2012 (adds Siemens NX 6 and deletes retired benchmark)
		v2.13: Adopted 30 June 2011 (adds Autodesk 3ds max 2011 rules and deletes
		retired benchmarks)
		v2.12: Adopted 24 June 2009 (adds Autodesk Maya 2009 rules)
		v2.11: Adopted 12 March 2009 (adds Lightwave 9.6
		rules, updates NX4 rules, adds non-interaction rule)
		v2.10: Adopted 13 September 2007 (reflects GPC-to-GWPG transition, and adds NX4
		rules)
		v2.06: Adopted 13 July 2007
		v2.05: Adopted 18 April 2007
		v2.04: Adopted 06 February 2007
		v2.03: Adopted  20 July 2006
		v2.02: Adopted  27 April 2006 for 01 May 2006 publication
		v2.01: Adopted  26 January 2006 
		v2.00: Adopted  26 January 2006