SPEC logo

SPECjAppServer2001
Frequently Asked Questions

Version 1.08
Last modified: Thursday, May 22, 2003

Q1: What is SPECjAppServer2001?
Q2: Why did you name the new benchmark SPECjAppServer2001 instead of SPECjAppServer2002?
Q3: Why are you releasing two incomparable benchmarks in the span of three months?
Q4: Does this benchmark obsolete SPECjvm98 or SPECjbb2000?
Q5: What is the performance metric for SPECjAppServer2001?
Q6: Where can I find published results for SPECjAppServer2001?
Q7: Who developed SPECjAppServer2001?
Q8: Why did Sun give the benchmark to SPEC?
Q9: How much control will the JCP have on the SPECjAppServer2001 benchmark?

Q10: Other SPEC benchmarks do not have a price/performance metric. Can you explain why SPECjAppServer2001 has a price/performance metric?
Q11: Can I compare SPECjAppServer2001 results with ECperf 1.0 results?
Q12: Can I compare SPECjAppServer2001 results with ECperf 1.1 results?
Q13: Can I compare SPECjAppServer2001 results with TPC-C results or TPC-W results?
Q14: Can I compare SPECjAppServer2001 results to results from other SPEC benchmarks?
Q15: Can I compare SPECjAppServer2001 results in different categories?
Q16: Do you permit benchmark results to be estimated or extrapolated from existing results?

Q17: What does SPECjAppServer2001 actually test?
Q18: What are the significant influences on the performance of the SPECjAppServer2001 benchmark?
Q19: Does this benchmark aim to stress the J2EE server or the database server?
Q20: Can you describe the workload?
Q21: Can I use SPECjAppServer2001 to determine the size of the server I need?
Q22: What hardware is required to run the benchmark?
Q23: What is the minimum configuration necessary to test this benchmark?
Q24: What additional software is required to run the benchmark?
Q25: Do you provide source code for the benchmark?

Q26: Is there a web layer in the SPECjAppServer2001 benchmark?
Q27: Why did you not address SSL (Secure Socket Layer)?
Q28: Can I report results on a large partitioned system?
Q29: Is the benchmark cluster scalable?
Q30: How scalable is this benchmark?
Q31: Can I report with vendor A hardware, a vendor B J2EE Server, and vendor C database software?
Q32: Can I use the Microsoft SQL Server for the database?
Q33: I am using public domain software, can I report results?
Q34: Are the results independently audited?
Q35: Can I announce my results before they are reviewed by the SPEC Java Subcommittee?

Q36: How can I publish SPECjAppServer2001 results?
Q37: How do I obtain the SPECjAppServer2001 benchmark?
Q38: How much does the SPECjAppServer2001 benchmark cost?
Q39: How much does it cost to publish results?
Q40: What if I have questions about running the SPECjAppServer2001 benchmark?
Q41: Where can I go for more information?


Q1: What is SPECjAppServer2001?

SPECjAppServer2001 is an industry standard benchmark designed to measure the performance of J2EE application servers. This benchmark was derived from ECperf which was developed under the Java Community Process (JCP).

Q2: Why did you name the new benchmark SPECjAppServer2001 instead of SPECjAppServer2002?

There will be two SPECjAppServer benchmark releases in 2002:

Q3: Why are you releasing two incomparable benchmarks in the span of three months?

The two benchmarks will be very similar except for the EJB specification used. While the EJB 2.0 specification is complete, there are vendors who are not able to publish benchmark results using the EJB 2.0 specification yet. There are also vendors who will not be able to publish benchmark results using the EJB 1.1 specification. To allow vendors in either situation to publish results it was decided to release two benchmarks, one supporting each specification. The reason that the two benchmarks are incomparable is because there are different optimization opportunities and constraints in the two EJB specifications.

Q4: Does this benchmark obsolete SPECjvm98 or SPECjbb2000?

No. SPECjvm98 is a client JVM benchmark. SPECjbb2000 is a server JVM benchmark. SPECjAppServer2001 is a J2EE Application Server benchmark.

Q5: What is the performance metric for SPECjAppServer2001?

SPECjAppServer2001 expresses performance in terms of two metrics:

Q6: Where can I find published results for SPECjAppServer2001?

SPECjAppServer2001 results are available via SPEC’s Web site: http://www.spec.org/.

Q7: Who developed SPECjAppServer2001?

SPECjAppServer2001 was originally developed as ECperf by the JCP as JSR 4. The JSR (Java Specification Request) for ECperf was approved in March 1999, and the first release of ECperf 1.0 was in June 2001. JSR 131 was set up immediately after to create a maintenance release (ECperf 1.1) and maintain the expert group to evaluate and accept ECperf 1.0 results. The JCP Expert Groups consisted of:

JSR 4: Art Technology Group (ATG), BEA Systems, Borland Software Corporation, Hewlett-Packard, IBM, Informix Software, IONA Technologies PLC, iPlanet, Oracle, Sun Microsystems, Inc., Sybase

JSR 131: Art Technology Group (ATG), BEA Systems, Borland Software Corporation, Compaq Computer Corporation, Fujitsu Limited, Hewlett-Packard, Hitachi, Ltd, IBM, Kayser, William, Macromedia, Inc., Oracle, Persistence Software Inc., Pramati Technologies, Silverstream Software, Sun Microsystems, Inc., Sybase

ECperf was licensed to the Standard Performance Evaluation Corporation (SPEC) by Sun Microsystems in May 2001. SPEC is a consortium of leading hardware vendors, software vendors, and university representatives whose purpose is "...to develop suites of benchmark programs that are effective and fair in comparing the performance of high performance computing systems..." (quoted from SPEC's policy document). The SPEC OSG Java subcommittee is responsible for accepting and reviewing any submitted results, fixing bugs and enhancing the benchmark for future releases. The SPEC OSG Java subcommittee includes participants from major J2EE application server vendors, several of whom participated in the ECperf JSRs.

Q8: Why did Sun give the benchmark to SPEC?

The JCP is a process, not an organization. Although a JCP expert group was established in the short-term to accept, review and publish ECperf 1.0 results, this was not viable in the long term. It was felt by the expert group that the process for result review, publication and challenge is best handled by an organization such as SPEC which specializes in benchmarks.

Q9: How much control will the JCP have on the SPECjAppServer2001 benchmark?

None. Several members of the JSRs are members of SPEC and will continue to develop and support the benchmark effort. SPEC membership is open to anyone who wishes to participate in the benchmark effort.


Q10: Other SPEC benchmarks do not have a price/performance metric. Can you explain why SPECjAppServer2001 has a price/performance metric?

The lineage of SPECjAppServer2001 is ECperf which was developed under the JCP process. SPEC committees debated on the inclusion of this metric for the SPECjAppServer2001 benchmark. When the SPECjAppServer2001 benchmark was released SPEC decided to do this on an experimental basis, and that this experiment would expire at the conclusion of the review cycle to end on 05/03/2003.

At a SPEC OSSC meeting on 04/08/2003, the OSSC voted that the SPECjAppServer2001 benchmark would not be automatically retired on 05/03/2003, rather the benchmark should continue until 6 months after the release of the follow-on benchmark (SPECjAppServer2003).

Q11: Can I compare SPECjAppServer2001 results with ECperf 1.0 results?

No. Major changes were made to the benchmark. One major change was the definition of specific data consistency requirements. These additional requirements changed the performance characteristics of the benchmark such that comparisons are inappropriate. See section 2.10.4 of the SPECjAppServer2001 Run and Reporting Rules for more information on the consistency requirements.

Q12: Can I compare SPECjAppServer2001 results with ECperf 1.1 results?

No, for two reasons. One, ECperf 1.1 results cannot be announced publicly. Two, while SPECjAppServer2001 may use the same workload as ECperf 1.1 it produces a different metric, so a direct comparison of results is not appropriate.

Q13: Can I compare SPECjAppServer2001 results with TPC-C results or TPC-W results?

No, absolutely not. SPECjAppServer2001 uses totally different data-set sizes and workload mixes, has a different set of run and reporting rules, a different measure of throughput, and different metrics.

Q14: Can I compare SPECjAppServer2001 results to results from other SPEC benchmarks?

No. There is no logical way to translate results from one benchmark to another.

Q15: Can I compare SPECjAppServer2001 results in different categories?

No. The Centralized categories (Single Node System, Dual Node System, and Multiple Node System) were established to prevent comparisons between dissimilar hardware configurations. The Distributed category has  different performance characteristics than the Centralized categories because it uses multiple resource managers.

Q16: Do you permit benchmark results to be estimated or extrapolated from existing results?

No.


Q17: What does SPECjAppServer2001 actually test?

SPECjAppServer2001 mainly tests the Enterprise JavaBeans (EJB) container in a J2EE 1.2 compatible server. It does not exercise all components of J2EE 1.2. See section 1.1 of the Design Document for more information.

Q18: What are the significant influences on the performance of the SPECjAppServer2001 benchmark?

The most significant influences on the performance of the benchmark are:

Q19: Does this benchmark aim to stress the J2EE server or the database server?

This benchmark was designed to stress the J2EE server. However, as this is a solutions based benchmark other components (such as the database server) are stressed as well.

Q20: Can you describe the workload?

The benchmark emulates a manufacturing, supply chain management (SCM) and order/inventory system. For additional details see the SPECjAppServer2001 Design Document.

Q21: Can I use SPECjAppServer2001 to determine the size of the server I need?

SPECjAppServer2001 should not be used to size a J2EE 1.2 server configuration, because it is based on a specific workload. There are numerous assumptions made about the workload, which may or may not apply to other user applications. SPECjAppServer2001 is a tool that provides a level playing field for comparing J2EE 1.2 compatible server products. Users of the tool can use the benchmark for internal stress testing, with the understanding that the test results are for internal use only.

Q22: What hardware is required to run the benchmark?

In addition to the hardware for the System Under Test (SUT), one or more client machines are required as well as the network equipment to connect the clients to the SUT. The number and size of client machines required by the benchmark will depend on the injection rate to be applied to the workload.

Q23: What is the minimum configuration necessary to test this benchmark?

A member of SPEC has run the benchmark on a Pentium III 1GHz laptop system with 1024MB of RAM and a 30GB hard drive.  The benchmark completed successfully with an injection rate of 5.

Note: This is not a configuration that you can use to report results, as it does not meet the durability requirements of the benchmark.

Q24: What additional software is required to run the benchmark?

SPECjAppServer2001 requires a J2EE 1.2 compatible server as well as a database server. See section 2.2 in the SPECjAppServer2001 Run and Reporting Rules for details on all the products. Also, a Java Runtime Environment (JRE) version 1.3 or later must be installed on the client machines.

Q25: Do you provide source code for the benchmark?

Yes, but you are required to run with the files provided with the benchmark if you are publishing results. As a general rule, modifying the source code is not allowed. Specific items (for example, the Load Program) can be modified to port the application to your environment. Areas where it is allowed to make changes are listed in the SPECjAppServer2001 Run and Reporting Rules. Any changes made must be disclosed in the submission file when submitting results.


Q26: Is there a web layer in the SPECjAppServer2001 benchmark?

No. We will be adding a web layer in the SPECjAppServer2003 benchmark.

Q27: Why did you not address SSL (Secure Socket Layer)?

SSL is addressed in the SPECweb99_SSL benchmark.

Q28: Can I report results on a large partitioned system?

Yes.

Q29: Is the benchmark cluster scalable?

Yes.

Q30: How scalable is this benchmark?

In our initial tests we have seen good scalability with three 4-CPU systems (two systems for the J2EE Server and one system for the database server) and we did not explicitly restrict scalability in the benchmark.

Q31: Can I report with vendor A hardware, a vendor B J2EE Server, and vendor C database software?

The SPECjAppServer2001 Run and Reporting Rules do not preclude 3rd party submission of benchmark results, but result submitters must abide by the licensing restrictions of all the products used in the benchmark; SPEC is not responsible for vendor (hardware or software) licensing issues. Many products include a restriction on publishing benchmark results without the expressed written permission of the vendor.

Q32: Can I use the Microsoft SQL Server for the database?

Yes. You can use any relational database that is accessible by JDBC and satisfies the SPECjAppServer2001 Run and Reporting Rules.

Q33: I am using public domain software, can I report results?

Yes, as long as the product satisfies the SPECjAppServer2001 Run and Reporting Rules.

Q34: Are the results independently audited?

No.

Q35: Can I announce my results before they are reviewed by the SPEC Java Subcommittee?

No.


Q36: How can I publish SPECjAppServer2001 results?

Only SPECjAppServer2001 licensees can publish results. All results are subject to a review by SPEC prior to publication.

For more information about submitting results, please contact SPEC.

Q37: How do I obtain the SPECjAppServer2001 benchmark?

To place an order, use the on-line order form or contact SPEC at http://www.spec.org/spec/contact.html.

The ECperf 1.1 benchmark is available for free from http://java.sun.com/j2ee/ecperf/. This benchmark can only be used for internal performance testing and tuning (see clause 8 of the ECperf 1.1 specification).

Q38: How much does the SPECjAppServer2001 benchmark cost?

Current pricing for all the SPEC benchmarks is available from the SPEC on-line order form. SPEC members receive the benchmark at no charge.

Q39: How much does it cost to publish results?

Contact SPEC at http://www.spec.org/spec/contact.html to learn the current cost to publish SPECjAppServer2001 results. SPEC members can submit results free of charge.

Q40: What if I have questions about running the SPECjAppServer2001 benchmark?

The procedures for installing and running the benchmark are contained in the SPECjAppServer2001 User Guide, which is included in the kit and is also available from the SPEC web site.

Q41: Where can I go for more information?

SPECjAppServer2001 documentation consists mainly of four documents: User Guide, Design Document, Run and Reporting Rules, and this FAQ. The documents can be found in the benchmark kit or on SPEC’s Web site: http://www.spec.org/.


Java, J2EE and ECperf are trademarks of Sun Microsystems, Inc.

TPC-C and TPC-W are trademarks of the Transaction Processing Performance Council.

SQL Server is a trademark of Microsoft Corp.