Is a workstation faster than a high-end PC of similar price? There are varying claims by different manufacturers. We decided to find out by comparing an entry-level workstation and reasonably high-end PC. For the entry-level workstation, we chose the Sun SPARCstation 5 with a 70-MHz clock and 32 MB of memory, which cost $5664. For the PC, we selected a Pentium-based system with an internal 90-MHz clock and a frequency divider that reduced the external clock speed to 60 MHz. It had 32 MB of memory and cost $3989. To supplement the original hard-disk drives, we installed 4-GB Seagate drives in both machines. The database, user's home directories, and data files used for the tests resided solely on the Seagate drives.
We tried to use the same operating system (Solaris 2.4) on both machines, but, of course, some variations had to occur since the chips execute different instruction sets.
We configured each machine with 32 MB of memory - enough to run most of our tests with up to 32 simulated users. Because of time considerations, we ran most of the tests using only 1, 4, 8, 12, 16, 20, 24, 28, and 32 users.
We used the Neal Nelson Remote Terminal Emulator (RTE) to generate the commands and to measure the response times. For each test, the RTE started all the users simultaneously, emulated the desired number of users, and measured all the response times.
To compare these two machines (see Figure 1), we ran several tests using the following real-world software:
While this selection should allow you to pick and choose the tests that most closely approximate your situation, in this article, we'll focus on the results of tests that use software-development commands. For results from the other tests, please contact us.
The test suites were conducted "categorically," which means two things: Every user executes the same RTE script, and every user simultaneously executes the same instruction. Solaris's default kernel configuration was used on both machines.
In the software-development-command portion of the tests, the scripts edited a C program containing 11,229 lines and 333,146 bytes, used the stream editor and program checker, looked at the space and files on a directory structure with 3016 files in 64 directories, and converted, compressed and compared files containing 8000 lines and 400 KB of data.
Manufacturer: (a) Sun Microsystems (b) Intel
Model: SPARCstation 5 Plato Motherboard
Price we paid: $5664.00 $3989.00
CPU: microSPARC II Pentium
CPU speed: 70 MHz 90 MHz (internal)
60 MHz (external)
RAM: 32 MB 32 MB
Number of CPUs: 1 1
Level 1 cache 24 KB 16 KB
Level 2 cache 0 KB 256 KB
Ext. disk-drive model: Seagate ST-15150N Seagate ST-15150N
Capacity: 4.0 GB 4.0 GB
Rotation speed: 7200 rpm 7200 rpm
I/O controller: Sun Microsystems BusLogic
Model: Internal to SPARC 5 BT-956C PCI
motherboard
Interface type: SCSI-II SCSI-II
Operating system: SunSoft Solaris 2.4 SunSoft Solaris for x 86 2.4
To analyze the data points, we could have taken the average, but the average doesn't tell the whole story, and can be very misleading. We decided to analyze each measurement category separately. We plotted all the associated performance ratios in ascending order and immediately uncovered invalid data points.
One problem was that the RTE measured the response times using "clock ticks" set to four per second for many of the tests. Very short measurements - say, half a second - were therefore imprecise. So we eliminated any test where the result measured less than one second.
But there were other invalid data points. For example, while a database runs, the database manager occasionally writes its log file. The user usually has no control over logging, and if it happens during a performance test, that test is invalid. Similar problems may occur with other applications. We therefore eliminated the best and worst 5 percent of the remaining data from each category. Still, one category showed a few badly skewed performance ratios. For that category, we eliminated the best and worst 10 percent of the data.
Figure 2 is a graph of the performance ratios for the software-development-commands category, with outlying data deleted. The vertical axis of the graph is the performance ratio; the horizontal axis shows the test number. Table 1 presents the ten highest and lowest performance ratios we graphed for the software-development category. The table contains brief descriptions of each test.
Script Meas. Description No. Sun Intel Ratio:
No. No. Users Resp. Resp. Intel/Sun
343 3 Copy two files 4 2.37 2.37 1
to a directory
343 5 Split a 8000-line 32 17.75 17.75 1
file
341 3 vi: put 1000 lines 4 9.18 9.06 1.01
343 4 sum two files 28 10.83 10.58 1.02
(800,000 char.)
341 3 vi: put 1000 lines 32 67.75 65.82 1.03
343 10 grep two files 28 12.16 11.74 1.04
343 10 grep two files 8 3.43 3.31 1.04
343 4 sum two files 16 6.50 6.23 1.04
(800,000 char.)
343 4 sum two files 20 7.96 7.62 1.04
(800,000 char.)
343 3 Copy two files 28 17.04 16.25 1.05
to a directory
344 4 List directory to file 1 8.75 4.75 1.84
342 1 lint 333K program 12 150.04 80.43 1.87
341 4 vi: delete 2000 lines 32 4.28 2.19 1.95
344 4 List directory to file 4 32.87 16.81 1.96
341 4 vi: delete 2000 lines 16 2.78 1.42 1.96
344 3 Sort file 28 568.12 280.11 2.03
343 8 Character dump of file 16 166.06 68.71 2.42
344 3 Sort file 24 279.29 115.41 2.42
341 4 vi: delete 2000 lines 24 4.25 1.75 2.43
341 4 vi: delete 2000 lines 20 3.51 1.40 2.51
Figure 2 presents the performance ratios less the fastest and slowest 5 percent. Here the Pentium was never slower than the SPARC 5. Table 1 indicates that the Pentium was worse than the SPARC 5 on simple commands and better on more-complex commands, particularly sort.