I remember playing Doom 3 on my pc at the time, and it was like 20 fps, maybe. I also remember the marketing on PC parts at the time having stickers like "Runs Doom 3" on them.
When it came out on Xbox, one of my friends marveled at it and how good it looked on console, but was annoyed at all the pc comparisons a second friend and I were making.
Doom3 is one of the landmark games for GPU-based shading - it made a lot of use of GPU stencilling, multiple texture targets, computation in shaders; it massively advanced the state-of-the art in forward rendering. So much so that any modern GPU is very well optimised for 'the things that Doom 3 wants to do', because that's what every game that wants advanced lighting wants to do. The problem then for using it for any kind of benchmarking is that basically any modern card will output Doom3 at 200fps at 4k and still be in power saving mode. It would be like trying to stress-test a CPU with Wolfenstein 3D - the state of the art has long moved past that, you can't use the results of that to tell anything apart any more.
Trying to get Doom3 to render in 16:9 resolutions tho, rather than 4:3? Now that's stressful...