Right now I'm doing some GPU testing using ATI Catalyst driver on Linux Kernel 3.0.1 which I was built yesterday. To have a working FGLRX driver I just run these steps in order:
1. Prepare the kernel source as described here: http://ftp.paudni.kemdiknas.go.id/slackware/slackware-current/source/k/README.TXT
# cd /usr/src/linux-$YOUR_KERNEL_VERSION # cat config-$YOUR_KERNEL_VERSION > .config # make oldconfig # make bzImage # make clean # make prepare # rm .version
2. Build the Slackware package of ATI Catalyst driver or FGLRX:
# /bin/sh ati-driver-installer-11-7-x86.x86_64.run --buildpkg Slackware/Slackware
3. Install the package
# installpkg fglrx-$VERSION-$ARCH-1.tgz
4. Configure your xorg.xonf using command: aticonfig --initial
5. Reboot the system to load the fglrx kernel module.
If in some case you have to remove the ATI Catalyst package by running aticonfig --uninstall, I recommend you to reinstall the xorg-server package as some of it's files has been renamed by the ATI Catalyst installer to avoid conflict with fglrx driver and libraries.
Catalyst Version: 11.7 hape@cklg:~$ glxinfo | more name of display: :0 display: :0 screen: 0 direct rendering: Yes server glx vendor string: ATI server glx version string: 1.4 OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Radeon 3000 Graphics OpenGL version string: 3.3.10907 Compatibility Profile Context OpenGL shading language version string: 3.30 hape@cklg:~$ glxgears 7540 frames in 5.0 seconds = 1507.473 FPS 8034 frames in 5.0 seconds = 1606.597 FPS 7881 frames in 5.0 seconds = 1576.125 FPS 8105 frames in 5.0 seconds = 1620.836 FPS 7979 frames in 5.0 seconds = 1595.738 FPS
I'm currently running KDE 4.7.0 (Alien Build) with desktop effects feature enabled. So far so smooth, no flicker, CPU load is far less compared with using Mesa 7.11 as 3D accelerator. I'm also tested MPlayer to play a H264 movie using ATI Radeon AVIVO which has some nice GPU accelerator feature (hardware and software) and no performance penalty given and less CPU usage as you can see in the screenshots below.
Basically you will have a high CPU usage when you're playing some H264 compressed video because it's nature compression algorithm that will takes an advantage use of some GPU feature if available. So if your GPU had no such feature, it will push your CPU to do the decompression and as a result you'll get a higher CPU usage and more heat in your box.
So happy slacking...