Thanks for the response. The university itself has a gigabit line, but we're on a part which is at FE. Actual performance is closer to 90 Mbps down 60 up generally, which is fine there. My real question, I guess, is what is generating that traffic. Is created by players in the server or from stats streamed to and from the server constantly? The first people I asked were my friends who work in the game server hosting industry and that's the response they gave right after hearing it was an ArmA II mod. Could have been biased, though, as I know they think that anything not coded natively for Linux or done by either Valve or id inherently isn't coded well in the first place. We rebuilt our server last Feb with a new 3.2 GHz quad core Sandy Bridge. CPU utilization hasn't been an issue with the new build so far, but then again it wasn't on our older Core 2 quad server either. My basis here was that we had to be careful with, for example, Minecraft in that up until just a few months ago it had memory leaks where the system's entire 16 GB would be taken up. We use Windows Server's native VM service but, again, we only use it for temporary purposes while testing stability and such. So far we've never kept a program in the VM sandbox for more than a month. Ideally we wouldn't need to even use one. I'm interested, have an example?