How can you come to some understanding of what it costs to use old computing equipment? There are the obvious costs that are easy to see and quantify such as tech support, and hardware failure replacement. The less easy costs to quantify are lost productivity due to Virus and Malware or OS obsolescence, slow to load applications, slow switching between applications, conversion of files between outdated versions of software, etc.
There was a survey done in 2013 to attempt to quantify these costs by Techaisle of 736 small and medium businesses. The results of the survey indicate that computing hardware greater than 4 years old result in repairs 150% more frequently than those less than 4 years old and a average of 21 additional lost productive hours per year per machine over the newer machines. How often have you or someone you know said they boot up their computer and go get a coffee and 20 minutes later the machine is ready to use? Did you know that modern machines with solid state drives and excess ram typically boot in under 1 minute? how many man days can be saved with modern equipment in just this boot scenario? 260 working days per year *19 minutes (boot time differential) = 4940 Minutes= 82.33 Hours = 10.3 Days of lost productivity. Even at minimum wage that would be a loss of over $600 per year but what is worse is the lost money that the employee could have generated in that time.
New hardware and software has never been better priced than it is now.