productivity - SSD and programming -


I am trying to put together a business case to get every developer in an SSD drive.

There are approximately 400,000 line codes in the main codebase. My theory is that since the code spreads to 1500 files, then one SSD drive will be fast enough for compiles, argue that many read smaller ones actually punishing the bottle neck of the time looking for a conventional hard drive is.

Am I right? Is the cost of money in productivity gains, SSD, reduced by editing / compiling cycle time?

How much longer does this take? To see how you will acquire it, buy it, check all these still about the value of the very expensive per GB SSDs and see whether it is worth it or not.

The main business case for the SSD is usually no running parts with them and are exactly what you need for a laptop ... instead of just a single drive Can make better bandwidth by driving multiple drives. / Strike> Update on April 2011 - The main business case for SSD is no longer that they are not running parts and they are not easily used by many spinning plates. Attributes attracting SSD main business now at their incredibly low reach (many are & lt; 0.1-0.2ms), very high bandwidth (reading / writing speed> 200-700MB / S) and too much random IOPS rates (10,000 -> 120,000 random IOPS 4K alliances). Especially during low reach, compilation time may improve depending on your exact environment. In addition, consider any improvement in the operation on the developer machine, except for project compilation (such as source control checkout), based on the advice to benchmark the reforms benchmark and generate a business case, which still true.

As someone else has said, if the barrier time is compiled, then you may have a bigger problem than just the I / O time.


Comments