I was baffled at the visceral reaction to what I (and others) thought was a reasonable question. Everyone knows that the purchase price of any tool is loose change compared to the cost of using it. What's the problem with discussing it?
Consider this: A software development tools company needed a source control system, but it balked at paying $250,000 to buy one. Instead, development decided that they should "eat their own dog food" and use their own tools to develop the source control system. Everyone thought this was a clever idea since they already had the developers, their own tools were free and they could completely customize what they wanted.
If they had spent the $250,000 in the beginning, their fixed cost would have been $45,000 in support fees plus $150,000 for one administrator. So they had already spent three times the original cost and were stuck with one and half times the fixed costs going forward forever.
The trap they fell into was the idea that if you don't have to add incremental expense -- write a check or hire another person -- that it is somehow free. But for most companies, people are the biggest cost component; people are expensive, and what they spend their time on costs money.
Don't get me wrong: much has been accomplished by a renegade programmer whipping up a really cool tool. And I'm not against making the best use of what is freely -- or at least cheaply -- available.
What I am in favor of, though, is that you do a cost/benefit analysis first. Someone who would balk at signing a $5,000 check may not blink at spending two weeks of development effort -- yet both cost the same. We also seem to forget that the cost is ongoing -- software never stops, and neither do the support costs.