Following post on Oscigen's benchmark, I was curious how much does increased memory usage due to Ruby and Python-based web app stacks affect the bottom line. Plotting Amazon EC2 instance cost against instance memory shows that costs scale roughly linearly with the memory allowance. The least squares fit is y(x) = 0.028*x + 0.079. Linear scaling means that when Ruby/Rails take 10x the memory (roughly the difference between OCaml/Oscigen and Ruby/Rails) then it costs 10x as much in the long run (and after scaling up). For smaller instances, the difference is probably less. Of course, this doesn't take into account the productivity difference. However, although Ruby and Rails are clearly more productive than C/CGI, it is not clear that it is more productive than other high-level languages with better memory use and performance characteristics.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.