Most have heard about Moore's law. It basically states that processing power of computers double every 18 month due to the technological development.
However, there must be another law with with an even steeper curve that states how much slower the software gets. Despite the impressive speed development of computers, some software seem to run slower and slower. Here's an example of that: http://www.excelforum.com/the-water-...013-sucks.html but Excel is just one example of many. This is not about Excel (I run 2010 and it's fast enough for all my needs), it's about the general idea of how working with a piece of software becomes slower and more painful for each year despite being upgraded on a regular basis with computers following Moore's law in speed development.
When I'm trying to draw a straight line in AutoCAD and connect it to another line it's a really painful process. It's lagging A LOT, really struggling to calculate where that line will go. Why? We are talking about a straight line for crying out loud! My laptop has 4 processors and some Giga number describing the speed of processing that is larger than I can grasp. Any mentioning of graphics card really just ads to my case here.
I came across some old computer a while ago, running Excel 2002 (still, Excel is just an example) and I couldn't believe how fast it started, it was virtually instant.
Why is this happening? Is this because optimizing computer code is really expensive? Or is all code full of blubber due to crude recycling of code and objects that don't really fit?
Computer games on the other hand seems to do just fine. Ok, so they take advantage of really powerful graphics cards but still (it's ok to mention graphics cards now). Calculating the trajectory of some projectile in some 3D game is certainly much more demanding than calculating a straight line in 2D. Ok, I'm done ranting now.
Any thoughts?
Bookmarks