In my last post I shared the story of improving performance while using .Net and COM.
While improving from a few minutes to a few seconds seems like a great success (if counting such great times ratios*), I haven’t yet beat myself.
The greatest time I’ve improved was from several hours to a few seconds.
Background
In my first workplace there was a C4I application that showed a vector map. No standards where used as it was first develop 15 years ago, before any standards have evolved. In order for the map to run efficiently, its data was split into tiles. First the tiles in view were presented, and then, in background, the other tiles were loaded.
I want to talk about the tool that cut the one big map into tiles.
That tool was given a map, number of tiles along and across, output folder and several hours to run.
The Problem
It was only used when the map was replaced, which was very rare, so no one took the time to improve it. However, this had changed, and I hated waiting those hours.
I looked at the code and found it was going in a simple XY loop over the map, slicing a tile at a time, saving it to a file. For every dead tile with nothing in it, it had to go over all the vector data in the map and testing it against the four walls of the tile.
The Solution
I decided to replace that mechanism with a smarter one. I figured that instead of running over all the data for every tile, I would slice the data recursively, leaving less and less work for next rounds until we get to a single tile size, and then store this tile to disk.
What do you know – it worked! Only a few seconds after years of waiting for hours, leaving the tool running during launch or at night, hoping no mistake was made, and that no power loss will destroy the chance for result.
Conclusion
This story shows that sometimes a replacement of the wrapping algorithm (in this case – from a simple XY loop to a bit more complicated recursion) is enough to gain major results.
*endnote
Optimizing code should be about reducing a method runtime from milliseconds to microseconds. Not from hours and minutes to seconds. If something takes that long, obviously something is very wrong with it that must be taken care of right away. Acceptable initial runtimes is the basis for real optimization.
It reminds me of the following saying:
ReplyDelete"Customer walks in with a cell phone and says 'this thing needs a bigger antenna.' We have to ask ourselves does he want a bigger antenna or better cell phone reception"...
Yeap, agree with y Itai.
ReplyDeleteThx for interesting solution, wish y good luck.
Best regards
Toby, data room providers
once decided to play invested a round sum of money inclusive sign up bonus and not lost
ReplyDeleteItai Bar-Haim, thanks for the article. Let me know if I can help you to choose the best virtual data room providers. I'm an expert, well, feel free to ask for help.
ReplyDelete