High performance Distributed Javascript

Performance is not always critical for web applications but Adequate performance is essential.  This article shows that designing and testing for performance can save money, development time and anguish in other parts of the SDLC.

When interacting with web clients I have always felt that designing for 1/2 second response time average worst case was needed to make users happy.    I have not always met this goal but I still find 8 and 9 second pages from premium retail sites cumbersome and irritating.   We had need as part of the bayes analytic system to graph multiple years of data while providing instant zoom and pan.     One area of this which follows the standard  JSON best practices but it resulted in a 13 second page load but with some creative changes we where able to bring this down to under 1/2 second.  I think this is critical because studies have shown that at 4 to 5 seconds you start hemorrhaging users.

One of the things we needed to support when graphing stock results in the Javascript client is graphing multiple years of data at 1 minute and 10 minute granularity.   This can require parsing over 95,000 individual bars of data per year.   I wanted instant  local zoom and pan because I felt this was essential to encourage our human users to see the necessary patterns.   To provide instant zoom it was a lot easier if I could have all the data local.

When we first implemented this using the natural JSON approach passing and parsing the JSON data in the client it required over 13 seconds on a 1 gig LAN and a fast client (4.2 Ghtz CPU and 8 gig RAM).   When using a standard JSON  library it required over 30 seconds.         The XML parser was even slower and took over 27 seconds.     We converted the data to a less complex format similar to a CSV with a custom Javascript parser and parse time dropped to under 300ms for 5 megabytes of data.  Full page display to fully interactive is under 1 second for fast computers and is still under 2 seconds even on slow devices like my Android phone and my Wife’s Ipad.

This is a good example of why even in the Javascript clients timing every major operation is critical.   I had the timing built in so I was able to isolate the performance problem to the data load and more specifically the local  parsing layer.   After this is was back to the basics to see how fast I could possibly make the parsing.  As it turned out we where able to deliver a  43 times (4300 increase in speed while not changing the language,  run-time or any of the hardware.         The entire CSV parser is under 100 lines of code and has been re-used multiple times.

This is an example of knowing when to violate the industry standard best practice to meet business and performance goals at the lowest feasible cost.  It allowed us with very little extra work to sift from a very frustrating slow response to a very fast immediate response that users love to use.

I could have taken another approach of fed down summary data to allow the first graph to view and kept feeding more data down in the background but this would increase the complexity and code volume by several times.   I could also have just give up on the instant response but this would not meet my user needs.      The other solutions would have increased cost and complexity quite probably by over 10X and still would not have delivered as good of a result.

USA engineering tragedy

The tragedy  in this instance is that I ran the same problem past several dozen talented web engineers including a few principals and all but one of them failed to find the faster solution while most of them wanted to implement a much more complex, larger and more cumbersome system of paging to solve the problem which would quite likely have ended up with over 15 times more code.       This seems to be an endemic problem in the current generation of web programmers.  The good the good side is that for those of us who enjoy this kind of challenge it means there are always projects in trouble  due to performance problems and we get well paid to solve the problems.  A better solution would be to figure out how to  get our colleges to teach that a faster CPU will not always compensate for sloppy engineering.  Of course you also need to be able to recognize and retain those engineers capable of delivering the non-obvious solutions.

For Engineering managers and Recruiters:

When you hire web developers at common rock bottom rates you will very likely wind up with the more complex and cumbersome solution.   This has a profound impact on the total ROI for your business.  In this instance the more complex solutions could easily have required a team of 5 of the less expensive engineers and extra 3 months of work to deliver a the more complex robust solution.   I know this is a valid estimate because I have been involved with many of these projects and have been forced to accept this from some of the engineering teams I was responsible for.

In contrast  I found, fixed and implemented  in 4 days including the data conversion.      My consulting cost in 1995 was $180 per hour while normal senior web developers are selling at between $70 and $100 per hour in 2013.     I cost 3 times what the lower end of the constants costs by as you will see below this higher cost also delivers a much higher ROI.

Optimize for ROI not Cost when hiring engineers

If you retained 5 engineers for the extra 3 months it would cost  to  $168,000 ($70 per hour) plus you would need to retain one or  two of them to maintain the more complex solution at a cost up to of $288,400 per year.       My costs at the $180 per hour rate was $5,750 even if I took me 4 times longer or 16 days my cost would still be $23,000 but my solution ended up simple with very little code and low maintenance overhead.      My solution ended up 29 times less expensive but if you add in maintenance costs over a 3 year period even at 4 times my current cost my solution would be 45 times less expensive.

This kind of decision is a critical ROI enabler for all IT and product development organizations building code.   Senior managers must understand that by setting policies where they force first and 2nd tier managers to buy out of the lower tier they are quite likely multiplying their cost.

Not every web page needs this kind of performance but in many applications there are at least some pages and some systems where performance will be critical to keep your users happy.   The art of  engineering management should be retaining enough of the very senior performance guru’s so you have then available to use on your critical components.  Ensure they have enough time to help the less senior people find  the critical flaw areas before they go off and invest man years to solve what could have been delivered faster and less expensively.    Absolutely back their decisions because the less senior people will resist.

The longer you wait the more expensive it gets

The further down the path these projects go before you bring in the right caliber of  people the more expensive it becomes to solve the problem because they are faced with the choice of ripping out a lot of work to reach the best solution or kludging what it in place to try and fix it.  Budget and time constraints almost always force select of the kludge which quite often ends up cost 10X what it should have cost.

You will not hear this from HR (Human Resources)

I know this is not what your HR representatives are telling you and even your technical engineering managers trained by HR may not tell you.   I can promise you that I have been building and delivering these systems for over 25 years from very small companies through the Fortune 50.  In all these projects I have never seen HR held accountable for the ROI delivered by the engineering department.  HR is a cost center and it is a truly rare HR representative who thinks in ROI rather than cost.   You absolutely must think in ROI if you want to best serve your stockholders.

Note: The numbers varied between browsers but the relative time changes stayed relatively constant.   We tested  Chrome, IE9, Safari, Opera and Firefox.

 

 

Leave a Reply