Saturday, July 11, 2009

performance remains an uphill battle

Firefox 3.5 has arrived, and according to Mozilla Foundation developers, its major advantage is speed. The new version of the open source Web browser is the first generally available release to include the TraceMonkey accelerated JavaScript engine, which previously had been found only in the 3.1 betas.

This move is the latest volley in the rejuvenated browser wars, as browser vendors shift their focus toward improving the performance of Web-based applications. Google set the pace when it shipped Chrome with a high-performance JavaScript engine last year. Since then, Opera and Apple have both announced new JavaScript engines for their respective browsers, and even Microsoft has grudgingly worked to optimize IE8.

[ See also: Google seeks faster Web | Keep up with app dev issues and trends. Check out InfoWorld's Developer World channel and Fatal Exception and Strategic Developer blogs. ]

But browser performance isn't everything. Users experienced delays browsing major news sites in the wake of the death of pop star Michael Jackson last week, but the problem there wasn't slow browsers or even overloaded servers. According to Web monitoring company Keynote Systems, in many cases site slowdowns were caused by ad networks and other third-party content providers, whose own networks couldn't handle the increased traffic.

This incident underscores an issue of growing concern to Web developers. Modern Web apps typically draw from multiple content sources, data stores, and services, and growing interest in cloud computing will only accelerate this trend. But given all these interdependencies, can Web developers really guarantee fast, responsive user experiences? Or as the complexity of our applications continues to grow, is application performance gradually slipping out of our fingers? Are we all just throwing ourselves on the mercy of the Internet?

Web developers' cloud conundrum
Making Web pages is easy, but building efficient Web applications can be deceptively tricky. Desktop software is tangible; as a developer, you can get your hands around it. To optimize its performance, you do things like eliminating memory leaks and improving the efficiency of disk access. None of this applies to Web apps, however, where developers rely on browsers to handle local resources efficiently.

Instead, Web developers are confronted with the vagaries of the network. If a user accesses a Web page that draws images from a third-party provider, the overall user experience depends on the user's browser, the user's data connection, the outgoing pipe from the Web server, the Web application software, the pipe between the Web server and the image provider, and the image provider's server software. A Web application developer is in a position to optimize only one of these.

Consider what else developers take for granted in this distributed, cloud-based model. How can you be sure that the third-party image provider takes security seriously? How can you be sure that its systems are sufficiently redundant, and that it makes backups regularly, so you won't be blindsided by any unexpected outage? You can ask, of course.

A more immediate problem lies in the ways in which external services integrate with Web pages. Most of them rely on external JavaScript, iframes, or both. Either of these techniques can block a page's onLoad event, a major factor in the user's perception site slowness. This bottleneck happens before the JavaScript code executes, so the speed of the browser's JavaScript engine makes little difference. Combine this design with an overburdened network, and it's not just third-party content but your entire application that suffers.

Increased complexity leads to increased risk for Web apps
The Web community is working on ways to mitigate these problems. For example, modern browsers load other content elements while they're waiting for JavaScript to execute, and developers have come up with various clever techniques to eliminate script bottlenecks. But these one-sided optimizations can only get you so far, and they're difficult to do right.

"Think about it," says Steve Souter, a performance evangelist at Google and author of the books "High Performance Web Sites" and "Even Faster Web Sites." He adds, "We're taking a chunk of HTML that might also include CSS, JavaScript, and Flash, and stuffing it into another page ... It's not surprising that they can, and do, significantly degrade the performance of Web pages, and in some cases can cause a Web site to fail entirely."

Part of the problem lies in the fact that such content-integration efforts often lack cohesive management and oversight. "Integrating third-party content into a Web page would be a complex project to pull off for two teams working in the same company," Souter says. "In the case of ads, the two teams work at two different companies. In fact, the developers creating the ad probably never interact with the team building the main Web site."

That's not to say that everyone will share the responsibility for site slowdowns, however. Rest assured that when site performance degrades, the user will place the blame squarely on the site's own brand: The external content providers will remain virtually anonymous.

Baby steps toward the Web as a first-class app platform
For now, Web application developers and architects should be sure to educate themselves about the potential bottlenecks and other pitfalls inherent in distributed, cloud-like Web applications. Souter's books are a good place to start, and Google has recently launched a Web site dedicated to developing best practices for JavaScript performance.

In the long run, however, Web services providers and consumers will need to work together to develop standards of practice for the cloud-based Internet. The Interactive Advertising Bureau has formed a working group and offers best practices for ad providers to improve load times. This is a good start, but clearly there's still much more work to be done.

One of the more troubling aspects of the current situation is that it tends to favor larger customers. Wal-Mart or a major sports league might be in a position to demand comprehensive SLAs and developer accountability of their external content providers, but a struggling newspaper publisher might not -- to say nothing of even smaller clients.

That's why it's of critical importance that the Web community work to increase not just browser performance, but the performance of cross-organizational Web development teams. As sites and services become increasingly interconnected, we need to come up with new ways to communicate, collaborate, and cooperate to make distributed, cross-site development efforts run more smoothly. Only in this way will the cloud-based Web flourish into a reliable, first-class application development platform.

from : infoworld.com

0 comments:

Post a Comment

Hosting Murah Domain Murah Indonesia, Pelayanan terbaik Fasilitas lengkap dan Limit bandwith lebih tinggi

Infolinks In Text Ads