JavaScript execution war speeds on

ThraxThrax 🐌Austin, TX Icrontian
edited December 2008 in Science & Tech

Comments

  • KhaosKhaos New Hampshire
    edited December 2008
    All the browsers execute JavaScript so quickly these days that it's not really relevant. It's a difference of microseconds. The browser devs are just in a pissing match because Google is marketing their browser as being faster than the other guys.

    Of course, if one loads a few million web pages, all that extra JS execution time does add up to a few seconds that they'll never get back.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2008
    The point of execution time has gone way, way beyond your standard webpage. We're talking full-fledged cloud applications... JavaScript is still so slow for these actual applications that Adobe and Google have developed plugins and runtimes to harness local resources to execute real code inside a browser. Adobe has Flex and AIR, and Google's Native Client just entered beta testing. Flex can run Quake 3 in a browser and Native Client has already been used to run Quake 1 in a browser; both of these are resource-intensive tasks that JS could only dream of.

    This all points to the fact that Java still needs a lot of work in spite of all the speed it has gained as of late.
  • KhaosKhaos New Hampshire
    edited December 2008
    JavaScript is the wrong base platform for rich content, that's all. It is the programmatic glue between the presentation layer and the processing layer, much in the same way that Visual Basic provides some GUI-glue between an interface and a C++ framework.

    There needs to be secure, run-time platforms such as NC that are capable of running compiled, native executable code inside a browser process. JavaScript augments these platforms, and it doesn't make sense to try to make it do more than just that in the same way that it never made sense to try to develop Visual Basic to the point where it could replace all need for C++.

    On that same token, a compute-intensive application's execution speed was rarely determined by the raw execution speed of Visual Basic. After all, it takes very few instructions to handle a button click and call a library method.

    What it all boils down to is the role of various platforms in the general cloud computing scheme. Clearly, the role of interpreted JavaScript is not to handle compute-intensive functions for cloud applications. It never will be. In the role of GUI-glue, on the other hand, JS performs quite nicely -- and it is plenty fast to serve that purpose as it stands.

    Look, I'm not saying that improvement in execution speed is a bad thing. All I'm saying is that its importance is being overstated by the marketing guys at Mozilla and Google because they need something to bark about.

    The truly important stuff is not JS execution speed, but rather the interoperability between JS and compiled, native, browser-contained run-times. In other words, how seamlessly these run-times mesh with JS.

    On that score, C# 4.0 (soon to be released) is actually light-years ahead of Google's NC platform with its new support for dynamic type resolution... But Microsoft has made itself unpopular with Web 2.0 developers, so it isn't getting a lot of buzz. Now I'm getting dangerously close to <strike>meaningless</strike> convoluted jargon, so I'm gonna hush.
  • MiracleManSMiracleManS Chambersburg, PA Icrontian
    edited December 2008
    The problem is the communication of information that is required with cloud programs. While you're right for EXTREMELY intensive CPU process (like the Quake examples) you're better off with a run-time platform. The true problem with the process is the fact that the web is <em>stateless</em>. Javascript (and other things, including .Net) allow for a transformation of things to SEEM to have states and/or are similar to desktop clients. Google Docs with all its Ajax beauty is a great example. That's why JS is needed, not for rich media content, but for communication and dynamic presentation.

    I think you guys are comparing apples and oranges. Javascript is being used (ala Ajax etc) for seemless presentation, not for presentations that are used for NC, Active X, Flex, Air, and Java on the web. Please, don't start thinking Javascript and Java are even in the same realm of what they are, or even from the same people.
  • KhaosKhaos New Hampshire
    edited December 2008
    MM - I don't disagree. JS is most definitely needed as the glue between the presentation layer and the run-time layer. What I'm saying is that the architecture of n-tiered web applications is specifically intended to minimize the impact of the performance of interpreted scripting languages while providing for maximum flexibility in terms of interoperability between the interpreted presentation layer and the compiled run-times; both client side and server side. It's not a strict comparison that I'm making. In this case, JS and Java/.NET represent different tiers in a single architecture. What I am saying is that the nature of web application architecture makes the relative execution speed of JS irrelevant.

    With current technologies, JS can already:
    - Post to server-side run-times.
    - Call into client-side run-times.
    - Load content dynamically.

    If anything, the greatest limiting factor right now in rich content performance on the web is the availability and performance of broadband connections. For these applications to really perform seamlessly, we need more people to have access to cable/FiOS like connections that are low-latency and high-bandwidth. Client-side run-time libraries and dynamically loaded content rather depend on that availability.

    Until such time as fast broadband is as ubiquitous as the PC, cloud computing will continue to be at a significant disadvantage. No matter how fast JS executes.
Sign In or Register to comment.