Skip to main content

JavaScript Frameworks Suck

There's been an interesting discussion off and on around the office this past couple of weeks about JavaScript frameworks, specifically which framework is the best so we can standardize on one. Of course I have to be difficult... my answer is none of them.

As a general rule of thumb, "frameworks are evil." There are exceptions, but frameworks seem to cause a lot of unnecessary bloat, make tasks difficult to accomplish if they fall outside the intended scope of the framework, create obstacles to efficient debugging, and adversely affect page load times causing the application to appear slow and sluggish. The real question ought not be what framework is best, but rather be what exactly are you trying to accomplish with client-side scripting in the first place.

Creating a rich user experience with standard JavaScript is not difficult. Many of the niceties the frameworks provide aren't magic... for example, $() is just function(x) document.getElementById(x);}. And AJAX is easy if you forgo XML in favor of JSON as your transfer format. If you don't understand your goals then you might as well just shoot yourself in the foot.

Once you have identified exactly what your needs are, and if those needs suggest you use a framework, then those needs will also dictate which framework would be suitable for use. If you need widgets, for example, then YUI! would stand out more as the best choice. If you need modularity and flexibility instead, then MooTools might be the way to go.

To further illustrate my point to a coworker that frameworks don't always make things easier, I implemented a basic Accordion widget in MooTools and straight-up plain old JavaScript. The development time in JavaScript proper was half-that of developing with MooTools because I didn't have to learn any special APIs, scrounge the documentation for a list of dependency files, etc. My implementation weighs in at 50 lines vs. MooTools' 3,100+ lines and 21 dependency files.
function $$(className) {
    var classElements = new Array();
    var els = document.getElementsByTagName("*");
    var pattern = new RegExp('(^|\\s)' + className + '(\\s|$)');

    for (var i = 0, j = 0; i < els.length; i++) {
        if (pattern.test(els[i].className)) {
            classElements[j] = els[i];
            j++;
        }
    }

    return classElements;
}

function Accordion(headerClass, panelClass, showIndex) {
    this.headers = $$(headerClass);
    this.panels  = $$(panelClass);

    for (var i = 0; i < this.headers.length; i++) {
        this.headers[i].args = {
            index: i,
            headers: this.headers,
            panels : this.panels
        };
        this.headers[i].onclick = function () {
            var a = this.args;
            for (var i = 0; i < a.panels.length; i++) {
                a.panels[i].style.display = ( i == a.index) ?
                    "" : "none";
            }
        };
    }

    (showIndex === undefined ? this.headers[0] : 
        this.headers[showIndex]).onclick();

    return true;
}

window.onload = function() {
    new Accordion("a_header", "a_body");
};
Sure it's not as "feature rich" as MooTools' Accordion, but any additional features can easily be added when the time comes, and they certainly wouldn't require 3,050 more lines of code.

People want fast front-ends. They're impatient and don't want to wait. Bloated code slows down the front-end and gives them impression of a slow back-end. Half the development time and 98.4% less code? Now that sounds good to me!

Comments

  1. The availability of already-built components in frameworks can be as much of a boon as a hindrance, I think.

    One advantage they do provide is logic to handle cross-browser compatibility issues, rather than requiring you to do research and pull your hair out trying to figure out why something works in one browser and not another.

    Assuming caching is properly used, bloat past the first request should be a non-issue. I will admit that JS frameworks could stand to be more flexible, though that sometimes comes with additional verbosity.

    Unfortunately, due to the environment in which JS is generally used (i.e. the client side of web applications), we don't have the same include-on-demand advantage that languages like PHP provide us, at least not to the extent that it doesn't pose a problem with caching.

    I will admit that sometimes rolling your own solution makes more sense, though I don't think that's always the case.

    ReplyDelete
  2. This comment has been removed by a blog administrator.

    ReplyDelete

Post a Comment

Popular posts from this blog

Writing a Minimal PSR-0 Autoloader

An excellent overview of autoloading in PHP and the PSR-0 standard was written by Hari K T over at PHPMaster.com , and it's definitely worth the read. But maybe you don't like some of the bloated, heavier autoloader offerings provided by various PHP frameworks, or maybe you just like to roll your own solutions. Is it possible to roll your own minimal loader and still be compliant? First, let's look at what PSR-0 mandates, taken directly from the standards document on GitHub : A fully-qualified namespace and class must have the following structure \<Vendor Name>\(<Namespace>\)*<Class Name> Each namespace must have a top-level namespace ("Vendor Name"). Each namespace can have as many sub-namespaces as it wishes. Each namespace separator is converted to a DIRECTORY_SEPARATOR when loading from the file system. Each "_" character in the CLASS NAME is converted to a DIRECTORY_SEPARATOR . The "_" character has no special ...

Safely Identify Dependencies for Chrooting

The most difficult part of setting up a chroot environment is identifying dependencies for the programs you want to copy to the jail. For example, to make cp available, not only do you need to copy its binary from /bin and any shared libraries it depends on, but the dependencies can have their own dependencies too that need to be copied. The internet suggests using ldd to list a binary’s dependencies, but that has its own problems. The man page for ldd warns not to use the script for untrusted programs because it works by setting a special environment variable and then executes the program. What’s a security-conscious systems administrator to do? The ldd man page recommends objdump as a safe alternative. objdump outputs information about an object file, including what shared libraries it links against. It doesn’t identify the dependencies’ dependencies, but it’s still a good start because it doesn’t try to execute the target file. We can overcome the dependencies of depende...

A Unicode fgetc() in PHP

In preparation for a presentation I’m giving at this month’s Syracuse PHP Users Group meeting, I found the need to read in Unicode characters in PHP one at a time. Unicode is still second-class in PHP; PHP6 failed and we have to fallback to extensions like the mbstring extension and/or libraries like Portable UTF-8 . And even with those, I didn’t see a unicode-capable fgetc() so I wrote my own. Years ago, I wrote a post describing how to read Unicode characters in C , so the logic was already familiar. As a refresher, UTF-8 is a multi-byte encoding scheme capable of representing over 2 million characters using 4 bytes or less. The first 128 characters are encoded the same as 7-bit ASCII with 0 as the most-significant bit. The other characters are encoded using multiple bytes, each byte with 1 as the most-significant bit. The bit pattern in the first byte of a multi-byte sequence tells us how many bytes are needed to represent the character. Here’s what the function looks like: f...