Front End Development: IN SPAAAAAACE!

Thanks again to the Toledo Web Professionals for allowing me to give this presentation, and to everyone at Hanson that contributed material and/or helped out. As promised, below are the notes and links from my talk.

I got most of the images for this presentation from James Vaughan (x-ray_delta_one) on Flickr. He’s done a fantastic job of collecting mid-century advertising and art and making it available under Creative Commons. I’d like to also thank ManosArt on DeviantArt for the robot illustration.

Slide 3: Welcome to the Next Frontier

There’s never been a more complex time to be in web development. The term “front end developer” didn’t exist a few years ago, and many of us probably started with a title like “engineer”, maybe “designer” or maybe even “webmaster”.

Front-end used to just be about knowing HTML and CSS, and maybe a little bit of JavaScript. These days, HTML and CSS have gotten more powerful and correspondingly more complex. Web development is also starting to look more like programming.

Slide 4: Beware of Killer Robots

The web is getting more powerful, natively. We’re doing things in the browser we wouldn’t have dreamed of 10 years ago. Things are moving very fast. It feels like you could spend 40 hours a week just keeping up. There are killer robots around every turn.

Slide 5: DON’T PANIC!

I’m here to tell you not to panic! Your skills are as valuable as ever. You don’t have to know everything about every technology you plan to use. Which is good, because there’s no way anyone could. There are giant shoulders you can stand on. That’s the new world of front-end development. Find some solid tools to build on, use them and customize them to meet your needs.

Slide 6: Embrace the Command Line

You don’t have to be a programmer to be a successful front-end developer, but you should be comfortable using the command line and editing config files.

If your command line skills are rusty, here are a couple of links to help you get up to speed:

Slide 8: Harness the Power of Robots

Some day, the robots will rise up and kill all humans. Until then, don’t waste your time. Use robot power to help you work faster by automating common tasks.

When I say robots, I mean scripts, small utility programs that do one thing very well. You can combine these in surprising and powerful ways. Some tasks you can use robot power for include:

  • Downloading code libraries and assets
  • Generate boilerplate structure and markup
  • Compiling things
  • Testing
  • Optimization

Slide 10: Tools


Git is a source control system. It’s distributed, which makes sharing and merging code from multiple contributors pretty easy Most of the libraries you’ll be using are stored in Git repos. Github is the de facto standard for publishing and accessing JavaScript code. If you dont’have a Github account, sign up for one at


Node.js is a standalone Javascript runtime based on Google’s V8 engine. It brings JavaScript out of the browser and makes it a platform for all kinds of development.


NPM is a package manager for node. A package manager is a client for managing code libraries. It provides a way to search for libraries and handles downloading, installing, and managing dependencies and versions of that code.

Slide 11: A note for Windows users

The community is very Mac/Unix centric. Lots of the libraries and code examples you’ll see assume you have access to unix and Some of the tools. It’s little things, like cp instead of copy, ls instead of dir. It’s 100% possible to use all these tools on Windows. Expect occasional frustration, but it’s generally not a problem.

Git installs some basic Unix command line utilities. If you need to follow complex examples it might be easier to install GNUWin32 or Cygwin than to dig around for Windows equivalents.

Slide 12: Using the NPM Registry

From this point in the presentation you’ll see a lot of example commands. The pointy bracket means that this is something you should type.

npm init is a command you run in to prepare a project for use with NPM libraries for the first time. It will ask you some questions and set up a package.json file. Package.json is used to store information about your project and its dependencies in case you publish it later.

You can search NPM from the command line, or you can also use the site at If you get involved with the community, you’ll see new projects announced on Twitter and different mailing lists and blogs. Most of these are in NPM, or in Bower which we’ll see in a bit.

It’s important to note that NPM installs packages to the current folder. It assumes you want to use the package on a per-project basis. You can use the -g (global) flag to install tools so you can access them from anywhere. This is the case with lots of the libraries we’ll install today.

Slide 15: Bower


From it’s web site: “Bower is a package manager for the web. It offers a generic, unopinionated solution to the problem of front-end package management.”

Bower can be used to add code libraries to your site. 99% of you want is already in Bower. You can install it from NPM: npm install -g bower

Slide 16: The Bower Way

We saw how to install packages the hard way. Here’s the bower way. bower init sets up a little file in the root of your project called bower.json, so you can which keeps a list of what you have installed.

bower search works the same way as npm search. You can also search the repo at Sometimes you won’t even need to search. Popular libraries are registered with Bower using the name you already know. If you guess the project name you’ll be right a lot of the time.

You can install more than one at a time, space separated. That --save tells Bower to write your packages to the bower.json file. That’s important later.

The old process had 8 steps…times 12. Here we’ve installed a bunch of libraries with three commands. Let’s call those steps 1 through 3.

Step 4 is to insert your script references. Bower puts everything it downloads into a folder called bower_components by default. This may or may not be useful to you. You can either reference files straight from there, or you can move them wherever you want. Then you’d insert script tags, stylesheet references, or whatever’s appropriate for this particular library.

Later on when we look at Grunt, we’ll look at a way to simplify this process so you only have to insert one reference for all your Bower dependencies.

That was step 4. Step 5 is take a nap in your space suit. Because you’re done.

Slide 17: In Space, No one can hear you schlep

Schlep is a Yiddish word meaning “to carry something heavy or awkward”. A lot of everyday tasks during development are just schlepping. Think about it. You schlep some LESS or SASS to the tool that makes it CSS. You schlep JavaScript to the tool that lints it, or minifies it. When those things are done, you schlep the output someplace else.

You don’t need to be doing those things. What do you need, ulcers? That’s robot work. Some things are schlepping that you might not even consider. How about adding prefixes to styles in a stylesheet? What about reloading your web browser? Your cycle goes like this: Edit file. Save. Alt-tab. F5. See if it worked. That’s a lot of needless schlepping.

Slide 20: Grunt and Gulp

We’ll talk about two task Runners today, Grunt and Gulp. Both live in NPM: npm install -g grunt-cli, npm install -g gulp. See the getting started guide of either site for more details. Both have similar capabilities.

Both need instruction files and plugins to help them help you. Grunt has gruntfile.js. Gulp has gulpfile.js. Gruntfiles provide configuration in the form of deep objects. The action happens inside the plugins. Gulpfiles contain little procedural programs that chain together the output of many single-purpose plugins.

Which one should you choose? Doesn’t matter. Different approaches but both are good. Grunt is older and has a larger ecosystem, but Gulp is catching up fast.

Slide 21: Sharing the Wealth

Everything’s better when you share. There’s a debate about whether you should check NPM and Bower libraries into source control. It’s a lot of files, and it’s not strictly necessary.

Remember the --save and --save-dev flags? As long as you’re using --save or --save-dev, all your dependencies get written to your package.json and bower.json files. The next person that comes along just has to run these two commands. npm install to install Grunt plugins, and Bower install to install code libraries.

Slide 22: Robots in Action

Here we demonstrated a few Grunt tasks in action. The IDE I used in the presentation with Grunt support built in was WebStorm 8.

There’s also a Grunt plugin, grunt-bower-concat that can take all your Bower dependencies and concatenate them into a single file so you don’t need to add a reference for every new library you add to your project!

Slide 23: Build and operate your own spacecraft for fun and profit

Now that you’re saving a ton of time by farming your boring work out to robots, you have the time to build something new and exciting and explore the cosmos. In this section we’ll talk about some tools that will help you take your projects to the next level.

Before I mentioned starting from a foundation. We’ll look at at a tool to help scaffold different kinds of projects using battle-tested patterns and practices. We’ll also look at tools to help you manage dependencies, build powerful clients with HTML templating, and generate documentation to keep the whole thing maintainable.

Slide 24: Yeoman


Yeoman is a project scaffolding system. It can give you a fresh, clean project structure for all kinds of apps. The theory behind Yeoman is that you should not need to rearchitect every project. that’s robot work.

You can find Yeoman in NPM: npm install -g yo

Slide 25: Yeoman generators

To use Yeoman, you install generators. A generator is a template for creating an application. Popular generators include wordpress, angular, jquery plugin, bootstrap, zurb foundation (zf5)

There are more than 600 generators available: If you don’t find one you like, it’s easy to download and modify a generator, or make your own! There is a generator for making generators

One of the powerful features of Yeoman is the interactive install process. Yeoman can ask questions and use your answers to select different libraries, or to customize the files and folders it outputs. Yeoman has a powerful templating system as well.

For really simple setup, try the h5bp or assemble generators.

For something more powerful (but more complicated) try webapp.

Most Yeoman generators have Bower and Grunt integration already built in

Slide 26: Assembling the Fleet – Dependency Management

I used to be an ActionScript developer. ActionScript is JavaScript’s more respectable cousin. It had a lot of things that made it feel like a serious programming language, classes and inheritance and structure, and above all, great, simple dependency management.

So what does JavaScript have? Script tags. Script tags are not a dependency management strategy. You, the developer, are 100% responsible for enforcing that all your dependencies are loaded, at the right time, in the right order.

This encourages some people to put all their JavaScript into one file. Technically it works, but it has a way of breaking down when you have multiple developers working together.

Discrete classes means looser coupling, better maintainability and encapsulation. The more classes you have, the more likely you are to run into out-of-order dependencies.

Slide 27: What’s better than script tags?

CommonJS modules

CommonJS module format is popular in Node, but a popular implementation for the web is Browserify. CommonJS’s weakness is that it’s synchronous.

Asynchronous Module Definition (AMD)

AMD was developed to address shortcomings in CommonJS modules. It’s asynchronous and well-suited for web use. AMD also provides some nice structure. Definitions are analogous to classes. We’ve used AMD successfully on many projects.

Slide 28: Require.js


Require.js is a JavaScript file and module loader. It helps you manage dependencies so you don’t have to worry about the order in which scripts are loaded.

Require a powerful build process so you can deploy pre-built versions of your script to production (Browserify also has this). You can install it with Bower: bower install requirejs

Slide 29: require() and define()

Require.js has two basic methods, require() and define(). require() loads scripts asynchronously given a slash-separated path: require("APP/controllers/main");

define() exports Javascript modules in such a way that Require can keep track of them. Require keeps a registry of modules it’s already loaded, so if you need a class more than once it isn’t loaded multiple times.

In the example we see a module definition that has an array of dependencies. Require will (asynchronously) load all those dependencies and then run the callback. Dependencies can be aliased to a local variable so you can refer to loaded modules by class name.

Slide 30: What about non-AMD libraries?

No problem. Require.js’s config file allows you to shim non-AMD Javascript. The slide has an example of some shimmed libraries.

If I require(‘lib/jqueryui’) in my classes, require will make sure all its dependencies are loaded, and will then track it like any other AMD module.

Slide 31: Building and optimizing Require.js

The Require.js optimizer combines related scripts together into build layers and minifies them with UglifyJS or Closure Compiler. It scans your definitions, loads all the dependencies and inlines them into the top of the file. Then it minifies everything and outputs it to a directory of your choosing (we prefer js-built).

This makes it trivial to select the /content/js directory (for dev) or /content/js-built (for staging and production). Nothing else needs to change in your code or your config!

Loader plugins

Require.js also supports loader plugins to support dependencies that are not JS files, for example: text files, JSON, CSS, images, strings for i18n. There’s a full list of plugins on the require.js wiki

Slide 32: Speed improvements

Building your Require app before deploying it has huge benefits!

The Require.js optimizer can be run as a build task with node or with Rhino for Java. Node is much, much faster. On one project, the Require.js build takes ~60 seconds locally with Node, but over 10 minutes(!!) when run as part of the build using Rhino.

To improve performance, set skipDirOptimize: true, and minify your third-party JS libraries some other way.

Slide 33: Client side templating – the old way

Why do we need a templating engine? Because of code like this. This sort of thing makes maintenance difficult. Any kind of logic (repeaters, conditionals) means lots of procedural code. It mixes disciplines and makes it confusing. There’s an HTML file structure, but also HTML in your JavaScript. Also, it just feels ugly and wrong.

Slide 34: Handlebars.js


Handlebars is a templating engine for JavaScript. You can get it from Bower: bower install handlebars Web services are a giant step forward in terms of data portability and reusability. A big part of that is having lightweight back ends and powerful front ends. We’re using Handlebars to build dynamic interfaces on the client side.

Slide 35: The Handlebars way

You could put this example template in a hidden script tag with a type of x-handlebars-template (to stop the browser from trying to parse it as JavaScript). But it’s better to put it in an external file (with the .hbs extension) and load it in using Ajax. Require.js has a loader plugin that will load and parse Handlebars for you!

Slide 36: Binding data

Here’s an example of how you bind data to a Handlebars template. We’ve already loaded the template in from somewhere as a string. We ask Handlebars to compile() the string into a template. What this does it make it a function, which takes a JavaScript object as its model and returns HTML with the {{mustaches}} replaced by variables.

Slide 38: Owens Corning Design EyeQ®

We reviewed an example Handlebars interface. Design EyeQ® is available here: The alternate Lowe’s version is available here:

Slide 41: JSDoc


JSDoc is a tool for generating HTML documentation from structured JavaScript comments. You want version 3.3, which runs in Node.js. Older versions require Mozilla Rhino.

You can install JSDoc with npm: npm install -g jsdoc

Once installed, navigate to where your JS files are located in a terminal and run this jsdoc -r to document your entire project.

Slide 42: JSDoc tags

JSDoc syntax is very similar to Javadoc. Any comment beginning with /** is interpreted as a JSDoc comment. Comments beginning with //`` or/*` are not picked up.

Not every comment needs to be part of your documentation anyway. start thinking about the API you’re creating and document the things that make sense. Think about your code modules as classes and consider the public methods and properties each class exposes.

You can add meaning to your JSDoc with tags. The slide contains a few examples – see the tag dictionary for a full list and examples of some common patterns. Our HanBootStrap Yeoman generator also comes with some documented example classes: [](tag dictionary]( for a full list and examples of some common patterns. Our HanBootStrap Yeoman generator also comes with some documented example classes: [

Slide 43: The JSDoc Default Template

The default template has good information about classes, but it’s severely lacking in navigation and organization.

Slide 45: DocStrap template

Fortunately we have an alternative. JSDoc supports publishing templates: there aren’t many, but there is a very good one called Docstrap. It’s in npm: npm install ink-docstrap

DocStrap includes a nice pulldown top menu, search, and a fixed navigation bar in every class and module.

Slide 48: Thanks for watching

My contact info is as follows:


2 Responses to “Front End Development: IN SPAAAAAACE!”

  1. Richard Alexander Green

    I love these slides. I might even look at the code!

    Seriously, great stuff, great advice. Well worth the read.

  2. Exploring the New Frontier of Front-End Development | Blog | Hanson Inc.

    […] can walk yourself through the presentation here, and visit Dave’s blog for detailed notes and […]

Comments are closed.