How Concrete and Zencoder brought high quality video to HitRECord

Thanks to Zencoder and HitRECord for the kind words. Encoding high quality video was a snap with their Ruby on Rails cloud technology.

Reposted from:

With infinite “airtime” and consumer demand going through the roof, a consistent theme in 2011 and now for 2012 is that high quality content is essential for entertaining and engaging viewers. To fill the pipes, big companies are pouring big bucks into producing original content for online distribution. Google is investing $100M in original content for YouTube. We all rejoiced as Netflix resurrected “Arrested Development”, and Yahoo landed a Hollywood whale in Tom Hanks and his animated series.

In recognition of the importance of content in advancing the online video industry, we’re highlighting our partner Concrete Interactive and the amazing product that they’ve built for HitRECord. HitRECord, founded by actor Joseph Gordon-Levitt, not only facilitates the creation of high quality Internet video but is also unique in that it’s democratizing, and bringing real innovation to, the process of content creation.

Founded in 2005, HitRECord has established itself as a unique destination for video artists, filmmakers, writers, animators, musicians, and videographers to collaborate and interact with Gordon-Levitt and others on a wide variety of creative endeavors. HitRECord’s library has grown to over 20,000 complete videos. Almost 1,000 contributors have used HitRECord to create films and video, which are available in the “TheRecord Store”. Thousands more have contributed to films shown at HitRECord’s live shows. When a production makes money, there’s a 50/50 revenue split between HitRECord and the co-creators.

Even though we didn’t get to hobnob with the stars in Park City, we were excited to hear of the successful showcase of HitRECord’s works, which were featured last week at the prestigious Eccles Theater as part of the 2012 Sundance Film Festival. Joseph Gordon-Levitt emceed a sold-out screening, presenting films that were developed on HitRECord.

A few factors are making this explosion of Internet content possible. First, it’s getting cheaper and cheaper to create content. The cost of HD cameras and powerful editing suites have fallen rapidly, making it possible to create professional content on a small budget. On the distribution end, highly efficient software and cloud-based infrastructure make it possible to rapidly deploy Internet video applications with a relatively small upfront investment.

To build HitRECord, Concrete Interactive, the San Francisco-based boutique web application development firm, used technologies such as Rubyon Rails, Heroku, Amazon S3, JQuery, New Relic, Splunk, Airbrake, and Kissmetrics. They use Zencoder, and relied on our industry-leading performance to encode HitRECord’s library. They were able to convert tens-of-thousands of videos into web and mobile outputs overnight, for playback across a variety of devices.

Most importantly, the end product is a very high performance platform that facilitates the creative output of its users. Joseph Gordon-Levitt, or “RegularJoe” on HitRECord, said, “Since working with Concrete Interactive and Zencoder, HitRECord’s video upload process has been smoother, the quality is excellent and processing times are really quick.”

Software Quality

What is Software Quality?

If you speak of software quality, what do you mean?  Product managers generally mean they want their features to work as designed across all target platforms.  Project managers want software to be completed on time and on budget.  Executives want customers to pay good money for a good experience.  And software developers want to build efficient code that gets deployed to their user audience.  In discussions with my clients, that’s usually about as far as it goes, “We want high quality software.”

In this article, I discuss ways to dissect software quality into six relevant areas: ruggedness, architecture, performance, scale, security, and process.  Each of these aspects of quality can then be prioritized, and following each area I highlight actionable ways to improve (or take shortcuts) in your software project.

Download this article as a PDF

Time. Features. Quality:  Pick Two.

When asked, “What is most important to you for this project?  Time, features, or quality: pick two,” almost everyone these days will actively choose to have a software project come in on time, and be of high quality, while accepting a somewhat more limited feature set.

But maybe this software adage has become as outmoded as the waterfall model. After all, the time we apply to a software project can be shorter or longer.  The features can be many or fewer.  So how can we match software quality in its manifold aspects and degrees to the goals of a project?

First it is necessary to realize that there is no one such thing as “high quality software.”  Quality, when it comes to software, is a way of saying how well a program solves its goal problem.  Let’s break that down into practical areas, since almost all software projects must face choices for budget allocation.


Software ruggedness is most commonly, and mistakenly, viewed as overall software quality.  In short, rugged software is written correctly, without bugs or user interface flaws, and works well across all its target platforms.  Building rugged software is probably the most widely understood software development practice.  Developing rugged software typically involves human force approaches such as manual quality assurance testing, bug tracking systems (eg. Fogbugz, Lighthouse, Bugzilla, Jira, Trac).

To ensure a software project becomes rugged, most software development teams “put it to the test.”  An adversarial practice is promoted between QA testers and developers.  This is often viewed as healthy, since many developers tend to label a feature as completed and move on to implementing the next ticket in their queue before engaging in the arduous task of cross-platform and stress testing.  In addition the practice of automated testing, using tools such as Selenium, Fake or custom scripts, aids the manual processes, but does not reduce the force required, it just shifts the human testing burden on to a new automated test construction burden, which in turn may have inaccuracies.


The single most important aspect of producing high quality software is matching software architecture to its problem domain (read Domain Driven Design).  At times the ultimate domain is not known, such as when a start-up company tries a new concept in the marketplace.  Often the iterative nature of market feedback, combined with an agile software development process, results in patchwork architecture ill-suited to the adapted domain.  Refactoring is the process by which a new software architecture is put in place of an existing one, containing the same feature set but, with increased capacity to solve problems in a new domain.

Refactoring is often seen by product managers as a costly and time-consuming enterprise that developers want for “code cleanliness” but with dire consequences on schedule and budget.  Yet, without refactoring a “dishes in the sink” approach can become endemic to the team mindset, wherein another dish, or in this case a poorly architected feature, gets added to the pile until a major reckoning is required.

Agile methodologies reduce the need for major refactoring by dissolving it into a relatively continuous process done along with the tasks of each sprint.  By shortening milestone deadlines into sprints, say three or four weeks, and reconciling the architecture improvements needed for each milestone, a clean household is maintained, and software architecture is built up continually to match the next few milestones.

  • Enforce use of Design Patterns: have developers do brown bag lunches, put this book on every developer’s desk.
  • Code up architecture for the current release.  Draw out architecture for the next two releases.


When it comes to massive database crunching, 3D game development, or scientific computing, careful attention is normally paid to algorithmic performance (as opposed to application scaling, addressed below).  However, in the course of building web applications, 2d flash games, or mobile apps, performance is usually examined when it becomes an issue.  This could be when the number of sprites in a flash game slows rendering, when mobile memory overflow causes crashes, or when web database queries become unacceptably long.  Code performance is another area where attention to quality at the right level should be built in as solutions are developed.


In contrast to performance, scale is often considered at the outset of web application projects, where eager management teams assert the need for servicing hundreds of thousands of concurrent users a la Twitter.  Technologies such as Ruby on Rails, Heroku, Google App Engine, Virtualization, and Amazon EC2 make scaling web applications easier than ever, and there is very little development cost to throwing more hardware at many common problems.  Of course, database design, server APIs and application server caching schemes go a long way to reducing these service costs by using them more efficiently.


Probably the most overlooked area of emphasis when developing a new software product or web service is security.  Frequently security breeches are recoverable, except the damage done to customer opinions (take the recent Sony debacle).  Life does go on (usually) after getting hacked, having a Distributed Denial of Service (DDoS) attack take down your site or application access for a time.  Scrambling to plug holes is the normal route.  But as with all other areas of software quality, a few simple steps at the beginning or middle of the development cycle can prevent the vast majority of infiltration.  Spending a day or two on threat modeling, updating security patches, and reading documentation about how to defend against common attacks are equivalent to locking your door when you leave the house: not impenetrable, but likely to encourage the attacker to move along to the next property.


Light enough not be burdensome, sophisticated enough to fit the tasks at hand, software development processes should fit the team, the company and the product.  Easier said than done.  Modern processes such as Agile overturn decades of doctrine involving heavy process such as lengthy code comments and prolonged design cycles in favor of iteration and well, agility.  The best software processes are fun, and since development teams are human, it is easier to convince them to do fun things than arduous ones.  The danger of a methodology such as Agile, or goal definition such as minimum viable product as to use as an excuse for slipshod project management and code practices.  To find balance in this realm is to focus on just what is important and have the right toolset to support just that narrowest of procedures: daily scrum meetings, hand-drawn feature sketches, presenting paper prototypes to potential customers, continuous deployment tools like Hudson and Capistrano.

  • Inculcate Test Driven Development into your process.
  • Use fast, analog approaches where possible: whiteboard, stickies, physical calendar on the wall.
  • Draw paper prototypes by hand (or with a ruler) and iterate frequently.
  • Tell your project manager to scan each design iteration and post to a wiki page.
  • Go to a cafe and buy strangers lunch in exchange for spending 5 minutes playing with your product on a laptop or phone.


It is my hope that by developing a deeper working definition and understanding of these six aspects of software quality, software projects will avoid many quality-related pitfalls, and make better decisions about how to allocate effort.  It isn’t necessary to spend time on every one of these areas, but it is necessary to consciously decide whether or not to.