Visual Regression Testing Part 2: Extending Grunt-PhantomCSS for Multiple Environments

Earlier this month, I explored testing dynamic content with PhantomCSS in the first post of a multi-part blog series on visual regression testing. Today, I dive a little deeper into extending grunt-phantomcss for multiple environments.

Automation is the name of the game, and so it was with our visual regression test suite. The set of visual regression tests that we created in part 1 of this series allowed us to test for cosmetic changes throughout the Department of Energy platform, but it was inconvenient to fit into our workflow. The GruntJS task runner and the Jenkins continuous integration tool were the answer to our needs. Here, in part 2, I’ll walk through how we set up Grunt and the updates we contributed to the open source grunt-phantomcss plugin.

Grunt is widely used nowadays, allowing developers to script repetitive tasks, and with the increasingly large repository of plugins it would be crazy not to take advantage of it. Once installed, typically through NPM, we set up a Gruntfile containing the list of tasks available to developers to execute through the command line. Our primary task would not only be in charge of running all of the visual regression tests, but also allow us to specify the environment we wished to run our tests against. With DOE, we have four such environments: local development environments, an integration environment, a staging environment, and our production environment. We needed the ability to maintain multiple sets of baselines and test results, one for each environment. Achieving this required an extension of Micah Godbolt’s fork of grunt-phantomcss.

The grunt-phantomcss plugin establishes a PhantomCSS task to run the test suite. In the original plugin all baselines and results are stored in two top-level directories, but this is not ideal because it conflicts with the notion of modularity. Micah Godbolts’s fork stores each test’s baseline(s) and result(s) in the directory of the test file itself, keeping the baselines, results, and tests together in a modular directory structure with less coupling between the tests. This made Micah’s fork a great starting point for us to build upon. Adding it to our repo was as easy as adding it to our package.json and running npm install.

Screen Shot 2015-05-04 at 3.12.33 PM.png


After mocking up our Gruntfile in the grunt-phantomcss documentation, we needed to specify the environment to run our visual regression test suite against. We needed the ability to pass a parameter to Grunt through the command line to allow us to execute a command such as the one below.

First, we needed to establish the URL associated with each environment. Rather than hard-coding this into the Gruntfile we created a small Gruntconfig JSON file of key-values, matching each environment to its URL. This allows other developers to easily change the URL depending on their environmental specifications.

Importing the key-value pairs from JSON into our Gruntfile was as easy as a single readJSON function call.

Next, we needed a Grunt task that would accept an environment parameter and pass it through to grunt-phantomcss. This way CasperJS could store these baselines and results in a particular directory for the environment specified. We achieved this by creating a parent task, “phantom,” that would accept the env parameter, set the grunt-phantomcss results and baselines options, as well as a new rootUrl option, and then call the “phantom-css” task.

 The rootUrl option is what eventually passes to CasperJS in each test file to prepend to the relative URL of each page we visit.

Extending Grunt-PhantomCSS

Now that the Gruntfile was set up for multiple environments, we just needed to update the grunt-phantomcss plugin. With Micah’s collaboration we added a rootUrl variable to the PhantomJS test runner that would accept the rootUrl option from our Gruntfile and pass it to each test.

 We made sure to maintain backwards compatibility here by keeping the rootUrl directive optional so old integrations of the grunt-phantomcss plugin would not be adversely affected by our updates. Now almost there, the final step was to update our tests to account for the new rootUrl variable. Here we prepend the rootUrl to the now relative page url.

With the grunt-phantomcss plugin updated, we were able to run our visual regression tests against multiple environments, depending on our needs. We could run tests after pull requests were merged into development. We could run tests after every deployment to staging and production. We could even run tests locally against our development environments as desired.

Bonus: Tests for Mobile

After all our success thus far, we wanted to add the ability to specify the viewport to our Gruntfile. We have particular tests for each of our four breakpoints: full-size for desktops, mid-size for large tablets, narrow for “phablets”, and tiny for phones. This was an easy lift, just requiring a few more tweaks to our Gruntfile.

Screen Shot 2015-05-04 at 3.11.59 PM.png

Here we set up four sub-tasks within the “phantomcss” task, one for each breakpoint. Each subtask specifies the viewport size and the location of the associated test files. Then we updated our parent task “phantom” to take two arguments: an environment parameter and a breakpoint parameter. Both also needed defaults in case either argument was not specified.

Additionally, we didn’t want a single test failing to halt the execution of the rest of our tests, so we added the grunt-continue plugin to our package.json. Grunt-continue essentially allows all tests to run regardless of errors, but will still cause the overall “phantom” task to fail in the end if a single test fails. Here is what our new “phantom” task looks like:

It was a success! Through the power and versatility of Grunt and the various open source plugins tailored for it, we were able to achieve significant automation of our visual regression tests. We were happy with our new ability to test across a range of environments, combating regressions and ensuring our environments are kept in a stable state.

But we hadn’t reached our full potential yet. The workflow wasn’t fully automatic; we still had to manually kick off these visual regression tests periodically, and that’s no fun. The final piece of the puzzle would be the Jenkins continuous integration tool, which I will be discussing in the final part of this Department of Energy Visual Regression Testing series.

Subscribe to the Phase2 mailing list to learn when the next post in the visual regression test series goes live!

Phase2 Takes Los Angeles: Watch All Our DrupalCon Sessions!

What a Week!

It’s hard to believe DrupalCon 2015 has already come and gone. As always, it was an event jam-packed with knowledge sharing, learning, sprinting, and of course a healthy dose of fun as we celebrated with the Drupal community. Add some virtual reality and 360 degree videos into the mix, and it’s safe to say we had a fantastically geeky time in Los Angeles (just the way we like it!).


Catch All the Recorded Phase2 Sessions

If you weren’t able to attend all the Phase2 sessions you were looking forward to, never fear! Thirteen Phase2 experts presented at this year’s DrupalCon, and each of their sessions is already available online. Catch up on them here:



Plus – Drupal 8 Sessions!

After launching one of the first enterprise Drupal 8 sites in the United States earlier this month with Memorial Sloan Kettering Cancer Center, we were excited to share what we’ve learned at DrupalCon. Watch our Drupal 8 sessions now!



Want to see more photos of the Phase2 team at DrupalCon Los Angeles? Visit our Flickr account!

Subscribe to the Phase2 mailing list for more from the Phase2 experts!

Driving Drupal 8 Forward Through Early Adoption

Last week, we were proud to announce the launch of Memorial Sloan Kettering Cancer Center’s enterprise Drupal 8 site, one of the first major Drupal 8 implementations in the U.S. One of the awesome benefits of working on this project was the opportunity to move Drupal 8 forward from beta to official release. Phase2 has been instrumental in accelerating Drupal 8, and we were excited that Memorial Sloan Kettering was equally invested in giving back to the community.

Benefits of starting early

Getting started during the beta phase of Drupal 8 meant that it wasn’t too late to fix bugs and tasks. Even feature requests can make their way in if the benefits outweigh the necessary changes to core.

Similarly, if other agencies and shops starting to use Drupal 8 are going through many of the same issues, there is more of an opportunity for collaboration (both on core issues and on contrib upgrades) than on a typical Drupal 7 project.

MSK, first Drupal 8 site launched

By the numbers

As of this writing, 57 patches have been directly contributed and committed to Drupal 8 as part of this project. Additionally, nearly 100 issues have been reviewed, marked RTBC, and committed. Hundreds of old and long neglected issues have been reviewed and moved closer to being ready.

Often, to take a break on a particularly tricky issue, I’d switch to “Issue Queue Triage” mode, and dive into some of the oldest, most neglected corners of the queue. This work brought the oldest Needs Review bugs from ~4 years to less than 4 months (the oldest crept back up to 6 months once I started circling back on myself).

This activity is a great way to learn about all the various parts of Drupal 8. Older issues stuck at Needs Review usually need, at minimum, a substantial reroll. I found that once tagging something with Needs Reroll, there were legions of folks that swooped in and did just that, increasing activity on most issues and getting many eventually committed.

One of my favorite but uncommitted patches is adding Views integration for the Date module. It’s still qualified as Needs Review, so go forth and review! Another patch, which is too late for 8.0.0, adds a very basic draft/moderation workflow to core. This patch is another amazing example of how powerful core has become–it is essentially just UI work on top of APIs already in Drupal 8.

Brad Wade, Phase2 Developer at DrupalCon Los Angeles

Porting contrib modules to Drupal 8

This project has contributed patches and pull requests for Drupal 8 versions of Redirect, Global Redirect, Login Security, Masquerade, Diff, Redis, Memcache, and Node Order.

One of the remarkable things about this project, and a testament to the power of Drupal 8, is how few contributed modules were needed. Compare some 114 contrib modules on the Drupal 6 site, to only 10 on the Drupal 8 site.

Considering Drupal 8 for your organization? Sign up for a complimentary Drupal 8 consultation with the Phase2 Drupal 8 experts


Evan Liebman Talks Drupal 8 and the Importance of Community at MSK

Over the past several months, our team has had the pleasure of helping to build Memorial Sloan Kettering Cancer Center’s (MSK) new sites on Drupal 8. Evan Liebman, Director of Online Communications and Technology at MSK, shares his experience with Phase2, Drupal 8, and everything in between.

Q: How did the culture of innovation and leadership at MSK play into the decision to adopt Drupal 8?

A: When we were evaluating our CMS options, what drew us to Drupal 8 was its clear alignment with several of MSK’s strategic pillars. First, innovation. We have researchers and clinicians at MSK who regularly push boundaries to innovate and generate new knowledge. We are inspired by their relentless efforts and are driven to do the same in our space. Second, sustainability. Because we were migrating from a Drupal 6 site, we had to choose between upgrading to Drupal 7 and quickly following a launch with a move to Drupal 8 or making the leap to Drupal 8, which was still a beta. We saw more of a long-term future with Drupal 8. Third, talent recruitment. The use of Symfony and Object Oriented Programming in Drupal 8 means that Drupal is becoming more accessible to more developers. In essence, their inclusion is a signal that the Drupal community will only continue to grow, Drupal and MSK growing with it.

Q: What was the most unexpected thing about building in Drupal 8?

A: To be honest, the most surprising part was how easy it was! From the things we’d heard, we thought it was going to be extremely difficult, and there was a learning curve involved. But once we’d gotten past that, Drupal 8 wasn’t as challenging as expected. The tools built into Drupal 8 core really helped to speed up the process. For instance, we went from 40 custom modules on Drupal 6 to 10 on Drupal 8 because more functionality was included in core.

Q: What is the biggest benefit of building in Drupal 8?

A: The most beneficial thing about Drupal 8 is the community effort that surrounds it. This is really important to us at MSK. Our clinicians and researchers work together across departments and specialties to give our patients the best care possible. It’s a multi-stakeholder effort. So the opportunity to be a part of the Drupal community, giving to others and knowing that in some ways it comes back to us — that was a major benefit of Drupal 8. Then, of course, there’s the added benefit of having the most up-to-date technology, which is important for being on the cutting edge in the healthcare industry.

Q: What are the most important Drupal 8 modules/code Memorial Sloan Kettering contributed back to the community?

A: One of the most exciting contributions is an alternative to Web Forms we created using YAML, thanks in large part to Jonathan Hedstrom of Phase2 and Jacob Rockowitz of Big Blue House. We’ll be sharing more details on that one at DrupalCon LA. But a big part of our contributions related to un-blocking issues and fixing bugs in core. As of now, 53 patches have been directly contributed and committed to Drupal 8 as part of this project, and nearly 100 issues have been reviewed. All of this work has kept Drupal 8 moving forward toward an official release.

Check out Jonathan Hedstrom’s blog post for details on specific Drupal 8 patches, issues, and modules!

Q: So we’ll be seeing Memorial Sloan Kettering at DrupalCon?

A: Yes, you will! This was a really unique project in that it brought together three different organizations — MSK, Phase2, and DigitasLBi — as we all collaborated to learn the ropes on new technology. We feel it is important that we tell our story together, especially because there is a lot to tell! So you can look for us on stage at the business showcase at 10:45am on Wednesday. I will also be joining Frank Febrraro and Michael Le Du of Phase2 to discuss Drupal 8 for enterprise organizations on a panel called “Drupal 8, Don’t Be Late.” In addition, MSK will participate in several BOFs throughout DrupalCon, including one focused on the YAML Forms module — so stayed tuned for more information on those!

Q: What are some ways in which the MSK team collaborated with Phase2 and Digitas to move the project forward?

A: Before we can effectively collaborate with our partners, we develop a core internal team to help us navigate through the project. Then, it’s very important that each organization had a seat at the table from the beginning, so everyone could see the roadmap from the start. Equally crucial was keeping open lines of communication. MSK really prioritized internal and cross-organizational communication, and that paid off during the later stages of the project.

Q: What advice would you give to other enterprises embarking on a Drupal 8 project?

A: Take the plunge! There’s a sense in the community that Drupal 8 is daunting, and although that may be true in the beginning, your velocity picks up quickly. There’s definitely a learning curve, but it lasts for a relatively short period of time. So if you’re on the fence, go for it! But make sure you choose wisely when selecting your partners. We chose Phase2 because of their experience as early adopters of previous versions of Drupal, and that expertise served us well.

Learn more about Phase2’s work with Memorial Sloan Kettering here.

8 Must-See Sessions at DrupalCon Los Angeles

DrupalCon Los Angeles is just around the corner and there are a ton of awesome sessions to attend. Every year, the top minds in the Drupal community present their thoughts on the tools and processes that will shape the future of Drupal. As always, the number and quality of the sessions available for consumption at DrupalCon is immense.


Having been part of the DrupalCon planning committee, I’ve been privileged to help shape the track selection criteria, review session submissions, and provide support to other committee members. To help others narrow their options for session attendance, I’ve created a short list of sessions that have me excited to attend.

Disclosure – While I’m excited that there are several sessions have been submitted by the fine folks at Phase2, in the interest of neutrality I’ve deliberately excluded sessions submitted by my co-workers here. You can read up on Phase2′s sessions here.

What are the trends?

One advantage of having to review all the sessions on the Coding & Development track is quickly becoming acquainted with the patterns emerging in the community. This year, of course, there was a plethora of Drupal 8 sessions – which makes sense given that D8 is in beta.

Like many in the Drupal Community, I’m excited to see the new features and improvements to Drupal core. However, the move to a more Objected-Oriented philosophy means that the old procedural ways of doing things are shifting. I’m looking forward to the Symfony & D8 sessions addressing this.

The other trend that has been taking the Drupal world by storm is Headless Drupal. As far as I can tell, the common thread here is Drupal’s front-end theming layer being replaced by JavaScript applications such as Angular, React, or Ember. Drupal’s role in the process is one of a database UI to curate content and manage non-theme layer configuration, such as editorial workflow and third party content aggregation.


8 must-see DrupalCon sessions:

Drupal 8: The crash course

DrupalCon without Larry Garfield (Crell) would be like spring in the Northwest without daffodils. I routinely enjoy his talks and blog posts, and he excels at presenting a complex topic as a series of easy-to-understand bites that clearly explain even the thorniest of topics. His session this year covers an introduction to the new systems introduced in Drupal 8, which assumes (but does not require) some knowledge of Drupal 7. This is one of many Drupal 8 sessions, but unless you’ve already been working in D8, it wouldn’t be a bad idea to target this one in particular.

Drupal 8′s render pipeline

This session focuses on the new ways that Drupal renders page content in version 8 – specifically the new caching regime that caches entities instead of just nodes, improved cache invalidation (cache tags, & bubbling FTW!), and so forth. Cache tags mean that individual cache components can be reset on certain events (expiring blog post feed cache when there’s a new post, caching the latest version of the page on node save, not after 2 hours). This is a huge win for performance nerds, and will have a significant effect on Drupal’s performance as a standalone application, as well as being part of a larger web stack.

What’s next for Updates on Strategic Initiatives

Josh Mitchell, the new CTO of the Drupal Association has come up with a set of strategic initiatives to improve D.O. as a resource for all members of the Drupal community. This is something that I feel has been some time coming, and I for one am excited to hear CTO Mitchell’s ideas for the future of D.O.

Issue Workspaces: A better way to collaborate in issue queues

Interacting directly with the code base on D.O. is something of a challenge for people new to the community: it requires fairly advanced knowledge and can be a big barrier to new folks contributing on issue queues. The Drupal issue queue needs to modernize to mimic best-of-breed code repository tools such as Bitbucket and Github. It’s exciting to see how is evolving to support a more git-friendly workflow.

We need revisions and CRAP everywhere in core

Dick Olsson (Dixon_), maintainer of the Deploy and UUID modules, posits that while content staging will never be in core, it should be easy enough to implement a Create Read Archive Purge model of content workflow. I believe this session will extend his previous sessions from Austin and Amsterdam, focusing on what needs to be done to extend this functionality out from core using contrib modules. This session also has the added benefit of having a related sprint on Friday.

What Panels can teach us about Web Components

Drupal often blurs the line between data and display layers in an application, as anyone who has written a custom theme function or a template file can attest. The Panels module is an effective way to decouple display and data layers. Anyone who has been involved with the Panels module knows its immense power. Therefore, this could be an interesting session to preview potential improvements to Drupal core (which seems to have been unaffected by the recent trend towards Headless Drupal).


To the Pattern Lab! Collaboration Using Molecular Design Principles

For the uninitiated, Pattern Lab is a dynamic prototyping system that focuses on breaking down a page into small, self-contained blocks of content. These blocks can be combined into multiple configurations without needing to rebuild everything from scratch. Furthermore, since the prototype is being viewed in the browser, elements are styled using CSS, and the markup can be edited to mimic Drupal’s native markup structure. As a result, the prototype’s style closely imitates the styling of the Drupal site, reducing duplicated effort in the theme creation and prototyping phases. As a bonus, because the system uses the web stack, the site can be designed as responsive from the beginning.

Making Content Strategic before “Content Strategy” Happens

Content Strategy can be defined as the process of planning content so as to maximize its effect for users. I’m excited to hear that people in the community are also interested in creating content that is engaging, compelling, and interesting.

So many sessions, so little time

Below are some interesting simultaneous sessions, so you will have to choose which one you’d rather attend. But fear not, all the sessions should be recorded to view at a later time.

If I were a themer or coder, and a fan of fast demos, I’d go to: 0 to MVP in 40 minutes: Coder and Themer get rich quick in silicon valley. If I were a SysOps aficionado and interested in hearing from a couple of Drupal community heavyweights I’d go to: You Are A Golden God: Automate Your Workflow for Fun and Profit.


This next one is a toughie. This time slot is occupied by three sessions.

For the Content Strategy nerd in me, nothing makes me happier than to see the DA taking steps to create model content that helps communicate Drupal’s mission: Content Strategy for For the front-end, storytelling nerd in me there’s: Styles of Storytelling: Cultivating Compelling Long-form Content. Finally, Steve Persch (stevector) makes the case for extracting Drupal from generating markup for a webpage: Rendering HTML with Drupal: Past, Present and Future.

If I were a junior dev looking to level up into a more senior dev role, I’d attend: De-mystifying client discovery. If you haven’t already been to a Headless Drupal session, or you’re a fan of Amitai Burnstein’s colorful presentation style, go to: Decoupled Drupal: When, Why, and How.

Todd Nienkerk’s talk on company culture was very warmly received at DrupalCon Latin America this spring, and I’m excited to hear this in person: Creating a Culture of Empowerment. Another session I’ve had on my radar ever since Ryan submitted: Routes, controllers and responses: The Basic Lifecycle of a D8 Request. He’s a must-see presenter.

If you’d like to get better acquainted with the D8 plugin system, An Overview of the Drupal 8 Plugin System would be right up your alley. Larry Garfield (Crell) is also presenting a workshop on what shouldn’t be the focus of Drupal core: No.

Clearly, DrupalCon LA will be a really exciting opportunity to grow skills, both as a developer and as a community member. As always I’m looking forward to attending many of these sessions, and also for the opportunity to network and contribute to the success of the Drupal project.

Transforming Enterprises with Drupal 8

As we’ve said before, enabling organizations to transform digitally is at the heart of Phase2’s focus on content, collaboration, and experience. A key element of effective transformation is the combination of adaptability and foresight – in other words, the ability to see what new technologies are coming, understand their potential, and harness their power for your benefit.

In the world of open source CMS solutions, that imminent technology is Drupal 8. Although a long time coming, Drupal 8 is still an unknown quantity for many organizations. The way we see it, companies’ willingness to pick it up and run with it (strategically!) will play a major role in their success in the coming years.

MSK & Drupal 8, A Commitment to Innovation

Last year, Phase2 teamed up with Memorial Sloan Kettering Cancer Center to act as the organization’s Drupal technology partner after they had made the innovative decision to be Drupal 8 pioneers. The MSK team had more than a simple migration in mind: they endeavored to build one of the very first enterprise-scale Drupal 8 sites, despite the fact that D8 only existed in a beta form at the time. This decision reflected the center’s ability to see the big picture and boldly pursue innovation. In everything from patient care to biomedical research, MSK constantly seeks new ways to advance beyond what was previously thought possible, and their attitude towards digital transformation was no different.


Major Perks of D8

In addition to the power in core, which allowed the team to use less than ten total modules, there were vast improvements in extensibility, testing, templating, and configuration management.


The ability to extend plugins and services is more available in Drupal 8, with the result that instead of struggling to use yet-to-be-ported contrib modules, our team was free to create custom code specifically fitted to MSK’s needs. The idea of inheritance also made custom code easier to manage.

Object-Oriented Programming

One of the trickiest learning curves of Drupal 8 was also the catalyst for a lot of saved time. Object-oriented programming forced us to take a highly structured approach, isolating our business logic into objects. This results in separated pieces which can move forward despite one another. You can run your migration without knowing how things are going to be themed, and you can theme things without knowing how all content will be structured, etc.

d8 and symfony


The level of testing integrated directly in Drupal 8 core makes it significantly easier to confidently maintain MSK’s site functionality as Drupal 8 continues to evolve. The existence of self-documenting tests, which weren’t available in Drupal 6, was a great positive change for the MSK team.

External Libraries

Drupal 8’s incorporation of Twig accelerated the theming process. In addition to Twig, the inclusion of external libraries (JavaScript, Backbone, PhpUnit, Guzzle, and Symfony’s YAML, Routing, and Dependency Injection libraries just to name a few) created a great framework for our developers to work in.

Don’t Miss the D8 Train

We fully believe Drupal 8 (even as a beta) is a valuable alternative to Drupal 6 and 7, especially for enterprise organizations that can combine the core with extended custom code. What’s more, the community needs more organizations to take the leap to Drupal 8 to facilitate improvements and provide influential feedback to the community. Phase2 and MSK were able to contribute a significant amount of code back to the project. To move Drupal 8 closer to an official release, more organizations need to invest in its creation through projects of this kind – and Drupal vendors need to be ready to support them.


Drupal 8 is a win-win for enterprises and the Drupal community alike. Are you and your organization ready to transform with Drupal 8? Take the first step by attending our DrupalCon session with Memorial Sloan Kettering (or our session on Drupal 8 for enterprise organizations!). You can learn from the challenges we faced and come away with a list of hints, tricks, and best practices for beginning your own Drupal 8 project. In the meantime, stay tuned to our blog for more on our adventures in Drupal 8.


Visual Regression Testing: How to Test Dynamic Content with PhantomCSS

Visual regression testing is an incredibly useful and growing area of testing in the software development sphere. Launching off Micah Godbolt’s recent blog post, CSS Testing with PhantomCSS, PhantomJS, CasperJS and Grunt, we worked with the Department of Energy to expand their current suite of tests to include visual regression testing. Then we went one step farther by automating the process with continuous integration with Grunt and Jenkins.

This blog post represents part 1 of a 3 part blog series on visual regression testing. Today I’ll dive into how we visually tested the Department of Energy platform without interference from dynamic content.

The Department of Energy platform has long had a suite of Behat tests that run prior to every deployment, enabling our combined team to test key functionality throughout the site. While a great asset to the platform, Behat unfortunately doesn’t address cosmetic changes. This could be anything from minor font-size changes to major contextual changes. However, with the integration of the PhantomCSS library, we have been able to supplement Behat with a new suite of visual regression tests. These new tests allow us to visually compare pages or elements of a page against baseline screenshots, confirming that no visual changes have occurred.

Because the platform includes over 60 offices, each with their own subset of constantly updated pages and customizations, we were forced to take a unique approach regarding the granularity of our tests. Typically, you’d use PhantomCSS to target only specific elements within a site. But taking snapshots of individual elements whose content could change anytime would give us unreliable baselines, so we instead used PhantomCSS to uniquely test our layouts and select global theming components. To avoid a windfall of false-positives, we needed a mechanism to stash dynamic user content and temporary styles from our screenshots.

Better Buildings Residential Network

First, we identified the dynamic content and targeted it.

Then, prior to every test of the UI, we have CasperJS evaluate several snippets of jQuery.

We first hide the dynamic content within our pages. This includes text, images, videos, and so on. Doing so ensures swapping out one piece of content for another will not cause our tests to fail. Note that we are only hiding the content, not removing it so we don’t alter the structure of the page. Next, we set fixed heights to each parent element so adding or removing a piece of content within each container will not cause our tests to fail from the container growing or shrinking. This leaves us with only fixed outlines of the containers on each page, allowing us to test the layout and structure of pages within the platform.

Now, within a Drupal context, we are able to test how many blocks are present on a page and if regions of our contexts are properly laid out. Furthermore, a user simply adding a paragraph to an article or replacing one image with another won’t cause our tests to fail, falsely indicating the platform has broken. Below are two examples of baseline screenshots that our test suite generates.

Better Buildings Neighborhood Program BaselinePublic Services


Testing Static Elements

For a few significant and constant elements on the platform, we tested them as a whole, instead of concealing the theming and content. Such elements include the header, the navigation, and the footer. For these elements we simply target the element’s ID and let PhantomJS do it’s thing.

Department of Energy Footer

What happens when a test fails?

When a test fails the screenshot diff is shown to us, highlighting any and all changes. For example, when the Department of Energy’s footer is altered the following test result is generated.

Department of Energy Footer Failed Test

When a block is mistakenly removed from a page we receive the following failing test result.

Better Buildings Neighborhood Program Failed Test

This was a great first step. After creating these tests for many of the pages on the Department of Energy platform we were able to properly test the layout of our pages – but it left us wanting more. We needed an improved workflow to automatically run our visual regression tests after pull requests were merged to development, after every deployment to staging and production, and locally as desired. We needed an update to the open-source grunt-phantomcss module and the introduction of Jenkins, both of which will be covered in subsequent blog posts.

Subscribe to the Phase2 mailing list to learn when the next post in the visual regression test series goes live!

Announcing Features for Drupal 8


The first Alpha version of the Features module for Drupal 8 is now available! “But wait!” you say. “I thought we didn’t need Features anymore now that we have the Configuration Management Initiative (CMI) in Drupal 8?”

This article will explain why some sites will still need Features, how Features works with CMI, and how Drupal 8 Features differs from the Drupal 7 version.

Features and CMI

Built in Drupal 8 core, CMI cleanly and consistently separates “content” (articles, blogs, etc) from “configuration” (views, content types, fields, etc). Rather than forcing each module to create its own import and export format, CMI uses YAML (yml) files to store configuration data. CMI also provides mechanisms for deploying configuration between sites, including development, test, and production environments.

Drupal 7 didn’t have CMI, and because the Features module could export and import configuration data as code modules, Features was often used for configuration management and deployment. But Features wasn’t designed to be a full configuration management system. Features was initially written to “bundle reusable functionality”.  The classic example was creating a Photo Gallery which was a bundle of a content type, some fields, a view, and an image style.


To create a Photo Gallery module in Drupal 8, you need to use the “single export” option in CMI to export your content type, then export your fields (and field storage), then export your view, then export your image style, and then hope that you haven’t missed any other important settings. This very manual process is subject to errors. Luckily, the Features module in Drupal 8 allows you to pick and choose what configuration data you want to add to your custom module, simplifying the process.

Features in Drupal 8 uses CMI as a consistent storage mechanism, in addition to importing and exporting configuration. Both the user interface and drush command line support are similar to the Drupal 7 version many developers are accustomed to.

New Features in Features

Rather that rewriting Features from scratch for D8, we started with the excellent “config_packager” module from nedjo. His module was created to export “packages” of configuration in D8, which is really the intended use-case of Features. During the past few months I have worked closely with nedjo to merge his config_packager concepts into the new Features module, which gives users some very cool new functionality. It’s been a great example of how open source collaboration often produces better results than either individual effort.

Features_D8_PluginsAssignment Plugins

One awesome idea from config_packager was plugins that can automatically assign existing site configuration into packages. When you first install Features and go to the listing page within the Configuration Management section of your site, you will see that it auto-detects your content types (article, page) and creates Features for exporting these content types, automatically including any fields, views, or other configuration assigned to that content type.

You can control which assignment plugins are enabled, and what order they run. Each plugin can have it’s own configuration. For example, the “Base” assignment plugin lets you choose which component should be the “base” for organizing configuration. By default, the base is “content type.” If you change the base to “views” then your site configuration will be organized into Features based upon each view on your site.

These assignment plugins solve the problem of organizing and modularizing your configuration. They also better modularize functionality within Features, such as auto-detecting dependencies, adding profile configuration, or excluding configuration that has already been exported to a module.


Assignment plugins are a great concept, but we wanted to support multiple plugin settings to enable easy switching between “show config organized by content type” and “show config organized by views.” In addition, we wanted to add “namespaces” to Features to better isolate configuration exported to different profiles. For example, “namespaces” make it easier to support sites running multiple distributions, such as Open Atrium running on Panopoly, where some Features are prefixed by oa_* and other Features are prefixed by panopoly_*.

In D8 Features we implemented “bundles” to specify the namespace (prefix) of Features modules, similar to the “package” tabs along the left side of the Features listing in D7. These bundles can be assigned to a specific profile, such as Panopoly or Open Atrium and will filter the Features listing to only show modules within the bundle’s namespace. Assignment plugin settings are stored within the bundle, allowing different bundles to organize configuration in different ways. You can also easily export configuration to a different bundle, making it finally easy to copy your Features from one namespace to another.

Features is for Developers

Drupal 7 didn’t have CMI — core support for configuration management was added via Features. If a custom module contained configuration exported by Features, you needed to have the Features module enabled (in most cases), even on Production, to use that module. In Drupal 8 we wanted to remove the dependency on the Features module. When you create a module using Features D8, the module will work on any other D8 site even without Features being installed!

In Drupal 8, configuration is owned by the site and not by modules. This is a very important design decision behind CMI and how your site configuration is managed and deployed. Configuration provided by a module is imported into the site only when the module is enabled. Once the configuration is imported, any changes made to the config files in the module are ignored by the site. The only way to re-import the configuration is to uninstall the module and then reinstall it.

Drupal 8 also does not allow you to enable a module that contains configuration that already exists on the site. This can cause problems with updating configuration provided by a module. For example, if the Photo Gallery view within our Gallery module needs to be changed, you normally need to uninstall the module and then reinstall it. However, some configuration is not removed when you uninstall a module. Thus, you can have a situation where you uninstall the module and then cannot reinstall it because the configuration already exists. You must manually delete the view provided by the module before you can re-enable the module.

These hurdles and restrictions can make developing modules that contain configuration difficult. As a developer, Features provides functionality to help with these issues. When the Features module is active, it allows you to enable a module that contains configuration already on your site. Once a module is enabled, Features allows you to Import changes from that module into your site without needing to first uninstall the module. This prevents the situation of having a module that cannot be re-enabled, or a module that cannot be uninstalled because of dependencies and makes it much easier to update the configuration stored in the module during development.

You only need to install the Features module in your development environment. Once you have updated your feature module and imported the configuration changes into your dev environment, you then use normal Drupal 8 CMI to deploy configuration to staging and production. If you still want to keep the Features module enabled on staging or production for doing config imports, you can disable the Features UI submodule and only use the drush interface.

Overriding Configuration

If you change configuration in Drupal 8 that was imported from a module, Features will detect and display this change. If you import the new module configuration, it will overwrite the active site configuration. If you export the changes you will update the config data stored in your module. CMI controls the granularity of this configuration. For example, an entire View is a single configuration entity. Thus, there is no way to only import a simple change, such as the title of the view. Importing the view will overwrite the entire view, replacing any other changes that might have been made.

So far there isn’t any way to manage partial configuration changes in Drupal 8. CMI was specifically designed not to handle partial configuration. Currently a module that needs to update partial configuration needs to implement an update hook. However, the code to update a partial configuration object is very specific to the configuration being changed. There is no overall consistent way to “merge” configuration between the module and active site. The config_synchronizer module is the beginning of some good ideas on monitoring site config changes to determine if it is safe to import new config from a module and might be the start of a better Features Override module in the future.

drupal8My Thoughts on Drupal 8

Rewriting Features for D8 was a very interesting project for me. I started with little knowledge of the guts of Drupal 8. I’m actually amazed by the amount of work that has gone into Drupal 8 to make it look and feel like Drupal (Views, Content Types, etc) and yet have a completely different architecture based on Symfony.

Working on Features for Drupal 8 was actually a joy.

Once I got the hang of some of the new concepts such as services and routing and plugins, it was actually fun to create Features. Sure, there are still some rough edges and some core config formats changed even during the Drupal Beta that caused some things in Features to break.

Ultimately I challenge developers to give Drupal 8 a chance and start “poking the tires” to give it a spin. In only four weeks I was able to learn D8 and produce a useable module for something as complex as Features (while also learning config_packager and doing a lot of refactoring). It’s not nearly as onerous to develop in D8 as I had been led to believe by some bloggers. I’m very excited for this new chapter of the Drupal story and also happy that Features will be playing a much smaller part in that story than ever before.


If you want to learn more about Features and see a demo, come to my session at DrupalCon LA next month: Features for Drupal 8. If you want to help, go to the Features project page on and download the latest 3.x version and start playing with it.  Post your bugs and suggestions to the issue queue. We could really use some help writing automated tests in order to release a Beta version. I am looking forward to our bright future in Drupal 8 where we no longer need to curse about Features and can focus back on building reusable functionality.

Announcing Phase2’s 2015 DrupalCon Lineup

It’s about that time of year again. DrupalCon 2015 is fast approaching, and all of us at Phase2 are busily planning events, sessions, and of course travel to Los Angeles! We are excited to announce that the Phase2 team will be represented on several session tracks this year. Here’s a rundown of what we’ll be presenting in LA.

DconLA lgoo

Business & Strategy

As CEO Jeff Walpole discussed in his blog post earlier this week, we’re really excited about the power and potential of D8 for enterprise organizations. At DrupalCon, CTO Frank Febbraro and Head of Strategic Accounts Michael Le Du will be joined by our client Evan Liebman of Memorial Sloan Kettering to discuss why the cancer center decided to adopt D8 while it was still in its beta state. They’ll examine MSK’s experience implementing one of the first major D8 platforms in the U.S. and explain why all enterprises should seriously be considering a move to D8. Don’t miss: Drupal 8, Don’t Be Late (Enterprise Orgs, We’re Looking at You).

Site Building

Software Architect Mike Potter will demo the latest version of the Features module in his session Features for Drupal 8. He’ll delve into the architecture of Features in D8, how Features integrates with D8 CMI and other modules, and when and how to override Features. Learn how, in D8, Features returns to its original purpose of bundling functionality rather than just managing configuration.


The popular Drupal is Not Your Website session returns with Software Architect Tobby Hagler, who explores how to build a Drupal website that will interoperate with other web components, live behind CDNs, and make heavy use of caching layers, yet still maintain a positive custom user experience. In essence, the Drupal CMS you are building is not actually the “thing” that Web users are directly looking at; Drupal is not your website. Check out this primer for a sneak peak.


Want to use Drupal Grunt Tasks on your next project? Stop by Director of Engineering Joe Turgeon’s session, Using Grunt to Manage Drupal Build and Testing Tools. Grunt is a well-supported and flexible JavaScript-based task runner. Building on his 2014 BADCamp talk, Joe will demonstrate the free and open-source Grunt Drupal Tasks project, which provides a set of common tasks related to building and testing Drupal sites.

User Experience Design

At Phase2, we do things a little differently when it comes to design. Join Senior Designer Joey Groh, Developer Evan Lovely, and Developer Micah Godbolt of the Sass Bites podcast to discover The New Design Workflow. The panel, representing some of Phase2’s most experienced designers and front-end devs, will provide an inside look at our best practices, tips and tricks, as well as an overview of some of our most successful co-design projects. Plus, hear us weigh in how Drupal 8 will interface with your favorite front-end tools like PatternLab.


Coding & Development

As part of the Coding & Development track, Senior Developer Joshua Turton will present How, When & Why to Patch a Module. Ever find a module that does 98% of what you need it to do, and there’s no way to make it do that last 2%? Ever need to fix a bug in a module you’ve run into, but you’re the only one who’s ever had this problem? You need a patch – and to attend this session.

Plus, another must-see session: Jose Eduardo García Torres, CTO at Anexus IT (a Phase2 partner), will give a talk called Speeding Up Drupal 8 Development Using Drupal Console.

Can’t wait to see you all in Los Angeles!

Subscribe here for Phase2 email to get all the latest news, announcements, and updates!

State of the Public Platform: Our Insights on Digital Government

The past year has seen some unprecedented and exciting advancements for Phase2’s work in the digital government space. Not only did we embark on several major projects with local, state, and federal partners, we partnered with IBM to complete work on the first federal agency-wide, cloud-based platform built on OpenPublic, our content management system tailored for government.

smc-logo  uspto logo  DOI logo

It’s been a busy twelve months, and we have certainly learned a lot – some of which we’ve shared in our digital strategy white paper series. As our Director of Government Practice, I’d like to share some of the lessons these projects have taught us – and what you can expect in 2015 and beyond for digital government.

The Value of Consolidation & Collaboration

Working in partnership with Phase2 and IBM, the Department of Interior became the first federal agency to implement the OpenPublic platform hosted on IBM’s SoftLayer cloud.This department-wide, enterprise-scale platform is a major milestone not just for Interior, but all federal agencies. I fully anticipate that the coming year will see many other agencies follow in the DOI’s footsteps. Why? It’s all about the value of consolidation and collaboration.

Before implementing the OpenPublic platform on SoftLayer cloud, the DOI had to contend with the cost and complexity of hundreds of individual sites. Managed separately, these sites induced high fixed operational costs, including infrastructure costs (servers, storage, and load balances), as well as services costs (disaster recovery, patch management, performance tuning, etc.). Not only did the fixed price structure limit the department’s flexibility in responding to fluctuations in end-user traffic, it made third party integrations and mobile applications expensive and difficult to support. This is an issue most federal agencies grapple with.


Our solution was to establish an agency-wide Drupal platform to migrate and consolidate all of Interior’s hundreds of sites. The platform is hosted in the IBM SoftLayer cloud and built using OpenPublic, our Drupal distribution pre-configured to meet government web requirements. The collaboration between Phase2 and IBM was a crucial element of the platform’s success. OpenPublic provided key functionality, while the SoftLayer cloud provided an enterprise-scalable PaaS infrastructure. Each site migrated to the platform retains its domain, separate database table structure, and site design, and each can be administered and updated separately by the agency team currently managing it. As a result, site administrators kept control of each site, but they are now hosted in a reasonably priced, elastic, enterprise-level cloud environment.

The Value of Apps

Within our Drupal distribution for government, OpenPublic, we simplified content management in the enterprise public sector space by “appifying” all the distribution’s functionality and content types. We encapsulated what was once a wilderness of modules and configuration in a clean collection of Apps, making all OpenPublic’s out-of-the-box functionality simple to configure for non-technical site administrators. This new App strategy made it easier and cheaper for governments to implement an OpenPublic CMS solution.


The introduction of Apps into OpenPublic was the culmination of years of developing government systems, most recently San Mateo County’s multi-site platform. This project really drove home these organizations’ need to turn features on and off without affecting other parts of the platform. As Bethany Thames of San Mateo County elaborated, “Each department’s identity and requirements are tied to their lines of business and the community they serve. The County departments wanted to maintain their unique identities within the overall County brand. OpenPublic allowed the County to maintain a strong central brand while meeting user demand for autonomy and flexibility.”

Apps like the Services App, Media Room App, Security App, and Workflow App allow those departments to pick and choose which distinct segments of functionality make the most sense for them. The “appification” of all functionality in this manner is truly knocking down traditional barriers to Drupal site maintenance and scalability.

The Value of the Static CMS

The single biggest point of failure of a Drupal site is Drupal. If any part of Drupal goes down, the site goes down. Wouldn’t it be great to build a site with Drupal but have a static public-facing site, instead of a constantly generating dynamic site? We took this approach with the U.S. Patent & Trademark Office, implementing what we referred to as the “Sleep at Night CMS.”

Static websites are ideal for all government agencies: no performance or security issues, no worries about redundant failover, no getting woken up in the middle of the night because something has gone wrong. Instead, by implementing Drupal’s static module, we allowed USPTO to take advantage of Drupal’s powerful content management capabilities while delivering the site to the public as static HTML. In effect, this approach mitigates the majority of performance, security, and redundant failover risks. When changes and additions are made within Drupal, any affected generated pages are updated along with it. Drupal is never exposed to end users, eliminating reliability issues. If Drupal has a problem, it just stops updating the generated site until it is fixed.


How did we do it? Get the details from Senior Developer Randall Knutson.

The Value of Open

Phase2 uses open technology in everything we do, but this choice is particularly salient when it comes to government work. Government agencies strive to reduce unnecessary costs for their taxpayers, and avoid the recurring licensing fees of proprietary software is a major benefit to open source solutions. Bypassing proprietary vendor lock-ins allows government to leverage the sustainable innovation of an open community working collaboratively to create, improve, and extend functionality, in addition to utilizing the community’s best practices for development. And because open technology is in the public domain, any agency can download, test drive, and learn about potential content management systems before choosing a provider.

San Mateo County recognized the value of openness in government technology and opened their code to the GitHub community. We were ecstatic that one of our clients embraced the open practices which are not only inherent in our work but laid the foundation for the development of OpenPublic. By making the innovative technology that went into building San Mateo County’s platform available for wider use, San Mateo County contributed to the government’s objective to foster openness, lower costs, and enhance service delivery. The “Open San Mateo” project demonstrates the power of open source to improve not just one government agency, but hundreds simultaneously by making the code available to other governments.


Similarly, web CMS and open data have become increasingly integrated over the past year. Government open data portals, like this example in New York City, build on open source tools to integrate external data into traditional website pages – in the best cases, using proven content strategy and UX design techniques to connect data sets and CMS content. This allows visitors on government websites to access raw data sets and interactive visualizations in a context sensitive way.

As an example, we are partnered with Socrata and have built a Drupal module that provides an integration point for Socrata within Drupal via Socrata’s SODA2 interface. This is significant because it will allow ordinary people, not just web developers, to use open data to solve everyday problems. For more, here are my thoughts on a less bifurcated user experience.

What’s Next…

Over the coming months, we expect to see a continuation of the trends mentioned above. In particular, keep an eye out for:

  • More consolidated CMS platforms for large federal agencies

  • Open source continuing to disrupt the SharePoint behemoth for internal collaboration and communication

  • Further integration of web CMS and open data

  • Increased code sharing across agencies

As for Phase2, we are hard at work on several government projects, including an overhaul of North Carolina’s web platform. You guessed it – they are moving to a Drupal multi-site platform to unify all state agencies and provide efficiency across all state departments. Also in the works is the migration of website onto the agency’s new OpenPublic platform.

What trends have you seen in the digital government space recently? Do you agree that many agencies are moving towards larger, more consolidated platforms built on open technology? Let us know in the comments below! For more updates on Phase2’s work in government, join our email mailing list! Finally, if you are interested in how to create an informed digital strategy for government, be sure to check out our white paper series for information on technology selection, platform models, and how to leverage a Drupal CMS to achieve your mission.