A collaborative effort between Mediacurrent Senior Front End Developer Clair Smith and Massachusetts Digital Service Product Manager Joe Galluccio, this post explores how we used Google Optimize to A/B test website navigation options for Mass.gov.
How do you know that the decisions you’ve made about your site are the best decisions? You’ve spent all that time building personas, writing your content, and designing your UI, but do you know you’re providing your users with the best possible experience? In monitoring feedback from Mass.gov visitors, we observed a general frustration with navigation. This supported observations from content specialists that it was sometimes difficult to navigate from any one page to a related page without a specific link.
To tailor a wicked awesome user experience for the constituents of Massachusetts, Mass.gov and Mediacurrent took a data-informed approach. We implemented Google Optimize to help developers and content specialists test and gather user feedback, steering site navigation and layout decisions.
We chose to use A/B testing, a method of website optimization in which multiple versions of a page are published and compared using live traffic. Site visitors are randomly shown one of the versions and the way they interact with the page is tracked. Developers can see how real users are using the pages, and choose the option that best fits their users’ behavior.
On Mass.gov we used Optimize to test a few header variations, measuring the user interaction with a new contextual pages menu and top-level site menu layout.
We started with moderated tests with user panels - groups of volunteers who signed up to regularly help test new features on the site. The panel’s feedback helped us refine the design and make decisions about what we wanted to show our public users.
To test this on our public website and measure its effectiveness, we created A/B testing variants with 4 questions in mind:
- Would people use the contextual links more than the global navigation?
- Would hiding the global navigation so that users would use the site search function be just as effective?
- Would people be more likely to engage with contextual links pointing to pages within the same service, or to pages related to the service?
- Would people who experienced the test variants be more satisfied and have a better overall experience with their site visit?
As part of our contextual link tests we also decided to test moving the global navigation to a “hamburger” menu for desktop users. This took the most time during development as the user interface had to be consistent across all browsers and constituted a fairly large charge.
The default menu looks like this:
After a few simple changes made in the Optimize editor our new menu variation looks like this:
We’re going to take a quick tour of the process and discuss some tips and gotchas we found along the way.
Some notes before we begin
There are a few things you should know about Optimize compatibility and use. You’ll need Google Chrome and the Optimize extension to edit and build your experiments. While the tests will run in most mobile browsers, but you can’t currently create or edit Optimize experiments in a mobile browser.
Optimize also requires a browser that supports CSS3 selectors for the variants to render correctly. If a user visits your page from a browser that is not supported, they’ll simply see the original page, and will be excluded from your tests. All the major browsers have this support in their most recent version, although IE11 and IE10 do have a few issues that we came across. For example, you can’t use the developer tools to inspect elements in your variant in IE11 because the asynchronous loading of the css breaks the developer tool window completely.
Your targeted page in the Optimize editor is broken out into selectable sections based on the markup in the page and can be manipulated with a contextual menu or options modal.
Optimize offers the ability to make any changes, additions or deletions you’d like to the markup in a textbox area, based on your focused element.
Changing the markup
The order in which you make your changes can also be important. One of the strategic decisions we made was simply to add a class to the header wrapper so we could use it to overwrite the existing styling to move the menu behind the hamburger for the variant. The DOM will load in the order you see in the changes panel. Here you can see the styling for the contextual navigation being added, followed by the markup and scripting changes for the new main navigation layout.
Changing the CSS
The DOM sections can be styled using either the options modal for each element or by writing CSS into the text window provided.
SASS and LESS need to be compiled down into CSS before posting. Media queries are about as complicated as the editor will allow.
The original CSS is still attached to the DOM, and in some cases will need to be overwritten. This is where the strategic changes to the markup can best be applied. We chose to add classes to the markup specific to the Optimize testing as routes for updating the CSS already present.
Changing the JS
We also found that this asynchronous loading means that jQuery doesn’t always trigger on elements already in the page that have been changed in the editor. For example, the swinging open of the menu on the live site is built with jQuery. We had to rebuild and reattach it in vanilla JS for the event to fire correctly and consistently in the Optimize variant.
In the end, we concluded that while contextual links were definitely used more than the global navigation links, it wasn’t clear that they resulted in a better user experience for visitors. We decided not to implement contextual links as tested, but we are still considering other modification, including moving the global navigation to the hamburger menu. We’re also examining modifying navigation elements to help visitors know where they are on the site and how they can get to related information.
A lot of work goes into designing and developing improvements for the website and testing them. Sometimes a test doesn’t give you the result you expected, and that can be a good thing. Contained and focused tests such as this can help us save valuable time and resources by showing us what not to develop and build, and steer us towards finding options that could result in a better user experience for everyone.