Compliance & Ethics Hub

Balancing User Needs With Stakeholder Expectations
UX research ■ UX strategy ■ information architecture ■ wireframes ■ content strategy ■ technical writing

Project Summary

Duration

January 2022 - ongoing  

Overview

This project was initiated against the backdrop of challenges faced by a leading global tech company, including negative publicity, reputation damage, and significant fines imposed by regulatory authorities. Federal mandates required the company to take decisive steps to enhance its compliance with various laws, driving the need for a comprehensive solution to improve policy awareness and adherence among employees. This case study explores the development of a centralized knowledge hub and action portal designed to address these requirements and mitigate future risks.

The significance of this project is that it exemplifies navigating complex trade-offs, where stakeholder concerns about legal risks and future maintenance burdens conflicted with the imperative for user-centered design. 

Through iterative design and usability testing, it was demonstrated how the benefits of alignment with UX best practices outweighed its costs. This process not only enhanced the platform's user experience but also built trust with stakeholders, leading to an adjustment in their initial requirements. 

The final outcome was a more user-friendly solution that thoughtfully balanced the trade-offs between user needs, legal risks and long-term manageability, highlighting the strategic application of UX in resolving complex design challenges while managing stakeholder apprehensions.

My role

As the project lead, my responsibilities included:

  • Planning and facilitating research
  • Defining the structure, navigation model and information architecture of the site
  • Creating the content strategy and style guide
  • Developing advanced expertise in local and global compliance laws and over 90 company policies

I worked at all stages of the process, collaborating with a visual designer and technical project manager, overseeing a small team of technical writers, and defining functionality requirements for the development team.

V1: MVP

The goal of our first iteration was to facilitate employee access to relevant, high-quality documentation on key compliance topics.

Challenges

Content quality. Company resources consisted of thousands of documents spread across numerous platforms. Some information was conflicting, some out of date. Employees had difficulty knowing what to look for, finding relevant information, and feeling confident that it was accurate and timely, while executing at optimal efficiency for a company that valued swift product rollouts.

Hard constraints. Our solution was limited by strict parameters set forth by internal stakeholders, which were spurred by legal specifications around how content was written, regional laws and restrictions of the countries in which the company operated, and concerns about the costs of ongoing content maintenance.

In this phase, we were given access to 7 compliance teams and their resources. Our access to end users was limited, and we were not permitted to rewrite content or duplicate specific content within the portal. This iteration was primarily a challenge of information architecture, UX architecture and content management strategy.

Approach

At this stage, we were given access to 7 compliance teams and their resources. We focused on information architecture, platform architecture and content management strategy, without rewriting content.

My contributions
  • Resource compilation and quality audit
  • Resource categorization and labeling by ownership, primary topic(s), linked topics, and source of truth
  • Review of (previously conducted) generative research and user personas to find key insights
  • Development of site architecture and navigation system
  • Content strategy, style guide and technical writing for 7 overview pages and ~40 articles
Content strategy development began with a thorough content audit and a series of resource and topic mapping exercises.
Key insights from generative research review
  • Compliance awareness was low. Employees did not know what they did not know, thus did not go looking for answers.
  • Legal and Compliance were not trusted partners. Employees avoided engaging with these teams for fear of being blamed for issues and/or slowing down the project timeline.
  • The needs and mental models of employees varied greatly between different roles across the company. A linear, one-size-fits-all approach would not serve the majority of employees.
  • Regional laws and cultural differences were often unknown or unaccounted for.
  • Employee confidence was low concerning existing documentation. This resulted in employees making assumptions and relying on their "own best judgement".
Solution

To match the mental models of employees who did not necessarily know about compliance issues or the structure of internal teams, resources were detached from the teams that owned them and recategorized by their relation to key activities in which employees were likely to engage.

To serve the needs of users with differing levels of compliance awareness, we offered a standard search experience as well as a more guided experience. The guided, home page experience directed employees to the appropriate information via selection of an activity. For each activity, a subsequent “overview” page showcased the primary compliance considerations, with links to more specific articles and important resources.

Every article was discoverable via multiple pathways, to surface important compliance information regardless of the employee's particular focus. For example, What types of information can I share externally? was findable via Hiring & Recruiting, Doing Stuff on My Own Time, and Contracting With Third Parties pathways, in addition to External Communications.

Home page guided experience. Tiles flip upon hover to reveal descriptions.
View larger wireframes here.
Site map, v1.
Outcomes
SUCCESSES
  • Content quality. Within the given topic parameters, the Hub presented a curated, highly accurate and up-to-date collection of resources delivering actionable guidance.
  • Discoverability. Resources were discoverable via multiple pathways, to match the varying mental models of employees working in different roles across the company.
OPPORTUNITIES FOR IMPROVEMENT
  • Illusory boundaries. The limited range of topics left employees hanging when their needs extended beyond the given parameters. This presented the risk of misleading employees into believing no further guidance was necessary.
  • Article density. Due to low stakeholder buy-in and heavy constraints, the end-result “articles” were text-heavy, essentially directories of links to external resources, full of legalese - not a desirable user experience.
  • Missed opportunities. Usability testing was not in budget at this stage, which made it difficult to measure our successes and gather invaluable feedback that would help us better advocate for user needs.
  • Low awareness. Initial site analytics indicated low usage by employees, which was deemed to be due to a failure to market internally

V2: Expansion

In spite of its limitations, the launch of our original solution was deemed a tentative success, which helped attain the attention and buy-in of other teams across the organization.

In our second iteration, our goals were to:

  • Expand the knowledge library, working with over 30 teams to represent almost all compliance, legal, and ethics programs across the company
  • Improve the user experience by replacing all legalese with simple, culture-neutral language
  • Provide clear avenues for reporting concerns and getting additional support.
My contributions
  • Stakeholder interviews and discovery workshops
  • Followup content strategy exercises with each team
  • Redesign of home page guided flow to include pathway for employees with concerns
  • Flow creation for a “single front door” support channel
  • Rebranding to align with new company content guidelines
  • Technical writing for 127 new and revised articles
  • Usability testing
Stakeholder interviews and content strategy exercises conducted remotely using Figjam.
Mapping out content for overview pages - an iterative and highly collaborative process.
New home page featuring double front door for guidance-seeking and concern-reporting
Support ops intake form / process flow
Outcomes
SUCCESSES
  • Increased visibility. Simple internal promotion tactics had improved awareness of our tool, marked by a jump in usage to an average of 2989 views per month - a huge increase from approximately 200 views per month, pre-promotion.
  • Increased adoption. Use of the Hub stabilized at around 1.5K visitors per month, a great improvement from 117 visitors per month, prior to our second launch.
OPPORTUNITIES FOR IMPROVEMENT
  • Article density. Increasing the breadth of knowledge areas supported by the Hub only increased the weightiness of our articles.
  • Competing demands. The requirement to link to documentation instead of providing answers on the page created friction with the end users’ need for information at their fingertips.

Usability Testing

Methods

We conducted hour-long qualitative interviews and task assessments, using think-out-loud methodology to capture participants’ thought processes as they explored the Hub and endeavored to find the information necessary to complete our test tasks successfully. Each participant was sent a followup survey immediately after the study session, which was used to capture quantitative data and final thoughts.

My contributions
  • Authored discussion guide and followup survey
  • Facilitated interviews
  • Analyzed raw data
  • Presented findings and recommendations to client stakeholders
Key findings
SUCCESSES
  • Intuitive site architecture. All participants found the layout to be self-explanatory and easy to navigate, and were able to successfully navigate to a relevant page for each task. Participants who started off on an incorrect pathway were able to recognize their error and course-correct in less than 15 seconds.
  • Streamlined support process. All participants agreed that support felt accessible, were able to submit a case successfully, and felt confident about the followup process.
  • Perceived value. 7/9 participants stated they would definitely use the tool in the future.

Is this live? My team needs this!

OPPORTUNITIES FOR IMPROVEMENT
  • Cognitive overload. Participants were overwhelmed by the amount of information on each page and missed key information as a result.

Is there a grid or a more digestible way to present this content?

  • Time to completion. On average, participants took over 4 minutes to complete each task. This suggests that, outside of the testing environment, our users would likely become frustrated and abandon their tasks before completion.

There’s no way. I would just email somebody. There’s no way I would figure this out.

  • Below-the-fold blindness. FAQs and support information were ignored by 8/9 participants due to their placement lower on the page.
  • Terminology clarity. 5/9 participants were unclear about the distinction between “policy”, “guidelines”, and other documentation types, and could not discern which was the source of truth.
  • Task failure. 9/9 participants felt frustrated by the lack of specific information within the tool, and didn’t take the time to find answers buried in long external policies. This resulted in unintentional non-compliance by 8/9 participants during our test tasks.

I’d just keep my gift under $200. [incorrect]

V3: Optimization

Although many of our findings were unsurprising, the usability insights were invaluable because they persuaded our client stakeholders to reconsider our constraints in order to better align with UX best practices. Particularly, the widespread unintentional non-compliance during task completion helped our stakeholders weigh the risks of non-compliance against the risks of simplifying language and reducing content, and the additional maintenance effort that would be required if information was duplicated on the page. It was determined that meeting the needs of end users would ultimately improve compliance outcomes and therefore must be prioritized.

In light of this pivot, our new goals were to:

  • Prevent cognitive overload by reducing the amount of content presented at one time
  • Provide direct answers and action plans, while keeping the need for content maintenance to a minimum
  • Extend the guided experience initiated by the home page, to better lead employees to the complete information needed for their use case
My contributions
  • Developed templates for reusable components
  • Created flows for decision tree components (see example below)
  • Pressure-tested revised solutions against 7 articles and 3 overview pages (which were determined to be representative of most needs across the site)
Solution
Minimalist aesthetic.

We drastically reduced visual overload by cutting all introductions, unnecessary descriptions, and non-actionable information. We replaced blocks of text with bullet points, charts, checklists, and other more-digestible information formats. We used nested accordions to reveal only the information selected by the user. Where relevant, we introduced the use of interactive decision trees to guide users to the most specific answers possible for their use cases. 

Answers at one’s fingertips.

We replaced the directory-style article pages with streamlined articles providing direct answers in simple language. For each chunk of information, we provided a clear link to the source of truth, for those who desired additional explanation. We presented FAQs and other supplementary information as-needed, to prevent users from missing relevant information buried in lower sections.

Minimized maintenance effort.

As noted previously, our stakeholders were concerned about duplicating content in any way that would require frequent review and revisioning each time a law or policy changed. To address that concern and make updates as easy as possible, we employed a 2-prong strategy. Unique content on the Hub would be limited to broad guidelines and thought perspectives that would be unlikely to change in the near future. For more specific information, we introduced reusable components - templated, discrete chunks of information (do’s and don’ts, how-to’s, policy bits, media, etc) that could be dropped onto multiple pages, yet could be updated in a single location. Each component was linked to a a policy or other source of truth, and changes to that resource would trigger a maintenance alert for each corresponding component.

The use of this new format allowed us to consolidate dozens of articles and greatly reduce redundancy of content throughout the Hub.

Reflections & Learnings

This project underscored the importance of iterative design and the value of usability testing in creating effective digital solutions. Balancing the needs of a diverse user base with the stringent requirements of almost 100 legal and compliance stakeholders was challenging but ultimately rewarding. Each phase of the project introduced additional needs and new challenges as we strived to transform complex information into accessible and actionable knowledge. Ultimately, usability testing had the greatest impact on our approach, allowing us to make great progress in our third iteration, likely saving countless development hours and hundreds of thousands of dollars.

By continuously engaging with our users and stakeholders, we were able to navigate the nuances of compliance and legal information, creating a scalable solution that not only meets the company's needs but sets a new benchmark in user experience for complex knowledge platforms.

Looking ahead

With a clear strategy in place, writers and developers are working on bringing our latest solution to fruition. So far, we’ve received positive feedback from stakeholders who were initially concerned about the risk of provided compliance and legal advice outside of written policy. Next, we’ll need to test our revised solution on end users, with the new goal of keeping task completion times under 76 seconds (the average Google search session time).