Optimizing search to power self-service

Turning content chaos into clarity

I design end-to-end product experiences and the UX systems that sustain them — from early foundations through growth, optimization, and scalability.

Challenge

Axway’s support site was the result of multiple acquisitions, each with its own documentation systems, formats, and taxonomies.
Customers trying to install, upgrade, or troubleshoot their software often couldn’t find what they needed — forcing support engineers to email files manually. Search returned irrelevant or outdated results, lacked analytics, and provided no insight into failure patterns.

There was no synonym mapping, no standardized metadata, and no measurement of precision or recall. Every department owned a slice of the content but no one owned the overall search experience.

Approach

I introduced search analytics as part of the UX process, combining quantitative log analysis with qualitative user research. The goal: transform search from a blind retrieval system into a measurable, continuously improving self-service engine.

1. Establish baselines and benchmark performance

  • Identified top tasks (install, upgrade, troubleshoot) by interviewing support engineers and analyzing logs.
  • Defined top 14 and 42 queries, representing ~20 % of search activity.
  • Created strict, loose, and permissive precision tests to measure relevance of the top 5 results per query.

Storyboarding the problem to gain stakeholder buy-in

As you know, a picture is worth 1000 words, so I storyboarded the main use cases: installing, upgrading, and troubleshooting. This helped to gain stakeholder buy-in, as they could understand the emotional ups and downs from sign in through finding the right documentation to update their system or troubleshoot a problem.

Sketching out the customer journey: At this point, users never got an email to upgrade or knew where to get the doc update, and no one had thought about the customer journey across touchpoints in the ecosystem.

With the increase in effort and cognitive burden, the user's trust plummets…

Resulting in irate users asking the same questions, taking up both their valuable time and that of our support agents, reducing retention and stickiness and driving up support costs…

Understanding how search results are returned

I became an expert in ElasticSearch, creating a synonyms file to group related documents, working with developers to boost important metadata, and much more.

This is an excerpt from that analysis, from which I derived what rules to apply to handle documentation that was in two different formats, HTML and PDF, depending on which company it had been acquired from: France, Germany, or US.

Relevancy testing

With the help of an excellent book, Rosenfeld's Search Analytics for Your Site, I tested the top 5 matches for our most important products and reported improvement. I worked with stakeholders to identify the best matches for each product.

This enabled me to quickly show that for even our flagship products, search was returning old, irrelevant content.

I tested each product, charting the incorrect and exact items, as well as related product matches. This also enabled me to eliminate duplicates and weed out older content to ensure that the most recent and relevant documents were in the top 5 search results.

Even for our brand-new products (which I had helped design!), we were still far from where we needed to be. For our older, but still flagship, products, we were far behind.

Benchmarking

I charted the current top 5 results vs the expected top 5 results for each product when installing, upgrading, and troubleshooting issues.

I focused on our API management suite in particular, since that was a new area we were breaking into, with stiff competition.

2. Apply UX research to inform optimization

  • Storyboarded real search sessions to show the emotional “pain curve” to executives and gain buy-in.
  • Grouped synonymous and mistyped terms (e.g., “SSL,” “Secure Sockets Layer”) using an ElasticSearch synonyms file.
  • Boosted metadata weighting for document freshness, product relevance, and version accuracy.
  • Worked with tech writers to standardize titles, metadata, and acronyms across hundreds of legacy docs.

3. Continuous tuning and feedback loops

  • Implemented zero-result tracking to identify missing content and improve documentation coverage.
  • Conducted ongoing precision testing to monitor accuracy improvements over time.
  • Created cluster analysis and “best bets” rules to prioritize the most clicked low-rank results.
  • Built dashboards to monitor trending queries, search exits, and performance by product line.

Implementation highlights

  • Integrated analytics directly into the content pipeline for the first time.
  • Developed a repeatable search-optimization process: analyze → test → optimize → measure → repeat.
  • Aligned terminology between customer queries and documentation, improving findability and SEO relevance.
  • Unified content from acquired companies under a single searchable structure.

Impact

Quantitative:
  • +60 % increase in relevancy across top 14 benchmark queries.
  • Zero-result searches reduced ≈ 70 %.
  • Support tickets about missing docs reduced ≈ 40 %.
Qualitative:
  • Search analytics became a standard UX metric used in roadmap prioritization.
  • Search optimization cut customer time-to-resolution and agent workload.
  • Stakeholders gained visibility into real user intent through dashboards and data-driven storytelling.

Can't get enough of SEO?

No problem! I take search optimization very seriously, as it's the hallmark to great self-service and reduced Customer Acquisition Cost (CAC). If you're interested, here's much more I've written on optimizing search.