Passport Creative
← All work
Sri Lanka

Destination IntelligenceSri Lanka

A National Tourism Policy Built on What Visitors Actually Said

Most tourism policies are shaped by what stakeholders believe. What happens when you build one from what hundreds of thousands of visitors actually said?

The Problem With How Tourism Policies Usually Get Made

The standard approach to national tourism policy development follows a familiar pattern. Consult industry stakeholders, review arrival data, benchmark against regional competitors, synthesize findings, produce recommendations. It's a defensible process. But it tends to produce policies that reflect what the tourism sector believes about itself — which doesn't always match what visitors are actually experiencing.

For Sri Lanka's National Tourism Policy, developed with UNDP following a period of significant disruption to the sector, a different approach was warranted. The country had a large and growing base of publicly available visitor feedback — tens of thousands of detailed reviews across every region and every major category of tourism product. Using that data as the foundation for the policy process, rather than an afterthought, changed what the evidence base looked like from the start.

Building the Evidence Base

The research phase began with structured analysis of a large corpus of visitor reviews, processed using IRAMUTEQ — a text analysis tool that maps language patterns across large datasets, identifies recurring themes, and allows for visual representation of how visitors cluster their feedback by category and region.

What emerged was a layered picture of Sri Lanka's tourism product as visitors experienced it: which regions generated the most positive and most consistent feedback, what language visitors used to describe their best and worst experiences, where expectations and reality were misaligned, and how patterns differed across accommodation, restaurants, and attraction categories.

This wasn't conducted as a supplementary exercise. It was the starting point — the evidence that guided what questions got asked in stakeholder consultation, and the benchmark against which stakeholder claims could be tested.

Alongside the review analysis, the team produced an Issues Paper: a working document covering Sri Lanka's tourism landscape, regional market comparisons, growth scenarios, and critical sector challenges. Crucially, it was designed to evolve. Every round of stakeholder input updated the document — so by the time the policy process was complete, the Issues Paper reflected both the data and the accumulated intelligence from every consultation.

The Stakeholder Engagement Structure

The consultation process was designed in two phases, structured so that findings from early engagement shaped what was asked in later rounds.

The first phase identified where stakeholder perspectives aligned with the visitor data, and where they diverged. The second phase returned to those divergence points — testing proposed policy directions against the evidence, building consensus where it existed, and making the trade-offs explicit where it didn't.

The use of data visualization throughout changed the dynamic in consultation rooms. Rather than stakeholders debating their own assessments of the tourism product, the discussion could anchor to a shared representation of what visitors had said. That's a different kind of conversation. It doesn't eliminate disagreement, but it shifts its basis — from institutional interest to a common factual starting point.

What the Policy Was Designed to Actually Do

The output wasn't a vision document. It was a policy framework with specific KPIs for measuring progress, a defined implementation timeline, clear responsibility assignments for each workstream, and a monitoring mechanism that didn't depend on future consultancy engagement to operate.

The insistence on this structure wasn't procedural. Tourism development initiatives consistently fail at the point where strategy meets implementation — where a well-researched set of recommendations becomes someone's responsibility to execute. Building implementation architecture into the policy itself, rather than treating it as a later-phase problem, was the most direct way to address that failure mode.

The goal was a national policy that the Sri Lanka Tourism Authority and its sector partners could own — not just receive.

Key Findings & Learnings from Sri Lanka

Key findings

  • Visitor Data and Stakeholder Assumptions Didn't Always Match

    Structured analysis of visitor reviews revealed regional perception patterns and product gaps that weren't consistently reflected in stakeholder priorities. The evidence base created a more grounded starting point for policy discussions.

  • Visualization Made Complex Patterns Discussable

    Using text analysis tools to map the language visitors used — across regions, property types, and experience categories — turned a large dataset into something a diverse group of stakeholders could engage with and respond to.

  • An Evolving Issues Paper Changed the Consultation Dynamic

    Rather than presenting a fixed draft for stakeholder comment, the process used a living document that was updated continuously as consultation progressed. Stakeholders were shaping the analysis, not reacting to conclusions already drawn.

Key learnings

  • Evidence Changes the Conversation Before Recommendations Are Made

    Starting with a shared, data-grounded picture of visitor reality shifted the discussion away from competing institutional interests and toward a common understanding of what the tourism product actually looked like to the people it was supposed to attract.

  • Stakeholder Engagement Works Better When It's Iterative

    The two-phase questionnaire approach — where findings from early consultation shaped the questions asked in later rounds — produced more specific and useful input than a conventional single-round process would have.

  • Policy Without Implementation Architecture Is Just a Document

    The most important structural decision was insisting on specific KPIs, assigned responsibilities, and a monitoring framework as part of the policy itself — not as a separate implementation phase that might or might not follow.