Passport Creative
← All work
Eswatini

Destination IntelligenceEswatini

A Tourism Authority That Can't Monitor Its Own Performance Can't Improve It

A tourism authority that can't monitor its own digital performance can't improve it — and can't make the case for investment. How do you build that capability from scratch, for both public institutions and the private sector businesses they support?

The Starting Point

Eswatini Tourism Authority was operating without a systematic view of its own digital performance. There was no regular monitoring of how the destination appeared in search results relative to regional competitors, no tracking of what potential visitors were actually searching for, and no consolidated picture of how Eswatini's tourism businesses were visible — or invisible — to international audiences.

This is a more common situation than it should be. Digital intelligence tools exist and are accessible. But tools without a defined framework, trained users, and a rhythm for producing and reviewing outputs don't get used consistently. The result is episodic attention to digital performance — checked when something seems wrong, ignored otherwise.

The brief was to fix this systemically: build the analysis framework, implement the tools, and train the people who would need to use them — in a way that would continue functioning after the project ended.

Two Different Audiences, Two Different Programs

The project separated public and private sector training from the start, because the intelligence needs and starting points were fundamentally different.

For the Tourism Authority, the relevant questions were strategic: How does Eswatini rank against competitors in the source markets that matter? What search terms are potential visitors using, and how visible is the destination for them? What is the trend in online sentiment about Eswatini as a destination? Answering these questions required competitive analysis tools, structured search data analysis, and social monitoring — and required building the internal processes to produce regular reports that fed into planning decisions.

For private tourism businesses, the relevant questions were operational: Is my business findable by the travelers I want to reach? How are my reviews performing? What can I do, with limited budget, to improve my position? The training for this audience was built around free and low-cost tools, practical application during sessions, and guidance on the highest-return actions available to businesses at different levels of digital maturity.

The Framework Design

The analysis framework built for the Tourism Authority was designed around three intelligence functions:

Competitive monitoring — tracking how Eswatini's search visibility for relevant terms compared against comparable destinations, and identifying where competitive gaps were growing or closing.

Market demand analysis — using search data to understand what potential visitors in priority markets were looking for, and whether Eswatini's digital content was aligned with those interests.

Sentiment tracking — monitoring visitor reviews and social conversation to identify emerging issues or strengths in the destination's perceived product, before they became visible in arrival statistics.

Each function had assigned ownership within the Tourism Authority, a defined reporting cadence, and a template for summarising findings. The system was designed to run on the tools that had been implemented — not to require specialist skills or additional external support.

What Sustainable Looks Like

The test for any intelligence capability isn't how well it works when the consultant is in the room. It's whether the team is still using it twelve months later.

This project closed with a monitoring system that the Eswatini Tourism Authority team had used and built themselves — not one handed over at the end of an engagement. The training was designed to produce capability, not familiarity. Private sector participants left sessions with active, optimised accounts and a working understanding of the tools they'd need to maintain them.

The goal was a Tourism Authority that could see its own digital performance, track its competitive position over time, and make the internal case for digital investment on the basis of data it had produced — without needing to commission a study to answer questions it should be able to answer itself.

Key Findings & Learnings from Eswatini

Key findings

  • Digital Blind Spots Were Structural, Not Incidental

    The absence of monitoring wasn't a resource problem — it was an architecture problem. Without defined tools, assigned roles, and reporting rhythms, digital intelligence wouldn't get produced regardless of budget. The solution was system design, not just skill development.

  • Public and Private Sector Needed Different Tools and Different Conversations

    The Tourism Authority needed competitive intelligence and market trend analysis. Individual tourism businesses needed practical, low-cost tools for managing their own visibility. Training both groups from the same curriculum would have served neither well — the program was differentiated from the start.

  • Monitoring Only Has Value If It's Sustained

    A one-time digital audit produces a snapshot. A monitoring system produces a trend line — which is what makes it possible to see whether investments are working and whether competitive position is improving. The design priority was sustainability: tools the team could use independently, reports simple enough to produce regularly, processes that survived staff changes.

Key learnings

  • Intelligence Tools Require Intelligence Questions

    Setting up SEMrush and Google Analytics is not the same as having a functioning intelligence capability. The more important work was defining the specific questions the Tourism Authority needed to answer — about competitor positioning, market search behavior, visitor sentiment — and building the monitoring framework around answering them.

  • Private Sector Training Has a Different Success Condition

    For tourism businesses, the goal wasn't digital marketing expertise — it was practical capability with tools they would actually keep using. Sessions built around real accounts, real content, and immediate application produced better retention than workshops presenting general principles.

  • Competitive Benchmarking Changes the Internal Conversation

    When a destination authority can show, with current data, exactly where it ranks against competitors for key search terms, the conversation about digital investment changes. The case for resources becomes evidence-based rather than aspirational.