BLOG

A collection of article and ideas that help Smart Marketers to become Smarter

Laurent Bouty Laurent Bouty

Marketing Canvas - Step 2 - Set Your Goals

In the Marketing Canvas Process, after having finalised your assessment, you should discuss potential scenarios that will help you achieve your goal(s). An interesting perspective for this phase is to use the scenarios proposed by Tiffani Boffa in her book Growth IQ.

The Marketing Canvas, developed by Laurent Bouty, is a powerful tool that provides a structured approach to crafting a robust marketing strategy. It's a co-creation method that intersects your environment (where you will play), your goals (what you would like to achieve), and your actions (what you will do). This article focuses on the second step of the Marketing Canvas Process - setting your goals. This step is vital as it serves as the reference point for the assessment phase.

Three Strategies for Growing Your Revenue:

In the Marketing Canvas Process, three strategies are highlighted for growing your revenue: GET, KEEP, and STIMULATE/MORE. These strategies focus on different aspects of customer interaction and are designed to help businesses increase their revenue.

  1. GET: This strategy is all about customer acquisition. The primary idea is that your business can grow by attracting new customers. Tactics that can be employed include acquisition campaigns (welcome offers), channel incentives for new customers, "bring a friend" campaigns, and freemium models. For instance, a new restaurant might offer a "buy one get one free" deal to attract new customers.

  2. KEEP: The second strategy emphasizes customer retention. The main idea here is that your business can grow by retaining existing customers. This strategy might seem defensive, but it is the cornerstone of customer experience and is essential for all businesses, including startups. Tactics include churn management, loyalty programs, brand and customer experience reinforcement, Net Promoter Score (NPS) programs for detractors, and below-the-line retention campaigns. For example, a software-as-a-service (SaaS) company might implement a loyalty program that offers exclusive features or discounts to long-term subscribers.

  3. STIMULATE/MORE: The third strategy focuses on customer stimulation. The primary idea is that your business can grow by encouraging your customers to spend more and/or more often. Tactics include cross-selling, upselling, promotion campaigns for usage stimulation, bundling, upgrade programs, and premium features. For instance, a telecom company might offer a bundle that includes internet, cable, and phone services at a discounted rate, encouraging customers to spend more.

Green Clean Use Case:

To illustrate these strategies, let's consider a hypothetical company, Green Clean, a startup offering eco-friendly cleaning services.

For the GET strategy, Green Clean could offer a discounted first cleaning service to attract new customers. They could also implement a referral program where existing customers get a discount for each new customer they bring in.

For the KEEP strategy, Green Clean could develop a loyalty program where customers get a free cleaning service for every ten services purchased. They could also focus on providing excellent customer service to ensure customer satisfaction and reduce churn.

For the STIMULATE/MORE strategy, Green Clean could offer additional services like deep carpet cleaning or window cleaning, encouraging existing customers to spend more. They could also offer a premium subscription service that includes regular cleaning and maintenance services.

Conclusion

Setting your goals is a crucial step in the Marketing Canvas Process. It provides a clear direction for your marketing efforts and serves as a reference point for assessing your progress. The three strategies - GET, KEEP, and STIMULATE/MORE - offer different approaches to growing your revenue. By understanding these strategies and how to apply them, businesses can create a robust marketing strategy that drives growth and success.

Remember, the Marketing Canvas is a dynamic tool. As your business environment changes, you should revisit your goals and strategies to ensure they remain relevant and effective. Regular review and adaptation are key to maintaining a successful marketing strategy.

Whether you're a non-marketer, an entrepreneur, or a marketer looking to learn something new, the Marketing Canvas offersa structured yet flexible approach to developing a marketing strategy. It breaks down complex marketing concepts into manageable steps, making the process more accessible and less intimidating.

The Marketing Canvas is not just a tool, but a journey. It's a process of discovery, assessment, and reinforcement. It's about understanding your market, setting clear goals, and determining the actions you need to take to achieve those goals.

So, are you ready to embark on this journey? Are you ready to set your goals and grow your business? Remember, the journey of a thousand miles begins with a single step. In the case of the Marketing Canvas, that step is setting your goals.

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Listening

Most companies listen reactively — processing complaints, running annual surveys, reading reviews when they arrive. The Marketing Canvas demands proactive listening. Dimension 510 explains the difference, why it is a Fatal Brake for Pivot Pioneers, and the most expensive sentence in marketing.

About the Marketing Canvas Method

This article covers dimension 510 — Listening, part of the Conversation meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Listening (dimension 510) is the Voice of the Customer (VOC) infrastructure — not a single survey, but a system that captures everything customers say across every channel, translates it into data, and feeds it into strategic decisions.

The distinction that defines this dimension: listening without action is surveillance. Listening with action is strategy.

Most organisations believe they listen to customers. Most are listening reactively — processing complaints when they arrive, running annual satisfaction surveys, reading reviews when a notification appears. The method demands something more demanding: proactive listening that generates data before it is needed, feeds it into decisions before problems compound, and closes the loop between what customers say and what the company does.

In the Marketing Canvas, Listening sits within the Conversation meta-category alongside Stories (520), Media (530), and Influencers (540). It is the first of the four Conversation dimensions — and it comes first deliberately. The meta-category header says it plainly: listening comes before stories, before media, before influencers. You cannot communicate effectively with people you haven't systematically understood.

Reactive vs. proactive: the canonical distinction

This is the distinction that separates a company with VOC processes from a company with a VOC system.

Reactive listening processes information when it arrives. Customer complains — the complaint is logged. Customer writes a review — someone reads it. Annual survey goes out — results are compiled. NPS score is reported quarterly. Each of these is listening. None of them is proactive. The information arrives at the company's pace, on the company's schedule, filtered through the customers who bothered to respond.

Proactive listening generates information continuously, systematically, and before it is urgently needed. Ongoing customer interviews on a regular cadence — not just when there is a problem to investigate. Social listening infrastructure monitoring what is said about the brand, the category, and competitors across platforms. Support ticket analysis that extracts pattern data from thousands of micro-interactions. Behavioural data from digital touchpoints that reveals what customers actually do, not just what they say. Structured feedback loops at defined journey stages that close the circle between hearing a concern and confirming the fix.

The gap between reactive and proactive is the gap between responding to problems and preventing them. Between knowing what customers said last quarter and knowing what they are saying now. Between confirming assumptions and challenging them.

The canonical test: if the company stopped sending surveys tomorrow, would customer understanding continue to improve? If yes, the listening system is proactive. If no — if surveys are the primary input — the system is reactive, and dimension 510 cannot score above +1.

MARKETING CANVAS TOPICS (1).png

The most expensive sentence in marketing

"We know what customers want."

This sentence costs more than any misaligned campaign, any failed product launch, or any churned enterprise account. It is the signal that internal assumptions have been allowed to substitute for external evidence — that the listening loop has been closed not by data but by conviction.

The canonical position of the Marketing Canvas on this: if the data contradicts the assumption, the assumption must yield. Not the data. Not the interpretation. The assumption.

This sounds obvious. It is routinely violated. Teams that have operated in a category for years develop a fluency with their customers that feels like understanding but is actually pattern recognition. They know what last year's customers said about last year's product. They extrapolate. The market moves. The extrapolation drifts.

The VOC system exists to correct the drift before it becomes a strategy gap. It is the institutional mechanism that keeps the company's model of its customers honest — continuously updated, data-grounded, and resistant to the internal assumptions that are far more comfortable to rely on.

The four properties of an effective VOC system

The Marketing Canvas scores Listening against four properties. Together they describe not just whether a company has listening tools, but whether those tools form a functioning system:

Capture scope (511) — does the VOC system hear everything customers are saying? Not everything worth hearing — everything. The signal that matters is often not in the formal feedback. It is in the support ticket that uses unusual language. The social media comment that frames the category differently. The customer interview that introduces a word the team has never used. A VOC system with limited capture scope is a VOC system with systematic blind spots.

Data discipline (512) — is the VOC process entirely data-driven, with no point where assumptions substitute for evidence? The failure mode here is not fraudulent data. It is filtered data — interview questions that lead to expected answers, survey scales that cluster around mid-range because respondents are conflict-averse, analysis that confirms the hypothesis the team walked in with. Data discipline means designing the listening system to surface inconvenient truths, not just validate comfortable ones.

Journey integration (513) — does the VOC process map to the customer lifecycle? Listening at only one stage of the journey is like taking a patient's temperature once and declaring the health of their entire year. The research that matters for acquisition decisions is different from the research that matters for retention decisions. A journey-integrated VOC system has different listening mechanisms at different stages — capturing the before-purchase research experience, the onboarding moment, the ongoing use patterns, and the renewal conversation separately, because each reveals different strategic information.

Methodological breadth (514) — are multiple research techniques used together? Each technique has a different blind spot. Surveys capture stated preferences but miss revealed behaviour. Interviews surface nuance but are prone to social desirability bias. Behavioural analytics reveal what customers do but not why. Support ticket analysis captures the most frustrated customers but underweights the quietly satisfied ones. No single technique is sufficient. The system that combines four or more creates a triangulated picture that is harder to misread.

Validation discipline — does the company run a JTBD check at the customer level before committing capital to a direction implied by a market signal? A strong market trend is not the same as a validated consumer job. A company can detect a trend correctly and still deploy capital in a direction its specific customer does not need, because it never ran the validation step between signal and decision. This failure is harder to catch because the company genuinely believes it is being data-driven. The tell: VOC data is being used to confirm a direction already chosen, rather than to test it before capital is committed. Volume of consumer data does not protect against this failure. Only validation discipline does.

The second critical failure is the mirror of the first: companies that mistake market signal intake for customer listening. Reactive companies filter data through assumptions. A different failure mode — harder to detect because it is dressed in data — is the company that tracks macro trends attentively but never validates them at the individual customer level. The market is moving toward X does not mean your specific customer's job has changed. Listening without validation is still surveillance, just at a more sophisticated level.

Listening in the Marketing Canvas

The canonical question

Do you systematically capture, analyse, and act on what customers are saying about your brand, products, and market?

Listening is a Fatal Brake for A5 (Pivot Pioneer) — the most strategically consequential placement of any Conversation dimension.

The rationale is direct: you cannot pivot successfully if you don't know where the market is going and whether your specific customer is moving with it. Listening is how you find out both — and the second question matters more than the first.

The Fujifilm and Kodak cases provide the sharpest possible contrast. Both companies faced the same crisis in the early 2000s: digital technology was destroying the photographic film market. Both had data. Kodak had commissioned research in 1981 predicting film's decline — and then calculated how many years they could milk film revenue before needing to act. They listened, and then filtered the listening through their assumption that they had more time. Fujifilm conducted an 18-month technology audit — described in the canonical case library as "the most sophisticated VOC exercise in the book" — mapping every capability they had against every market need they could identify. They listened, and then let the data direct the strategy. Fujifilm still exists. Kodak destroyed over €100B in value.

For A5, Listening is a Fatal Brake because the pivot direction is unknown until the market reveals it. An A5 company that is listening well will identify the new job before competitors do. An A5 company that is listening reactively will discover it in competitors' press releases.

Listening is also a Growth Driver for A9 (Category Creator) — the dimension through which category language is discovered. Green Clean's voice-of-customer language mining is the canonical example: extracting the exact phrases customers used to describe the indoor health protection job and feeding those phrases directly into marketing copy. Customers teach you the vocabulary of the category they are joining. Listening is how you learn it.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. You have set a VOC system that captures everything that customers are saying about your brand and your value proposition.

  2. Your entire VOC process is data-driven — at no point are you making assumptions that substitute for evidence.

  3. Your VOC process is based on an in-depth knowledge of your user's journey and customer lifecycle.

  4. You are using different techniques together to ensure you are getting the most from your research.

  5. Your VOC system captures your customers' views on sustainability.

(Dimensions 511–514 + 515 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Customer understanding relies on assumptions, single-source data, or reactive feedback that arrives too late to be strategic. "We know what customers want" is the operating assumption. The likely result: strategy decisions are made on the basis of internal conviction rather than external evidence. Problems compound before they are detected. For A5, this score is existential — a pivot built on assumed market direction is a rebrand, not a transformation.

Positive scores (+1 to +3): Multiple listening channels feed a structured process that visibly influences product, marketing, and service decisions. Every significant strategy decision can be traced back to a specific customer insight from a specific source. The VOC system generates evidence before it is urgently needed, corrects internal assumptions when data contradicts them, and closes the loop between what customers say and what the company does.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's listening consists of a post-service satisfaction email sent to every customer after each visit. The response rate is 19%. The four questions (overall satisfaction, cleaner performance, product quality, likelihood to recommend) produce scores the team reviews monthly. No action has been taken based on these scores in the past six months — they are tracked but not acted on. Customer interviews have never been conducted. Social media is monitored by the founder personally, approximately once a week, without a systematic process for capturing or analysing what is found. Support tickets are answered and then closed, with no aggregation or pattern analysis. "We know what our customers want" is the informal position of the team. The VOC system exists in form. It does not function as strategy.

Score: +1 to +2 (Developing) Green Clean has introduced quarterly customer feedback sessions — 45-minute conversations with a rotating group of 8–10 customers focused on the full service journey. The sessions are structured but not scripted: customers describe specific moments rather than rate abstract attributes. Two rounds of sessions have already produced one significant insight: customers consistently describe the moment they realise the Family Health Report is personalised to their specific home as the point when they first trusted the brand. This insight was not available from the satisfaction survey. The team has started acting on it: the first Health Report for new customers is now delivered with a phone call rather than an email, specifically to confirm the personalisation in conversation. Social listening is now monitored daily using a basic tool. Support ticket language is being reviewed weekly for recurring patterns. Proactive listening is forming. It is not yet systematic.

Score: +2 to +3 (Strong) Green Clean's VOC system operates at four levels simultaneously. Satisfaction data (post-service NPS) provides the quantitative baseline. Quarterly customer interviews provide the qualitative depth, including specific language analysis — the team has documented the exact phrases health-conscious parents use to describe the indoor health protection job and has fed those phrases directly into website copy, sales conversations, and the Family Health Report narrative. Social listening captures every mention of Green Clean and its category terms in the region, updated daily. Support ticket analysis is reviewed weekly and produces a monthly "friction report" — specific interaction patterns that indicate friction in the journey. Each of these data streams feeds into monthly strategy reviews where at least one decision is required to trace back to VOC evidence. The system has produced three product changes and two messaging updates in the past twelve months. When the team states what customers want, they can cite the specific data source, the sample size, and the date the insight was captured.

Connected dimensions

Listening does not operate in isolation. Five dimensions connect most directly:

  • 110 — JTBD: Listening enables the initial evidence base for the job definition — and, more critically, maintains its accuracy over time. A company can define the job well in year one and then watch it silently decay if no VOC system is actively testing whether the definition still holds. Without 510, a correct 110 ages in amber while the customer's actual job evolves. 510 is how you build 110. It is also how you keep it honest.

  • 130 — Pains & Gains: VOC validates pain mapping. The pains identified in journey research (dimension 130) are hypotheses until the VOC system confirms them with data across a sufficient sample. Pains that appear in one customer interview may be individual; pains that appear in twelve are systemic. Listening is how the difference is established.

  • 140 — Engagement: VOC systems feed engagement data. The promoter/detractor ratios that dimension 140 scores are produced by the listening infrastructure. Without a functioning VOC system, Engagement can only be measured by satisfaction surveys — which, as noted in dimension 140, is not the same as measuring engagement.

  • 420 — Experience: Listening reveals what the experience actually feels like from the customer side. A team that believes the onboarding experience is +2 on Experience may discover through customer interviews that the specific moment the substitute cleaner arrives without prior notice is scoring −2 in the customer's head. Without the listening system, the Experience score is a self-assessment. With it, it becomes evidence-based.

  • 520 — Stories: Listening provides the customer language that makes stories resonate. The most effective content uses the words customers use to describe their own problems — not the words the marketing team uses to describe the product. VOC language mining is the process that produces the raw material for story strategy.

Conclusion

Listening is the first Conversation dimension because it is the prerequisite for all the others. A brand cannot tell credible stories without knowing what customers actually experience. It cannot design effective media without knowing which messages resonate. It cannot identify the right influencers without knowing which voices customers trust.

The strategic test is not whether the company has feedback mechanisms. It is whether those mechanisms are proactive, multi-technique, journey-integrated, and action-connected. A company that sends satisfaction surveys and reads the results is listening. A company that conducts ongoing interviews, monitors social conversation, analyses support ticket patterns, tracks behavioural data, and ties every decision to a specific customer insight is listening strategically.

The difference between those two companies is not tools. It is discipline — the discipline of requiring data to yield when it contradicts assumption, rather than requiring assumption to explain away inconvenient data.

Sources

  1. Harvard Business Review, "Everyone Says They Listen to Their Customers — Here's How to Really Do It", October 2015 — hbr.org

  2. McKinsey & Company, "Are You Really Listening to What Your Customers Are Saying?", McKinsey Quarterly — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 510: Listening (VOC), Laurent Bouty, 2026

About this dimension

Dimension 510 — Listening (VOC) is part of the Conversation meta-category (500) in the Marketing Canvas Method. The Conversation meta-category contains four dimensions: Listening (510), Stories (520), Media (530), and Influencers (540).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Conversation - Listening To

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Magic

Satisfaction keeps customers. Magic turns them into advocates. Dimension 440 of the Marketing Canvas scores four components — effortless, stress-free, sensory pleasure, and social pleasure — and explains why exceeding expectations on something the customer doesn't care about isn't magic, it's waste.

About the Marketing Canvas Method

This article covers dimension 440 — Magic, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Magic (dimension 440) scores whether your brand exceeds expectations in ways customers didn't anticipate. Not satisfaction — that is delivering what was promised. Not quality — that is consistency. Magic is the surprise that transforms a satisfied customer into an active advocate.

The most important design principle: exceeding expectations on something the customer doesn't care about isn't magic. It's waste. Magic requires knowing what the customer expects — and then strategically exceeding it at the moment that matters most.

In the Marketing Canvas, Magic sits within the Journey meta-category alongside Moments (410), Experience (420), and Channels (430). It is the peak layer — the dimension that elevates a reliable experience into one customers feel compelled to describe to others. Experience (420) sets the baseline. Magic (440) creates the highs above it.

Magic vs. Experience: the critical distinction

This is the most important conceptual clarification in dimension 440, and the one most commonly missed in workshops.

Experience (420) scores the consistent baseline — whether every customer, in every interaction, receives a response that is intentional, reliable, and meets expectations. Consistency is the standard. A strong Experience score means: nothing is left to chance, the brand's promise is defended at every touchpoint.

Magic (440) scores the peaks — the unexpected moments that exceed what the customer anticipated and produce the emotional response that generates advocacy. Magic is not consistent by definition. It is strategic and selective — designed to occur at the specific moments where the surprise will have the highest impact.

The sequencing rule: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic at one touchpoint and inconsistency at another do not become advocates. They become confused — and confusion precedes churn, not advocacy.

Score negative if the customer journey is functional but unremarkable, or if it creates friction the company hasn't noticed. Score positive when specific moments are designed to exceed expectations and customers spontaneously share those moments with others.

The four components of Magic

The Marketing Canvas breaks Magic into four scored components — each addressing a different dimension of the unexpected experience:

Effortless (441) — obstacles removed. The customer expects friction; they encounter none. The booking that takes 30 seconds when they budgeted 5 minutes. The form that pre-fills from their previous interaction. The return process that requires no explanation because the system already knows why. Effortlessness is the absence of friction the customer had learned to expect. It is magical precisely because the absence is unexpected — the category has trained customers to tolerate effort, and the brand has made it disappear.

Stress-free (442) — confusion, uncertainty, and anxiety eliminated. The customer expects to worry about something; they find there is nothing to worry about. The ambiguous delivery window that turns into real-time location tracking. The ingredient claim that is accompanied by independent verification rather than asking the customer to trust. The post-service question that is answered before it was asked. Stress-free magic is the proactive removal of cognitive load — the brand doing the worrying so the customer does not have to.

Sensory pleasure (443) — delight through sight, touch, sound, taste, or smell. In consumer markets this is the Apple unboxing, the Hermès ribbon, the hotel that remembers a pillow preference. The experience engages the senses in a way that exceeds the purely functional expectation. In service contexts, sensory pleasure appears in the aesthetics of a delivered report, the warmth of an unexpected handwritten note, the packaging that communicates care before a word is read.

Social pleasure (444) — status elevation. The customer encounters the brand in a way that makes them feel recognised, celebrated, or elevated in front of others. The loyalty recognition at a hotel check-in that happens in front of other guests. The personalised annual impact report that the customer shows to friends because it makes them look like someone who has made a difference. The referral confirmation that acknowledges the customer as a trusted advisor to their network. Social pleasure magic is the brand giving the customer a story they want to tell.

B2B Magic: cognitive, not sensory

In consumer markets, Magic is often sensory — the unboxing, the ribbon, the pillow preference. In B2B, Magic is cognitive: the insight the client didn't ask for, the risk flagged before it became a problem, the deliverable completed three weeks early without explanation.

The NTT Data case illustrates the distinction. B2B Magic isn't about delight in the consumer sense. It is about demonstrating competence so completely and proactively that the client forms the belief: "this is a genuine partner, not just a vendor." That belief is the B2B equivalent of advocacy — the CTO who mentions the vendor by name at an industry conference, the COO who recommends the firm without being asked, the procurement lead who shortcuts the RFP process because they already know who they want.

The B2B Magic design question: where in this engagement does the client expect reasonable competence — and where could we deliver something so far ahead of expectation that it changes the nature of the relationship?

Spotify's Discover Weekly is the canonical example of consumer-facing Magic that operates on a cognitive principle: the algorithm's ability to surface music the user didn't know they wanted, at the moment they most want it. Not sensory delight. Cognitive surprise. The user's reaction — "how does it know?" — is the Magic response. It drove measurable retention improvement, which is the commercial test of whether Magic is working.

Magic in the Marketing Canvas

The canonical question

Where do you exceed expectations in ways customers didn't see coming?

Magic appears in the Vital 8 of five archetypes — spanning the full range of strategic roles:

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth tends to destroy the exceptional experiences that created growth in the first place. The early customers of a high-growth brand experienced something that felt personal, attentive, and unexpectedly good — because the team was small, the founder was involved, and every interaction was high-touch. As the company scales, processes replace people, automation replaces attention, and the magic that converted early adopters into evangelists disappears into a standardised service. For A7, Magic is a Fatal Brake because losing it is the mechanism through which growth erodes the advocacy that funded growth. It must reach ≥+2 before hypergrowth investment can be sustained.

Primary Accelerator for A2 (Efficiency Machine): For the Efficiency Machine, Magic means the customer barely notices the transaction happened. The 25-minute Ryanair turnaround. The Amazon checkout that requires one click. The banking app that reconciles the account before the customer closes the browser. In A2, operational magic is not sensory delight — it is the complete removal of the customer's effort. The customer doesn't tell a story about the experience; they tell a story about the absence of one. "I barely had to do anything" is the A2 Magic response.

Secondary Brake for A6 (Value Harvester): A Value Harvester extracting maximum cash flow from an existing base must maintain enough magic to prevent the churn that would otherwise accelerate as the product matures. Magic maintenance for A6 is defensive — enough unexpected value to remind customers why they stay, even as the brand optimises for margin rather than growth.

Secondary Accelerator for A4 (Stagnant Leader): For a stagnant leader fighting churn, Magic initiatives provide the proof of renewal that keeps the existing base engaged while Experience (420) and Features (310) are being rebuilt. A single well-designed magical moment — the AI-powered feature that anticipates the user's next action, the proactive support contact that prevents a problem before it occurs — signals that the brand is still invested in the relationship.

Growth Driver for A6: For the Value Harvester, Magic initiatives that generate advocacy are a low-cost acquisition mechanism that complements the margin extraction strategy. Existing customers who experience unexpected delight become the most credible referral source for the next customer cohort.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. You have identified obstacles across your customer journey and reduced them (effortless).

  2. You have eliminated confusion, uncertainty, and anxiety across your customer journey (stress-free).

  3. You have delighted the senses of your customers — they all look for sensory pleasure (sensory pleasure).

  4. You have provided a customer experience that elevates your customers' status (social pleasure).

  5. You have reduced the social and environmental impact while making sustainable moments magical.

(Dimensions 441–444 + 445 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The customer journey is functional but unremarkable. There are no designed moments of unexpected delight. Customers are satisfied but not moved to advocate. Worse: friction and anxiety may exist that the team hasn't noticed because nobody has mapped the journey from the customer's perspective. For A7, a negative score here explains why growth is eroding the advocacy that created it.

Positive scores (+1 to +3): Specific moments are designed to exceed expectations across one or more of the four components. Customers spontaneously share those moments with others — in conversation, in reviews, in referrals. Magic is functioning as the advocacy generation mechanism: not all customers experience it, but the ones who do become the brand's most effective acquisition channel.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's customer journey is functional and unremarkable. The booking works. The cleaner arrives. The cleaning is done. But nothing about the interaction exceeds what a customer would expect from a competent cleaning service. There are no designed moments of effortlessness — the booking process requires four steps that could be two. There is no stress removal — customers who want to verify what products were used have to ask, and the answer varies by team member. There is no sensory pleasure — the cleaner leaves without any communication, the invoice arrives two days later as a plain text email. There is no social pleasure — the service produces no story the customer would want to share. When existing customers describe the service, they use words like "reliable" and "good" — the language of satisfied, disengaged customers rather than active advocates.

Score: +1 to +2 (Developing) Green Clean has introduced two designed Magic moments. First: the Family Health Report arrives within 6 hours of service completion — a specific, data-rich document that no competitor provides and that customers describe as "not what I expected" when they receive it for the first time. This addresses the stress-free component: customers who would have worried about whether the claims are real now have evidence without asking for it. Second: on the third service, customers receive a personalised summary of their cumulative impact — how many service visits, how many households protected from chemical exposure, how much waste has not been generated. This addresses social pleasure: customers who care about environmental responsibility have a number they can share. These two moments are working — the referral rate has started to climb. But the effortless and sensory pleasure components remain undesigned.

Score: +2 to +3 (Strong) Green Clean has designed Magic moments across all four components. Effortless: the booking takes 90 seconds on mobile, with address pre-filled and service preferences remembered. Scheduling confirmation and reminder are automatic. Stress-free: the Family Health Report arrives within 6 hours with a plain-language explanation of what was found and eliminated. Customers never have to ask. Sensory pleasure: the cleaner leaves a handwritten note summarising what was done in this specific home, with one personalised observation (a comment on the kitchen herbs, a note about the child's artwork visible from the bathroom). The note costs 3 minutes and generates more customer responses than any other touchpoint. Social pleasure: the annual impact statement — "Your household prevented 42kg of chemical exposure in 2024" — is designed as a shareable card with Green Clean's visual identity. 23% of customers share it on social media or forward it to friends. The referral rate reached 35% by 2024. Customers do not describe the service as "good." They describe specific moments that changed how they think about what a cleaning service can be.

Connected dimensions

Magic does not operate in isolation. Four dimensions connect most directly:

  • 130 — Pains & Gains: Magic eliminates pains and creates unexpected gains. The pain map is the source material for effortless and stress-free Magic design. When a pain is eliminated so completely that the customer barely registers its absence, that is effortless Magic. When a gain exceeds what the customer expected, that is the raw material of the sensory and social pleasure components.

  • 420 — Experience: Magic elevates experience beyond consistency. Experience (420) sets the reliable baseline. Magic (440) creates the moments above it. The two dimensions work in sequence: without a consistent Experience baseline, Magic investments are undermined by the inconsistency that surrounds them.

  • 320 — Emotions: Magic creates peak emotional moments. The surprise that generates advocacy is an emotional event — the "I didn't expect that" feeling that produces the story worth telling. Magic moments are the designed delivery mechanism for peak emotional benefits.

  • 140 — Engagement: Magic drives engagement and advocacy. A customer who has experienced a designed Magic moment is more likely to be a promoter on the NPS scale, more likely to refer, and more likely to provide feedback. Magic is the upstream cause; Engagement (140) measures the downstream effect.

Conclusion

Magic is the dimension that answers the question most brands cannot: why do some customers become advocates when others merely stay?

The answer is not product quality. Quality is expected. It is not service consistency. Consistency is the baseline. It is the specific, unexpected moment that exceeds what the customer had learned to anticipate — the report that arrives before they asked, the note that references their home specifically, the status recognition that makes them feel seen.

The design principle that separates effective Magic from wasted investment: it must exceed expectations on something the customer actually cares about. The hotel that remembers a pillow preference is Magic because sleep quality matters. The hotel that provides a turndown chocolate to a customer who explicitly avoids sugar has produced an interaction, not a magic moment.

Knowing what customers expect — and where exceeding it will produce the highest advocacy response — is the work. The four components (effortless, stress-free, sensory pleasure, social pleasure) provide the framework. The Moments map (410) and the Pains & Gains research (130) provide the evidence. Together, they produce the design brief for Magic initiatives that convert satisfied customers into advocates.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  3. Marketing Canvas Method, Appendix E — Dimension 440: Magic, Laurent Bouty, 2026

About this dimension

Dimension 440 — Magic is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Magic

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Channels

Most companies have channels. Few have orchestrated channels. Dimension 430 of the Marketing Canvas scores the difference — and explains why a brand with three connected channels outperforms one with eight siloed ones.

About the Marketing Canvas Method

This article covers dimension 430 — Channels, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Channels (dimension 430) scores how customers interact with your brand — physical and digital, owned and third-party — and whether those interactions form a seamless, coherent experience across all of them.

The canonical distinction that defines this dimension: most companies have channels. Few have orchestrated channels. The score measures orchestration, not presence.

A brand with a website, a mobile app, a social media presence, a phone line, and a field team is not necessarily scoring well on dimension 430. The question is whether those channels work together without silos — whether a customer who starts research on one channel can complete the journey seamlessly on another, and whether the company can track and serve that customer across the transition.

In the Marketing Canvas, Channels sits within the Journey meta-category alongside Moments (410), Experience (420), and Magic (440). It is the delivery infrastructure — the system that ensures every moment designed in 410 is actually accessible to the customer in the format that serves them best.

Presence vs. orchestration: the canonical distinction

Every company has channels. Most companies have more channels than they have resources to maintain well. The channel list is not the dimension. The orchestration of that list is.

The test is a single customer journey across multiple channels. A customer who discovers Green Clean through a health parenting blog, visits the website to research the formula, emails a question about ingredient safety, books a service via the app, receives the Family Health Report by email, and calls to ask about a recurring subscription — has touched five channels. If the experience is continuous (the phone call picks up where the booking left off; the subscription question doesn't require re-explaining the service model), the channels are orchestrated. If each channel treats the customer as a stranger, the channels exist but are not orchestrated.

The canonical four properties that define orchestrated channels:

Context (431) — can customers use the most relevant channel for their specific situation at each moment? A customer researching a service in the evening needs findable, credible content on the web. A customer mid-service with a question needs an immediate human response. A customer reviewing their health report at midnight needs a digital self-service interface. The same channel cannot serve all three moments well.

Interaction quality (432) — do channels provide clear, personalised, seamless interactions? Quality here means the interaction is adapted to the customer's identity and context — not generic, not one-size-fits-all, not a copy-paste template.

Information consistency (433) — is data consistent and real-time across channels? A customer who updates their household profile in the app should not have to re-state it on the phone. A booking made on the website should be visible to the cleaner on their route app. Inconsistency in data across channels is the most common channel orchestration failure — and the most invisible to the teams building the channels, who each see only their own system.

Orchestration (434) — are channels connected so customers can navigate seamlessly between them with no silos? This is the composite test: does the company have a joined-up view of the customer's journey, or does each channel operate as a separate interaction with no shared memory?

Digital, physical, and moment-driven channel design

The channel strategy question is not "should we be digital or physical?" Every customer journey involves both. The question is: which channel serves each moment best?

A purely digital company that ignores physical moments — the cleaner arriving at the door, the unboxing experience, the in-person explanation of a result — misses the touchpoints where trust is built or lost at the highest intensity. Physical moments carry emotional weight that digital channels cannot replicate.

A traditional service business that treats digital as a secondary channel — the website as an online brochure, the email as a support afterthought — loses the pre-purchase research phase entirely. Customers research digitally before they commit physically. Winning the digital research moment is often what determines whether the physical visit ever happens.

The best channel strategies design each moment to use the channel that serves the customer best:

  • The research moment needs findable, credible digital content

  • The booking moment needs a frictionless digital transaction

  • The service delivery moment needs a reliable physical interaction

  • The result delivery moment needs a clear digital report with optional human follow-up

  • The renewal moment needs a proactive, low-friction digital prompt

Designing channels from moments is the inversion of the default approach (designing moments around the channels that already exist). The default produces a channel strategy. The inversion produces an orchestrated journey.

Channels in the Marketing Canvas

The canonical question

Can customers interact with your brand through the channels they prefer, with a seamless experience across all of them?

Channels appears in the Vital 8 of two archetypes — in notably different roles:

Secondary Brake for A1 (Disruptive Newcomer): A disruptor's survival depends on being noticed and understood immediately. Features and positioning may be compelling, but if the channels through which the target customer discovers and evaluates the brand are wrong or incomplete, the disruption never reaches beyond the early-adopter bubble. Channel failure for A1 is quiet: the product is ready, the message is sharp, but the distribution infrastructure isn't present where the customers are. A Secondary Brake score means the brake must reach ≥+1 before channel failure begins to limit the reach of the disruption.

Secondary Accelerator for A5 (Pivot Pioneer): A company executing a strategic pivot may find that its existing channels were optimised for the old positioning and the old customer segment. The new direction — new JTBD, new lead segment, new positioning — may require new channels entirely. Legacy channels that served the old strategy are not neutral for the pivot; they actively signal the old identity to customers encountering the brand for the first time in the new context. For A5, channel strategy is part of the repositioning work, not a downstream execution decision.

A note on Fatal Brakes: Channels does not appear as a Fatal Brake in any archetype. But channel failure can block the dimensions that are Fatal. If Acquisition (610) is a Fatal Brake and channel orchestration failures are increasing CAC, the channel problem is a Fatal Brake problem in disguise. If Experience (420) is a Fatal Brake and channel inconsistency is producing the experience variance, the same applies. Channels is the infrastructure. Infrastructure failures propagate upward.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. Your customers can use the most relevant channel in function of their specific context at each moment.

  2. Your channels are physical and digital — you provide clear, personalised, and seamless interactions, anywhere, anytime.

  3. Information captured or shared in your channels is consistent, real-time, personalised, useful, and accurate.

  4. You have orchestrated all your channels — there is no silo between them, and customers can navigate seamlessly through them at each moment.

  5. You optimise the social and environmental impact of your physical and digital channels.

(Dimensions 431–434 + 435 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Channels operate in silos. Customers who cross channel boundaries encounter a brand that does not recognise them. Orchestration is absent or incomplete. The likely downstream effect: acquisition costs are higher than they need to be (research-to-booking friction), experience scores are lower than designed (channel handoff failures), and engagement data is fragmented (no joined-up view of customer behaviour).

Positive scores (+1 to +3): Channels are orchestrated. Customers move between channels without friction. Data is consistent and real-time across the full journey. Each channel is designed for the specific moment it serves. The company can track the customer journey across touchpoints and improve each channel based on measured performance.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method. Green Clean sells a residential service — cleaners visit customer homes — not packaged products. Their relevant channels are: website, booking flow, email, in-home service visit, Family Health Report (digital delivery), phone/chat support, and referral mechanics.

Score: −2 to −1 (Weak) Green Clean's channels are independent systems that do not share data or context. The website takes booking requests but is not connected to the cleaner's scheduling app — bookings are manually transferred by the founder. The Family Health Report is generated as a PDF by one team member and emailed by another, introducing a 24–72 hour delay that varies unpredictably. When a customer calls with a question about their report, the support team does not have access to the customer's service history or their specific report data — every call starts from scratch. A customer who books through the website and follows up by email is treated as two separate interactions. No channel knows what the others have communicated. The silos are invisible to the teams but immediately apparent to any customer who crosses a channel boundary.

Score: +1 to +2 (Developing) Green Clean has connected the booking system to the cleaner's route app — scheduling is now automated. The Family Health Report is generated and emailed automatically within 6 hours of service completion. A customer CRM has been introduced: all booking, service, and communication history is accessible to the support team when a customer calls. But the research channel (website) still operates independently — prospects who spend time researching on the website and then book are not identified as the same person until after the booking is made, meaning the website-to-booking conversion cannot be tracked and the research journey cannot be improved with data. The referral mechanic is manual — the team asks existing customers to refer but has no digital system to track referrals or reward them efficiently. Orchestration has improved significantly but is not yet complete.

Score: +2 to +3 (Strong) Green Clean's channels are fully orchestrated around the customer journey, not around internal team structures. The website research behaviour is tracked — customers who read the formula science page before booking convert at a higher rate, so that content is featured prominently in the booking flow. Booking, service, health report, follow-up communication, and subscription renewal are all automated and connected through a single customer record. Support staff see full service history, report data, and communication history before responding to any contact. The referral mechanic is digital — existing customers receive a referral link after every service and can track whether their referrals booked. Channel performance is measured per moment: website conversion rate, booking completion rate, Health Report open rate, support resolution time, referral conversion rate. Each metric corresponds to a specific channel at a specific journey stage. The orchestration is visible in the data: channel handoffs produce no drop-off in conversion that would indicate a silo.

Connected dimensions

Channels does not operate in isolation. Four dimensions connect most directly:

  • 240 — Visual Identity: Channels must carry visual identity consistently. A customer encountering the brand on Instagram, the website, the booking confirmation email, and the physical cleaner's uniform should see a coherent identity at every touchpoint. Channel proliferation without visual governance produces brand fragmentation.

  • 410 — Moments: Channels serve specific moments. The channel strategy is only as good as the moments map underneath it. Without knowing which moments require which types of interaction, channel decisions are made by habit (we've always had a phone line) rather than by design (this moment requires human contact).

  • 420 — Experience: Experience quality depends on channel execution. Channel inconsistency is one of the most common causes of experience variance — customers receive different responses from different channels because the channels are not coordinated. A +2 on Experience requires channel orchestration as a prerequisite.

  • 530 — Media: Media and channels overlap in digital contexts. Paid media, social media, email, and owned content all function as channels at the research and awareness stages. The boundary between Media (530) and Channels (430) is context: Media drives reach and awareness; Channels deliver the interaction and transaction. They share infrastructure and must be planned together.

Conclusion

Channels is the infrastructure dimension of the Journey meta-category. It does not generate the value proposition, design the experience, or create the magic. It delivers all of those things to the customer — or fails to.

The distinction that matters for scoring is not how many channels the brand has. It is whether those channels form a coherent system. A well-orchestrated system of three channels outscores a fragmented system of eight. The customer's perspective is binary: either the journey is seamless across channels, or it is not.

Channel failure is rarely dramatic. It does not produce a single terrible interaction. It produces accumulating friction — the customer who has to re-explain their situation to every channel they touch, the research that doesn't convert because the booking flow is on a different system, the report that arrives three days late because two teams aren't connected. Each incident is minor. The cumulative effect on acquisition, experience, and retention is material.

Sources

  1. Forrester Research, "The State of Omnichannel Commerce", Forrester, 2024 — forrester.com

  2. McKinsey & Company, "The value of getting personalisation right — or wrong — is multiplying", McKinsey, 2021 — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 430: Channels, Laurent Bouty, 2026

About this dimension

Dimension 430 — Channels is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Channels by Laurent Bouty

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Experience

Experience is a Fatal Brake for three archetypes. In every case the mechanism is the same: experience failure is the proximate cause of churn. Dimension 420 of the Marketing Canvas scores consistency — not brilliance — and explains why "leaving nothing to chance" is a scored criterion, not an aspiration.

About the Marketing Canvas Method

This article covers dimension 420 — Experience, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Experience (dimension 420) scores the brand's answer to every moment in the customer journey. Where Moments (410) maps what the customer thinks, feels, and does, Experience scores how well the company responds. Does the response reflect the customer's identity? Does it help them achieve their objectives? Is it consistent across time and space? Does it meet the expectations it sets?

The canonical question is not "do we create exceptional experiences?" It is: what is it actually like to be your customer?

In the Marketing Canvas, Experience sits within the Journey meta-category alongside Moments (410), Channels (430), and Magic (440). It is the most frequent Fatal Brake in the method — tied with Positioning (220) and Features (310) at three archetypes each. In every case, the mechanism is the same: experience failure is the proximate cause of churn.

Consistency over brilliance: the canonical insight

The most common Experience scoring error in workshops is confusing it with Magic (440). Experience is not about peak moments or memorable impressions. It is about baseline consistency.

A single brilliant experience surrounded by mediocre ones creates more frustration than consistent adequacy. The customer remembers the gap between the peak and the norm. A hotel that provides an extraordinary check-in and then loses the luggage has not delivered a good experience — it has demonstrated that brilliance is accidental and failure is structural.

Experience design is less about creating memorable highs than about eliminating the lows and ensuring reliability. Every touchpoint should be intentional. Every response should be consistent. The design question is not "how do we create moments that wow?" — that is Magic. The design question is "how do we ensure that every single interaction reflects the promise, regardless of which team member delivers it, which channel it occurs on, or which day of the week it is?"

This is why sub-question 423 scores: "For each moment, your brand answer is consistent in time and space, leaving nothing to chance." Leaving nothing to chance is not a phrase about aspiration. It is a scored criterion. Every undesigned moment is a moment where the brand's promise is undefended — delivered differently by different people, interpreted differently by different teams, experienced differently by different customers.

Score negative if customer experience varies unpredictably across touchpoints, teams, or time. Score positive when experience design is intentional, documented, trained, and measured — and when customers describe the experience using the same words the brand intends.

Experience vs. Magic: the critical distinction

These two dimensions are adjacent and routinely conflated. The confusion produces inflated Experience scores and underinvested Magic strategies.

Experience (420) scores the consistent baseline. Does every customer, in every interaction, receive a response that reflects their identity, serves their goals, and meets the expectations that were set? Consistency is the standard. A score of +2 on Experience means: every moment has a designed response, that response is reliably delivered, and customers confirm it matches their expectations.

Magic (440) scores the peaks. Does the brand exceed expectations in ways customers didn't anticipate? Magic is the surprise that converts a satisfied customer into an advocate. It is scored separately because it requires a different design discipline — not reliability engineering but expectation mapping and strategic over-delivery.

The sequencing principle: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic in one interaction and inconsistency in the next do not become advocates. They become confused — and confusion is the precursor to churn.

B2B Experience: the seams are felt

In B2C, Experience failure is visible and dramatic: the wrong product delivered, the rude support call, the website that crashes at checkout. In B2B, Experience failure is quieter and more expensive.

NTT Data's Experience challenge was not a single bad project. It was organisational inconsistency across post-merger engagement models. Different teams, acquired through different M&A paths, delivering different service standards under the same brand name. The client could feel the seams — the inconsistency between what the sales team promised and what the delivery team delivered, between what one regional office did and what another understood the engagement model to be.

B2B clients do not churn after one bad interaction. They churn after accumulating evidence that the inconsistency is structural rather than situational. The moment a client forms the belief "this isn't a bad week, this is how they operate" — the renewal conversation has already been lost. The revenue metric confirms it six months later.

For B2B service businesses, Experience design means: what does a client encounter at every stage of the engagement, regardless of which team member they are talking to? The standard is not the best delivery manager on staff. It is the minimum consistent standard that can be trained, documented, and reliably reproduced.

Experience in the Marketing Canvas

The canonical question

What is it actually like to be your customer?

Experience is a Fatal Brake for three archetypes — the most Fatal Brake appearances of any single dimension alongside Positioning and Features:

Fatal Brake for A4 (Stagnant Leader): Experience failure is the proximate cause of stagnation. The canonical A4 pattern: churn rises, leadership reaches for Acquisition to refill the bucket. The method says fix the leak first. For Sage in 2019, fragmented UX across dozens of legacy SKUs and desktop-era screens was driving customers to Xero and QuickBooks before the retention team even knew they were at risk. No acquisition investment can compensate for an experience that is actively driving customers away. Experience must reach ≥+2 before any other A4 investment makes strategic sense.

Fatal Brake for A6 (Value Harvester): A company extracting maximum cash flow from an existing base depends entirely on retention. Every 1% of churn that Experience failure generates is a permanent reduction in the cash extraction potential. For A6, Experience is not a growth lever — it is a defensive necessity. The floor below which the strategy collapses.

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth destroys experience consistency. Teams grow faster than onboarding can standardise behaviour. Processes built for 50 customers break at 500. The individual attention that defined early relationships becomes structurally impossible at scale. The Scale-Up Guardian's primary Experience challenge is not improving the experience — it is preserving the experience as headcount and customer volume compound. Every month of growth without experience governance is a month of promise dilution.

Secondary Brake for A2 (Efficiency Machine): For the Efficiency Machine, Experience operates at the operational level. Magic (440) is the adjacent dimension that eliminates friction entirely; Experience sets the floor below which efficiency becomes indistinguishable from indifference. A cost-leader that delivers a genuinely frictionless experience retains customers. A cost-leader that delivers a degraded experience loses them to whichever competitor can match the price with marginally better service.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. For each moment, your brand answer has been adapted to your customers' identity.

  2. For each moment, your brand answer has helped customers to achieve their goals.

  3. For each moment, your brand answer is consistent in time and space, leaving nothing to chance.

  4. For each moment, your brand answer has clear expectations and delivers them consistently.

  5. For each moment, your brand answer is compatible with the concept of sustainability.

(Dimensions 421–424 + 425 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Experience varies unpredictably across touchpoints, teams, or time. The brand promise is undefended in at least some interactions. For archetypes where Experience is a Fatal Brake, this score explains why churn is rising and retention investment is not working. The leaky bucket cannot be fixed by adding more acquisition — it must be fixed at the experience level first.

Positive scores (+1 to +3): Experience is intentional, documented, trained, and measured. Every moment has a designed response. Customers describe the experience in consistent language that matches the brand's intended positioning. The baseline is reliable. Magic (440) initiatives can now be layered on top of a consistent foundation rather than compensating for an inconsistent one.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's experience varies significantly by team member and visit. The two full-time cleaners operate consistently. The three part-time contractors, hired during a growth period, have had no structured onboarding and no shared standard for what a Green Clean visit should look and feel like. Some customers receive a verbal explanation of the formula used; others do not. Some receive the Family Health Report within 24 hours; others wait three days or receive it after a follow-up request. When a customer calls to ask about an ingredient, the response depends on which team member picks up. The experience is sometimes excellent and frequently adequate — but it is never reliably consistent. When the founder asks customers how the experience compares to EcoPure, the feedback is mixed: "better sometimes, comparable usually." That is a −1: experience is not reliably reflecting the positioning.

Score: +1 to +2 (Developing) Green Clean has identified the three highest-variance touchpoints from customer research: the onboarding call, the first-service visit, and the Family Health Report delivery. For each, a standard has been designed and documented. Contractors are trained on the first-service protocol. The Health Report is now automated — delivered within 6 hours of every service completion without requiring manual action. The onboarding call has a structured agenda that ensures the health-first positioning is explained consistently regardless of who conducts it. Variance has reduced but not eliminated — the support interaction (what happens when a customer reports a concern) remains undesigned and inconsistent. Positive customer descriptions of the experience are converging on consistent language: "professional," "trustworthy," "actually explains what they're doing." The experience baseline is improving. It is not yet reliable enough to score +2.

Score: +2 to +3 (Strong) Every Green Clean customer touchpoint has a designed response, documented standard, and trained delivery. The experience is consistent whether the customer is in their first month or their third year, whether they call on a Monday or a Saturday, whether their regular cleaner is available or a substitute is deployed. When a substitute is required, the customer receives a proactive message explaining the change and confirming the substitute has been briefed on the household profile. Support interactions follow a structured resolution protocol — concern acknowledged within 2 hours, resolution proposed within 24 hours, follow-up confirmed within 48 hours. Customer descriptions of the experience use consistent language unprompted: "they always explain what they've done," "I never have to chase anything," "it's the same standard every time." The NPS promoter cohort grew from 38% to 62% between 2021 and 2024 — a direct consequence of experience consistency, not product change.

Connected dimensions

Experience does not operate in isolation. Four dimensions connect most directly:

  • 410 — Moments: Experience responds to moments. Every Experience initiative traces back to a specific mapped moment where the current response is inadequate. Without a complete Moments map, Experience improvements are directional guesses — improving the wrong touchpoints while leaving the highest-variance ones unaddressed.

  • 130 — Pains & Gains: Experience design eliminates pains. The specific pains identified in journey research — the ones that accumulate into churn — are the Experience design brief. A pain at the research phase is an Experience problem in the before stage. A pain at the support interaction is an Experience problem in the after stage.

  • 440 — Magic: Magic elevates experience beyond consistency. Once the baseline is reliable, Magic creates the peaks that generate advocacy. The sequencing is fixed: fix Experience first, then invest in Magic. A +2 on Experience is the prerequisite for Magic initiatives to work as intended.

  • 630 — Lifetime: Experience quality predicts customer lifetime. The most reliable predictor of whether a customer will still be a customer in 12 months is whether their ongoing experience is consistently meeting the promise. Experience is not just a satisfaction metric — it is the leading indicator of lifetime value.

Conclusion

Experience is tied as the most frequent Fatal Brake in the Marketing Canvas Method for a straightforward reason: it is the dimension that most directly connects to churn. Customers do not leave because of a single terrible interaction. They leave because the cumulative experience does not consistently reflect the promise that acquired them.

The strategic diagnostic is not "how good is our best experience?" — teams consistently overrate on this question because they remember peaks and discount inconsistency. The question is: "what does every customer encounter, every time, regardless of team member, channel, or day of the week?"

If the honest answer is "it depends" — dimension 420 is the initiative queue.

Sources

  1. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  2. Bain & Company, "Closing the Delivery Gap", 2005 — bain.com (the foundational 80/8 gap research: 80% of companies believe they deliver a superior experience; 8% of customers agree)

  3. Marketing Canvas Method, Appendix E — Dimension 420: Experience, Laurent Bouty, 2026

About this dimension

Dimension 420 — Experience is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Experience

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Moments

Most companies over-invest in the "during" phase of the customer journey and under-invest in "before" and "after" — which is precisely where both acquisition and retention are won or lost. Dimension 410 of the Marketing Canvas explains how to map moments correctly, and why the most valuable output is the seams it reveals between departments.

About the Marketing Canvas Method

This article covers dimension 410 — Moments, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Moments (dimension 410) maps the complete customer journey as a sequence of interactions seen through the customer's eyes. For each moment — before, during, and after purchase — three questions: what does the customer think? What do they feel? What do they do?

The discipline that makes this strategic rather than descriptive: moments must be built from customer observations and interviews, not from internal assumptions about how the journey should work. Every organisation believes it knows its customer journey. The map built from actual customer research almost always looks different from the one built internally.

In the Marketing Canvas, Moments sits within the Journey meta-category alongside Experience (420), Channels (430), and Magic (440). It is the discovery layer — the research input that makes every other Journey dimension scoreable with evidence rather than assumption.

The seams between departments

The most powerful diagnostic purpose of Moments mapping is one that most companies never anticipate: it reveals the seams between internal departments, and those seams are where the customer experience fails.

Marketing owns "before" — awareness, research, consideration. Sales owns "during" — the purchase conversation, onboarding, first use. Support owns "after" — ongoing use, queries, renewal, advocacy. Each team does their part reasonably well, measured on their own terms.

But the customer experiences one continuous journey.

When a customer moves from "before" to "during" — from the website to the first sales conversation — they often encounter a brand that seems to know nothing about what they read, what concerns they formed, or what decision criteria they brought to that conversation. The seam is visible to the customer; it is invisible to the organisation because no single team owns the transition.

Moments mapping forces the organisation to adopt the customer's timeline rather than its own. When the full map is laid out — every touchpoint from first awareness to advocacy, with what the customer thinks, feels, and does at each stage — the seams appear as blank spaces or contradictory experiences. Those gaps are the strategic agenda.

Score negative if the journey map was built from internal assumptions or if the "after purchase" phase is unmapped. Score positive when moments are customer-researched, granular, and actively used to design specific touchpoints.

Where companies systematically fail: the "during" trap

Most companies over-invest in the "during" phase of the journey — the purchase moment, onboarding, first use — and under-invest in "before" and "after." This is where both acquisition and retention are won or lost, making the imbalance strategically costly.

Before purchase is where acquisition happens or fails. A customer who feels confused during research — overwhelmed by competing eco-friendly claims, unable to find independent verification, uncertain which product fits their specific situation — will not convert, regardless of how good the product is. The pre-purchase experience is entirely within the brand's control, and almost entirely unmapped by most organisations. The website, the content, the comparison experience, the social proof — these are designed by teams who know the product, not by teams who have watched confused prospects try to make a decision.

After purchase is where retention happens or fails. A customer who feels abandoned after the transaction — no structured follow-up, no proactive communication about what to expect, no mechanism to give feedback — begins the churn journey immediately. Engagement does not decline suddenly. It begins its decline at the first moment the customer feels the relationship ended at the point of purchase.

The diagnostic test: map your last twelve months of customer-facing initiatives. What percentage addressed the before phase? The during phase? The after phase? The imbalance is almost always striking — and it predicts where the strategic gaps are before a single score is calculated.

Mental Models - Moments in the Marketing Canvas

Mental Models - Moments in the Marketing Canvas

The three questions at every moment

For each moment in the journey, the Marketing Canvas requires three specific answers — all drawn from customer research, not internal assumption:

What does the customer think? The cognitive content of the moment. What information are they processing? What comparisons are they making? What questions are unanswered? What beliefs — accurate or not — are shaping their interpretation of this interaction? For Green Clean's "first service visit" moment: "I hope this is genuinely different from the eco-cleaning service I tried before. I want to see something that proves the health claim."

What does the customer feel? The emotional state at this moment. Anxiety, anticipation, confusion, trust, pride, disappointment. This is not the emotional job (what they want to feel in their lives) — it is the actual emotional state at this specific interaction. Accurately mapping current feelings is the prerequisite for designing better ones. If the customer feels sceptical at the booking stage, no amount of warm onboarding email copy will resolve it.

What does the customer do? The observable behaviour. Searches. Clicks. Calls. Compares. Reads reviews. Asks a friend. Abandons the checkout. These actions are often more revealing than stated opinions because they reflect actual behaviour under actual conditions, not hypothetical responses to survey questions.

Moments in the Marketing Canvas

The canonical question

Have you identified the critical touchpoints where customers interact with your brand, and do you understand what they think, feel, and do at each one?

Strategic role: foundational for most, existential for one

Moments has an unusual Vital 8 profile — it appears formally in only one archetype: it is a Secondary Brake for A9 (Category Creator).

The reason is specific: in a new category, the customer journey doesn't exist yet. There are no established research behaviours, no familiar comparison frameworks, no prior experience of the product category that shapes customer expectations. Every moment must be designed from scratch — the customer has no mental model to bring to the first interaction. Green Clean in 2021 could not assume customers knew how to evaluate "indoor health protection" because the category had not been defined. The first-clean teaching moment — the onboarding experience that explained what health-first meant in practice — was not a nice-to-have. It was the foundational category education that made everything downstream possible.

For all other archetypes, Moments functions like Pains & Gains (130): it is a research input that feeds the scored dimensions above it, particularly Experience (420) and Channels (430). A company that cannot score Experience honestly — because it does not know what customers actually experience at each touchpoint — almost certainly has an unmapped or assumption-built Moments layer underneath. Improving the Moments map improves the reliability of every Journey dimension score.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. Your moments have been defined based on customer observations and interviews — they reflect the customer's actual identity and experience, not internal assumptions.

  2. You have identified all moments before, during, and after buying your value proposition.

  3. For each moment, you have clearly identified what your customers think, feel, and do.

  4. For each moment, you have clearly identified what the customer objectives are.

(Dimensions 411–414 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The journey map is absent or built from internal assumptions rather than customer research. The before and/or after phases are unmapped. The seams between departments are invisible because nobody owns the transitions. Experience (420), Channels (430), and Magic (440) scores cannot be reliably set because the evidence base doesn't exist.

Positive scores (+1 to +3): The journey map is built from customer research, covers all three phases, captures think/feel/do at each moment, and actively identifies where seams between departments are creating experience failures. The map is used — it feeds Experience design, Channels decisions, and Magic moment identification — rather than filed as a project deliverable.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's journey map was assembled by the founding team in a two-hour internal session. It covers the booking process (during) and a brief post-service survey (after). The before phase is entirely unmapped: no research has been done on how health-conscious parents discover cleaning services, what search terms they use, which comparison triggers they apply, or what objections form during the research phase. The after phase map stops at the thank-you email. No moment beyond the first three months of service has been researched. When the team describes the customer journey, they describe what they intended to build, not what customers actually experience. The seam between the website (marketing) and the first sales conversation (founder-led) is the most visible gap — customers arrive with questions formed during research that the founder does not know they have.

Score: +1 to +2 (Developing) Green Clean has conducted eight customer interviews specifically focused on journey mapping. The before phase now has three defined moments: the initial search ("what is the difference between eco-cleaning and health-first cleaning?"), the comparison visit (landing on the Green Clean website and trying to find independent validation), and the booking decision (the moment of commitment and what makes it happen or not). For each, the team has documented what customers think, feel, and do based on interview evidence rather than assumption. The during and early-after phases are mapped. The seam between website and onboarding call has been identified — customers arrive uncertain whether the health claim is substantiated. The team has not yet designed a solution to the seam. But the seam is now named.

Score: +2 to +3 (Strong) Green Clean's journey map covers all phases, built from twenty-two customer interviews and three observed service visits. The before phase is mapped in five moments, each with specific documented think/feel/do data. The seam between website and first contact has been designed out: a structured pre-booking sequence sends the university formula summary and B-Corp certification to every prospect before the first call, so the call begins with the health claim validated rather than questioned. The "First-Clean Teaching Moment" — a structured onboarding experience at the first service visit — explains in plain language what health-first means in practice, shows the before/after air quality data, and delivers the first Family Health Report within 24 hours. The after phase is mapped through the 12-month relationship, with specific moments designed at months 1, 3, 6, and 12 that correspond to the highest churn risk periods identified through customer research. The journey map is reviewed quarterly and updated as research produces new evidence.

Connected dimensions

Moments does not operate in isolation. Four dimensions connect directly as the downstream beneficiaries of good journey mapping:

  • 130 — Pains & Gains: Pains and gains map to specific moments. The pain of "I can't find independent verification" belongs to the before-phase research moment. The gain of "the Family Health Report made me feel like I finally know the truth" belongs to the first-service after moment. Without Moments mapping, Pains & Gains is a list. With it, it becomes a journey-anchored strategy.

  • 420 — Experience: Experience is designed moment by moment. Every Experience initiative traces back to a specific moment in the journey where the current response is inadequate. Without a complete Moments map, Experience improvements are based on internal opinion rather than evidence about where the customer actually struggles.

  • 430 — Channels: Channels serve specific moments. The question "which channels should we be present on?" cannot be answered without knowing which moments require which types of interaction. A customer in the research moment needs findable, credible content. A customer in the post-service moment needs a proactive, low-friction feedback mechanism. The channel follows the moment.

  • 440 — Magic: Magic happens at peak moments. The unexpected delight that converts a satisfied customer into an active advocate occurs at a specific moment in the journey — often one that companies hadn't designed for at all. Without a complete Moments map, Magic cannot be placed. The map reveals where the peaks and troughs are; Magic strategy addresses the peaks.

Conclusion

Moments is the dimension that makes the Journey meta-category honest. Without it, Experience is opinion, Channels is habit, and Magic is accident.

The strategic value is not the map itself — it is what the map reveals. The over-investment in "during" at the expense of "before" and "after." The seams between marketing, sales, and support that the customer feels as a fragmented experience. The moments that are assumed to be satisfactory because nobody has actually asked a customer what they think, feel, and do at that point.

For Category Creators building a journey from scratch, the Moments map is the architectural blueprint — without it, every other Journey dimension is being built without knowing the structure it needs to serve. For all other archetypes, it is the evidence base that makes every Journey dimension score credible rather than flattering.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Forrester Research, "Customer Journey Mapping Best Practices", Forrester, 2024 — forrester.com

  3. Marketing Canvas Method, Appendix E — Dimension 410: Moments, Laurent Bouty, 2026

About this dimension

Dimension 410 — Moments is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Moments

Read More
Laurent Bouty Laurent Bouty

Resources for Course on Customer Experience

List of resources (books, articles, video, website) that I recommend you to visit if you are interested in the Customer Experience topic. I am using these resources during my classes @SolvayBrusselsSchool and during workshops.

In this post, you will find a collection of resources that I am using and maintaining for my different classes and workshops on this topic. Unfortunately I couldn't list everything that I am reading or watching and I have only selected some vital fews that mights inspired you. It is also a good start if you are interested by this topic. The list contains websites, books, articles and videos.

Cheers

Laurent 

Recommended WebSites

Recommended Books

Easy to read and a good start if you are curious about Customer Experience from a Marketing Perspective. A lot of good tools and and a powerful process.

Customers are powerful. They have a loud voice, a wealth of choice and their expectations are higher than ever.

This book covers ten principles you can use to make real world improvements to your customers’ experiences, whatever your business does and whoever you are. 

One step further on this subject with the notion of Moments of Truth. Brian Solis is a though leader on Digital Transformation.

In his new book X: The Experience When Business Meets Design bestselling author Brian Solis shares why great products are no longer good enough to win with customers and why creative marketing and delightful customer service too are not enough to succeed. In X, he shares why the future of business is experiential and how to create and cultivate meaningful experiences.


Recommended Articles

1998 - Harvard Business Review - Welcome to the Experience Economy

First there was agriculture, then manufactured goods, and eventually services. Each change represented a step up in economic value--a way for producers to distinguish their products from increasingly undifferentiated competitive offerings. Now, as services are in their turn becoming commoditized, companies are looking for the next higher value in an economic offering. Leading-edge companies are finding that it lies in staging experiences. To reach this higher level of competition, companies will have to learn how to design, sell, and deliver experiences that customers will readily pay for. An experience occurs when a company uses services as the stage--and goods as props--for engaging individuals in a way that creates a memorable event. And while experiences have always been at the heart of the entertainment business, any company stages an experience when it engages customers in a personal, memorable way. The lessons of pioneering experience providers, including the Walt Disney Company, can help companies learn how to compete in the experience economy. The authors offer five design principles that drive the creation of memorable experiences. First, create a consistent theme, one that resonates throughout the entire experience. Second, layer the theme with positive cues--for example, easy-to-follow signs. Third, eliminate negative cues, those visual or aural messages that distract or contradict the theme. Fourth, offer memorabilia that commemorate the experience for the user. Finally, engage all five senses--through sights, sounds, and so on--to heighten the experience and thus make it more memorable. 

Read on HBR here


2002 - Harvard Business Review - The One Number You Need to Grow

Companies spend lots of time and money on complex tools to assess customer satisfaction. But they're measuring the wrong thing. The best predictor of top-line growth can usually be captured in a single survey question: Would you recommend this company to a friend? This finding is based on two years of research in which a variety of survey questions were tested by linking the responses with actual customer behavior--purchasing patterns and referrals--and ultimately with company growth. Surprisingly, the most effective question wasn't about customer satisfaction or even loyalty per se. In most of the industries studied, the percentage of customers enthusiastic enough about a company to refer it to a friend or colleague directly correlated with growth rates among competitors. Willingness to talk up a company or product to friends, family, and colleagues is one of the best indicators of loyalty because of the customer's sacrifice in making the recommendation. When customers act as references, they do more than indicate they've received good economic value from a company; they put their own reputations on the line. And they will risk their reputations only if they feel intense loyalty. The findings point to a new, simpler approach to customer research, one directly linked to a company's results. By substituting a single question--blunt tool though it may appear to be--for the complex black box of the customer satisfaction survey, companies can actually put consumer survey results to use and focus employees on the task of stimulating growth. 

Read on HBR here


2007 - Harvard Business Review - Understanding Customer Experience

The article discusses the importance of monitoring customer experience. Several examples are presented demonstrating customer dissatisfaction in a variety of situations. Customer experience is defined, and several methods for measuring it are discussed. The results of a recent Bain & Company survey of customers of 362 companies is presented. Methods of collecting customer data at "touch points," instances of direct contact either with the product or service itself or with representations of it, are detailed.

Read on HBR here


2016 - McKinsey - Customer Experiences

Collection of ideas, articles, thoughts and interviews about Customer Experience. Currently 2 entire collections examining how companies can create competitive advantage by putting customers first and managing their journeys.

Read on McKinsey here


2016 - PWC - 10 Principles of Customer Strategy

It’s no longer enough to target your chosen customers. To stay ahead, you need to create distinctive value and experiences for them.

Read on Strategy-Business here


2017 - Altimeter - The Customer Experience of AI

This report explores the impact of AI on the customer experience, lays out a set of operating principles, and includes insight from technology users, developers, academics, designers, and other experts on how to design customer-centric experiences in the age of AI. More than anything, business leaders today should begin to treat AI as fundamental to the customer experience. This means thinking about the values it perpetuates as an essential and eventually indistinguishable expression of product, services and the brand experience.

Read on Altimeter here


Recommended Videos

Joe Pine introduces the Progression of Economic Value, the foundational model for understanding the role of Experiences in the history of economics.
Brian Solis, award-winning author, prominent blogger/writer and principal analyst at Altimeter Group, helps people understand and define the role we play in the evolution of technology and its impact. In this first talk of the session The Wild Promises of the Digital Customer Experience at Lift16, Brian Solis shows us how brands are focusing their designs on customer experience and why it matters, especially in the digital world.

This is a full keynote based on the story of my latest book 'when digital becomes human'. Presented this on the biggest retail conference in Istanbul. Enjoy! More about Steven Steven is an expert in customer focus in a digital world.

Read More
Laurent Bouty Laurent Bouty

A New Whole Brain Customer Experience

Reposted article from https://www.spencerstuart.com/research-and-insight/whole-brain-marketing

Author Sid McGrath, Chief Strategy Officer, Karmarama, discusses the importance of customer experience for brands.

Reposted from https://www.spencerstuart.com/research-and-insight/whole-brain-marketing

Sid McGrath, Chief Strategy Officer, Karmarama

A consequence experience

The customer experience for brands is driven by consequence: when customers have a good experience they continue to engage with the brand; if the experience is bad they disengage, often telling others about their disappointment and spreading a message of general discontent.

This makes for some pretty precarious brand relationships. However, the issue that so far no-one seems to be addressing is that the very notion of the customer experience is fundamentally flawed.

A disconnected, transactional experience

Marketing leaders see customer experience as their number one priority, but they are rarely in control of all of it, or even enough of it to make a difference. Recent focus on using digital technology to influence customer purchasing decisions is causing some companies to concentrate too narrowly on the customer’s interaction with a brand at the moment of sale. These ‘experiences’ can be relentlessly sales-focused and annoyingly interruptive. Organisations calling themselves customer experience experts encourage companies to increase the number of transactional messages, but is this really leading to better, worthwhile and relevant experiences for the customer? The fact is that global use of adblockers is rising while trust in brands is rapidly declining.

Reducing a person’s relationship to a brand solely to that of a ‘customer’ demonstrates a lack of understanding about the role that brands actually play in our lives. A transactional focus also shows a brand’s hand: their audience is perceived as a wallet ready to be picked or a purse ready to be opened, rather than a person to be understood, respected and served.

A human experience

What then is the answer? To start with, people must be respected as human beings with fairly low thresholds for unwanted buying messaging. This doesn’t mean no messaging; it means messaging that is empathetic to the individual and to the context. With this in mind the customer experience can then be reimagined as the human experience, from CX to HX, where a brand’s pathway into people’s lives is fully understood and delivered with relevance rather than persistence.

The transactional experience previously locked into consumption and category gives way to one that connects with culture and allows for meaningful, useful and relevant communication, with the selling left to the right place and the right time.

A fully-connected experience

If the human experience is the answer, how do we get there? Again, it’s about understanding how humans, and more specifically, how our brains, work.

The brain is an astonishingly connected piece of hardware. As much as we may try and separate it into left and right hemisphere, or occipital and frontal lobes, or neocortex and limbic system, every part of the human brain is connected to another part to improve its understanding and response towards any situation. This connection ensures an integrated response, a mix of logical and emotional consideration, instinct and intelligence.

The interconnectedness of the brain serves as a model for understanding how to create better, balanced and truly human experiences for brands. Approaching any experience with a whole-brain mentality means finding a way to connect everything with everything, from consumption to category to culture. This is how humans see their world — fully connected — so it stands to reason that it’s also how they should engage with their brands and how brands should engage with them.

Now consider once again the classic customer experience — an experience that ushers customers through the consumption and category phases of their relationship with a brand, but stops short of connecting to the culture of the wider life they lead.

Without the insight and intelligence required to understand the implications — the consequences — of the brand experience, the experience itself breaks or, worse, is biased towards buying rather than being. This is the fundamental reason why customer experiences are disconnected.

A meaningful experience

Once a brand is able to connect to a person’s wider life, understand and respect them as a human rather than a data point or part of an algorithm, and can connect that back to the category and consumption phase of the relationship, there emerges a new type of powerful, meaningful, connected human experience — one that people will actually want rather than one that will frustrate them.

So, paradoxically, we don’t live in the age of the customer; they are not “king”, “queen” or “the answer”. We need to move to the age of human, to human-centricity where what the human wants and needs can be fully, relevantly connected to the relationship that brands want to have.

Read More