BLOG

A collection of article and ideas that help Smart Marketers to become Smarter

marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Magic

Satisfaction keeps customers. Magic turns them into advocates. Dimension 440 of the Marketing Canvas scores four components — effortless, stress-free, sensory pleasure, and social pleasure — and explains why exceeding expectations on something the customer doesn't care about isn't magic, it's waste.

About the Marketing Canvas Method

This article covers dimension 440 — Magic, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Magic (dimension 440) scores whether your brand exceeds expectations in ways customers didn't anticipate. Not satisfaction — that is delivering what was promised. Not quality — that is consistency. Magic is the surprise that transforms a satisfied customer into an active advocate.

The most important design principle: exceeding expectations on something the customer doesn't care about isn't magic. It's waste. Magic requires knowing what the customer expects — and then strategically exceeding it at the moment that matters most.

In the Marketing Canvas, Magic sits within the Journey meta-category alongside Moments (410), Experience (420), and Channels (430). It is the peak layer — the dimension that elevates a reliable experience into one customers feel compelled to describe to others. Experience (420) sets the baseline. Magic (440) creates the highs above it.

Magic vs. Experience: the critical distinction

This is the most important conceptual clarification in dimension 440, and the one most commonly missed in workshops.

Experience (420) scores the consistent baseline — whether every customer, in every interaction, receives a response that is intentional, reliable, and meets expectations. Consistency is the standard. A strong Experience score means: nothing is left to chance, the brand's promise is defended at every touchpoint.

Magic (440) scores the peaks — the unexpected moments that exceed what the customer anticipated and produce the emotional response that generates advocacy. Magic is not consistent by definition. It is strategic and selective — designed to occur at the specific moments where the surprise will have the highest impact.

The sequencing rule: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic at one touchpoint and inconsistency at another do not become advocates. They become confused — and confusion precedes churn, not advocacy.

Score negative if the customer journey is functional but unremarkable, or if it creates friction the company hasn't noticed. Score positive when specific moments are designed to exceed expectations and customers spontaneously share those moments with others.

The four components of Magic

The Marketing Canvas breaks Magic into four scored components — each addressing a different dimension of the unexpected experience:

Effortless (441) — obstacles removed. The customer expects friction; they encounter none. The booking that takes 30 seconds when they budgeted 5 minutes. The form that pre-fills from their previous interaction. The return process that requires no explanation because the system already knows why. Effortlessness is the absence of friction the customer had learned to expect. It is magical precisely because the absence is unexpected — the category has trained customers to tolerate effort, and the brand has made it disappear.

Stress-free (442) — confusion, uncertainty, and anxiety eliminated. The customer expects to worry about something; they find there is nothing to worry about. The ambiguous delivery window that turns into real-time location tracking. The ingredient claim that is accompanied by independent verification rather than asking the customer to trust. The post-service question that is answered before it was asked. Stress-free magic is the proactive removal of cognitive load — the brand doing the worrying so the customer does not have to.

Sensory pleasure (443) — delight through sight, touch, sound, taste, or smell. In consumer markets this is the Apple unboxing, the Hermès ribbon, the hotel that remembers a pillow preference. The experience engages the senses in a way that exceeds the purely functional expectation. In service contexts, sensory pleasure appears in the aesthetics of a delivered report, the warmth of an unexpected handwritten note, the packaging that communicates care before a word is read.

Social pleasure (444) — status elevation. The customer encounters the brand in a way that makes them feel recognised, celebrated, or elevated in front of others. The loyalty recognition at a hotel check-in that happens in front of other guests. The personalised annual impact report that the customer shows to friends because it makes them look like someone who has made a difference. The referral confirmation that acknowledges the customer as a trusted advisor to their network. Social pleasure magic is the brand giving the customer a story they want to tell.

B2B Magic: cognitive, not sensory

In consumer markets, Magic is often sensory — the unboxing, the ribbon, the pillow preference. In B2B, Magic is cognitive: the insight the client didn't ask for, the risk flagged before it became a problem, the deliverable completed three weeks early without explanation.

The NTT Data case illustrates the distinction. B2B Magic isn't about delight in the consumer sense. It is about demonstrating competence so completely and proactively that the client forms the belief: "this is a genuine partner, not just a vendor." That belief is the B2B equivalent of advocacy — the CTO who mentions the vendor by name at an industry conference, the COO who recommends the firm without being asked, the procurement lead who shortcuts the RFP process because they already know who they want.

The B2B Magic design question: where in this engagement does the client expect reasonable competence — and where could we deliver something so far ahead of expectation that it changes the nature of the relationship?

Spotify's Discover Weekly is the canonical example of consumer-facing Magic that operates on a cognitive principle: the algorithm's ability to surface music the user didn't know they wanted, at the moment they most want it. Not sensory delight. Cognitive surprise. The user's reaction — "how does it know?" — is the Magic response. It drove measurable retention improvement, which is the commercial test of whether Magic is working.

Magic in the Marketing Canvas

The canonical question

Where do you exceed expectations in ways customers didn't see coming?

Magic appears in the Vital 8 of five archetypes — spanning the full range of strategic roles:

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth tends to destroy the exceptional experiences that created growth in the first place. The early customers of a high-growth brand experienced something that felt personal, attentive, and unexpectedly good — because the team was small, the founder was involved, and every interaction was high-touch. As the company scales, processes replace people, automation replaces attention, and the magic that converted early adopters into evangelists disappears into a standardised service. For A7, Magic is a Fatal Brake because losing it is the mechanism through which growth erodes the advocacy that funded growth. It must reach ≥+2 before hypergrowth investment can be sustained.

Primary Accelerator for A2 (Efficiency Machine): For the Efficiency Machine, Magic means the customer barely notices the transaction happened. The 25-minute Ryanair turnaround. The Amazon checkout that requires one click. The banking app that reconciles the account before the customer closes the browser. In A2, operational magic is not sensory delight — it is the complete removal of the customer's effort. The customer doesn't tell a story about the experience; they tell a story about the absence of one. "I barely had to do anything" is the A2 Magic response.

Secondary Brake for A6 (Value Harvester): A Value Harvester extracting maximum cash flow from an existing base must maintain enough magic to prevent the churn that would otherwise accelerate as the product matures. Magic maintenance for A6 is defensive — enough unexpected value to remind customers why they stay, even as the brand optimises for margin rather than growth.

Secondary Accelerator for A4 (Stagnant Leader): For a stagnant leader fighting churn, Magic initiatives provide the proof of renewal that keeps the existing base engaged while Experience (420) and Features (310) are being rebuilt. A single well-designed magical moment — the AI-powered feature that anticipates the user's next action, the proactive support contact that prevents a problem before it occurs — signals that the brand is still invested in the relationship.

Growth Driver for A6: For the Value Harvester, Magic initiatives that generate advocacy are a low-cost acquisition mechanism that complements the margin extraction strategy. Existing customers who experience unexpected delight become the most credible referral source for the next customer cohort.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

MCM Self-Assessment — Magic (441–445)
Marketing Canvas Method JOURNEY · 400
Magic Self-Assessment
Select your level of agreement for each statement. There is no neutral option — the Marketing Canvas forces a directional position on every dimension. The dimension score is the average of the five sub-scores, rounded to the nearest whole number.
Dimension score
Select one option per statement  ·  Dimensions 441–445  ·  Score revealed after each selection
DIM
Statement
Score
← Brake
Accelerator →
441
01.You have identified obstacles across your customer journey and reduced them (effortless).
442
02.You have eliminated confusion, uncertainty, and anxiety across your customer journey (stress-free).
443
03.You have delighted the senses of your customers — they all look for sensory pleasure (sensory pleasure).
444
04.You have provided a customer experience that elevates your customers' status (social pleasure).
445
05.You have reduced the social and environmental impact while making sustainable moments magical.
Brake verdict · Dim 440
My Magic is a Brake
No, my customer journey lacks designed moments of unexpected delight across the four components. It is not helping me achieve my goals.
Accelerator verdict · Dim 440
My Magic is an Accelerator
Yes, I have designed effortless, stress-free, sensory, and social pleasure moments that convert satisfied customers into active advocates. It is helping me achieve my goals.
Strength
Per dimension
Marketing Canvas Method · marketingcanvas.net
© Laurent Bouty · Marketing Strategy, Programmed

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The customer journey is functional but unremarkable. There are no designed moments of unexpected delight. Customers are satisfied but not moved to advocate. Worse: friction and anxiety may exist that the team hasn't noticed because nobody has mapped the journey from the customer's perspective. For A7, a negative score here explains why growth is eroding the advocacy that created it.

Positive scores (+1 to +3): Specific moments are designed to exceed expectations across one or more of the four components. Customers spontaneously share those moments with others — in conversation, in reviews, in referrals. Magic is functioning as the advocacy generation mechanism: not all customers experience it, but the ones who do become the brand's most effective acquisition channel.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's customer journey is functional and unremarkable. The booking works. The cleaner arrives. The cleaning is done. But nothing about the interaction exceeds what a customer would expect from a competent cleaning service. There are no designed moments of effortlessness — the booking process requires four steps that could be two. There is no stress removal — customers who want to verify what products were used have to ask, and the answer varies by team member. There is no sensory pleasure — the cleaner leaves without any communication, the invoice arrives two days later as a plain text email. There is no social pleasure — the service produces no story the customer would want to share. When existing customers describe the service, they use words like "reliable" and "good" — the language of satisfied, disengaged customers rather than active advocates.

Score: +1 to +2 (Developing) Green Clean has introduced two designed Magic moments. First: the Family Health Report arrives within 6 hours of service completion — a specific, data-rich document that no competitor provides and that customers describe as "not what I expected" when they receive it for the first time. This addresses the stress-free component: customers who would have worried about whether the claims are real now have evidence without asking for it. Second: on the third service, customers receive a personalised summary of their cumulative impact — how many service visits, how many households protected from chemical exposure, how much waste has not been generated. This addresses social pleasure: customers who care about environmental responsibility have a number they can share. These two moments are working — the referral rate has started to climb. But the effortless and sensory pleasure components remain undesigned.

Score: +2 to +3 (Strong) Green Clean has designed Magic moments across all four components. Effortless: the booking takes 90 seconds on mobile, with address pre-filled and service preferences remembered. Scheduling confirmation and reminder are automatic. Stress-free: the Family Health Report arrives within 6 hours with a plain-language explanation of what was found and eliminated. Customers never have to ask. Sensory pleasure: the cleaner leaves a handwritten note summarising what was done in this specific home, with one personalised observation (a comment on the kitchen herbs, a note about the child's artwork visible from the bathroom). The note costs 3 minutes and generates more customer responses than any other touchpoint. Social pleasure: the annual impact statement — "Your household prevented 42kg of chemical exposure in 2024" — is designed as a shareable card with Green Clean's visual identity. 23% of customers share it on social media or forward it to friends. The referral rate reached 35% by 2024. Customers do not describe the service as "good." They describe specific moments that changed how they think about what a cleaning service can be.

Connected dimensions

Magic does not operate in isolation. Four dimensions connect most directly:

  • 130 — Pains & Gains: Magic eliminates pains and creates unexpected gains. The pain map is the source material for effortless and stress-free Magic design. When a pain is eliminated so completely that the customer barely registers its absence, that is effortless Magic. When a gain exceeds what the customer expected, that is the raw material of the sensory and social pleasure components.

  • 420 — Experience: Magic elevates experience beyond consistency. Experience (420) sets the reliable baseline. Magic (440) creates the moments above it. The two dimensions work in sequence: without a consistent Experience baseline, Magic investments are undermined by the inconsistency that surrounds them.

  • 320 — Emotions: Magic creates peak emotional moments. The surprise that generates advocacy is an emotional event — the "I didn't expect that" feeling that produces the story worth telling. Magic moments are the designed delivery mechanism for peak emotional benefits.

  • 140 — Engagement: Magic drives engagement and advocacy. A customer who has experienced a designed Magic moment is more likely to be a promoter on the NPS scale, more likely to refer, and more likely to provide feedback. Magic is the upstream cause; Engagement (140) measures the downstream effect.

Conclusion

Magic is the dimension that answers the question most brands cannot: why do some customers become advocates when others merely stay?

The answer is not product quality. Quality is expected. It is not service consistency. Consistency is the baseline. It is the specific, unexpected moment that exceeds what the customer had learned to anticipate — the report that arrives before they asked, the note that references their home specifically, the status recognition that makes them feel seen.

The design principle that separates effective Magic from wasted investment: it must exceed expectations on something the customer actually cares about. The hotel that remembers a pillow preference is Magic because sleep quality matters. The hotel that provides a turndown chocolate to a customer who explicitly avoids sugar has produced an interaction, not a magic moment.

Knowing what customers expect — and where exceeding it will produce the highest advocacy response — is the work. The four components (effortless, stress-free, sensory pleasure, social pleasure) provide the framework. The Moments map (410) and the Pains & Gains research (130) provide the evidence. Together, they produce the design brief for Magic initiatives that convert satisfied customers into advocates.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  3. Marketing Canvas Method, Appendix E — Dimension 440: Magic, Laurent Bouty, 2026

About this dimension

Dimension 440 — Magic is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Magic

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Channels

Most companies have channels. Few have orchestrated channels. Dimension 430 of the Marketing Canvas scores the difference — and explains why a brand with three connected channels outperforms one with eight siloed ones.

About the Marketing Canvas Method

This article covers dimension 430 — Channels, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Channels (dimension 430) scores how customers interact with your brand — physical and digital, owned and third-party — and whether those interactions form a seamless, coherent experience across all of them.

The canonical distinction that defines this dimension: most companies have channels. Few have orchestrated channels. The score measures orchestration, not presence.

A brand with a website, a mobile app, a social media presence, a phone line, and a field team is not necessarily scoring well on dimension 430. The question is whether those channels work together without silos — whether a customer who starts research on one channel can complete the journey seamlessly on another, and whether the company can track and serve that customer across the transition.

In the Marketing Canvas, Channels sits within the Journey meta-category alongside Moments (410), Experience (420), and Magic (440). It is the delivery infrastructure — the system that ensures every moment designed in 410 is actually accessible to the customer in the format that serves them best.

Presence vs. orchestration: the canonical distinction

Every company has channels. Most companies have more channels than they have resources to maintain well. The channel list is not the dimension. The orchestration of that list is.

The test is a single customer journey across multiple channels. A customer who discovers Green Clean through a health parenting blog, visits the website to research the formula, emails a question about ingredient safety, books a service via the app, receives the Family Health Report by email, and calls to ask about a recurring subscription — has touched five channels. If the experience is continuous (the phone call picks up where the booking left off; the subscription question doesn't require re-explaining the service model), the channels are orchestrated. If each channel treats the customer as a stranger, the channels exist but are not orchestrated.

The canonical four properties that define orchestrated channels:

Context (431) — can customers use the most relevant channel for their specific situation at each moment? A customer researching a service in the evening needs findable, credible content on the web. A customer mid-service with a question needs an immediate human response. A customer reviewing their health report at midnight needs a digital self-service interface. The same channel cannot serve all three moments well.

Interaction quality (432) — do channels provide clear, personalised, seamless interactions? Quality here means the interaction is adapted to the customer's identity and context — not generic, not one-size-fits-all, not a copy-paste template.

Information consistency (433) — is data consistent and real-time across channels? A customer who updates their household profile in the app should not have to re-state it on the phone. A booking made on the website should be visible to the cleaner on their route app. Inconsistency in data across channels is the most common channel orchestration failure — and the most invisible to the teams building the channels, who each see only their own system.

Orchestration (434) — are channels connected so customers can navigate seamlessly between them with no silos? This is the composite test: does the company have a joined-up view of the customer's journey, or does each channel operate as a separate interaction with no shared memory?

Digital, physical, and moment-driven channel design

The channel strategy question is not "should we be digital or physical?" Every customer journey involves both. The question is: which channel serves each moment best?

A purely digital company that ignores physical moments — the cleaner arriving at the door, the unboxing experience, the in-person explanation of a result — misses the touchpoints where trust is built or lost at the highest intensity. Physical moments carry emotional weight that digital channels cannot replicate.

A traditional service business that treats digital as a secondary channel — the website as an online brochure, the email as a support afterthought — loses the pre-purchase research phase entirely. Customers research digitally before they commit physically. Winning the digital research moment is often what determines whether the physical visit ever happens.

The best channel strategies design each moment to use the channel that serves the customer best:

  • The research moment needs findable, credible digital content

  • The booking moment needs a frictionless digital transaction

  • The service delivery moment needs a reliable physical interaction

  • The result delivery moment needs a clear digital report with optional human follow-up

  • The renewal moment needs a proactive, low-friction digital prompt

Designing channels from moments is the inversion of the default approach (designing moments around the channels that already exist). The default produces a channel strategy. The inversion produces an orchestrated journey.

Channels in the Marketing Canvas

The canonical question

Can customers interact with your brand through the channels they prefer, with a seamless experience across all of them?

Channels appears in the Vital 8 of two archetypes — in notably different roles:

Secondary Brake for A1 (Disruptive Newcomer): A disruptor's survival depends on being noticed and understood immediately. Features and positioning may be compelling, but if the channels through which the target customer discovers and evaluates the brand are wrong or incomplete, the disruption never reaches beyond the early-adopter bubble. Channel failure for A1 is quiet: the product is ready, the message is sharp, but the distribution infrastructure isn't present where the customers are. A Secondary Brake score means the brake must reach ≥+1 before channel failure begins to limit the reach of the disruption.

Secondary Accelerator for A5 (Pivot Pioneer): A company executing a strategic pivot may find that its existing channels were optimised for the old positioning and the old customer segment. The new direction — new JTBD, new lead segment, new positioning — may require new channels entirely. Legacy channels that served the old strategy are not neutral for the pivot; they actively signal the old identity to customers encountering the brand for the first time in the new context. For A5, channel strategy is part of the repositioning work, not a downstream execution decision.

A note on Fatal Brakes: Channels does not appear as a Fatal Brake in any archetype. But channel failure can block the dimensions that are Fatal. If Acquisition (610) is a Fatal Brake and channel orchestration failures are increasing CAC, the channel problem is a Fatal Brake problem in disguise. If Experience (420) is a Fatal Brake and channel inconsistency is producing the experience variance, the same applies. Channels is the infrastructure. Infrastructure failures propagate upward.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

MCM Self-Assessment — Channels (431–435)
Marketing Canvas Method JOURNEY · 400
Channels Self-Assessment
Select your level of agreement for each statement. There is no neutral option — the Marketing Canvas forces a directional position on every dimension. The dimension score is the average of the five sub-scores, rounded to the nearest whole number.
Dimension score
Select one option per statement  ·  Dimensions 431–435  ·  Score revealed after each selection
DIM
Statement
Score
← Brake
Accelerator →
431
01.Your customers can use the most relevant channel in function of their specific context at each moment.
432
02.Your channels are physical and digital — you provide clear, personalised, and seamless interactions, anywhere, anytime.
433
03.Information captured or shared in your channels is consistent, real-time, personalised, useful, and accurate.
434
04.You have orchestrated all your channels — there is no silo between them, and customers can navigate seamlessly through them at each moment.
435
05.You optimise the social and environmental impact of your physical and digital channels.
Brake verdict · Dim 430
My Channels are a Brake
No, my channels operate in silos without orchestration or consistent data. They are not helping me achieve my goals.
Accelerator verdict · Dim 430
My Channels are an Accelerator
Yes, my channels are fully orchestrated — data-consistent, context-appropriate, and seamlessly navigable by customers. They are helping me achieve my goals.
Strength
Per dimension
Marketing Canvas Method · marketingcanvas.net
© Laurent Bouty · Marketing Strategy, Programmed

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Channels operate in silos. Customers who cross channel boundaries encounter a brand that does not recognise them. Orchestration is absent or incomplete. The likely downstream effect: acquisition costs are higher than they need to be (research-to-booking friction), experience scores are lower than designed (channel handoff failures), and engagement data is fragmented (no joined-up view of customer behaviour).

Positive scores (+1 to +3): Channels are orchestrated. Customers move between channels without friction. Data is consistent and real-time across the full journey. Each channel is designed for the specific moment it serves. The company can track the customer journey across touchpoints and improve each channel based on measured performance.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method. Green Clean sells a residential service — cleaners visit customer homes — not packaged products. Their relevant channels are: website, booking flow, email, in-home service visit, Family Health Report (digital delivery), phone/chat support, and referral mechanics.

Score: −2 to −1 (Weak) Green Clean's channels are independent systems that do not share data or context. The website takes booking requests but is not connected to the cleaner's scheduling app — bookings are manually transferred by the founder. The Family Health Report is generated as a PDF by one team member and emailed by another, introducing a 24–72 hour delay that varies unpredictably. When a customer calls with a question about their report, the support team does not have access to the customer's service history or their specific report data — every call starts from scratch. A customer who books through the website and follows up by email is treated as two separate interactions. No channel knows what the others have communicated. The silos are invisible to the teams but immediately apparent to any customer who crosses a channel boundary.

Score: +1 to +2 (Developing) Green Clean has connected the booking system to the cleaner's route app — scheduling is now automated. The Family Health Report is generated and emailed automatically within 6 hours of service completion. A customer CRM has been introduced: all booking, service, and communication history is accessible to the support team when a customer calls. But the research channel (website) still operates independently — prospects who spend time researching on the website and then book are not identified as the same person until after the booking is made, meaning the website-to-booking conversion cannot be tracked and the research journey cannot be improved with data. The referral mechanic is manual — the team asks existing customers to refer but has no digital system to track referrals or reward them efficiently. Orchestration has improved significantly but is not yet complete.

Score: +2 to +3 (Strong) Green Clean's channels are fully orchestrated around the customer journey, not around internal team structures. The website research behaviour is tracked — customers who read the formula science page before booking convert at a higher rate, so that content is featured prominently in the booking flow. Booking, service, health report, follow-up communication, and subscription renewal are all automated and connected through a single customer record. Support staff see full service history, report data, and communication history before responding to any contact. The referral mechanic is digital — existing customers receive a referral link after every service and can track whether their referrals booked. Channel performance is measured per moment: website conversion rate, booking completion rate, Health Report open rate, support resolution time, referral conversion rate. Each metric corresponds to a specific channel at a specific journey stage. The orchestration is visible in the data: channel handoffs produce no drop-off in conversion that would indicate a silo.

Connected dimensions

Channels does not operate in isolation. Four dimensions connect most directly:

  • 240 — Visual Identity: Channels must carry visual identity consistently. A customer encountering the brand on Instagram, the website, the booking confirmation email, and the physical cleaner's uniform should see a coherent identity at every touchpoint. Channel proliferation without visual governance produces brand fragmentation.

  • 410 — Moments: Channels serve specific moments. The channel strategy is only as good as the moments map underneath it. Without knowing which moments require which types of interaction, channel decisions are made by habit (we've always had a phone line) rather than by design (this moment requires human contact).

  • 420 — Experience: Experience quality depends on channel execution. Channel inconsistency is one of the most common causes of experience variance — customers receive different responses from different channels because the channels are not coordinated. A +2 on Experience requires channel orchestration as a prerequisite.

  • 530 — Media: Media and channels overlap in digital contexts. Paid media, social media, email, and owned content all function as channels at the research and awareness stages. The boundary between Media (530) and Channels (430) is context: Media drives reach and awareness; Channels deliver the interaction and transaction. They share infrastructure and must be planned together.

Conclusion

Channels is the infrastructure dimension of the Journey meta-category. It does not generate the value proposition, design the experience, or create the magic. It delivers all of those things to the customer — or fails to.

The distinction that matters for scoring is not how many channels the brand has. It is whether those channels form a coherent system. A well-orchestrated system of three channels outscores a fragmented system of eight. The customer's perspective is binary: either the journey is seamless across channels, or it is not.

Channel failure is rarely dramatic. It does not produce a single terrible interaction. It produces accumulating friction — the customer who has to re-explain their situation to every channel they touch, the research that doesn't convert because the booking flow is on a different system, the report that arrives three days late because two teams aren't connected. Each incident is minor. The cumulative effect on acquisition, experience, and retention is material.

Sources

  1. Forrester Research, "The State of Omnichannel Commerce", Forrester, 2024 — forrester.com

  2. McKinsey & Company, "The value of getting personalisation right — or wrong — is multiplying", McKinsey, 2021 — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 430: Channels, Laurent Bouty, 2026

About this dimension

Dimension 430 — Channels is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Channels by Laurent Bouty

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Experience

Experience is a Fatal Brake for three archetypes. In every case the mechanism is the same: experience failure is the proximate cause of churn. Dimension 420 of the Marketing Canvas scores consistency — not brilliance — and explains why "leaving nothing to chance" is a scored criterion, not an aspiration.

About the Marketing Canvas Method

This article covers dimension 420 — Experience, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Experience (dimension 420) scores the brand's answer to every moment in the customer journey. Where Moments (410) maps what the customer thinks, feels, and does, Experience scores how well the company responds. Does the response reflect the customer's identity? Does it help them achieve their objectives? Is it consistent across time and space? Does it meet the expectations it sets?

The canonical question is not "do we create exceptional experiences?" It is: what is it actually like to be your customer?

In the Marketing Canvas, Experience sits within the Journey meta-category alongside Moments (410), Channels (430), and Magic (440). It is the most frequent Fatal Brake in the method — tied with Positioning (220) and Features (310) at three archetypes each. In every case, the mechanism is the same: experience failure is the proximate cause of churn.

Consistency over brilliance: the canonical insight

The most common Experience scoring error in workshops is confusing it with Magic (440). Experience is not about peak moments or memorable impressions. It is about baseline consistency.

A single brilliant experience surrounded by mediocre ones creates more frustration than consistent adequacy. The customer remembers the gap between the peak and the norm. A hotel that provides an extraordinary check-in and then loses the luggage has not delivered a good experience — it has demonstrated that brilliance is accidental and failure is structural.

Experience design is less about creating memorable highs than about eliminating the lows and ensuring reliability. Every touchpoint should be intentional. Every response should be consistent. The design question is not "how do we create moments that wow?" — that is Magic. The design question is "how do we ensure that every single interaction reflects the promise, regardless of which team member delivers it, which channel it occurs on, or which day of the week it is?"

This is why sub-question 423 scores: "For each moment, your brand answer is consistent in time and space, leaving nothing to chance." Leaving nothing to chance is not a phrase about aspiration. It is a scored criterion. Every undesigned moment is a moment where the brand's promise is undefended — delivered differently by different people, interpreted differently by different teams, experienced differently by different customers.

Score negative if customer experience varies unpredictably across touchpoints, teams, or time. Score positive when experience design is intentional, documented, trained, and measured — and when customers describe the experience using the same words the brand intends.

Experience vs. Magic: the critical distinction

These two dimensions are adjacent and routinely conflated. The confusion produces inflated Experience scores and underinvested Magic strategies.

Experience (420) scores the consistent baseline. Does every customer, in every interaction, receive a response that reflects their identity, serves their goals, and meets the expectations that were set? Consistency is the standard. A score of +2 on Experience means: every moment has a designed response, that response is reliably delivered, and customers confirm it matches their expectations.

Magic (440) scores the peaks. Does the brand exceed expectations in ways customers didn't anticipate? Magic is the surprise that converts a satisfied customer into an advocate. It is scored separately because it requires a different design discipline — not reliability engineering but expectation mapping and strategic over-delivery.

The sequencing principle: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic in one interaction and inconsistency in the next do not become advocates. They become confused — and confusion is the precursor to churn.

B2B Experience: the seams are felt

In B2C, Experience failure is visible and dramatic: the wrong product delivered, the rude support call, the website that crashes at checkout. In B2B, Experience failure is quieter and more expensive.

NTT Data's Experience challenge was not a single bad project. It was organisational inconsistency across post-merger engagement models. Different teams, acquired through different M&A paths, delivering different service standards under the same brand name. The client could feel the seams — the inconsistency between what the sales team promised and what the delivery team delivered, between what one regional office did and what another understood the engagement model to be.

B2B clients do not churn after one bad interaction. They churn after accumulating evidence that the inconsistency is structural rather than situational. The moment a client forms the belief "this isn't a bad week, this is how they operate" — the renewal conversation has already been lost. The revenue metric confirms it six months later.

For B2B service businesses, Experience design means: what does a client encounter at every stage of the engagement, regardless of which team member they are talking to? The standard is not the best delivery manager on staff. It is the minimum consistent standard that can be trained, documented, and reliably reproduced.

Experience in the Marketing Canvas

The canonical question

What is it actually like to be your customer?

Experience is a Fatal Brake for three archetypes — the most Fatal Brake appearances of any single dimension alongside Positioning and Features:

Fatal Brake for A4 (Stagnant Leader): Experience failure is the proximate cause of stagnation. The canonical A4 pattern: churn rises, leadership reaches for Acquisition to refill the bucket. The method says fix the leak first. For Sage in 2019, fragmented UX across dozens of legacy SKUs and desktop-era screens was driving customers to Xero and QuickBooks before the retention team even knew they were at risk. No acquisition investment can compensate for an experience that is actively driving customers away. Experience must reach ≥+2 before any other A4 investment makes strategic sense.

Fatal Brake for A6 (Value Harvester): A company extracting maximum cash flow from an existing base depends entirely on retention. Every 1% of churn that Experience failure generates is a permanent reduction in the cash extraction potential. For A6, Experience is not a growth lever — it is a defensive necessity. The floor below which the strategy collapses.

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth destroys experience consistency. Teams grow faster than onboarding can standardise behaviour. Processes built for 50 customers break at 500. The individual attention that defined early relationships becomes structurally impossible at scale. The Scale-Up Guardian's primary Experience challenge is not improving the experience — it is preserving the experience as headcount and customer volume compound. Every month of growth without experience governance is a month of promise dilution.

Secondary Brake for A2 (Efficiency Machine): For the Efficiency Machine, Experience operates at the operational level. Magic (440) is the adjacent dimension that eliminates friction entirely; Experience sets the floor below which efficiency becomes indistinguishable from indifference. A cost-leader that delivers a genuinely frictionless experience retains customers. A cost-leader that delivers a degraded experience loses them to whichever competitor can match the price with marginally better service.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

MCM Self-Assessment — Experience (421–425)
Marketing Canvas Method JOURNEY · 400
Experience Self-Assessment
Select your level of agreement for each statement. There is no neutral option — the Marketing Canvas forces a directional position on every dimension. The dimension score is the average of the five sub-scores, rounded to the nearest whole number.
Dimension score
Select one option per statement  ·  Dimensions 421–425  ·  Score revealed after each selection
DIM
Statement
Score
← Brake
Accelerator →
421
01.For each moment, your brand answer has been adapted to your customers' identity.
422
02.For each moment, your brand answer has helped customers to achieve their goals.
423
03.For each moment, your brand answer is consistent in time and space, leaving nothing to chance.
424
04.For each moment, your brand answer has clear expectations and delivers them consistently.
425
05.For each moment, your brand answer is compatible with the concept of sustainability.
Brake verdict · Dim 420
My Experience is a Brake
No, my customer experience is inconsistent across touchpoints, teams, or time. It is not helping me achieve my goals.
Accelerator verdict · Dim 420
My Experience is an Accelerator
Yes, my customer experience is intentional, consistent, and reliably delivered at every touchpoint, leaving nothing to chance. It is helping me achieve my goals.
Strength
Per dimension
Marketing Canvas Method · marketingcanvas.net
© Laurent Bouty · Marketing Strategy, Programmed

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Experience varies unpredictably across touchpoints, teams, or time. The brand promise is undefended in at least some interactions. For archetypes where Experience is a Fatal Brake, this score explains why churn is rising and retention investment is not working. The leaky bucket cannot be fixed by adding more acquisition — it must be fixed at the experience level first.

Positive scores (+1 to +3): Experience is intentional, documented, trained, and measured. Every moment has a designed response. Customers describe the experience in consistent language that matches the brand's intended positioning. The baseline is reliable. Magic (440) initiatives can now be layered on top of a consistent foundation rather than compensating for an inconsistent one.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's experience varies significantly by team member and visit. The two full-time cleaners operate consistently. The three part-time contractors, hired during a growth period, have had no structured onboarding and no shared standard for what a Green Clean visit should look and feel like. Some customers receive a verbal explanation of the formula used; others do not. Some receive the Family Health Report within 24 hours; others wait three days or receive it after a follow-up request. When a customer calls to ask about an ingredient, the response depends on which team member picks up. The experience is sometimes excellent and frequently adequate — but it is never reliably consistent. When the founder asks customers how the experience compares to EcoPure, the feedback is mixed: "better sometimes, comparable usually." That is a −1: experience is not reliably reflecting the positioning.

Score: +1 to +2 (Developing) Green Clean has identified the three highest-variance touchpoints from customer research: the onboarding call, the first-service visit, and the Family Health Report delivery. For each, a standard has been designed and documented. Contractors are trained on the first-service protocol. The Health Report is now automated — delivered within 6 hours of every service completion without requiring manual action. The onboarding call has a structured agenda that ensures the health-first positioning is explained consistently regardless of who conducts it. Variance has reduced but not eliminated — the support interaction (what happens when a customer reports a concern) remains undesigned and inconsistent. Positive customer descriptions of the experience are converging on consistent language: "professional," "trustworthy," "actually explains what they're doing." The experience baseline is improving. It is not yet reliable enough to score +2.

Score: +2 to +3 (Strong) Every Green Clean customer touchpoint has a designed response, documented standard, and trained delivery. The experience is consistent whether the customer is in their first month or their third year, whether they call on a Monday or a Saturday, whether their regular cleaner is available or a substitute is deployed. When a substitute is required, the customer receives a proactive message explaining the change and confirming the substitute has been briefed on the household profile. Support interactions follow a structured resolution protocol — concern acknowledged within 2 hours, resolution proposed within 24 hours, follow-up confirmed within 48 hours. Customer descriptions of the experience use consistent language unprompted: "they always explain what they've done," "I never have to chase anything," "it's the same standard every time." The NPS promoter cohort grew from 38% to 62% between 2021 and 2024 — a direct consequence of experience consistency, not product change.

Connected dimensions

Experience does not operate in isolation. Four dimensions connect most directly:

  • 410 — Moments: Experience responds to moments. Every Experience initiative traces back to a specific mapped moment where the current response is inadequate. Without a complete Moments map, Experience improvements are directional guesses — improving the wrong touchpoints while leaving the highest-variance ones unaddressed.

  • 130 — Pains & Gains: Experience design eliminates pains. The specific pains identified in journey research — the ones that accumulate into churn — are the Experience design brief. A pain at the research phase is an Experience problem in the before stage. A pain at the support interaction is an Experience problem in the after stage.

  • 440 — Magic: Magic elevates experience beyond consistency. Once the baseline is reliable, Magic creates the peaks that generate advocacy. The sequencing is fixed: fix Experience first, then invest in Magic. A +2 on Experience is the prerequisite for Magic initiatives to work as intended.

  • 630 — Lifetime: Experience quality predicts customer lifetime. The most reliable predictor of whether a customer will still be a customer in 12 months is whether their ongoing experience is consistently meeting the promise. Experience is not just a satisfaction metric — it is the leading indicator of lifetime value.

Conclusion

Experience is tied as the most frequent Fatal Brake in the Marketing Canvas Method for a straightforward reason: it is the dimension that most directly connects to churn. Customers do not leave because of a single terrible interaction. They leave because the cumulative experience does not consistently reflect the promise that acquired them.

The strategic diagnostic is not "how good is our best experience?" — teams consistently overrate on this question because they remember peaks and discount inconsistency. The question is: "what does every customer encounter, every time, regardless of team member, channel, or day of the week?"

If the honest answer is "it depends" — dimension 420 is the initiative queue.

Sources

  1. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  2. Bain & Company, "Closing the Delivery Gap", 2005 — bain.com (the foundational 80/8 gap research: 80% of companies believe they deliver a superior experience; 8% of customers agree)

  3. Marketing Canvas Method, Appendix E — Dimension 420: Experience, Laurent Bouty, 2026

About this dimension

Dimension 420 — Experience is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Experience

Read More