BLOG

A collection of article and ideas that help Smart Marketers to become Smarter

Marketing Canvas, marketingcanvas.net Laurent Bouty Marketing Canvas, marketingcanvas.net Laurent Bouty

Marketing Canvas - Magic

Satisfaction keeps customers. Magic turns them into advocates. Dimension 440 of the Marketing Canvas scores four components — effortless, stress-free, sensory pleasure, and social pleasure — and explains why exceeding expectations on something the customer doesn't care about isn't magic, it's waste.

About the Marketing Canvas Method

This article covers dimension 440 — Magic, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Magic (dimension 440) scores whether your brand exceeds expectations in ways customers didn't anticipate. Not satisfaction — that is delivering what was promised. Not quality — that is consistency. Magic is the surprise that transforms a satisfied customer into an active advocate.

The most important design principle: exceeding expectations on something the customer doesn't care about isn't magic. It's waste. Magic requires knowing what the customer expects — and then strategically exceeding it at the moment that matters most.

In the Marketing Canvas, Magic sits within the Journey meta-category alongside Moments (410), Experience (420), and Channels (430). It is the peak layer — the dimension that elevates a reliable experience into one customers feel compelled to describe to others. Experience (420) sets the baseline. Magic (440) creates the highs above it.

Magic vs. Experience: the critical distinction

This is the most important conceptual clarification in dimension 440, and the one most commonly missed in workshops.

Experience (420) scores the consistent baseline — whether every customer, in every interaction, receives a response that is intentional, reliable, and meets expectations. Consistency is the standard. A strong Experience score means: nothing is left to chance, the brand's promise is defended at every touchpoint.

Magic (440) scores the peaks — the unexpected moments that exceed what the customer anticipated and produce the emotional response that generates advocacy. Magic is not consistent by definition. It is strategic and selective — designed to occur at the specific moments where the surprise will have the highest impact.

The sequencing rule: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic at one touchpoint and inconsistency at another do not become advocates. They become confused — and confusion precedes churn, not advocacy.

Score negative if the customer journey is functional but unremarkable, or if it creates friction the company hasn't noticed. Score positive when specific moments are designed to exceed expectations and customers spontaneously share those moments with others.

The four components of Magic

The Marketing Canvas breaks Magic into four scored components — each addressing a different dimension of the unexpected experience:

Effortless (441) — obstacles removed. The customer expects friction; they encounter none. The booking that takes 30 seconds when they budgeted 5 minutes. The form that pre-fills from their previous interaction. The return process that requires no explanation because the system already knows why. Effortlessness is the absence of friction the customer had learned to expect. It is magical precisely because the absence is unexpected — the category has trained customers to tolerate effort, and the brand has made it disappear.

Stress-free (442) — confusion, uncertainty, and anxiety eliminated. The customer expects to worry about something; they find there is nothing to worry about. The ambiguous delivery window that turns into real-time location tracking. The ingredient claim that is accompanied by independent verification rather than asking the customer to trust. The post-service question that is answered before it was asked. Stress-free magic is the proactive removal of cognitive load — the brand doing the worrying so the customer does not have to.

Sensory pleasure (443) — delight through sight, touch, sound, taste, or smell. In consumer markets this is the Apple unboxing, the Hermès ribbon, the hotel that remembers a pillow preference. The experience engages the senses in a way that exceeds the purely functional expectation. In service contexts, sensory pleasure appears in the aesthetics of a delivered report, the warmth of an unexpected handwritten note, the packaging that communicates care before a word is read.

Social pleasure (444) — status elevation. The customer encounters the brand in a way that makes them feel recognised, celebrated, or elevated in front of others. The loyalty recognition at a hotel check-in that happens in front of other guests. The personalised annual impact report that the customer shows to friends because it makes them look like someone who has made a difference. The referral confirmation that acknowledges the customer as a trusted advisor to their network. Social pleasure magic is the brand giving the customer a story they want to tell.

B2B Magic: cognitive, not sensory

In consumer markets, Magic is often sensory — the unboxing, the ribbon, the pillow preference. In B2B, Magic is cognitive: the insight the client didn't ask for, the risk flagged before it became a problem, the deliverable completed three weeks early without explanation.

The NTT Data case illustrates the distinction. B2B Magic isn't about delight in the consumer sense. It is about demonstrating competence so completely and proactively that the client forms the belief: "this is a genuine partner, not just a vendor." That belief is the B2B equivalent of advocacy — the CTO who mentions the vendor by name at an industry conference, the COO who recommends the firm without being asked, the procurement lead who shortcuts the RFP process because they already know who they want.

The B2B Magic design question: where in this engagement does the client expect reasonable competence — and where could we deliver something so far ahead of expectation that it changes the nature of the relationship?

Spotify's Discover Weekly is the canonical example of consumer-facing Magic that operates on a cognitive principle: the algorithm's ability to surface music the user didn't know they wanted, at the moment they most want it. Not sensory delight. Cognitive surprise. The user's reaction — "how does it know?" — is the Magic response. It drove measurable retention improvement, which is the commercial test of whether Magic is working.

Magic in the Marketing Canvas

The canonical question

Where do you exceed expectations in ways customers didn't see coming?

Magic appears in the Vital 8 of five archetypes — spanning the full range of strategic roles:

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth tends to destroy the exceptional experiences that created growth in the first place. The early customers of a high-growth brand experienced something that felt personal, attentive, and unexpectedly good — because the team was small, the founder was involved, and every interaction was high-touch. As the company scales, processes replace people, automation replaces attention, and the magic that converted early adopters into evangelists disappears into a standardised service. For A7, Magic is a Fatal Brake because losing it is the mechanism through which growth erodes the advocacy that funded growth. It must reach ≥+2 before hypergrowth investment can be sustained.

Primary Accelerator for A2 (Efficiency Machine): For the Efficiency Machine, Magic means the customer barely notices the transaction happened. The 25-minute Ryanair turnaround. The Amazon checkout that requires one click. The banking app that reconciles the account before the customer closes the browser. In A2, operational magic is not sensory delight — it is the complete removal of the customer's effort. The customer doesn't tell a story about the experience; they tell a story about the absence of one. "I barely had to do anything" is the A2 Magic response.

Secondary Brake for A6 (Value Harvester): A Value Harvester extracting maximum cash flow from an existing base must maintain enough magic to prevent the churn that would otherwise accelerate as the product matures. Magic maintenance for A6 is defensive — enough unexpected value to remind customers why they stay, even as the brand optimises for margin rather than growth.

Secondary Accelerator for A4 (Stagnant Leader): For a stagnant leader fighting churn, Magic initiatives provide the proof of renewal that keeps the existing base engaged while Experience (420) and Features (310) are being rebuilt. A single well-designed magical moment — the AI-powered feature that anticipates the user's next action, the proactive support contact that prevents a problem before it occurs — signals that the brand is still invested in the relationship.

Growth Driver for A6: For the Value Harvester, Magic initiatives that generate advocacy are a low-cost acquisition mechanism that complements the margin extraction strategy. Existing customers who experience unexpected delight become the most credible referral source for the next customer cohort.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. You have identified obstacles across your customer journey and reduced them (effortless).

  2. You have eliminated confusion, uncertainty, and anxiety across your customer journey (stress-free).

  3. You have delighted the senses of your customers — they all look for sensory pleasure (sensory pleasure).

  4. You have provided a customer experience that elevates your customers' status (social pleasure).

  5. You have reduced the social and environmental impact while making sustainable moments magical.

(Dimensions 441–444 + 445 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The customer journey is functional but unremarkable. There are no designed moments of unexpected delight. Customers are satisfied but not moved to advocate. Worse: friction and anxiety may exist that the team hasn't noticed because nobody has mapped the journey from the customer's perspective. For A7, a negative score here explains why growth is eroding the advocacy that created it.

Positive scores (+1 to +3): Specific moments are designed to exceed expectations across one or more of the four components. Customers spontaneously share those moments with others — in conversation, in reviews, in referrals. Magic is functioning as the advocacy generation mechanism: not all customers experience it, but the ones who do become the brand's most effective acquisition channel.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's customer journey is functional and unremarkable. The booking works. The cleaner arrives. The cleaning is done. But nothing about the interaction exceeds what a customer would expect from a competent cleaning service. There are no designed moments of effortlessness — the booking process requires four steps that could be two. There is no stress removal — customers who want to verify what products were used have to ask, and the answer varies by team member. There is no sensory pleasure — the cleaner leaves without any communication, the invoice arrives two days later as a plain text email. There is no social pleasure — the service produces no story the customer would want to share. When existing customers describe the service, they use words like "reliable" and "good" — the language of satisfied, disengaged customers rather than active advocates.

Score: +1 to +2 (Developing) Green Clean has introduced two designed Magic moments. First: the Family Health Report arrives within 6 hours of service completion — a specific, data-rich document that no competitor provides and that customers describe as "not what I expected" when they receive it for the first time. This addresses the stress-free component: customers who would have worried about whether the claims are real now have evidence without asking for it. Second: on the third service, customers receive a personalised summary of their cumulative impact — how many service visits, how many households protected from chemical exposure, how much waste has not been generated. This addresses social pleasure: customers who care about environmental responsibility have a number they can share. These two moments are working — the referral rate has started to climb. But the effortless and sensory pleasure components remain undesigned.

Score: +2 to +3 (Strong) Green Clean has designed Magic moments across all four components. Effortless: the booking takes 90 seconds on mobile, with address pre-filled and service preferences remembered. Scheduling confirmation and reminder are automatic. Stress-free: the Family Health Report arrives within 6 hours with a plain-language explanation of what was found and eliminated. Customers never have to ask. Sensory pleasure: the cleaner leaves a handwritten note summarising what was done in this specific home, with one personalised observation (a comment on the kitchen herbs, a note about the child's artwork visible from the bathroom). The note costs 3 minutes and generates more customer responses than any other touchpoint. Social pleasure: the annual impact statement — "Your household prevented 42kg of chemical exposure in 2024" — is designed as a shareable card with Green Clean's visual identity. 23% of customers share it on social media or forward it to friends. The referral rate reached 35% by 2024. Customers do not describe the service as "good." They describe specific moments that changed how they think about what a cleaning service can be.

Connected dimensions

Magic does not operate in isolation. Four dimensions connect most directly:

  • 130 — Pains & Gains: Magic eliminates pains and creates unexpected gains. The pain map is the source material for effortless and stress-free Magic design. When a pain is eliminated so completely that the customer barely registers its absence, that is effortless Magic. When a gain exceeds what the customer expected, that is the raw material of the sensory and social pleasure components.

  • 420 — Experience: Magic elevates experience beyond consistency. Experience (420) sets the reliable baseline. Magic (440) creates the moments above it. The two dimensions work in sequence: without a consistent Experience baseline, Magic investments are undermined by the inconsistency that surrounds them.

  • 320 — Emotions: Magic creates peak emotional moments. The surprise that generates advocacy is an emotional event — the "I didn't expect that" feeling that produces the story worth telling. Magic moments are the designed delivery mechanism for peak emotional benefits.

  • 140 — Engagement: Magic drives engagement and advocacy. A customer who has experienced a designed Magic moment is more likely to be a promoter on the NPS scale, more likely to refer, and more likely to provide feedback. Magic is the upstream cause; Engagement (140) measures the downstream effect.

Conclusion

Magic is the dimension that answers the question most brands cannot: why do some customers become advocates when others merely stay?

The answer is not product quality. Quality is expected. It is not service consistency. Consistency is the baseline. It is the specific, unexpected moment that exceeds what the customer had learned to anticipate — the report that arrives before they asked, the note that references their home specifically, the status recognition that makes them feel seen.

The design principle that separates effective Magic from wasted investment: it must exceed expectations on something the customer actually cares about. The hotel that remembers a pillow preference is Magic because sleep quality matters. The hotel that provides a turndown chocolate to a customer who explicitly avoids sugar has produced an interaction, not a magic moment.

Knowing what customers expect — and where exceeding it will produce the highest advocacy response — is the work. The four components (effortless, stress-free, sensory pleasure, social pleasure) provide the framework. The Moments map (410) and the Pains & Gains research (130) provide the evidence. Together, they produce the design brief for Magic initiatives that convert satisfied customers into advocates.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  3. Marketing Canvas Method, Appendix E — Dimension 440: Magic, Laurent Bouty, 2026

About this dimension

Dimension 440 — Magic is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Magic

Read More
Marketing Canvas, marketingcanvas.net Laurent Bouty Marketing Canvas, marketingcanvas.net Laurent Bouty

Marketing Canvas - Channels

Most companies have channels. Few have orchestrated channels. Dimension 430 of the Marketing Canvas scores the difference — and explains why a brand with three connected channels outperforms one with eight siloed ones.

About the Marketing Canvas Method

This article covers dimension 430 — Channels, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Channels (dimension 430) scores how customers interact with your brand — physical and digital, owned and third-party — and whether those interactions form a seamless, coherent experience across all of them.

The canonical distinction that defines this dimension: most companies have channels. Few have orchestrated channels. The score measures orchestration, not presence.

A brand with a website, a mobile app, a social media presence, a phone line, and a field team is not necessarily scoring well on dimension 430. The question is whether those channels work together without silos — whether a customer who starts research on one channel can complete the journey seamlessly on another, and whether the company can track and serve that customer across the transition.

In the Marketing Canvas, Channels sits within the Journey meta-category alongside Moments (410), Experience (420), and Magic (440). It is the delivery infrastructure — the system that ensures every moment designed in 410 is actually accessible to the customer in the format that serves them best.

Presence vs. orchestration: the canonical distinction

Every company has channels. Most companies have more channels than they have resources to maintain well. The channel list is not the dimension. The orchestration of that list is.

The test is a single customer journey across multiple channels. A customer who discovers Green Clean through a health parenting blog, visits the website to research the formula, emails a question about ingredient safety, books a service via the app, receives the Family Health Report by email, and calls to ask about a recurring subscription — has touched five channels. If the experience is continuous (the phone call picks up where the booking left off; the subscription question doesn't require re-explaining the service model), the channels are orchestrated. If each channel treats the customer as a stranger, the channels exist but are not orchestrated.

The canonical four properties that define orchestrated channels:

Context (431) — can customers use the most relevant channel for their specific situation at each moment? A customer researching a service in the evening needs findable, credible content on the web. A customer mid-service with a question needs an immediate human response. A customer reviewing their health report at midnight needs a digital self-service interface. The same channel cannot serve all three moments well.

Interaction quality (432) — do channels provide clear, personalised, seamless interactions? Quality here means the interaction is adapted to the customer's identity and context — not generic, not one-size-fits-all, not a copy-paste template.

Information consistency (433) — is data consistent and real-time across channels? A customer who updates their household profile in the app should not have to re-state it on the phone. A booking made on the website should be visible to the cleaner on their route app. Inconsistency in data across channels is the most common channel orchestration failure — and the most invisible to the teams building the channels, who each see only their own system.

Orchestration (434) — are channels connected so customers can navigate seamlessly between them with no silos? This is the composite test: does the company have a joined-up view of the customer's journey, or does each channel operate as a separate interaction with no shared memory?

Digital, physical, and moment-driven channel design

The channel strategy question is not "should we be digital or physical?" Every customer journey involves both. The question is: which channel serves each moment best?

A purely digital company that ignores physical moments — the cleaner arriving at the door, the unboxing experience, the in-person explanation of a result — misses the touchpoints where trust is built or lost at the highest intensity. Physical moments carry emotional weight that digital channels cannot replicate.

A traditional service business that treats digital as a secondary channel — the website as an online brochure, the email as a support afterthought — loses the pre-purchase research phase entirely. Customers research digitally before they commit physically. Winning the digital research moment is often what determines whether the physical visit ever happens.

The best channel strategies design each moment to use the channel that serves the customer best:

  • The research moment needs findable, credible digital content

  • The booking moment needs a frictionless digital transaction

  • The service delivery moment needs a reliable physical interaction

  • The result delivery moment needs a clear digital report with optional human follow-up

  • The renewal moment needs a proactive, low-friction digital prompt

Designing channels from moments is the inversion of the default approach (designing moments around the channels that already exist). The default produces a channel strategy. The inversion produces an orchestrated journey.

Channels in the Marketing Canvas

The canonical question

Can customers interact with your brand through the channels they prefer, with a seamless experience across all of them?

Channels appears in the Vital 8 of two archetypes — in notably different roles:

Secondary Brake for A1 (Disruptive Newcomer): A disruptor's survival depends on being noticed and understood immediately. Features and positioning may be compelling, but if the channels through which the target customer discovers and evaluates the brand are wrong or incomplete, the disruption never reaches beyond the early-adopter bubble. Channel failure for A1 is quiet: the product is ready, the message is sharp, but the distribution infrastructure isn't present where the customers are. A Secondary Brake score means the brake must reach ≥+1 before channel failure begins to limit the reach of the disruption.

Secondary Accelerator for A5 (Pivot Pioneer): A company executing a strategic pivot may find that its existing channels were optimised for the old positioning and the old customer segment. The new direction — new JTBD, new lead segment, new positioning — may require new channels entirely. Legacy channels that served the old strategy are not neutral for the pivot; they actively signal the old identity to customers encountering the brand for the first time in the new context. For A5, channel strategy is part of the repositioning work, not a downstream execution decision.

A note on Fatal Brakes: Channels does not appear as a Fatal Brake in any archetype. But channel failure can block the dimensions that are Fatal. If Acquisition (610) is a Fatal Brake and channel orchestration failures are increasing CAC, the channel problem is a Fatal Brake problem in disguise. If Experience (420) is a Fatal Brake and channel inconsistency is producing the experience variance, the same applies. Channels is the infrastructure. Infrastructure failures propagate upward.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. Your customers can use the most relevant channel in function of their specific context at each moment.

  2. Your channels are physical and digital — you provide clear, personalised, and seamless interactions, anywhere, anytime.

  3. Information captured or shared in your channels is consistent, real-time, personalised, useful, and accurate.

  4. You have orchestrated all your channels — there is no silo between them, and customers can navigate seamlessly through them at each moment.

  5. You optimise the social and environmental impact of your physical and digital channels.

(Dimensions 431–434 + 435 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Channels operate in silos. Customers who cross channel boundaries encounter a brand that does not recognise them. Orchestration is absent or incomplete. The likely downstream effect: acquisition costs are higher than they need to be (research-to-booking friction), experience scores are lower than designed (channel handoff failures), and engagement data is fragmented (no joined-up view of customer behaviour).

Positive scores (+1 to +3): Channels are orchestrated. Customers move between channels without friction. Data is consistent and real-time across the full journey. Each channel is designed for the specific moment it serves. The company can track the customer journey across touchpoints and improve each channel based on measured performance.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method. Green Clean sells a residential service — cleaners visit customer homes — not packaged products. Their relevant channels are: website, booking flow, email, in-home service visit, Family Health Report (digital delivery), phone/chat support, and referral mechanics.

Score: −2 to −1 (Weak) Green Clean's channels are independent systems that do not share data or context. The website takes booking requests but is not connected to the cleaner's scheduling app — bookings are manually transferred by the founder. The Family Health Report is generated as a PDF by one team member and emailed by another, introducing a 24–72 hour delay that varies unpredictably. When a customer calls with a question about their report, the support team does not have access to the customer's service history or their specific report data — every call starts from scratch. A customer who books through the website and follows up by email is treated as two separate interactions. No channel knows what the others have communicated. The silos are invisible to the teams but immediately apparent to any customer who crosses a channel boundary.

Score: +1 to +2 (Developing) Green Clean has connected the booking system to the cleaner's route app — scheduling is now automated. The Family Health Report is generated and emailed automatically within 6 hours of service completion. A customer CRM has been introduced: all booking, service, and communication history is accessible to the support team when a customer calls. But the research channel (website) still operates independently — prospects who spend time researching on the website and then book are not identified as the same person until after the booking is made, meaning the website-to-booking conversion cannot be tracked and the research journey cannot be improved with data. The referral mechanic is manual — the team asks existing customers to refer but has no digital system to track referrals or reward them efficiently. Orchestration has improved significantly but is not yet complete.

Score: +2 to +3 (Strong) Green Clean's channels are fully orchestrated around the customer journey, not around internal team structures. The website research behaviour is tracked — customers who read the formula science page before booking convert at a higher rate, so that content is featured prominently in the booking flow. Booking, service, health report, follow-up communication, and subscription renewal are all automated and connected through a single customer record. Support staff see full service history, report data, and communication history before responding to any contact. The referral mechanic is digital — existing customers receive a referral link after every service and can track whether their referrals booked. Channel performance is measured per moment: website conversion rate, booking completion rate, Health Report open rate, support resolution time, referral conversion rate. Each metric corresponds to a specific channel at a specific journey stage. The orchestration is visible in the data: channel handoffs produce no drop-off in conversion that would indicate a silo.

Connected dimensions

Channels does not operate in isolation. Four dimensions connect most directly:

  • 240 — Visual Identity: Channels must carry visual identity consistently. A customer encountering the brand on Instagram, the website, the booking confirmation email, and the physical cleaner's uniform should see a coherent identity at every touchpoint. Channel proliferation without visual governance produces brand fragmentation.

  • 410 — Moments: Channels serve specific moments. The channel strategy is only as good as the moments map underneath it. Without knowing which moments require which types of interaction, channel decisions are made by habit (we've always had a phone line) rather than by design (this moment requires human contact).

  • 420 — Experience: Experience quality depends on channel execution. Channel inconsistency is one of the most common causes of experience variance — customers receive different responses from different channels because the channels are not coordinated. A +2 on Experience requires channel orchestration as a prerequisite.

  • 530 — Media: Media and channels overlap in digital contexts. Paid media, social media, email, and owned content all function as channels at the research and awareness stages. The boundary between Media (530) and Channels (430) is context: Media drives reach and awareness; Channels deliver the interaction and transaction. They share infrastructure and must be planned together.

Conclusion

Channels is the infrastructure dimension of the Journey meta-category. It does not generate the value proposition, design the experience, or create the magic. It delivers all of those things to the customer — or fails to.

The distinction that matters for scoring is not how many channels the brand has. It is whether those channels form a coherent system. A well-orchestrated system of three channels outscores a fragmented system of eight. The customer's perspective is binary: either the journey is seamless across channels, or it is not.

Channel failure is rarely dramatic. It does not produce a single terrible interaction. It produces accumulating friction — the customer who has to re-explain their situation to every channel they touch, the research that doesn't convert because the booking flow is on a different system, the report that arrives three days late because two teams aren't connected. Each incident is minor. The cumulative effect on acquisition, experience, and retention is material.

Sources

  1. Forrester Research, "The State of Omnichannel Commerce", Forrester, 2024 — forrester.com

  2. McKinsey & Company, "The value of getting personalisation right — or wrong — is multiplying", McKinsey, 2021 — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 430: Channels, Laurent Bouty, 2026

About this dimension

Dimension 430 — Channels is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Channels by Laurent Bouty

Read More
Marketing Canvas, marketingcanvas.net Laurent Bouty Marketing Canvas, marketingcanvas.net Laurent Bouty

Marketing Canvas - Experience

Experience is a Fatal Brake for three archetypes. In every case the mechanism is the same: experience failure is the proximate cause of churn. Dimension 420 of the Marketing Canvas scores consistency — not brilliance — and explains why "leaving nothing to chance" is a scored criterion, not an aspiration.

About the Marketing Canvas Method

This article covers dimension 420 — Experience, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Experience (dimension 420) scores the brand's answer to every moment in the customer journey. Where Moments (410) maps what the customer thinks, feels, and does, Experience scores how well the company responds. Does the response reflect the customer's identity? Does it help them achieve their objectives? Is it consistent across time and space? Does it meet the expectations it sets?

The canonical question is not "do we create exceptional experiences?" It is: what is it actually like to be your customer?

In the Marketing Canvas, Experience sits within the Journey meta-category alongside Moments (410), Channels (430), and Magic (440). It is the most frequent Fatal Brake in the method — tied with Positioning (220) and Features (310) at three archetypes each. In every case, the mechanism is the same: experience failure is the proximate cause of churn.

Consistency over brilliance: the canonical insight

The most common Experience scoring error in workshops is confusing it with Magic (440). Experience is not about peak moments or memorable impressions. It is about baseline consistency.

A single brilliant experience surrounded by mediocre ones creates more frustration than consistent adequacy. The customer remembers the gap between the peak and the norm. A hotel that provides an extraordinary check-in and then loses the luggage has not delivered a good experience — it has demonstrated that brilliance is accidental and failure is structural.

Experience design is less about creating memorable highs than about eliminating the lows and ensuring reliability. Every touchpoint should be intentional. Every response should be consistent. The design question is not "how do we create moments that wow?" — that is Magic. The design question is "how do we ensure that every single interaction reflects the promise, regardless of which team member delivers it, which channel it occurs on, or which day of the week it is?"

This is why sub-question 423 scores: "For each moment, your brand answer is consistent in time and space, leaving nothing to chance." Leaving nothing to chance is not a phrase about aspiration. It is a scored criterion. Every undesigned moment is a moment where the brand's promise is undefended — delivered differently by different people, interpreted differently by different teams, experienced differently by different customers.

Score negative if customer experience varies unpredictably across touchpoints, teams, or time. Score positive when experience design is intentional, documented, trained, and measured — and when customers describe the experience using the same words the brand intends.

Experience vs. Magic: the critical distinction

These two dimensions are adjacent and routinely conflated. The confusion produces inflated Experience scores and underinvested Magic strategies.

Experience (420) scores the consistent baseline. Does every customer, in every interaction, receive a response that reflects their identity, serves their goals, and meets the expectations that were set? Consistency is the standard. A score of +2 on Experience means: every moment has a designed response, that response is reliably delivered, and customers confirm it matches their expectations.

Magic (440) scores the peaks. Does the brand exceed expectations in ways customers didn't anticipate? Magic is the surprise that converts a satisfied customer into an advocate. It is scored separately because it requires a different design discipline — not reliability engineering but expectation mapping and strategic over-delivery.

The sequencing principle: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic in one interaction and inconsistency in the next do not become advocates. They become confused — and confusion is the precursor to churn.

B2B Experience: the seams are felt

In B2C, Experience failure is visible and dramatic: the wrong product delivered, the rude support call, the website that crashes at checkout. In B2B, Experience failure is quieter and more expensive.

NTT Data's Experience challenge was not a single bad project. It was organisational inconsistency across post-merger engagement models. Different teams, acquired through different M&A paths, delivering different service standards under the same brand name. The client could feel the seams — the inconsistency between what the sales team promised and what the delivery team delivered, between what one regional office did and what another understood the engagement model to be.

B2B clients do not churn after one bad interaction. They churn after accumulating evidence that the inconsistency is structural rather than situational. The moment a client forms the belief "this isn't a bad week, this is how they operate" — the renewal conversation has already been lost. The revenue metric confirms it six months later.

For B2B service businesses, Experience design means: what does a client encounter at every stage of the engagement, regardless of which team member they are talking to? The standard is not the best delivery manager on staff. It is the minimum consistent standard that can be trained, documented, and reliably reproduced.

Experience in the Marketing Canvas

The canonical question

What is it actually like to be your customer?

Experience is a Fatal Brake for three archetypes — the most Fatal Brake appearances of any single dimension alongside Positioning and Features:

Fatal Brake for A4 (Stagnant Leader): Experience failure is the proximate cause of stagnation. The canonical A4 pattern: churn rises, leadership reaches for Acquisition to refill the bucket. The method says fix the leak first. For Sage in 2019, fragmented UX across dozens of legacy SKUs and desktop-era screens was driving customers to Xero and QuickBooks before the retention team even knew they were at risk. No acquisition investment can compensate for an experience that is actively driving customers away. Experience must reach ≥+2 before any other A4 investment makes strategic sense.

Fatal Brake for A6 (Value Harvester): A company extracting maximum cash flow from an existing base depends entirely on retention. Every 1% of churn that Experience failure generates is a permanent reduction in the cash extraction potential. For A6, Experience is not a growth lever — it is a defensive necessity. The floor below which the strategy collapses.

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth destroys experience consistency. Teams grow faster than onboarding can standardise behaviour. Processes built for 50 customers break at 500. The individual attention that defined early relationships becomes structurally impossible at scale. The Scale-Up Guardian's primary Experience challenge is not improving the experience — it is preserving the experience as headcount and customer volume compound. Every month of growth without experience governance is a month of promise dilution.

Secondary Brake for A2 (Efficiency Machine): For the Efficiency Machine, Experience operates at the operational level. Magic (440) is the adjacent dimension that eliminates friction entirely; Experience sets the floor below which efficiency becomes indistinguishable from indifference. A cost-leader that delivers a genuinely frictionless experience retains customers. A cost-leader that delivers a degraded experience loses them to whichever competitor can match the price with marginally better service.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. For each moment, your brand answer has been adapted to your customers' identity.

  2. For each moment, your brand answer has helped customers to achieve their goals.

  3. For each moment, your brand answer is consistent in time and space, leaving nothing to chance.

  4. For each moment, your brand answer has clear expectations and delivers them consistently.

  5. For each moment, your brand answer is compatible with the concept of sustainability.

(Dimensions 421–424 + 425 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Experience varies unpredictably across touchpoints, teams, or time. The brand promise is undefended in at least some interactions. For archetypes where Experience is a Fatal Brake, this score explains why churn is rising and retention investment is not working. The leaky bucket cannot be fixed by adding more acquisition — it must be fixed at the experience level first.

Positive scores (+1 to +3): Experience is intentional, documented, trained, and measured. Every moment has a designed response. Customers describe the experience in consistent language that matches the brand's intended positioning. The baseline is reliable. Magic (440) initiatives can now be layered on top of a consistent foundation rather than compensating for an inconsistent one.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's experience varies significantly by team member and visit. The two full-time cleaners operate consistently. The three part-time contractors, hired during a growth period, have had no structured onboarding and no shared standard for what a Green Clean visit should look and feel like. Some customers receive a verbal explanation of the formula used; others do not. Some receive the Family Health Report within 24 hours; others wait three days or receive it after a follow-up request. When a customer calls to ask about an ingredient, the response depends on which team member picks up. The experience is sometimes excellent and frequently adequate — but it is never reliably consistent. When the founder asks customers how the experience compares to EcoPure, the feedback is mixed: "better sometimes, comparable usually." That is a −1: experience is not reliably reflecting the positioning.

Score: +1 to +2 (Developing) Green Clean has identified the three highest-variance touchpoints from customer research: the onboarding call, the first-service visit, and the Family Health Report delivery. For each, a standard has been designed and documented. Contractors are trained on the first-service protocol. The Health Report is now automated — delivered within 6 hours of every service completion without requiring manual action. The onboarding call has a structured agenda that ensures the health-first positioning is explained consistently regardless of who conducts it. Variance has reduced but not eliminated — the support interaction (what happens when a customer reports a concern) remains undesigned and inconsistent. Positive customer descriptions of the experience are converging on consistent language: "professional," "trustworthy," "actually explains what they're doing." The experience baseline is improving. It is not yet reliable enough to score +2.

Score: +2 to +3 (Strong) Every Green Clean customer touchpoint has a designed response, documented standard, and trained delivery. The experience is consistent whether the customer is in their first month or their third year, whether they call on a Monday or a Saturday, whether their regular cleaner is available or a substitute is deployed. When a substitute is required, the customer receives a proactive message explaining the change and confirming the substitute has been briefed on the household profile. Support interactions follow a structured resolution protocol — concern acknowledged within 2 hours, resolution proposed within 24 hours, follow-up confirmed within 48 hours. Customer descriptions of the experience use consistent language unprompted: "they always explain what they've done," "I never have to chase anything," "it's the same standard every time." The NPS promoter cohort grew from 38% to 62% between 2021 and 2024 — a direct consequence of experience consistency, not product change.

Connected dimensions

Experience does not operate in isolation. Four dimensions connect most directly:

  • 410 — Moments: Experience responds to moments. Every Experience initiative traces back to a specific mapped moment where the current response is inadequate. Without a complete Moments map, Experience improvements are directional guesses — improving the wrong touchpoints while leaving the highest-variance ones unaddressed.

  • 130 — Pains & Gains: Experience design eliminates pains. The specific pains identified in journey research — the ones that accumulate into churn — are the Experience design brief. A pain at the research phase is an Experience problem in the before stage. A pain at the support interaction is an Experience problem in the after stage.

  • 440 — Magic: Magic elevates experience beyond consistency. Once the baseline is reliable, Magic creates the peaks that generate advocacy. The sequencing is fixed: fix Experience first, then invest in Magic. A +2 on Experience is the prerequisite for Magic initiatives to work as intended.

  • 630 — Lifetime: Experience quality predicts customer lifetime. The most reliable predictor of whether a customer will still be a customer in 12 months is whether their ongoing experience is consistently meeting the promise. Experience is not just a satisfaction metric — it is the leading indicator of lifetime value.

Conclusion

Experience is tied as the most frequent Fatal Brake in the Marketing Canvas Method for a straightforward reason: it is the dimension that most directly connects to churn. Customers do not leave because of a single terrible interaction. They leave because the cumulative experience does not consistently reflect the promise that acquired them.

The strategic diagnostic is not "how good is our best experience?" — teams consistently overrate on this question because they remember peaks and discount inconsistency. The question is: "what does every customer encounter, every time, regardless of team member, channel, or day of the week?"

If the honest answer is "it depends" — dimension 420 is the initiative queue.

Sources

  1. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  2. Bain & Company, "Closing the Delivery Gap", 2005 — bain.com (the foundational 80/8 gap research: 80% of companies believe they deliver a superior experience; 8% of customers agree)

  3. Marketing Canvas Method, Appendix E — Dimension 420: Experience, Laurent Bouty, 2026

About this dimension

Dimension 420 — Experience is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Experience

Read More
Marketing Canvas, marketingcanvas.net Laurent Bouty Marketing Canvas, marketingcanvas.net Laurent Bouty

Marketing Canvas - Moments

Most companies over-invest in the "during" phase of the customer journey and under-invest in "before" and "after" — which is precisely where both acquisition and retention are won or lost. Dimension 410 of the Marketing Canvas explains how to map moments correctly, and why the most valuable output is the seams it reveals between departments.

About the Marketing Canvas Method

This article covers dimension 410 — Moments, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Moments (dimension 410) maps the complete customer journey as a sequence of interactions seen through the customer's eyes. For each moment — before, during, and after purchase — three questions: what does the customer think? What do they feel? What do they do?

The discipline that makes this strategic rather than descriptive: moments must be built from customer observations and interviews, not from internal assumptions about how the journey should work. Every organisation believes it knows its customer journey. The map built from actual customer research almost always looks different from the one built internally.

In the Marketing Canvas, Moments sits within the Journey meta-category alongside Experience (420), Channels (430), and Magic (440). It is the discovery layer — the research input that makes every other Journey dimension scoreable with evidence rather than assumption.

The seams between departments

The most powerful diagnostic purpose of Moments mapping is one that most companies never anticipate: it reveals the seams between internal departments, and those seams are where the customer experience fails.

Marketing owns "before" — awareness, research, consideration. Sales owns "during" — the purchase conversation, onboarding, first use. Support owns "after" — ongoing use, queries, renewal, advocacy. Each team does their part reasonably well, measured on their own terms.

But the customer experiences one continuous journey.

When a customer moves from "before" to "during" — from the website to the first sales conversation — they often encounter a brand that seems to know nothing about what they read, what concerns they formed, or what decision criteria they brought to that conversation. The seam is visible to the customer; it is invisible to the organisation because no single team owns the transition.

Moments mapping forces the organisation to adopt the customer's timeline rather than its own. When the full map is laid out — every touchpoint from first awareness to advocacy, with what the customer thinks, feels, and does at each stage — the seams appear as blank spaces or contradictory experiences. Those gaps are the strategic agenda.

Score negative if the journey map was built from internal assumptions or if the "after purchase" phase is unmapped. Score positive when moments are customer-researched, granular, and actively used to design specific touchpoints.

Where companies systematically fail: the "during" trap

Most companies over-invest in the "during" phase of the journey — the purchase moment, onboarding, first use — and under-invest in "before" and "after." This is where both acquisition and retention are won or lost, making the imbalance strategically costly.

Before purchase is where acquisition happens or fails. A customer who feels confused during research — overwhelmed by competing eco-friendly claims, unable to find independent verification, uncertain which product fits their specific situation — will not convert, regardless of how good the product is. The pre-purchase experience is entirely within the brand's control, and almost entirely unmapped by most organisations. The website, the content, the comparison experience, the social proof — these are designed by teams who know the product, not by teams who have watched confused prospects try to make a decision.

After purchase is where retention happens or fails. A customer who feels abandoned after the transaction — no structured follow-up, no proactive communication about what to expect, no mechanism to give feedback — begins the churn journey immediately. Engagement does not decline suddenly. It begins its decline at the first moment the customer feels the relationship ended at the point of purchase.

The diagnostic test: map your last twelve months of customer-facing initiatives. What percentage addressed the before phase? The during phase? The after phase? The imbalance is almost always striking — and it predicts where the strategic gaps are before a single score is calculated.

Mental Models - Moments in the Marketing Canvas

Mental Models - Moments in the Marketing Canvas

The three questions at every moment

For each moment in the journey, the Marketing Canvas requires three specific answers — all drawn from customer research, not internal assumption:

What does the customer think? The cognitive content of the moment. What information are they processing? What comparisons are they making? What questions are unanswered? What beliefs — accurate or not — are shaping their interpretation of this interaction? For Green Clean's "first service visit" moment: "I hope this is genuinely different from the eco-cleaning service I tried before. I want to see something that proves the health claim."

What does the customer feel? The emotional state at this moment. Anxiety, anticipation, confusion, trust, pride, disappointment. This is not the emotional job (what they want to feel in their lives) — it is the actual emotional state at this specific interaction. Accurately mapping current feelings is the prerequisite for designing better ones. If the customer feels sceptical at the booking stage, no amount of warm onboarding email copy will resolve it.

What does the customer do? The observable behaviour. Searches. Clicks. Calls. Compares. Reads reviews. Asks a friend. Abandons the checkout. These actions are often more revealing than stated opinions because they reflect actual behaviour under actual conditions, not hypothetical responses to survey questions.

Moments in the Marketing Canvas

The canonical question

Have you identified the critical touchpoints where customers interact with your brand, and do you understand what they think, feel, and do at each one?

Strategic role: foundational for most, existential for one

Moments has an unusual Vital 8 profile — it appears formally in only one archetype: it is a Secondary Brake for A9 (Category Creator).

The reason is specific: in a new category, the customer journey doesn't exist yet. There are no established research behaviours, no familiar comparison frameworks, no prior experience of the product category that shapes customer expectations. Every moment must be designed from scratch — the customer has no mental model to bring to the first interaction. Green Clean in 2021 could not assume customers knew how to evaluate "indoor health protection" because the category had not been defined. The first-clean teaching moment — the onboarding experience that explained what health-first meant in practice — was not a nice-to-have. It was the foundational category education that made everything downstream possible.

For all other archetypes, Moments functions like Pains & Gains (130): it is a research input that feeds the scored dimensions above it, particularly Experience (420) and Channels (430). A company that cannot score Experience honestly — because it does not know what customers actually experience at each touchpoint — almost certainly has an unmapped or assumption-built Moments layer underneath. Improving the Moments map improves the reliability of every Journey dimension score.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. Your moments have been defined based on customer observations and interviews — they reflect the customer's actual identity and experience, not internal assumptions.

  2. You have identified all moments before, during, and after buying your value proposition.

  3. For each moment, you have clearly identified what your customers think, feel, and do.

  4. For each moment, you have clearly identified what the customer objectives are.

(Dimensions 411–414 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The journey map is absent or built from internal assumptions rather than customer research. The before and/or after phases are unmapped. The seams between departments are invisible because nobody owns the transitions. Experience (420), Channels (430), and Magic (440) scores cannot be reliably set because the evidence base doesn't exist.

Positive scores (+1 to +3): The journey map is built from customer research, covers all three phases, captures think/feel/do at each moment, and actively identifies where seams between departments are creating experience failures. The map is used — it feeds Experience design, Channels decisions, and Magic moment identification — rather than filed as a project deliverable.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's journey map was assembled by the founding team in a two-hour internal session. It covers the booking process (during) and a brief post-service survey (after). The before phase is entirely unmapped: no research has been done on how health-conscious parents discover cleaning services, what search terms they use, which comparison triggers they apply, or what objections form during the research phase. The after phase map stops at the thank-you email. No moment beyond the first three months of service has been researched. When the team describes the customer journey, they describe what they intended to build, not what customers actually experience. The seam between the website (marketing) and the first sales conversation (founder-led) is the most visible gap — customers arrive with questions formed during research that the founder does not know they have.

Score: +1 to +2 (Developing) Green Clean has conducted eight customer interviews specifically focused on journey mapping. The before phase now has three defined moments: the initial search ("what is the difference between eco-cleaning and health-first cleaning?"), the comparison visit (landing on the Green Clean website and trying to find independent validation), and the booking decision (the moment of commitment and what makes it happen or not). For each, the team has documented what customers think, feel, and do based on interview evidence rather than assumption. The during and early-after phases are mapped. The seam between website and onboarding call has been identified — customers arrive uncertain whether the health claim is substantiated. The team has not yet designed a solution to the seam. But the seam is now named.

Score: +2 to +3 (Strong) Green Clean's journey map covers all phases, built from twenty-two customer interviews and three observed service visits. The before phase is mapped in five moments, each with specific documented think/feel/do data. The seam between website and first contact has been designed out: a structured pre-booking sequence sends the university formula summary and B-Corp certification to every prospect before the first call, so the call begins with the health claim validated rather than questioned. The "First-Clean Teaching Moment" — a structured onboarding experience at the first service visit — explains in plain language what health-first means in practice, shows the before/after air quality data, and delivers the first Family Health Report within 24 hours. The after phase is mapped through the 12-month relationship, with specific moments designed at months 1, 3, 6, and 12 that correspond to the highest churn risk periods identified through customer research. The journey map is reviewed quarterly and updated as research produces new evidence.

Connected dimensions

Moments does not operate in isolation. Four dimensions connect directly as the downstream beneficiaries of good journey mapping:

  • 130 — Pains & Gains: Pains and gains map to specific moments. The pain of "I can't find independent verification" belongs to the before-phase research moment. The gain of "the Family Health Report made me feel like I finally know the truth" belongs to the first-service after moment. Without Moments mapping, Pains & Gains is a list. With it, it becomes a journey-anchored strategy.

  • 420 — Experience: Experience is designed moment by moment. Every Experience initiative traces back to a specific moment in the journey where the current response is inadequate. Without a complete Moments map, Experience improvements are based on internal opinion rather than evidence about where the customer actually struggles.

  • 430 — Channels: Channels serve specific moments. The question "which channels should we be present on?" cannot be answered without knowing which moments require which types of interaction. A customer in the research moment needs findable, credible content. A customer in the post-service moment needs a proactive, low-friction feedback mechanism. The channel follows the moment.

  • 440 — Magic: Magic happens at peak moments. The unexpected delight that converts a satisfied customer into an active advocate occurs at a specific moment in the journey — often one that companies hadn't designed for at all. Without a complete Moments map, Magic cannot be placed. The map reveals where the peaks and troughs are; Magic strategy addresses the peaks.

Conclusion

Moments is the dimension that makes the Journey meta-category honest. Without it, Experience is opinion, Channels is habit, and Magic is accident.

The strategic value is not the map itself — it is what the map reveals. The over-investment in "during" at the expense of "before" and "after." The seams between marketing, sales, and support that the customer feels as a fragmented experience. The moments that are assumed to be satisfactory because nobody has actually asked a customer what they think, feel, and do at that point.

For Category Creators building a journey from scratch, the Moments map is the architectural blueprint — without it, every other Journey dimension is being built without knowing the structure it needs to serve. For all other archetypes, it is the evidence base that makes every Journey dimension score credible rather than flattering.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Forrester Research, "Customer Journey Mapping Best Practices", Forrester, 2024 — forrester.com

  3. Marketing Canvas Method, Appendix E — Dimension 410: Moments, Laurent Bouty, 2026

About this dimension

Dimension 410 — Moments is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Moments

Read More