BLOG

A collection of article and ideas that help Smart Marketers to become Smarter

Laurent Bouty Laurent Bouty

Marketing Canvas and Customers

When working on the Customers part of the Marketing Canvas, you are trying to identify relevant and actionable triggers (you can also call it insights) that you will try to leverage through the other dimensions of the canvas. We have 4 dimensions you can play with for identifying these triggers (JTBD, ASPIRATIONS, PAINS & GAINS, ENGAGEMENT).

In a nutshell

When working on the Customers part of the Marketing Canvas, you are trying to identify relevant and actionable triggers (you can also call it insights) that you will try to leverage through the other dimensions of the canvas. We have 4 dimensions you can play with for identifying these triggers (JTBD, ASPIRATIONS, PAINS & GAINS, ENGAGEMENT). What matters at the end of this exercise is that you avoid fluffy (triggers), you have built a list of triggers, you have qualified them (functional or emotional), you have identified supporting evidence and you have rated the strength of each trigger.

In the Marketing Canvas

In the Marketing Canvas, we have identified 6 main categories for building your Marketing Strategy: Customers, Brand, Value Proposition, Journey, Conversation and Metrics. Each of these categories have 4 dimensions which means that a total of 24 dimensions (6 by 4) are defining your Marketing Strategy.

Customers is one of the 4 dimensions of the Metrics category. That category is composed of 4 dimensions.

How to use it?

What I have noticed during workshops is that people have difficulties to identify strong insights that could be used for building value propositions that rocks. They usually list insights that are very broad (even fluffy) like customers want quality (who doesn’t?) but could not describe what sort of quality customers are looking for. One example that could help you understand my point is the following:

When designing mobile phones, we know that these phones should be robust but what does it really mean. Glass manufacturer designed glass that could resist a drop from 10 meters but customers were looking for a phone that could resist multiple drops from 1 meter because it is what they are experiencing in real life. You see robustness could be very different!

When working on the 4 dimensions of CUSTOMERS, you can identify a list of triggers that could be functional (What the customer is expecting to get?) and emotional (What the customer is expecting to feel?). An interesting read on benefits/triggers is the article from the beloved brand web site (here).

I have not found a global list with all potential triggers (functional and emotional) that you could choose when working on a specific case. The most elaborated list I have found so far is the one developed by Bain Consulting for B2C and B2B. They have identified elements of value (30 for B2C and 40 for B2B) classified as functional, emotional, life-changing, and social impact.

In the Marketing Canvas, I have only considered 2 categories (functional and emotional), therefore if you are using Bain B2C triggers, you should consider emotional, life-changing and social impact as Emotional triggers.

What I also like in the Bain proposal is their B2B mapping which is something you don’t easily find. In the case of the B2B mapping, you should consider Table Stakes and Functional Values as Functional and Ease of doing business value, Individual value and inspirational value as Emotional for the Marketing Canvas method.

More on Bain can be found here: B2C elements of value and B2B elements of value.

Some Videos

Potential ideas

How to add intangible values to product?

  1. Immediacy - priority access, immediate delivery

  2. Personalization - tailored just for you

  3. Interpretation - support and guidance

  4. Authenticity - how can you be sure it is the real thing?

  5. Accessibility - wherever, whenever

  6. Embodiment - books, live music

  7. Patronage - "paying simply because it feels good",

  8. Findability - "When there are millions of books, millions of songs, millions of films, millions of applications, millions of everything requesting our attention — and most of it free — being found is valuable."

source: Wikipedia Attention Economy

Method

What you should do is the following:

  • Take each dimension and identify triggers that are either functional or emotional;

  • List evidence supporting each trigger;

  • Rate each trigger from weak to strong in the function of the importance of the customer (the more the customer is demonstrating that s.he is effectively in needs of this trigger through past behavior (doing more than saying), the stronger the trigger).

  • Take the top 10 triggers at the end of this exercise and complete the template below.

Template

Marketing Canvas Method - Customer Triggers Template

Marketing Canvas Method - Customer Triggers Template

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Listening

Most companies listen reactively — processing complaints, running annual surveys, reading reviews when they arrive. The Marketing Canvas demands proactive listening. Dimension 510 explains the difference, why it is a Fatal Brake for Pivot Pioneers, and the most expensive sentence in marketing.

About the Marketing Canvas Method

This article covers dimension 510 — Listening, part of the Conversation meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Listening (dimension 510) is the Voice of the Customer (VOC) infrastructure — not a single survey, but a system that captures everything customers say across every channel, translates it into data, and feeds it into strategic decisions.

The distinction that defines this dimension: listening without action is surveillance. Listening with action is strategy.

Most organisations believe they listen to customers. Most are listening reactively — processing complaints when they arrive, running annual satisfaction surveys, reading reviews when a notification appears. The method demands something more demanding: proactive listening that generates data before it is needed, feeds it into decisions before problems compound, and closes the loop between what customers say and what the company does.

In the Marketing Canvas, Listening sits within the Conversation meta-category alongside Stories (520), Media (530), and Influencers (540). It is the first of the four Conversation dimensions — and it comes first deliberately. The meta-category header says it plainly: listening comes before stories, before media, before influencers. You cannot communicate effectively with people you haven't systematically understood.

Reactive vs. proactive: the canonical distinction

This is the distinction that separates a company with VOC processes from a company with a VOC system.

Reactive listening processes information when it arrives. Customer complains — the complaint is logged. Customer writes a review — someone reads it. Annual survey goes out — results are compiled. NPS score is reported quarterly. Each of these is listening. None of them is proactive. The information arrives at the company's pace, on the company's schedule, filtered through the customers who bothered to respond.

Proactive listening generates information continuously, systematically, and before it is urgently needed. Ongoing customer interviews on a regular cadence — not just when there is a problem to investigate. Social listening infrastructure monitoring what is said about the brand, the category, and competitors across platforms. Support ticket analysis that extracts pattern data from thousands of micro-interactions. Behavioural data from digital touchpoints that reveals what customers actually do, not just what they say. Structured feedback loops at defined journey stages that close the circle between hearing a concern and confirming the fix.

The gap between reactive and proactive is the gap between responding to problems and preventing them. Between knowing what customers said last quarter and knowing what they are saying now. Between confirming assumptions and challenging them.

The canonical test: if the company stopped sending surveys tomorrow, would customer understanding continue to improve? If yes, the listening system is proactive. If no — if surveys are the primary input — the system is reactive, and dimension 510 cannot score above +1.

MARKETING CANVAS TOPICS (1).png

The most expensive sentence in marketing

"We know what customers want."

This sentence costs more than any misaligned campaign, any failed product launch, or any churned enterprise account. It is the signal that internal assumptions have been allowed to substitute for external evidence — that the listening loop has been closed not by data but by conviction.

The canonical position of the Marketing Canvas on this: if the data contradicts the assumption, the assumption must yield. Not the data. Not the interpretation. The assumption.

This sounds obvious. It is routinely violated. Teams that have operated in a category for years develop a fluency with their customers that feels like understanding but is actually pattern recognition. They know what last year's customers said about last year's product. They extrapolate. The market moves. The extrapolation drifts.

The VOC system exists to correct the drift before it becomes a strategy gap. It is the institutional mechanism that keeps the company's model of its customers honest — continuously updated, data-grounded, and resistant to the internal assumptions that are far more comfortable to rely on.

The four properties of an effective VOC system

The Marketing Canvas scores Listening against four properties. Together they describe not just whether a company has listening tools, but whether those tools form a functioning system:

Capture scope (511) — does the VOC system hear everything customers are saying? Not everything worth hearing — everything. The signal that matters is often not in the formal feedback. It is in the support ticket that uses unusual language. The social media comment that frames the category differently. The customer interview that introduces a word the team has never used. A VOC system with limited capture scope is a VOC system with systematic blind spots.

Data discipline (512) — is the VOC process entirely data-driven, with no point where assumptions substitute for evidence? The failure mode here is not fraudulent data. It is filtered data — interview questions that lead to expected answers, survey scales that cluster around mid-range because respondents are conflict-averse, analysis that confirms the hypothesis the team walked in with. Data discipline means designing the listening system to surface inconvenient truths, not just validate comfortable ones.

Journey integration (513) — does the VOC process map to the customer lifecycle? Listening at only one stage of the journey is like taking a patient's temperature once and declaring the health of their entire year. The research that matters for acquisition decisions is different from the research that matters for retention decisions. A journey-integrated VOC system has different listening mechanisms at different stages — capturing the before-purchase research experience, the onboarding moment, the ongoing use patterns, and the renewal conversation separately, because each reveals different strategic information.

Methodological breadth (514) — are multiple research techniques used together? Each technique has a different blind spot. Surveys capture stated preferences but miss revealed behaviour. Interviews surface nuance but are prone to social desirability bias. Behavioural analytics reveal what customers do but not why. Support ticket analysis captures the most frustrated customers but underweights the quietly satisfied ones. No single technique is sufficient. The system that combines four or more creates a triangulated picture that is harder to misread.

Validation discipline — does the company run a JTBD check at the customer level before committing capital to a direction implied by a market signal? A strong market trend is not the same as a validated consumer job. A company can detect a trend correctly and still deploy capital in a direction its specific customer does not need, because it never ran the validation step between signal and decision. This failure is harder to catch because the company genuinely believes it is being data-driven. The tell: VOC data is being used to confirm a direction already chosen, rather than to test it before capital is committed. Volume of consumer data does not protect against this failure. Only validation discipline does.

The second critical failure is the mirror of the first: companies that mistake market signal intake for customer listening. Reactive companies filter data through assumptions. A different failure mode — harder to detect because it is dressed in data — is the company that tracks macro trends attentively but never validates them at the individual customer level. The market is moving toward X does not mean your specific customer's job has changed. Listening without validation is still surveillance, just at a more sophisticated level.

Listening in the Marketing Canvas

The canonical question

Do you systematically capture, analyse, and act on what customers are saying about your brand, products, and market?

Listening is a Fatal Brake for A5 (Pivot Pioneer) — the most strategically consequential placement of any Conversation dimension.

The rationale is direct: you cannot pivot successfully if you don't know where the market is going and whether your specific customer is moving with it. Listening is how you find out both — and the second question matters more than the first.

The Fujifilm and Kodak cases provide the sharpest possible contrast. Both companies faced the same crisis in the early 2000s: digital technology was destroying the photographic film market. Both had data. Kodak had commissioned research in 1981 predicting film's decline — and then calculated how many years they could milk film revenue before needing to act. They listened, and then filtered the listening through their assumption that they had more time. Fujifilm conducted an 18-month technology audit — described in the canonical case library as "the most sophisticated VOC exercise in the book" — mapping every capability they had against every market need they could identify. They listened, and then let the data direct the strategy. Fujifilm still exists. Kodak destroyed over €100B in value.

For A5, Listening is a Fatal Brake because the pivot direction is unknown until the market reveals it. An A5 company that is listening well will identify the new job before competitors do. An A5 company that is listening reactively will discover it in competitors' press releases.

Listening is also a Growth Driver for A9 (Category Creator) — the dimension through which category language is discovered. Green Clean's voice-of-customer language mining is the canonical example: extracting the exact phrases customers used to describe the indoor health protection job and feeding those phrases directly into marketing copy. Customers teach you the vocabulary of the category they are joining. Listening is how you learn it.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. You have set a VOC system that captures everything that customers are saying about your brand and your value proposition.

  2. Your entire VOC process is data-driven — at no point are you making assumptions that substitute for evidence.

  3. Your VOC process is based on an in-depth knowledge of your user's journey and customer lifecycle.

  4. You are using different techniques together to ensure you are getting the most from your research.

  5. Your VOC system captures your customers' views on sustainability.

(Dimensions 511–514 + 515 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Customer understanding relies on assumptions, single-source data, or reactive feedback that arrives too late to be strategic. "We know what customers want" is the operating assumption. The likely result: strategy decisions are made on the basis of internal conviction rather than external evidence. Problems compound before they are detected. For A5, this score is existential — a pivot built on assumed market direction is a rebrand, not a transformation.

Positive scores (+1 to +3): Multiple listening channels feed a structured process that visibly influences product, marketing, and service decisions. Every significant strategy decision can be traced back to a specific customer insight from a specific source. The VOC system generates evidence before it is urgently needed, corrects internal assumptions when data contradicts them, and closes the loop between what customers say and what the company does.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's listening consists of a post-service satisfaction email sent to every customer after each visit. The response rate is 19%. The four questions (overall satisfaction, cleaner performance, product quality, likelihood to recommend) produce scores the team reviews monthly. No action has been taken based on these scores in the past six months — they are tracked but not acted on. Customer interviews have never been conducted. Social media is monitored by the founder personally, approximately once a week, without a systematic process for capturing or analysing what is found. Support tickets are answered and then closed, with no aggregation or pattern analysis. "We know what our customers want" is the informal position of the team. The VOC system exists in form. It does not function as strategy.

Score: +1 to +2 (Developing) Green Clean has introduced quarterly customer feedback sessions — 45-minute conversations with a rotating group of 8–10 customers focused on the full service journey. The sessions are structured but not scripted: customers describe specific moments rather than rate abstract attributes. Two rounds of sessions have already produced one significant insight: customers consistently describe the moment they realise the Family Health Report is personalised to their specific home as the point when they first trusted the brand. This insight was not available from the satisfaction survey. The team has started acting on it: the first Health Report for new customers is now delivered with a phone call rather than an email, specifically to confirm the personalisation in conversation. Social listening is now monitored daily using a basic tool. Support ticket language is being reviewed weekly for recurring patterns. Proactive listening is forming. It is not yet systematic.

Score: +2 to +3 (Strong) Green Clean's VOC system operates at four levels simultaneously. Satisfaction data (post-service NPS) provides the quantitative baseline. Quarterly customer interviews provide the qualitative depth, including specific language analysis — the team has documented the exact phrases health-conscious parents use to describe the indoor health protection job and has fed those phrases directly into website copy, sales conversations, and the Family Health Report narrative. Social listening captures every mention of Green Clean and its category terms in the region, updated daily. Support ticket analysis is reviewed weekly and produces a monthly "friction report" — specific interaction patterns that indicate friction in the journey. Each of these data streams feeds into monthly strategy reviews where at least one decision is required to trace back to VOC evidence. The system has produced three product changes and two messaging updates in the past twelve months. When the team states what customers want, they can cite the specific data source, the sample size, and the date the insight was captured.

Connected dimensions

Listening does not operate in isolation. Five dimensions connect most directly:

  • 110 — JTBD: Listening enables the initial evidence base for the job definition — and, more critically, maintains its accuracy over time. A company can define the job well in year one and then watch it silently decay if no VOC system is actively testing whether the definition still holds. Without 510, a correct 110 ages in amber while the customer's actual job evolves. 510 is how you build 110. It is also how you keep it honest.

  • 130 — Pains & Gains: VOC validates pain mapping. The pains identified in journey research (dimension 130) are hypotheses until the VOC system confirms them with data across a sufficient sample. Pains that appear in one customer interview may be individual; pains that appear in twelve are systemic. Listening is how the difference is established.

  • 140 — Engagement: VOC systems feed engagement data. The promoter/detractor ratios that dimension 140 scores are produced by the listening infrastructure. Without a functioning VOC system, Engagement can only be measured by satisfaction surveys — which, as noted in dimension 140, is not the same as measuring engagement.

  • 420 — Experience: Listening reveals what the experience actually feels like from the customer side. A team that believes the onboarding experience is +2 on Experience may discover through customer interviews that the specific moment the substitute cleaner arrives without prior notice is scoring −2 in the customer's head. Without the listening system, the Experience score is a self-assessment. With it, it becomes evidence-based.

  • 520 — Stories: Listening provides the customer language that makes stories resonate. The most effective content uses the words customers use to describe their own problems — not the words the marketing team uses to describe the product. VOC language mining is the process that produces the raw material for story strategy.

Conclusion

Listening is the first Conversation dimension because it is the prerequisite for all the others. A brand cannot tell credible stories without knowing what customers actually experience. It cannot design effective media without knowing which messages resonate. It cannot identify the right influencers without knowing which voices customers trust.

The strategic test is not whether the company has feedback mechanisms. It is whether those mechanisms are proactive, multi-technique, journey-integrated, and action-connected. A company that sends satisfaction surveys and reads the results is listening. A company that conducts ongoing interviews, monitors social conversation, analyses support ticket patterns, tracks behavioural data, and ties every decision to a specific customer insight is listening strategically.

The difference between those two companies is not tools. It is discipline — the discipline of requiring data to yield when it contradicts assumption, rather than requiring assumption to explain away inconvenient data.

Sources

  1. Harvard Business Review, "Everyone Says They Listen to Their Customers — Here's How to Really Do It", October 2015 — hbr.org

  2. McKinsey & Company, "Are You Really Listening to What Your Customers Are Saying?", McKinsey Quarterly — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 510: Listening (VOC), Laurent Bouty, 2026

About this dimension

Dimension 510 — Listening (VOC) is part of the Conversation meta-category (500) in the Marketing Canvas Method. The Conversation meta-category contains four dimensions: Listening (510), Stories (520), Media (530), and Influencers (540).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Conversation - Listening To

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Magic

Satisfaction keeps customers. Magic turns them into advocates. Dimension 440 of the Marketing Canvas scores four components — effortless, stress-free, sensory pleasure, and social pleasure — and explains why exceeding expectations on something the customer doesn't care about isn't magic, it's waste.

About the Marketing Canvas Method

This article covers dimension 440 — Magic, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Magic (dimension 440) scores whether your brand exceeds expectations in ways customers didn't anticipate. Not satisfaction — that is delivering what was promised. Not quality — that is consistency. Magic is the surprise that transforms a satisfied customer into an active advocate.

The most important design principle: exceeding expectations on something the customer doesn't care about isn't magic. It's waste. Magic requires knowing what the customer expects — and then strategically exceeding it at the moment that matters most.

In the Marketing Canvas, Magic sits within the Journey meta-category alongside Moments (410), Experience (420), and Channels (430). It is the peak layer — the dimension that elevates a reliable experience into one customers feel compelled to describe to others. Experience (420) sets the baseline. Magic (440) creates the highs above it.

Magic vs. Experience: the critical distinction

This is the most important conceptual clarification in dimension 440, and the one most commonly missed in workshops.

Experience (420) scores the consistent baseline — whether every customer, in every interaction, receives a response that is intentional, reliable, and meets expectations. Consistency is the standard. A strong Experience score means: nothing is left to chance, the brand's promise is defended at every touchpoint.

Magic (440) scores the peaks — the unexpected moments that exceed what the customer anticipated and produce the emotional response that generates advocacy. Magic is not consistent by definition. It is strategic and selective — designed to occur at the specific moments where the surprise will have the highest impact.

The sequencing rule: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic at one touchpoint and inconsistency at another do not become advocates. They become confused — and confusion precedes churn, not advocacy.

Score negative if the customer journey is functional but unremarkable, or if it creates friction the company hasn't noticed. Score positive when specific moments are designed to exceed expectations and customers spontaneously share those moments with others.

The four components of Magic

The Marketing Canvas breaks Magic into four scored components — each addressing a different dimension of the unexpected experience:

Effortless (441) — obstacles removed. The customer expects friction; they encounter none. The booking that takes 30 seconds when they budgeted 5 minutes. The form that pre-fills from their previous interaction. The return process that requires no explanation because the system already knows why. Effortlessness is the absence of friction the customer had learned to expect. It is magical precisely because the absence is unexpected — the category has trained customers to tolerate effort, and the brand has made it disappear.

Stress-free (442) — confusion, uncertainty, and anxiety eliminated. The customer expects to worry about something; they find there is nothing to worry about. The ambiguous delivery window that turns into real-time location tracking. The ingredient claim that is accompanied by independent verification rather than asking the customer to trust. The post-service question that is answered before it was asked. Stress-free magic is the proactive removal of cognitive load — the brand doing the worrying so the customer does not have to.

Sensory pleasure (443) — delight through sight, touch, sound, taste, or smell. In consumer markets this is the Apple unboxing, the Hermès ribbon, the hotel that remembers a pillow preference. The experience engages the senses in a way that exceeds the purely functional expectation. In service contexts, sensory pleasure appears in the aesthetics of a delivered report, the warmth of an unexpected handwritten note, the packaging that communicates care before a word is read.

Social pleasure (444) — status elevation. The customer encounters the brand in a way that makes them feel recognised, celebrated, or elevated in front of others. The loyalty recognition at a hotel check-in that happens in front of other guests. The personalised annual impact report that the customer shows to friends because it makes them look like someone who has made a difference. The referral confirmation that acknowledges the customer as a trusted advisor to their network. Social pleasure magic is the brand giving the customer a story they want to tell.

B2B Magic: cognitive, not sensory

In consumer markets, Magic is often sensory — the unboxing, the ribbon, the pillow preference. In B2B, Magic is cognitive: the insight the client didn't ask for, the risk flagged before it became a problem, the deliverable completed three weeks early without explanation.

The NTT Data case illustrates the distinction. B2B Magic isn't about delight in the consumer sense. It is about demonstrating competence so completely and proactively that the client forms the belief: "this is a genuine partner, not just a vendor." That belief is the B2B equivalent of advocacy — the CTO who mentions the vendor by name at an industry conference, the COO who recommends the firm without being asked, the procurement lead who shortcuts the RFP process because they already know who they want.

The B2B Magic design question: where in this engagement does the client expect reasonable competence — and where could we deliver something so far ahead of expectation that it changes the nature of the relationship?

Spotify's Discover Weekly is the canonical example of consumer-facing Magic that operates on a cognitive principle: the algorithm's ability to surface music the user didn't know they wanted, at the moment they most want it. Not sensory delight. Cognitive surprise. The user's reaction — "how does it know?" — is the Magic response. It drove measurable retention improvement, which is the commercial test of whether Magic is working.

Magic in the Marketing Canvas

The canonical question

Where do you exceed expectations in ways customers didn't see coming?

Magic appears in the Vital 8 of five archetypes — spanning the full range of strategic roles:

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth tends to destroy the exceptional experiences that created growth in the first place. The early customers of a high-growth brand experienced something that felt personal, attentive, and unexpectedly good — because the team was small, the founder was involved, and every interaction was high-touch. As the company scales, processes replace people, automation replaces attention, and the magic that converted early adopters into evangelists disappears into a standardised service. For A7, Magic is a Fatal Brake because losing it is the mechanism through which growth erodes the advocacy that funded growth. It must reach ≥+2 before hypergrowth investment can be sustained.

Primary Accelerator for A2 (Efficiency Machine): For the Efficiency Machine, Magic means the customer barely notices the transaction happened. The 25-minute Ryanair turnaround. The Amazon checkout that requires one click. The banking app that reconciles the account before the customer closes the browser. In A2, operational magic is not sensory delight — it is the complete removal of the customer's effort. The customer doesn't tell a story about the experience; they tell a story about the absence of one. "I barely had to do anything" is the A2 Magic response.

Secondary Brake for A6 (Value Harvester): A Value Harvester extracting maximum cash flow from an existing base must maintain enough magic to prevent the churn that would otherwise accelerate as the product matures. Magic maintenance for A6 is defensive — enough unexpected value to remind customers why they stay, even as the brand optimises for margin rather than growth.

Secondary Accelerator for A4 (Stagnant Leader): For a stagnant leader fighting churn, Magic initiatives provide the proof of renewal that keeps the existing base engaged while Experience (420) and Features (310) are being rebuilt. A single well-designed magical moment — the AI-powered feature that anticipates the user's next action, the proactive support contact that prevents a problem before it occurs — signals that the brand is still invested in the relationship.

Growth Driver for A6: For the Value Harvester, Magic initiatives that generate advocacy are a low-cost acquisition mechanism that complements the margin extraction strategy. Existing customers who experience unexpected delight become the most credible referral source for the next customer cohort.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. You have identified obstacles across your customer journey and reduced them (effortless).

  2. You have eliminated confusion, uncertainty, and anxiety across your customer journey (stress-free).

  3. You have delighted the senses of your customers — they all look for sensory pleasure (sensory pleasure).

  4. You have provided a customer experience that elevates your customers' status (social pleasure).

  5. You have reduced the social and environmental impact while making sustainable moments magical.

(Dimensions 441–444 + 445 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): The customer journey is functional but unremarkable. There are no designed moments of unexpected delight. Customers are satisfied but not moved to advocate. Worse: friction and anxiety may exist that the team hasn't noticed because nobody has mapped the journey from the customer's perspective. For A7, a negative score here explains why growth is eroding the advocacy that created it.

Positive scores (+1 to +3): Specific moments are designed to exceed expectations across one or more of the four components. Customers spontaneously share those moments with others — in conversation, in reviews, in referrals. Magic is functioning as the advocacy generation mechanism: not all customers experience it, but the ones who do become the brand's most effective acquisition channel.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's customer journey is functional and unremarkable. The booking works. The cleaner arrives. The cleaning is done. But nothing about the interaction exceeds what a customer would expect from a competent cleaning service. There are no designed moments of effortlessness — the booking process requires four steps that could be two. There is no stress removal — customers who want to verify what products were used have to ask, and the answer varies by team member. There is no sensory pleasure — the cleaner leaves without any communication, the invoice arrives two days later as a plain text email. There is no social pleasure — the service produces no story the customer would want to share. When existing customers describe the service, they use words like "reliable" and "good" — the language of satisfied, disengaged customers rather than active advocates.

Score: +1 to +2 (Developing) Green Clean has introduced two designed Magic moments. First: the Family Health Report arrives within 6 hours of service completion — a specific, data-rich document that no competitor provides and that customers describe as "not what I expected" when they receive it for the first time. This addresses the stress-free component: customers who would have worried about whether the claims are real now have evidence without asking for it. Second: on the third service, customers receive a personalised summary of their cumulative impact — how many service visits, how many households protected from chemical exposure, how much waste has not been generated. This addresses social pleasure: customers who care about environmental responsibility have a number they can share. These two moments are working — the referral rate has started to climb. But the effortless and sensory pleasure components remain undesigned.

Score: +2 to +3 (Strong) Green Clean has designed Magic moments across all four components. Effortless: the booking takes 90 seconds on mobile, with address pre-filled and service preferences remembered. Scheduling confirmation and reminder are automatic. Stress-free: the Family Health Report arrives within 6 hours with a plain-language explanation of what was found and eliminated. Customers never have to ask. Sensory pleasure: the cleaner leaves a handwritten note summarising what was done in this specific home, with one personalised observation (a comment on the kitchen herbs, a note about the child's artwork visible from the bathroom). The note costs 3 minutes and generates more customer responses than any other touchpoint. Social pleasure: the annual impact statement — "Your household prevented 42kg of chemical exposure in 2024" — is designed as a shareable card with Green Clean's visual identity. 23% of customers share it on social media or forward it to friends. The referral rate reached 35% by 2024. Customers do not describe the service as "good." They describe specific moments that changed how they think about what a cleaning service can be.

Connected dimensions

Magic does not operate in isolation. Four dimensions connect most directly:

  • 130 — Pains & Gains: Magic eliminates pains and creates unexpected gains. The pain map is the source material for effortless and stress-free Magic design. When a pain is eliminated so completely that the customer barely registers its absence, that is effortless Magic. When a gain exceeds what the customer expected, that is the raw material of the sensory and social pleasure components.

  • 420 — Experience: Magic elevates experience beyond consistency. Experience (420) sets the reliable baseline. Magic (440) creates the moments above it. The two dimensions work in sequence: without a consistent Experience baseline, Magic investments are undermined by the inconsistency that surrounds them.

  • 320 — Emotions: Magic creates peak emotional moments. The surprise that generates advocacy is an emotional event — the "I didn't expect that" feeling that produces the story worth telling. Magic moments are the designed delivery mechanism for peak emotional benefits.

  • 140 — Engagement: Magic drives engagement and advocacy. A customer who has experienced a designed Magic moment is more likely to be a promoter on the NPS scale, more likely to refer, and more likely to provide feedback. Magic is the upstream cause; Engagement (140) measures the downstream effect.

Conclusion

Magic is the dimension that answers the question most brands cannot: why do some customers become advocates when others merely stay?

The answer is not product quality. Quality is expected. It is not service consistency. Consistency is the baseline. It is the specific, unexpected moment that exceeds what the customer had learned to anticipate — the report that arrives before they asked, the note that references their home specifically, the status recognition that makes them feel seen.

The design principle that separates effective Magic from wasted investment: it must exceed expectations on something the customer actually cares about. The hotel that remembers a pillow preference is Magic because sleep quality matters. The hotel that provides a turndown chocolate to a customer who explicitly avoids sugar has produced an interaction, not a magic moment.

Knowing what customers expect — and where exceeding it will produce the highest advocacy response — is the work. The four components (effortless, stress-free, sensory pleasure, social pleasure) provide the framework. The Moments map (410) and the Pains & Gains research (130) provide the evidence. Together, they produce the design brief for Magic initiatives that convert satisfied customers into advocates.

Sources

  1. Chip Heath, Dan Heath, The Power of Moments: Why Certain Experiences Have Extraordinary Impact, Simon & Schuster, 2017

  2. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  3. Marketing Canvas Method, Appendix E — Dimension 440: Magic, Laurent Bouty, 2026

About this dimension

Dimension 440 — Magic is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Magic

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Channels

Most companies have channels. Few have orchestrated channels. Dimension 430 of the Marketing Canvas scores the difference — and explains why a brand with three connected channels outperforms one with eight siloed ones.

About the Marketing Canvas Method

This article covers dimension 430 — Channels, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Channels (dimension 430) scores how customers interact with your brand — physical and digital, owned and third-party — and whether those interactions form a seamless, coherent experience across all of them.

The canonical distinction that defines this dimension: most companies have channels. Few have orchestrated channels. The score measures orchestration, not presence.

A brand with a website, a mobile app, a social media presence, a phone line, and a field team is not necessarily scoring well on dimension 430. The question is whether those channels work together without silos — whether a customer who starts research on one channel can complete the journey seamlessly on another, and whether the company can track and serve that customer across the transition.

In the Marketing Canvas, Channels sits within the Journey meta-category alongside Moments (410), Experience (420), and Magic (440). It is the delivery infrastructure — the system that ensures every moment designed in 410 is actually accessible to the customer in the format that serves them best.

Presence vs. orchestration: the canonical distinction

Every company has channels. Most companies have more channels than they have resources to maintain well. The channel list is not the dimension. The orchestration of that list is.

The test is a single customer journey across multiple channels. A customer who discovers Green Clean through a health parenting blog, visits the website to research the formula, emails a question about ingredient safety, books a service via the app, receives the Family Health Report by email, and calls to ask about a recurring subscription — has touched five channels. If the experience is continuous (the phone call picks up where the booking left off; the subscription question doesn't require re-explaining the service model), the channels are orchestrated. If each channel treats the customer as a stranger, the channels exist but are not orchestrated.

The canonical four properties that define orchestrated channels:

Context (431) — can customers use the most relevant channel for their specific situation at each moment? A customer researching a service in the evening needs findable, credible content on the web. A customer mid-service with a question needs an immediate human response. A customer reviewing their health report at midnight needs a digital self-service interface. The same channel cannot serve all three moments well.

Interaction quality (432) — do channels provide clear, personalised, seamless interactions? Quality here means the interaction is adapted to the customer's identity and context — not generic, not one-size-fits-all, not a copy-paste template.

Information consistency (433) — is data consistent and real-time across channels? A customer who updates their household profile in the app should not have to re-state it on the phone. A booking made on the website should be visible to the cleaner on their route app. Inconsistency in data across channels is the most common channel orchestration failure — and the most invisible to the teams building the channels, who each see only their own system.

Orchestration (434) — are channels connected so customers can navigate seamlessly between them with no silos? This is the composite test: does the company have a joined-up view of the customer's journey, or does each channel operate as a separate interaction with no shared memory?

Digital, physical, and moment-driven channel design

The channel strategy question is not "should we be digital or physical?" Every customer journey involves both. The question is: which channel serves each moment best?

A purely digital company that ignores physical moments — the cleaner arriving at the door, the unboxing experience, the in-person explanation of a result — misses the touchpoints where trust is built or lost at the highest intensity. Physical moments carry emotional weight that digital channels cannot replicate.

A traditional service business that treats digital as a secondary channel — the website as an online brochure, the email as a support afterthought — loses the pre-purchase research phase entirely. Customers research digitally before they commit physically. Winning the digital research moment is often what determines whether the physical visit ever happens.

The best channel strategies design each moment to use the channel that serves the customer best:

  • The research moment needs findable, credible digital content

  • The booking moment needs a frictionless digital transaction

  • The service delivery moment needs a reliable physical interaction

  • The result delivery moment needs a clear digital report with optional human follow-up

  • The renewal moment needs a proactive, low-friction digital prompt

Designing channels from moments is the inversion of the default approach (designing moments around the channels that already exist). The default produces a channel strategy. The inversion produces an orchestrated journey.

Channels in the Marketing Canvas

The canonical question

Can customers interact with your brand through the channels they prefer, with a seamless experience across all of them?

Channels appears in the Vital 8 of two archetypes — in notably different roles:

Secondary Brake for A1 (Disruptive Newcomer): A disruptor's survival depends on being noticed and understood immediately. Features and positioning may be compelling, but if the channels through which the target customer discovers and evaluates the brand are wrong or incomplete, the disruption never reaches beyond the early-adopter bubble. Channel failure for A1 is quiet: the product is ready, the message is sharp, but the distribution infrastructure isn't present where the customers are. A Secondary Brake score means the brake must reach ≥+1 before channel failure begins to limit the reach of the disruption.

Secondary Accelerator for A5 (Pivot Pioneer): A company executing a strategic pivot may find that its existing channels were optimised for the old positioning and the old customer segment. The new direction — new JTBD, new lead segment, new positioning — may require new channels entirely. Legacy channels that served the old strategy are not neutral for the pivot; they actively signal the old identity to customers encountering the brand for the first time in the new context. For A5, channel strategy is part of the repositioning work, not a downstream execution decision.

A note on Fatal Brakes: Channels does not appear as a Fatal Brake in any archetype. But channel failure can block the dimensions that are Fatal. If Acquisition (610) is a Fatal Brake and channel orchestration failures are increasing CAC, the channel problem is a Fatal Brake problem in disguise. If Experience (420) is a Fatal Brake and channel inconsistency is producing the experience variance, the same applies. Channels is the infrastructure. Infrastructure failures propagate upward.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. Your customers can use the most relevant channel in function of their specific context at each moment.

  2. Your channels are physical and digital — you provide clear, personalised, and seamless interactions, anywhere, anytime.

  3. Information captured or shared in your channels is consistent, real-time, personalised, useful, and accurate.

  4. You have orchestrated all your channels — there is no silo between them, and customers can navigate seamlessly through them at each moment.

  5. You optimise the social and environmental impact of your physical and digital channels.

(Dimensions 431–434 + 435 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Channels operate in silos. Customers who cross channel boundaries encounter a brand that does not recognise them. Orchestration is absent or incomplete. The likely downstream effect: acquisition costs are higher than they need to be (research-to-booking friction), experience scores are lower than designed (channel handoff failures), and engagement data is fragmented (no joined-up view of customer behaviour).

Positive scores (+1 to +3): Channels are orchestrated. Customers move between channels without friction. Data is consistent and real-time across the full journey. Each channel is designed for the specific moment it serves. The company can track the customer journey across touchpoints and improve each channel based on measured performance.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method. Green Clean sells a residential service — cleaners visit customer homes — not packaged products. Their relevant channels are: website, booking flow, email, in-home service visit, Family Health Report (digital delivery), phone/chat support, and referral mechanics.

Score: −2 to −1 (Weak) Green Clean's channels are independent systems that do not share data or context. The website takes booking requests but is not connected to the cleaner's scheduling app — bookings are manually transferred by the founder. The Family Health Report is generated as a PDF by one team member and emailed by another, introducing a 24–72 hour delay that varies unpredictably. When a customer calls with a question about their report, the support team does not have access to the customer's service history or their specific report data — every call starts from scratch. A customer who books through the website and follows up by email is treated as two separate interactions. No channel knows what the others have communicated. The silos are invisible to the teams but immediately apparent to any customer who crosses a channel boundary.

Score: +1 to +2 (Developing) Green Clean has connected the booking system to the cleaner's route app — scheduling is now automated. The Family Health Report is generated and emailed automatically within 6 hours of service completion. A customer CRM has been introduced: all booking, service, and communication history is accessible to the support team when a customer calls. But the research channel (website) still operates independently — prospects who spend time researching on the website and then book are not identified as the same person until after the booking is made, meaning the website-to-booking conversion cannot be tracked and the research journey cannot be improved with data. The referral mechanic is manual — the team asks existing customers to refer but has no digital system to track referrals or reward them efficiently. Orchestration has improved significantly but is not yet complete.

Score: +2 to +3 (Strong) Green Clean's channels are fully orchestrated around the customer journey, not around internal team structures. The website research behaviour is tracked — customers who read the formula science page before booking convert at a higher rate, so that content is featured prominently in the booking flow. Booking, service, health report, follow-up communication, and subscription renewal are all automated and connected through a single customer record. Support staff see full service history, report data, and communication history before responding to any contact. The referral mechanic is digital — existing customers receive a referral link after every service and can track whether their referrals booked. Channel performance is measured per moment: website conversion rate, booking completion rate, Health Report open rate, support resolution time, referral conversion rate. Each metric corresponds to a specific channel at a specific journey stage. The orchestration is visible in the data: channel handoffs produce no drop-off in conversion that would indicate a silo.

Connected dimensions

Channels does not operate in isolation. Four dimensions connect most directly:

  • 240 — Visual Identity: Channels must carry visual identity consistently. A customer encountering the brand on Instagram, the website, the booking confirmation email, and the physical cleaner's uniform should see a coherent identity at every touchpoint. Channel proliferation without visual governance produces brand fragmentation.

  • 410 — Moments: Channels serve specific moments. The channel strategy is only as good as the moments map underneath it. Without knowing which moments require which types of interaction, channel decisions are made by habit (we've always had a phone line) rather than by design (this moment requires human contact).

  • 420 — Experience: Experience quality depends on channel execution. Channel inconsistency is one of the most common causes of experience variance — customers receive different responses from different channels because the channels are not coordinated. A +2 on Experience requires channel orchestration as a prerequisite.

  • 530 — Media: Media and channels overlap in digital contexts. Paid media, social media, email, and owned content all function as channels at the research and awareness stages. The boundary between Media (530) and Channels (430) is context: Media drives reach and awareness; Channels deliver the interaction and transaction. They share infrastructure and must be planned together.

Conclusion

Channels is the infrastructure dimension of the Journey meta-category. It does not generate the value proposition, design the experience, or create the magic. It delivers all of those things to the customer — or fails to.

The distinction that matters for scoring is not how many channels the brand has. It is whether those channels form a coherent system. A well-orchestrated system of three channels outscores a fragmented system of eight. The customer's perspective is binary: either the journey is seamless across channels, or it is not.

Channel failure is rarely dramatic. It does not produce a single terrible interaction. It produces accumulating friction — the customer who has to re-explain their situation to every channel they touch, the research that doesn't convert because the booking flow is on a different system, the report that arrives three days late because two teams aren't connected. Each incident is minor. The cumulative effect on acquisition, experience, and retention is material.

Sources

  1. Forrester Research, "The State of Omnichannel Commerce", Forrester, 2024 — forrester.com

  2. McKinsey & Company, "The value of getting personalisation right — or wrong — is multiplying", McKinsey, 2021 — mckinsey.com

  3. Marketing Canvas Method, Appendix E — Dimension 430: Channels, Laurent Bouty, 2026

About this dimension

Dimension 430 — Channels is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Channels by Laurent Bouty

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Experience

Experience is a Fatal Brake for three archetypes. In every case the mechanism is the same: experience failure is the proximate cause of churn. Dimension 420 of the Marketing Canvas scores consistency — not brilliance — and explains why "leaving nothing to chance" is a scored criterion, not an aspiration.

About the Marketing Canvas Method

This article covers dimension 420 — Experience, part of the Journey meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Experience (dimension 420) scores the brand's answer to every moment in the customer journey. Where Moments (410) maps what the customer thinks, feels, and does, Experience scores how well the company responds. Does the response reflect the customer's identity? Does it help them achieve their objectives? Is it consistent across time and space? Does it meet the expectations it sets?

The canonical question is not "do we create exceptional experiences?" It is: what is it actually like to be your customer?

In the Marketing Canvas, Experience sits within the Journey meta-category alongside Moments (410), Channels (430), and Magic (440). It is the most frequent Fatal Brake in the method — tied with Positioning (220) and Features (310) at three archetypes each. In every case, the mechanism is the same: experience failure is the proximate cause of churn.

Consistency over brilliance: the canonical insight

The most common Experience scoring error in workshops is confusing it with Magic (440). Experience is not about peak moments or memorable impressions. It is about baseline consistency.

A single brilliant experience surrounded by mediocre ones creates more frustration than consistent adequacy. The customer remembers the gap between the peak and the norm. A hotel that provides an extraordinary check-in and then loses the luggage has not delivered a good experience — it has demonstrated that brilliance is accidental and failure is structural.

Experience design is less about creating memorable highs than about eliminating the lows and ensuring reliability. Every touchpoint should be intentional. Every response should be consistent. The design question is not "how do we create moments that wow?" — that is Magic. The design question is "how do we ensure that every single interaction reflects the promise, regardless of which team member delivers it, which channel it occurs on, or which day of the week it is?"

This is why sub-question 423 scores: "For each moment, your brand answer is consistent in time and space, leaving nothing to chance." Leaving nothing to chance is not a phrase about aspiration. It is a scored criterion. Every undesigned moment is a moment where the brand's promise is undefended — delivered differently by different people, interpreted differently by different teams, experienced differently by different customers.

Score negative if customer experience varies unpredictably across touchpoints, teams, or time. Score positive when experience design is intentional, documented, trained, and measured — and when customers describe the experience using the same words the brand intends.

Experience vs. Magic: the critical distinction

These two dimensions are adjacent and routinely conflated. The confusion produces inflated Experience scores and underinvested Magic strategies.

Experience (420) scores the consistent baseline. Does every customer, in every interaction, receive a response that reflects their identity, serves their goals, and meets the expectations that were set? Consistency is the standard. A score of +2 on Experience means: every moment has a designed response, that response is reliably delivered, and customers confirm it matches their expectations.

Magic (440) scores the peaks. Does the brand exceed expectations in ways customers didn't anticipate? Magic is the surprise that converts a satisfied customer into an advocate. It is scored separately because it requires a different design discipline — not reliability engineering but expectation mapping and strategic over-delivery.

The sequencing principle: fix Experience before investing in Magic. A brand with a −1 on Experience that invests in Magic initiatives is adding peaks to an unreliable baseline. Customers who encounter magic in one interaction and inconsistency in the next do not become advocates. They become confused — and confusion is the precursor to churn.

B2B Experience: the seams are felt

In B2C, Experience failure is visible and dramatic: the wrong product delivered, the rude support call, the website that crashes at checkout. In B2B, Experience failure is quieter and more expensive.

NTT Data's Experience challenge was not a single bad project. It was organisational inconsistency across post-merger engagement models. Different teams, acquired through different M&A paths, delivering different service standards under the same brand name. The client could feel the seams — the inconsistency between what the sales team promised and what the delivery team delivered, between what one regional office did and what another understood the engagement model to be.

B2B clients do not churn after one bad interaction. They churn after accumulating evidence that the inconsistency is structural rather than situational. The moment a client forms the belief "this isn't a bad week, this is how they operate" — the renewal conversation has already been lost. The revenue metric confirms it six months later.

For B2B service businesses, Experience design means: what does a client encounter at every stage of the engagement, regardless of which team member they are talking to? The standard is not the best delivery manager on staff. It is the minimum consistent standard that can be trained, documented, and reliably reproduced.

Experience in the Marketing Canvas

The canonical question

What is it actually like to be your customer?

Experience is a Fatal Brake for three archetypes — the most Fatal Brake appearances of any single dimension alongside Positioning and Features:

Fatal Brake for A4 (Stagnant Leader): Experience failure is the proximate cause of stagnation. The canonical A4 pattern: churn rises, leadership reaches for Acquisition to refill the bucket. The method says fix the leak first. For Sage in 2019, fragmented UX across dozens of legacy SKUs and desktop-era screens was driving customers to Xero and QuickBooks before the retention team even knew they were at risk. No acquisition investment can compensate for an experience that is actively driving customers away. Experience must reach ≥+2 before any other A4 investment makes strategic sense.

Fatal Brake for A6 (Value Harvester): A company extracting maximum cash flow from an existing base depends entirely on retention. Every 1% of churn that Experience failure generates is a permanent reduction in the cash extraction potential. For A6, Experience is not a growth lever — it is a defensive necessity. The floor below which the strategy collapses.

Fatal Brake for A7 (Scale-Up Guardian): Hypergrowth destroys experience consistency. Teams grow faster than onboarding can standardise behaviour. Processes built for 50 customers break at 500. The individual attention that defined early relationships becomes structurally impossible at scale. The Scale-Up Guardian's primary Experience challenge is not improving the experience — it is preserving the experience as headcount and customer volume compound. Every month of growth without experience governance is a month of promise dilution.

Secondary Brake for A2 (Efficiency Machine): For the Efficiency Machine, Experience operates at the operational level. Magic (440) is the adjacent dimension that eliminates friction entirely; Experience sets the floor below which efficiency becomes indistinguishable from indifference. A cost-leader that delivers a genuinely frictionless experience retains customers. A cost-leader that delivers a degraded experience loses them to whichever competitor can match the price with marginally better service.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

  1. For each moment, your brand answer has been adapted to your customers' identity.

  2. For each moment, your brand answer has helped customers to achieve their goals.

  3. For each moment, your brand answer is consistent in time and space, leaving nothing to chance.

  4. For each moment, your brand answer has clear expectations and delivers them consistently.

  5. For each moment, your brand answer is compatible with the concept of sustainability.

(Dimensions 421–424 + 425 in the Marketing Canvas scoring system)

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Experience varies unpredictably across touchpoints, teams, or time. The brand promise is undefended in at least some interactions. For archetypes where Experience is a Fatal Brake, this score explains why churn is rising and retention investment is not working. The leaky bucket cannot be fixed by adding more acquisition — it must be fixed at the experience level first.

Positive scores (+1 to +3): Experience is intentional, documented, trained, and measured. Every moment has a designed response. Customers describe the experience in consistent language that matches the brand's intended positioning. The baseline is reliable. Magic (440) initiatives can now be layered on top of a consistent foundation rather than compensating for an inconsistent one.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean's experience varies significantly by team member and visit. The two full-time cleaners operate consistently. The three part-time contractors, hired during a growth period, have had no structured onboarding and no shared standard for what a Green Clean visit should look and feel like. Some customers receive a verbal explanation of the formula used; others do not. Some receive the Family Health Report within 24 hours; others wait three days or receive it after a follow-up request. When a customer calls to ask about an ingredient, the response depends on which team member picks up. The experience is sometimes excellent and frequently adequate — but it is never reliably consistent. When the founder asks customers how the experience compares to EcoPure, the feedback is mixed: "better sometimes, comparable usually." That is a −1: experience is not reliably reflecting the positioning.

Score: +1 to +2 (Developing) Green Clean has identified the three highest-variance touchpoints from customer research: the onboarding call, the first-service visit, and the Family Health Report delivery. For each, a standard has been designed and documented. Contractors are trained on the first-service protocol. The Health Report is now automated — delivered within 6 hours of every service completion without requiring manual action. The onboarding call has a structured agenda that ensures the health-first positioning is explained consistently regardless of who conducts it. Variance has reduced but not eliminated — the support interaction (what happens when a customer reports a concern) remains undesigned and inconsistent. Positive customer descriptions of the experience are converging on consistent language: "professional," "trustworthy," "actually explains what they're doing." The experience baseline is improving. It is not yet reliable enough to score +2.

Score: +2 to +3 (Strong) Every Green Clean customer touchpoint has a designed response, documented standard, and trained delivery. The experience is consistent whether the customer is in their first month or their third year, whether they call on a Monday or a Saturday, whether their regular cleaner is available or a substitute is deployed. When a substitute is required, the customer receives a proactive message explaining the change and confirming the substitute has been briefed on the household profile. Support interactions follow a structured resolution protocol — concern acknowledged within 2 hours, resolution proposed within 24 hours, follow-up confirmed within 48 hours. Customer descriptions of the experience use consistent language unprompted: "they always explain what they've done," "I never have to chase anything," "it's the same standard every time." The NPS promoter cohort grew from 38% to 62% between 2021 and 2024 — a direct consequence of experience consistency, not product change.

Connected dimensions

Experience does not operate in isolation. Four dimensions connect most directly:

  • 410 — Moments: Experience responds to moments. Every Experience initiative traces back to a specific mapped moment where the current response is inadequate. Without a complete Moments map, Experience improvements are directional guesses — improving the wrong touchpoints while leaving the highest-variance ones unaddressed.

  • 130 — Pains & Gains: Experience design eliminates pains. The specific pains identified in journey research — the ones that accumulate into churn — are the Experience design brief. A pain at the research phase is an Experience problem in the before stage. A pain at the support interaction is an Experience problem in the after stage.

  • 440 — Magic: Magic elevates experience beyond consistency. Once the baseline is reliable, Magic creates the peaks that generate advocacy. The sequencing is fixed: fix Experience first, then invest in Magic. A +2 on Experience is the prerequisite for Magic initiatives to work as intended.

  • 630 — Lifetime: Experience quality predicts customer lifetime. The most reliable predictor of whether a customer will still be a customer in 12 months is whether their ongoing experience is consistently meeting the promise. Experience is not just a satisfaction metric — it is the leading indicator of lifetime value.

Conclusion

Experience is tied as the most frequent Fatal Brake in the Marketing Canvas Method for a straightforward reason: it is the dimension that most directly connects to churn. Customers do not leave because of a single terrible interaction. They leave because the cumulative experience does not consistently reflect the promise that acquired them.

The strategic diagnostic is not "how good is our best experience?" — teams consistently overrate on this question because they remember peaks and discount inconsistency. The question is: "what does every customer encounter, every time, regardless of team member, channel, or day of the week?"

If the honest answer is "it depends" — dimension 420 is the initiative queue.

Sources

  1. Matt Watkinson, The Ten Principles Behind Great Customer Experiences, FT Publishing, 2013

  2. Bain & Company, "Closing the Delivery Gap", 2005 — bain.com (the foundational 80/8 gap research: 80% of companies believe they deliver a superior experience; 8% of customers agree)

  3. Marketing Canvas Method, Appendix E — Dimension 420: Experience, Laurent Bouty, 2026

About this dimension

Dimension 420 — Experience is part of the Journey meta-category (400) in the Marketing Canvas Method. The Journey meta-category contains four dimensions: Moments (410), Experience (420), Channels (430), and Magic (440).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Marketing Canvas Method - Journey - Experience

Read More
marketingcanvas.net Laurent Bouty marketingcanvas.net Laurent Bouty

Marketing Canvas - Job To Be Done

Customers don't buy products — they hire them to make progress. Dimension 110 of the Marketing Canvas explains how to define the job at all three layers (functional, emotional personal, emotional social), why it is a Fatal Brake for Category Creators, and the single diagnostic sentence that exposes whether your team actually knows it.

About the Marketing Canvas Method

This article covers dimension 110 — Job To Be Done, part of the Customers meta-category. The Marketing Canvas Method structures marketing strategy across 24 dimensions and 9 strategic archetypes.
Full framework reference at marketingcanvas.net →  ·  Get the book →

In a nutshell

Job To Be Done (dimension 110) captures the ultimate objective that inspires a customer to hire your product or service. Not a description of what your product does. The reason a customer reaches for it in the first place — the progress they are trying to make in their life.

Theodore Levitt put it plainly in 1960: people don't want a quarter-inch drill. They want a quarter-inch hole. But the Marketing Canvas goes further. The hole is still only the surface. The functional job ("hang a picture") sits beneath an emotional job ("feel proud of my home") and a social job ("be seen as someone with good taste"). All three determine which product wins. Scoring only the functional layer produces a dimension score that flatters and misleads.

In the Marketing Canvas, JTBD is the first dimension in the Customers meta-category — the starting point for everything. Before positioning, before features, before pricing: who are your customers and what are they trying to accomplish?

What JTBD actually is

Customers don't buy products. They hire them to make progress.

That reframing has a sharp implication: the real competition for any product is not other products in the same category. It is every solution the customer could hire for the same job. Spotify competes with podcasts, meditation apps, and audiobooks — because all of them compete for the same job: "help me feel less anxious during my commute." Netflix competes with sleep. Understanding the job reveals the competition that a feature-based analysis never finds.

Jobs change slowly. Solutions change constantly.

This is the strategic insight that makes JTBD durable. A customer's functional job ("get from A to B without owning a car") has existed for decades. The solutions that serve it — taxis, rental cars, Uber, Lime scooters — change with technology. Brands that define themselves by the solution become obsolete when the solution changes. Brands that define themselves by the job remain relevant regardless.

Clayton Christensen, who popularised the framework in Competing Against Luck (2016), put it this way: jobs aren't just about function — they have powerful social and emotional dimensions. A brand that only understands the functional layer of its customer's job is working with a partial map.

Clayton Christensen, professor at Harvard Business School talks about the job to be done.

The three layers of every job

The Marketing Canvas structures JTBD across three scored sub-questions — one per layer. All three must be understood to score the dimension honestly:

Functional job — the tangible, measurable task the customer needs to accomplish. "Get my home clean." "File my tax return." "Track my fitness." This is the layer most companies understand reasonably well. It is necessary but not sufficient.

Emotional personal job — how the customer wants to feel as a result of getting the job done. "Feel safe in my own home." "Feel in control of my finances." "Feel like someone who takes care of themselves." This layer is what differentiates brands in mature categories where functional performance has converged. Two cleaning services that perform identically will be separated by which one makes the customer feel more like the person they want to be.

Emotional social job — how the customer wants to be perceived by others as a result of the purchase. "Be seen as a responsible parent." "Be known as someone who makes smart financial decisions." "Be recognised as someone who takes health seriously." This layer drives premium pricing, word-of-mouth, and tribal loyalty. It is the layer most commonly undiscovered because customers rarely articulate it directly — it has to be observed or inferred.

Job To Be Done

Job To Be Done

JTBD in the Marketing Canvas

The canonical question

What job is the customer hiring your product to do?

JTBD appears in the Vital 8 of three archetypes — in the highest-stakes roles:

  • Fatal Brake for A9 (Category Creator): You cannot create a category around a job you haven't named. This is the existential challenge for any company attempting category creation — the job must be defined, named, and taught to the market before any scaling investment makes sense. Green Clean's entire strategic progression hinged on shifting from "eco-cleaning company" (a crowded, undifferentiated category) to "the company that protects your family from indoor toxins" (a job the market hadn't yet named). The 2021 JTBD score of −1 blocked all ALIGN activity until the job was defined. That gate is not a bureaucratic rule — it reflects the reality that you cannot market a job the customer doesn't yet recognise.

  • Secondary Brake for A4 (Stagnant Leader): Losing touch with the job is the first sign of strategic drift. Leaders stagnate when their product roadmap continues to answer the job their customers used to have rather than the one they have now. Kodak understood the job of "preserve memories" — but only in the film layer. When the job migrated to digital, Kodak's JTBD score quietly turned negative while revenue held. The revenue metric lagged the strategic failure by years.

  • Secondary Brake for A8 (Niche Expert): A niche expert's authority rests on understanding the customer's job at a depth generalists cannot match. When a niche expert begins to drift toward average-customer thinking — serving the mainstream version of the job rather than the specific, nuanced version their segment actually has — the authority erodes. The niche is lost before the revenue line shows it.

Marketing Canvas by Laurent Bouty - Job To Be Done

Marketing Canvas by Laurent Bouty - Job To Be Done

The red flag test

The Marketing Canvas applies a single diagnostic sentence to determine whether a JTBD score can reach +2 or above:

Can your team complete the sentence "Customers hire us to help them ___" without mentioning a feature?

If the answer requires a feature — "customers hire us to help them use our proprietary cleaning formula" — the job is not yet defined. The feature is the solution. The job is independent of any particular solution. A score of 0 or below is the honest result until the sentence can be completed in customer language: "customers hire us to help them know their family is safe at home."

This test consistently exposes the gap between a company that sells a product and a company that understands its job. The sentence has to be written in customer language, not marketing copy. "Enable sustainable home care solutions" fails the test. "Help me know my children aren't breathing toxins" passes it.

Statements for self-assessment

Rate your agreement on a scale from −3 (completely disagree) to +3 (completely agree). There is no zero — the Marketing Canvas forces a directional position on every dimension.

MCM Self-Assessment — Job To Be Done (111–115)
Marketing Canvas Method CUSTOMERS · 100
Job To Be Done Self-Assessment
Select your level of agreement for each statement. There is no neutral option — the Marketing Canvas forces a directional position on every dimension. The dimension score is the average of the four sub-scores, rounded to the nearest whole number.
Dimension score
Select one option per statement  ·  Dimensions 111–115  ·  Score revealed after each selection
DIM
Statement
Score
← Brake
Accelerator →
111
01.You have clearly identified the functional unmet goals of your customers and feel confident in addressing them.
112
02.You have clearly identified the emotional personal unmet goals of your customers and feel confident in addressing them.
113
03.You have clearly identified the emotional social unmet goals of your customers and feel confident in addressing them.
115
04.Your Job To Be Done is compatible with the concept of sustainability.
Brake verdict · Dim 110
My JTBD is a Brake
No, I have not clearly identified the functional and emotional unmet goals of my customers. My Job To Be Done is not helping me achieve my goals.
Accelerator verdict · Dim 110
My JTBD is an Accelerator
Yes, I have clearly identified the functional and emotional unmet goals of my customers and feel confident addressing them. My Job To Be Done is helping me achieve my goals.
Strength
Per dimension
Marketing Canvas Method · marketingcanvas.net
© Laurent Bouty · Marketing Strategy, Programmed

Note on Detailed Track scoring: if averaging sub-question scores produces a mathematical zero, the method rounds to −1. A split score means the dimension is not clearly helping your goal — and "not clearly helping" requires the same investigation as "hurting."

Interpreting your scores

Negative scores (−1 to −3): Your understanding of the customer's job is incomplete, product-defined, or unvalidated by research. The likely result: marketing talks about solutions customers don't recognise as theirs; innovation addresses the wrong problem; competitors who understand the job more deeply will win the customer without a price war.

Positive scores (+1 to +3): You understand what customers are hiring you to do — at all three layers — and that understanding is grounded in research, not assumption. Marketing speaks the customer's language. Product decisions trace back to the job. You can name competitors from completely different categories that serve the same job.

Case study: Green Clean

Green Clean is a fictional eco-friendly residential cleaning service used as the recurring worked example throughout the Marketing Canvas Method.

Score: −2 to −1 (Weak) Green Clean understands the functional job superficially: "get the house clean using eco-friendly products." They have not identified the emotional personal job ("feel confident that my home is genuinely safe, not just superficially tidy") or the emotional social job ("be the kind of parent who makes responsible choices for my family"). Their marketing talks about product ingredients and eco-certifications — solution language, not job language. The team cannot complete the red flag sentence without mentioning a product feature. Customers who share the deeper job don't recognise themselves in Green Clean's messaging. The brand reaches people who already care about eco-cleaning; it doesn't reach the larger group who care about family health and haven't yet connected that job to a cleaning service.

Score: +1 to +2 (Developing) Green Clean has begun to articulate the deeper job: "protect indoor health." The functional layer is clear. The emotional personal layer is partially mapped — customer research has identified that parents are the primary segment and that the dominant emotional driver is "not worrying about what my children are exposed to." The emotional social layer is still assumed rather than researched. Marketing has started shifting from ingredient-led to outcome-led language, but execution is uneven. Some campaigns lead with health; others still lead with eco-credentials. The team can complete the red flag sentence most of the time, though the phrasing varies between team members — a sign the job definition hasn't fully landed internally.

Score: +2 to +3 (Strong) Green Clean's JTBD is precisely defined across all three layers and validated by customer research. Functional: "keep my home free from toxic chemical residues." Emotional personal: "feel confident that the air my children breathe at home is safe." Emotional social: "be a household my neighbours know takes health and environment seriously." The Family Health Report — a monthly transparency dashboard showing toxin load avoided per visit — was designed directly from the emotional personal layer. It addresses the job, not the service feature. Every team member completes the red flag sentence in the same language. Marketing leads with the job. The job definition has been stable for 18 months, even as the product has evolved.

Connected dimensions

JTBD does not operate in isolation. Five dimensions connect most directly:

  • 120 — Aspirations: The job feeds the aspiration. If the job is "protect my family's health," the aspiration is "be a parent who makes responsible choices." The aspiration is the identity version of the job — who the customer wants to become as a result of getting it done.

  • 130 — Pains & Gains: Pains block the job. Gains accelerate it. A precise JTBD definition is the prerequisite for mapping pains and gains usefully — without it, you're cataloguing frictions without knowing which ones matter.

  • 220 — Positioning: Positioning is how you frame the job externally. Green Clean's positioning shift from "eco-friendly cleaning" to "indoor health protection" is a direct translation of the JTBD from internal strategy to external claim. Positioning that doesn't reference the job occupies no mental real estate.

  • 310 — Features: Features must solve the job. Every feature that doesn't serve the customer's job is complexity without value. The JTBD definition is the filter that decides which features matter and which are engineering ambition.

  • 320 — Emotions: The emotional job defines the target feeling. Emotional benefits in the value proposition are the delivery mechanism for the emotional layer of the job. If you don't know the emotional job, you cannot design the right emotional benefit.

Conclusion

Job To Be Done is the first dimension in the Marketing Canvas for a reason. Everything downstream — positioning, features, pricing, experience, stories — only makes sense if it is oriented toward a job the customer actually has.

The strategic error is not failing to understand JTBD in theory. Most marketers can explain the drill-and-hole metaphor. The error is defining the job in product terms rather than customer terms, validating it with internal assumptions rather than customer research, and stopping at the functional layer without mapping the emotional dimensions that determine which brand wins when products perform comparably.

The test is simple: can your team complete the sentence without mentioning a feature? If they can — in consistent, customer-language — the dimension is working. If they can't, everything built on top of it is built on a assumption.

Sources

  1. Theodore Levitt, "Marketing Myopia", Harvard Business Review, 1960 — hbr.org

  2. Clayton Christensen, Taddy Hall, Karen Dillon, David S. Duncan, Competing Against Luck, Harper Business, 2016

  3. Alan Klement, When Coffee and Kale Compete, 2018 — alanklement.com

  4. Tony Ulwick, Jobs to be Done: Theory to Practice, Strategyn Press, 2016 — strategyn.com

  5. Marketing Canvas Method, Appendix E — Dimension 110: Job To Be Done, Laurent Bouty, 2026

About this dimension

Dimension 110 — Job To Be Done is part of the Customers meta-category (100) in the Marketing Canvas Method. The Customers meta-category contains four dimensions: Job To Be Done (110), Aspirations (120), Pains & Gains (130), and Engagement (140).

The Marketing Canvas Method is a complete marketing strategy framework built around 6 meta-categories, 24 dimensions, and 9 strategic archetypes. Learn more at marketingcanvas.net or in the book Marketing Strategy, Programmed by Laurent Bouty.

Read More