There is a particular species of internal consultant who has confused activity with analysis. You can identify them by the trail of PowerPoint files they leave behind — sixty-slide decks with elaborate animations, transition pages decorated with stock photography, and appendices thick enough to stop a door. They are prolific producers of content and unreliable producers of insight.
I say this with empathy, because I was one of them for the first two years of my internal consulting career. I had come from an environment where the deck was the deliverable, where the quality of the thinking was assumed to correlate with the volume of the output. It took a CFO at one of my Fortune 50 engagements to disabuse me of this notion. She looked at my forty-page analysis, set it aside, and said: “I need you to tell me three things. What’s the problem? What should we do? What happens if we don’t?”
That was the day I stopped making decks and started solving problems. The distinction sounds semantic. It is not. It is the difference between internal consultants who are invited into strategic conversations and those who are asked to “put together some slides.”
The Internal Context Changes Everything
The core analytical frameworks taught in MBA programs and used by external firms are sound. Issue trees, MECE structuring, hypothesis-driven analysis, financial modeling — the tools themselves are not broken. What’s broken is the assumption that they can be applied identically in an internal context.
The internal environment introduces three variables that external methodologies were never designed to accommodate.
Variable 1: You Have Too Much Data
External consultants spend three weeks trying to get data. They submit requests, wait, negotiate access, receive sanitized extracts, and build models from incomplete information.
You have the opposite problem. As an internal consultant, you often have access to everything — transactional data, operational data, financial data, HR data, customer data, and the institutional memory of people who have been in their roles for twenty years. The challenge is not acquisition. It is filtration.
The most common analytical failure I’ve seen is not reaching the wrong conclusion — it is never reaching any conclusion because the analyst is drowning in data. They keep pulling one more report, running one more query, scheduling one more interview, because the data is there and the instinct is to be thorough. Thoroughness becomes a form of procrastination.
The intervention is ruthless hypothesis discipline. Before you touch a single data source, write down what you believe the answer is. Now ask: what are the three to five data points that would confirm or destroy this hypothesis? Go get those data points. Only those. If they confirm the hypothesis, stress-test it with the strongest counterargument. If they destroy it, revise and repeat.
Variable 2: You Know Too Many People
External consultants conduct stakeholder interviews with clinical detachment. They have no prior relationship with the interviewee, no organizational history, no political context.
You know Sarah in Operations has been territorial about her data since the 2022 reorganization. You know Marcus in Finance will give you the official numbers but not the real ones unless you ask the right way. You know the VP who commissioned this analysis has already decided what she wants the answer to be.
This institutional knowledge is simultaneously your greatest asset and your greatest risk. The asset: you can navigate directly to real data and real constraints, cutting weeks from discovery. The risk: you unconsciously weight information from people you trust and discount information from people you don’t. You shape your analysis to be survivable rather than accurate.
The countermeasure is what I call structured naivety — the deliberate practice of approaching a problem as if you don’t know the answer, even when you think you do. Interview stakeholders using a formal protocol rather than a casual conversation. Document disconfirming evidence with the same rigor as confirming evidence. Create a section in your analysis called “What if we’re wrong?” and populate it with genuine alternative explanations, not strawmen.
Variable 3: You Will Live With the Consequences
External consultants produce a recommendation and transition off the engagement. You will be in the building when your recommendation is implemented. You will see whether your assumptions held. You will be asked, a year later, whether the projected savings materialized.
This changes the analytical calculus fundamentally. External consultants can afford to be theoretically elegant. You must be operationally honest. Every assumption should pass the Monday morning test: will this assumption survive contact with the actual operations team on Monday morning?
The Six-Tool Internal Analytical Toolkit
The Adapted Toolkit
Standard issue trees decompose a problem into MECE sub-problems. The bounded version adds a constraint layer: for each branch, specify whether it’s within scope, what data you will and won’t pursue, and what assumptions you’re accepting without further analysis. This prevents the infinite-data problem.
Before launching full analysis, spend 48 hours developing and stress-testing your initial hypothesis using only readily available data and conversations. If you can’t articulate a clear hypothesis within 48 hours, the problem is not well-framed — go back to scoping.
Cross-unit benchmarking inside the same company is politically explosive. The protocol includes data anonymization (performance quartiles, not named units), leadership preview (no surprises), and a “leading practices” frame rather than “laggards and leaders.” Same data, radically different reception.
Standard root cause analysis identifies process failures. The dual-track version adds a parallel political-cultural analysis: not just “why did the process fail” but “why did the organization allow the process to fail.” The technical root cause gets the fix right. The political root cause determines whether the fix will be adopted.
Build your financial model using the exact same chart of accounts, cost categories, and return metrics your CFO uses. A brilliant analysis in the wrong financial framework is a rejected analysis.
Before finalizing your recommendation, assume it was implemented and failed. Why? What went wrong? This surfaces implementation risks that pure analytical frameworks miss and gives you material for the risk mitigation section that makes your recommendation adoptable.
The “So What” Discipline
Every analysis must answer one question: so what? Analysis tells you operating costs in the Southeast are 18% above the national average. Insight tells you the variance is driven by three facilities acquired in 2019 that were never integrated onto the standard platform, and migrating them would generate $4.2M in annual savings with an eight-month payback.
Analysis is a fact. Insight is an actionable recommendation with a quantified outcome. Train yourself to never present a finding without its implication. If you cannot articulate the “so what,” the finding is not ready. If the “so what” doesn’t connect to a decision the organization needs to make, the finding is irrelevant to this engagement.
Stop making decks. Start solving problems. The organization doesn’t need more slides. It needs someone who can walk into a room, frame the real problem, and tell them what to do about it.
Get the Internal Consultant Starter Kit
Free download: the Strategic Value Matrix, Engagement Charter template, and the full first chapter of the book.
No spam. Unsubscribe anytime.