<node id="690034">
  <nid>690034</nid>
  <type>news</type>
  <uid>
    <user id="27233"><![CDATA[27233]]></user>
  </uid>
  <created>1777463806</created>
  <changed>1777557728</changed>
  <title><![CDATA[The Blind Spot in Modern Supply Chain Analytics: Where Did Critical Thinking Go?]]></title>
  <body><![CDATA[<p><em>By Chris Gaffney, Managing Director of the Georgia Tech Supply Chain and Logistics Institute, Supply Chain Advisor, and former executive at Frito‑Lay, AJC International, and Coca‑Cola.</em></p><p><strong>In this issue:</strong></p><ul><li>The real blind spot in analytics teams</li><li>Three failures where the model was “right” and the decision was wrong</li><li>A five-question checklist to run before anything goes to leadership.</li></ul><h2>A Subtle but Growing Concern</h2><p>Over the past several months, I have had conversations with senior leaders at several large, well-established supply chain organizations with strong teams responsible for Integrated Business Planning (IBP) and supply chain network design and optimization.</p><p>These teams are technically strong. They know how to build models. They are comfortable with large data sets. Many are now incorporating AI tools into their workflows.</p><p>But the same concern keeps surfacing across those conversations:</p><blockquote><p><strong>The analytical capability is improving—but the decision-making discipline around it is not keeping pace.</strong></p></blockquote><p>Analysts move quickly to building models without fully defining the business problem. Assumptions are not always surfaced or challenged. Outputs are evaluated mathematically, not operationally. And recommendations are not always translated into real-world implications.</p><p>Leaders are concerned about this and are looking for ways to address. I share their concern because I have been in their shoes.</p><h2>What the Experience Taught Us</h2><p>Earlier in my career, across different roles at Coca-Cola, we did not formally teach critical thinking. We learned it through experience and often through mistakes. Three situations shaped how I think about this today.</p><h3>Powerade: When the Model Works but the Thinking Doesn’t</h3><p>While working with optimization groups at Coca-Cola North America, we overbuilt capacity for Powerade. The model did exactly what it was supposed to do. The problem was upstream of the model.</p><p>We took the demand forecast at face value. At the time, we deferred to the brand teams without interrogating their assumptions. We never asked what was driving the projected volume—whether the competitive dynamics supported it, whether the channel assumptions were realistic, whether pricing and distribution plans were grounded, whether overall market growth would materialize as projected.</p><p>The consequence was idle capacity, production lines that were purchased and never installed, write-offs, and a fundamental change to our process. Going forward, brand and supply chain teams were both required to sign off on future business cases. The model was technically correct. The thinking around the model had not been.</p><h3>Little Rock: When Feasibility Isn’t Reality</h3><p>Later, within Coca-Cola Supply, we made a network decision to close a plant in Little Rock. On paper, the remaining system had the capacity to absorb the volume. The model said so.</p><p>What the model assessed was production capacity based on rated line speeds. What it did not account for was dock and storage capacity at peak, or the practical limitations of standing up a new shift at the receiving plants. Those constraints were real. They were also invisible in the model.</p><p>In the short term, we had to source sub optimally from other plants—which directly undermined the business case we had built to justify the closure. The math was right. The operational validation was incomplete.</p><h3>Mini Cans: When the Thinking Matches the Model</h3><p>By the time I led the National Product Support Group, we had evolved. Decisions like the launch of mini cans required cross-functional alignment, scenario-based thinking, and a clear understanding of how demand would actually be generated across channels and routes to market.</p><p>We got that one right, not because the model was more sophisticated, but because the discipline around the model was stronger. We had learned, the hard way, to ask the questions the model could not ask for itself.</p><h2>Most of the Work Is Outside the Model</h2><p>There is a line I first heard from Chris Janke: "Most of the work is outside the model." He may have learned it from someone else; I don’t know the original source, but it is the framing that has stayed with me. With the advances in data and machine learning we have seen over the past decade, that proportion may be closer to 75 percent today.</p><p>We are better than ever at collecting and cleansing large data sets, processing high volumes of information, and identifying mathematical errors. But the most important work still happens outside the model: defining the right business question, building meaningful scenarios, interpreting outputs in real-world terms, and stress-testing the assumptions that drive the recommendation.</p><p>Janke captured this precisely in documenting his own experience with a modeling error that illustrated the point. An analyst had validated the math on a labor cost model—everything checked out numerically. But when the output was translated into real-world terms, it implied production workers earning roughly $300,000 per year while working approximately 60 hours total annually. The math was internally consistent. The result was operationally impossible. The question that should have been asked early: does this make sense in the context of how the business actually operates? It was not asked until after the analysis was complete.</p><p>The discipline to ask that question is not modeling skill. It is a critical thinking skill.</p><h2>Where the Breakdown Happens</h2><h3>Before the Model: Skipping the Hard Questions</h3><p>A common pattern today is that analysts move quickly to building the model. The harder and more important step of defining the business decision before the model is built gets compressed or skipped entirely. The questions that require that step are not complicated, but they take time and engagement to answer well:</p><ul><li>What business decision are we actually trying to make?</li><li>What scenarios matter, and why?</li><li>What does success look like—not mathematically, but operationally?</li><li>What constraints are real versus assumed?</li></ul><p>These questions are not as clean as coding a model. They require conversations with people who understand the constraints, not just the data. That is part of why they get skipped.</p><h3>After the Model: Mistaking Mathematical Accuracy for Business Validity</h3><p>This is where more serious errors occur. Model issues can usually be fixed with more time. Misinterpretation of output leads to bad decisions that are much harder to unwind.</p><p>The Powerade and Little Rock situations both illustrate this. In each case, the model was not wrong in any technical sense. What was missing was the translation layer— where someone asks, “what changes on a Tuesday night shift, at Plant B, when demand spikes 12 percent?”</p><p>That translation layer does not happen automatically. It has to be built into how teams work. And it is exactly the discipline that gets squeezed when organizations reward speed and analytical sophistication above everything else.</p><h2>What Critical Thinking Actually Means in Supply Chain</h2><p>Critical thinking in supply chain is not skepticism for its own sake, and it is not a soft skill that sits alongside the analytical work. It is a discipline applied to decisions and not just to models. The word itself points to what we mean: kritikos, the Greek root, means skilled in judging, able to discern*. That is the right definition for our purposes.</p><p>It means asking whether the right question is being answered before investing in answering it well. It means making the assumptions that drive a recommendation visible and testable. It means translating analytical output into operational consequence: what actually changes, for whom, at what cost, and under what conditions the answer flips.</p><p>That discipline shows up or breaks down at four specific moments:</p><ol><li><strong>Before the model is built</strong>: &nbsp;Is the business question defined precisely enough to model?</li><li><strong>While the model is running</strong>: &nbsp;Are the assumptions embedded in the data realistic and challenged?</li><li><strong>When the output is ready</strong>: &nbsp;Does this result make sense in how the business actually operates?</li><li><strong>Before the recommendation goes forward</strong>: &nbsp;Have we planned for how this will be received, and by whom?</li></ol><p>When these moments are skipped because of time pressure, overconfidence in tools, or a culture that rewards analytical speed over decision rigor the gap between analysis and action grows. The Powerade and Little Rock situations were both failures at these moments, not failures of the models themselves.</p><p><em>*DeCesare, M. (2009). Casting a critical glance at teaching “critical thinking.” Pedagogy and the Human Sciences, 1(1), 73–77.</em></p><h2>A Five-Question Diagnostic</h2><p>Before an analysis or recommendation moves forward, teams should be able to answer five questions clearly. If any of them cannot be answered, the analysis is not ready—regardless of how strong the model is.</p><p><img src="https://www.scl.gatech.edu/sites/default/files/news/2026-04/5-question-diagnostic.jpg" alt="Strategic Analysis Checklist infographic."></p><p><a href="https://hg.gatech.edu/sites/default/files/documents/2026-04/20260430_Figure1_Five-QuestionDiagnostic_SpotlightArticle.docx"><em><strong>Figure 1: A Five-Question Diagnostic (accessible version)</strong></em></a></p><p>These are &nbsp;questions that should have specific, grounded answers before a recommendation reaches leadership. If the team cannot answer question two (what assumption would flip the result) then the recommendation rests on unexamined ground. If question four cannot be answered, the change management work has not started yet.</p><p>In the Powerade situation, questions one and two were the misses. In Little Rock, it was question three. The models were not the problem. The diagnostic would have surfaced both gaps before the decisions were made.</p><h2>This Gap Is Well Documented</h2><p>What I am describing from my own experience is consistent with what the research shows.</p><p>A long-running finding in operations research is that many models are built and comparatively few actually drive decisions, and the breakdown is organizational, not technical. A widely cited review in the European Journal of Operational Research frames this as an implementation problem rooted in how models are connected (or not connected) to the people and processes that own the decision.&nbsp;</p><p>Professional credentialing bodies have recognized the same gap. The INFORMS Certified Analytics Professional blueprint explicitly lists business problem framing, stakeholder analysis, and business case development as core analytics competencies—not optional additions. The signal is clear: being analytically strong is necessary but not sufficient.</p><p>On the training side, a field study published in the European Journal of Operational Research tested the effects of structured decision training across roughly 1,000 decision makers and analysts. The results showed measurable improvement in proactive decision-making skills and decision satisfaction. The gap is real, and it is addressable. It is a training and design issue, not a talent issue.</p><h2>The 4 C’s: A Decision-Focused Framework</h2><p>At Georgia Tech SCL, we organize this thinking around what we call the 4 C’s. These soft skills play a key role in the decision process. Each one asks a specific question about whether the decision, not just the analysis, was made well.</p><p><img src="https://www.scl.gatech.edu/sites/default/files/news/2026-04/the-4-Cs.jpg" alt="The 4 Cs Decision Test infographic."></p><p><a href="https://hg.gatech.edu/sites/default/files/documents/2026-04/20260430_Figure2_The4Cs_SpotlightArticle.docx"><em><strong>Figure 2: The 4 C’s: A Decision-Focused Framework (accessible version)</strong></em></a></p><p>Notice what this framework does not include: model accuracy, data quality, or visualization quality. Those matter, and they are inputs to the decision. But a team can have a perfect model, a clean dataset, and a compelling dashboard and still fail all four of these tests.</p><p>The Powerade situation failed the Collaboration test The supply chain team did not sufficiently interrogate the brand team’s assumptions. Little Rock failed the Critical Thinking test: the right question was not asked about what the model was not capturing. In both cases, the Communication and Change Management failures followed directly from those upstream gaps.</p><p>When all four are present, analysis becomes a decision. When one or more is missing, the analysis and translation to a solid recommendation are at risk.</p><h2>Where to Start</h2><p>This topic keeps coming up in conversations with companies, in work with practitioners, and in what we hear from students as they move into industry roles.</p><p>The tools are not the problem. AI-assisted analytics, optimization models, and advanced forecasting are real assets. But tools amplify the thinking behind them. Weak decision discipline and better tools is a faster path to the wrong answer.</p><p>If this shows up in your org, try the five-question diagnostic on your next recommendation before it hits leadership. If it surfaces gaps you cannot close quickly, SCL can help. We are building workshops and courseware on decision-focused critical thinking, and we will cover this in our <a href="https://www.scl.gatech.edu/events/calendar/day/2026/06/04/13298">June Lunch and Learn</a>.</p><p>Reach out through the SCL Spotlight or directly at <a href="mailto:info@scl.gatech.edu">info@scl.gatech.edu</a>.</p>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2023-04-29T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[While modern supply chain analytics and AI are more advanced than ever, technical capability must be paired with rigorous critical thinking and operational discipline to ensure data-driven models translate into successful real-world decisions.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Despite the rapid advancement of AI and data modeling in supply chain management, many organizations face a growing "blind spot" where sophisticated mathematical outputs are not adequately challenged by human intuition or operational reality. Drawing on experience, author Chris Gaffney illustrates how neglecting to stress-test assumptions can lead to costly mistakes even when the data itself is accurate. To bridge this gap, the article introduces a strategic diagnostic framework designed to help leaders move beyond technical validation and toward more holistic, cross-functional decision discipline.</p>]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="680113">
            <nid>680113</nid>
            <type>image</type>
            <title><![CDATA[The Blind Spot in Modern Supply Chain Analytics: Where Did Critical Thinking Go?]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>264353</fid>
                  <filename><![CDATA[spotlight-SC_critical_thinking_1200x1200.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/2026/04/29/spotlight-SC_critical_thinking_1200x1200.jpg]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/29/spotlight-SC_critical_thinking_1200x1200.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Two data analysts, a man in a suit and a woman, are seated at a desk in a high-tech logistics control center. They monitor various displays, including a comprehensive data dashboard with charts and graphs, a US network map, and a tablet for a video conference. A massive, towering warehouse filled with stacked cardboard boxes is visible in the background.]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
          <item>
        <nid>
          <node id="674087">
            <nid>674087</nid>
            <type>image</type>
            <title><![CDATA[Chris Gaffney]]></title>
            <body><![CDATA[<p>Chris Gaffney</p>]]></body>
                          <field_image>
                <item>
                  <fid>257557</fid>
                  <filename><![CDATA[chris-gaffney_scl.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/2024/05/30/chris-gaffney_scl.jpg]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/05/30/chris-gaffney_scl.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Chris Gaffney, Managing Director, Georgia Tech Supply Chain and Logistics Institute]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[info@scl.gatech.edu]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>1250</item>
          <item>1242</item>
          <item>1243</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Artificial Intelligence]]>
      </item>
          <item>
        <![CDATA[Education]]>
      </item>
          <item>
        <![CDATA[Engineering]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>194606</tid>
        <value><![CDATA[Artificial Intelligence]]></value>
      </item>
          <item>
        <tid>42911</tid>
        <value><![CDATA[Education]]></value>
      </item>
          <item>
        <tid>145</tid>
        <value><![CDATA[Engineering]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
          <link>
      <url>https://www.scl.gatech.edu/news-events/newsletters</url>
      <title></title>
      </link>
          <link>
      <url>https://www.scl.gatech.edu/</url>
      <title></title>
      </link>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>1250</item>
          <item>1242</item>
          <item>1243</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Center for Health and Humanitarian Systems (CHHS)]]></item>
          <item><![CDATA[School of Industrial and Systems Engineering (ISYE)]]></item>
          <item><![CDATA[The Supply Chain and Logistics Institute (SCL)]]></item>
      </og_groups_both>
  <field_keywords>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
