Can you make a better decision by knowing less? In a high-stakes emergency room, the answer is often yes. Most of us assume that gathering more information leads to more accurate conclusions. But the story of Lee Goldman and his decision tree suggests we're often drowning in data that clouds our judgment rather than clearing it.
Developing the Goldman algorithm chest pain model was a decade-long quest to solve a specific medical crisis. Doctors at Cook County Hospital were overwhelmed with patients who thought they were having heart attacks. They used every test available, yet they were still guessing. The resulting chaos meant that truly sick patients waited while healthy ones occupied expensive beds.
Lee Goldman didn't rely on gut feelings or medical seniority. He used math to strip away everything that didn't matter. He realized that while a patient's age, weight, and smoking history are important for long-term health, they're often noise during a crisis. By focusing on a few specific data points, his model outperformed the most experienced cardiologists in the building.
In his book Blink, Malcolm Gladwell explains that this is a classic example of thin-slicing. This is the ability of our unconscious to find patterns in situations based on very narrow slices of experience. Goldman took this concept and turned it into a formal decision tree. He proved that in complex situations, we don't need to know everything to find the truth.
This matters in the business world because we face the same information overload. We think we need a forty-page report to make a hire or launch a product. In reality, the most important signals are often hidden under a mountain of irrelevant statistics. Learning to edit our information is a superpower in a world that never stops talking.
Goldman’s model works by prioritizing three specific "urgent risk factors" alongside the results of an initial electrocardiogram (ECG). If the ECG isn't positive for a massive heart attack, the doctor looks at just three things. These variables tell the physician exactly where the patient should go.
The first factor is whether the patient's pain is unstable angina. This isn't just any chest pain; it’s a specific pattern that suggests a heart is struggling for blood. When this variable is present, the risk profile changes immediately. It’s a clear, binary signal that demands attention over more vague symptoms.
The second factor is the presence of fluid in the lungs. Doctors listen for a specific crinkling sound with a stethoscope. This physical evidence tells them the heart is failing to keep up with its pumping duties. It is a more reliable indicator of an immediate crisis than a patient’s general cholesterol level or diet.
The third factor is whether the systolic blood pressure is below 100. Low blood pressure in a chest pain patient is a red flag for shock or heart failure. It’s a simple number that provides a hard floor for the decision-making process. Combined with the other two factors, it creates a narrow window for diagnosis.
When a doctor focuses on these specific variables, they're no longer distracted by the patient's "story." They don't worry about whether the patient looks healthy or has a high-pressure job. The algorithm forces the doctor to ignore thirty other variables that usually lead to confusion. This results in faster, more accurate triage that saves lives and money.
Brendan Reilly took over the Department of Medicine at Cook County in the late 1990s. He inherited a system that was essentially a guess-work factory. To prove the algorithm's worth, he staged a "bake-off" between his doctors' intuition and Goldman’s math. For months, they tracked every diagnosis and every outcome to see which method won.
Goldman’s rule was a whopping 70 percent better at recognizing the patients who weren't actually having a heart attack. More importantly, it was significantly safer for the high-risk patients. The doctors, left to their own devices, missed more serious cases than the simple four-variable model did. This proved that the experts were actually being blinded by their own expertise.
Another study by Stuart Oskamp showed a similar pattern with psychologists. As he gave them more information about a case, their confidence in their diagnosis skyrocketed. However, their actual accuracy stayed exactly the same at around 30 percent. They were confusing confidence with correctness, which is exactly what happens when we over-research a business problem.
You can apply these same decision making models to your daily work. The goal is to move from a state of information gluttony to one of analytical frugality. This requires a conscious effort to identify what actually drives your results. Use these three steps to simplify your own complex choices.
Identify the three variables that historically predict 80 percent of your success in a specific task. If you're hiring, this might be a work sample, a specific reference check, and a culture-fit question. Ignore the rest of the resume fluff that usually distracts you during the first interview.
Create a hard stop on data collection once those three variables are confirmed. We often keep researching because it makes us feel safer, not because it makes us smarter. Force yourself to make a preliminary decision as soon as your core criteria are met to avoid the trap of paralysis by analysis.
Evaluate your "intuition" against the results of this simplified model after six months. Keep a record of the decisions where you followed the model versus where you let your gut take over. You'll likely find that your simplified math is a more consistent partner than your shifting emotional states.
Many professionals hate the idea of an algorithm because it feels like it diminishes their value. If a simple four-variable tree can diagnose a heart attack, why does the hospital need a highly paid cardiologist? This ego-driven resistance is why it took decades for Lee Goldman to see his work used in practice. Doctors felt it was mundane to follow guidelines rather than using their own "brilliance."
Critics often argue that these models are oversimplified or that they miss the nuances of individual cases. They claim that every patient—or every business problem—is unique and requires a custom approach. While there's some truth to this in long-term care, it’s a dangerous distraction during a crisis. The "nuance" is often just a fancy word for the irrelevant information that leads to a wrong turn.
The most important takeaway is that more information doesn't equal better insight. Truly successful people aren't those who can process the most data; they're the ones who know what to throw away. Strip your current project down to its three most vital signs today. Make your next move based on those core truths and let the rest of the noise disappear.
The Goldman algorithm is a clinical decision rule developed by cardiologist Lee Goldman. It uses a simple decision tree based on an ECG and three urgent risk factors to diagnose potential heart attacks. By focusing on only a few high-impact variables, it provides a more accurate and efficient way to triage patients than traditional, information-heavy medical intuition.
It's better because it eliminates 'noise.' Doctors often take in too much information, such as age, weight, and family history, which are important for long-term health but irrelevant in a crisis. This extra data creates a false sense of confidence while actually decreasing accuracy. The algorithm forces a focus on the few factors that truly predict an immediate heart attack.
In business, we often over-analyze data. To apply this logic, you must identify the 'vital signs' of your project—the 3 or 4 metrics that actually determine success. Once you have those, stop searching for more data. For example, in sales, focusing strictly on lead quality and response time is often more effective than analyzing fifty different market variables.
No, it structures it. The algorithm is a tool that frees up an expert's mind to handle the parts of the job that can't be automated, like patient empathy or complex treatment plans. It handles the initial triage so the expert doesn't waste cognitive energy on the 'guesswork' phase of a crisis. It turns a chaotic situation into an organized process.
The Goldman Algorithm How to Diagnose a Crisis with Only 3 Variables
The 10x Rule Why Marginal Improvements Lead to Business Failure
Using Red Flag Mechanisms to Turn Data into Action
The Regulatory Arbitrage Strategy What Most People Get Wrong
How Our Education System Traps Us in a Competition Cage
The Bell Curve of Success Navigating the Human Condition Bell Curve
The Viral Engine of Growth Mastering the Viral Coefficient
The Storytelling Problem Navigating the Trap of Post-Hoc Rationalization
The 'Do or Do Not' Fallacy How a Twisted Growth Mindset Destroyed Theranos
Learning from Failure How to Conduct Autopsies Without Blame