Moneyball for Morons: How I Built an NBA Prediction Engine in Google Sheets

Before I started building directory websites like SoonerClassifieds or automating garage sales with Python, I had a much simpler, much more dangerous ambition: I wanted to predict the future.

Specifically, I wanted to predict the outcome of Oklahoma City Thunder games.

This project, which took place back in November, was effectively “Season 1” of my collaboration with the Machine. If the later web projects were about architecture, this project was about pure chaos. We weren’t building a house; we were trying to catch lightning in a bottle using nothing but Google Sheets and a free API.

The Origin Story: Why Spreadsheets?

I’m a sales and marketing guy. I live in spreadsheets. I understand rows, columns, and pivot tables. I figured that if I wanted to dip my toes into AI-assisted engineering, I should start on my home turf.

The goal was simple: Build a model that pulls live NBA stats, weighs them against my own arbitrary “homer” logic, and spits out a predicted score.

I thought, “How hard can it be? It’s just math.”

[Insert Screenshot: The early, messy version of the Google Sheet with raw data]

The Loss: The Infinite Loop of Doom

The “Loss” in this project was immediate and humiliating. It wasn’t a syntax error or a broken CSS layout; it was the realization that I didn’t know how to ask for what I wanted.

I started by pasting massive blocks of stats into the chat and saying, “Analyze this.”

The AI, dutifully, would hallucinate a response. It would invent stats. It would tell me Shai Gilgeous-Alexander averaged 50 points a game because it hallucinated a single outlier night as the average.

Then came the formula nightmare. I asked for a formula to calculate “Momentum.” The AI gave me a nested IF statement so long that it literally crashed my browser tab.

The Low Point: I spent six hours one night trying to get an IMPORTJSON script to pull live data. I was copy-pasting code I couldn’t read into a Script Editor I didn’t understand. The debug console was just screaming red text at me. I felt like a caveman banging rocks together hoping to create a nuclear reactor.

I realized that AI is not a crystal ball. If you feed it garbage data and vague instructions (“tell me who wins”), it will lie to you to make you happy.

[Insert Screenshot: A Google Sheets cell showing a #REF! or #ERROR! message]

The Win: The “Logic First” Breakthrough

The turning point—and the “Win” that made this project viable—came when I stopped asking the AI to do the math and started asking it to build the structure.

I stopped saying “Give me a winning formula.” I started saying “Here is the logic I want to test: If a team is on the second night of a back-to-back, reduce their projected score by 5%. Write a formula for that.”

This was the birth of my “Prompt Engineering” strategy, though I didn’t call it that yet. I learned to deconstruct the problem:

  1. Data Ingestion: We got the API connection working by breaking it down into tiny steps. “Step 1: Fetch the URL. Step 2: Parse the JSON.”
  2. The ‘Vibe’ Adjustment: I had the AI create variables for “Home Court Advantage” and “Rest Days.”
  3. The Dashboard: We built a clean front-end in Sheets that hid the ugly math in the back.

[Insert Screenshot: The final ‘Dashboard’ view showing the Thunder vs. Opponent prediction]

When the model finally predicted a Thunder win within 2 points of the actual final score, I felt like a god. It wasn’t about the gambling (okay, maybe a little); it was about the fact that I had engineered a system that worked.

Deconstructing the Collaboration

Looking back, this project was the training ground for everything that came later with SoonerClassifieds and OKGarageSales.

What I Learned:

  • Google Sheets is a Fragile Database: The AI loves to write complex array formulas (ARRAYFORMULA), but they are brittle. One wrong row deletion and the whole model explodes. I learned to ask for “robust, error-proof formulas.”
  • Context is King: In the early days, I would start a new chat and expect the AI to know my scoring model. It didn’t. I learned to create a “Master Prompt”—a text file explaining the model’s rules—that I would paste at the start of every session.
  • Man vs. Machine: The machine was great at the syntax (writing the script), but terrible at the context (knowing that a player sitting out due to injury changes the whole game). I had to be the “General Manager,” making the subjective calls, while the AI was the “Statistician” crunching the numbers.

The Verdict

The NBA Prediction Model sits in my Google Drive today. It’s not perfect. It still breaks sometimes when the API changes. But it was the “Alpha Test” for my life as a tech-enabled builder.

It taught me that the AI is a force multiplier, not a replacement. It can do the heavy lifting, but I have to tell it where to put the weight.

If I hadn’t struggled through those IMPORTXML errors in November, I never would have had the confidence to touch a WordPress CSS file in December.

[Insert Screenshot: A graph or chart from the sheet showing Win/Loss predictions vs Actuals]

The Thunder are doing great. And thanks to this jagged, messy, beautiful spreadsheet, I like to think I know exactly why.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *