What Actually Happens at a Show & Tell (And Why People Keep Coming Back)
It's 11:47 AM on a Monday. I'm standing in the event space at Bamboo Grand Rapids, connecting my laptop to their projector, trying to remember whether I saved the file I was going to demo or if it's still sitting in my drafts folder on a different machine. Chairs are filling up. Someone is asking where the coffee is. The Wi-Fi is cooperating, which is never guaranteed.
This is the glamorous life of running a monthly AI event.
I've been doing Show & Tell at Crash That Course since late 2025, and every session follows roughly the same arc -- except the content is completely different each time. What I want to do in this post is walk you through what actually happens during one of these sessions, because I think the format itself is the reason people keep showing up. And it's probably not what you'd expect from an "AI event."
Noon: The Awkward Start
We start at noon. This is intentional -- it's a lunch-hour event. People walk in from their offices downtown, grab a seat, and settle in with whatever they brought to eat. There's no formal welcome, no sponsor acknowledgments, no "before we begin, let me tell you about our premium tier." I just start talking.
Usually I open with something like: "Okay, so this month I've been obsessed with [tool/technique]. Let me show you what happened when I tried to use it for real work."
The key word is "real." This isn't a curated presentation where every demo goes perfectly. I'm showing actual work -- the campaigns I'm running, the automations I'm building, the tools I'm testing. Sometimes they work beautifully. Sometimes they don't. Both are useful to watch.
12:15: The First Demo (Where People Lean Forward)
The first 15 minutes are usually a single, focused demo. At our March session -- Episode 4: Wading into Claude Code -- I opened by showing Anthropic's Claude Code running in my terminal. For most people in the room, this was the first time they'd seen an AI write, test, and debug code in real time.
I asked Claude to build a simple feature for a website. The audience watched as it planned the approach, wrote the code, ran into an error, diagnosed the error, and fixed it -- all without me typing a single line. The room got visibly quieter during this demo. You could feel people recalibrating what they thought AI could do.
Then I deliberately asked it to do something harder -- something I knew would probably trip it up. Because watching AI fail is just as educational as watching it succeed. When Claude wrote code that didn't work on the first try and then self-corrected, someone in the back row audibly said "wait, it just... fixed itself?"
That's the kind of moment that doesn't happen in a pre-recorded webinar. That's a human being, in a room, watching something they didn't think was possible, and reacting in real time. Those moments are why I do this live instead of just posting videos.
12:30: The Second Demo (Where It Gets Practical)
After the "holy crap" moment, I usually pivot to something more practical. Something that applies to the majority of people in the room -- not just the tech-curious ones.
In the ChatGPT Marketing Sprint episode (Episode 2), the second half was all about content generation. I showed how I take a single interview transcript and turn it into a month's worth of LinkedIn posts, email copy, and social content -- live, with the audience watching every prompt.
I didn't just show the output. I showed the bad output first. The generic, obvious, "clearly written by AI" draft that comes from a lazy prompt. Then I showed the same task with a better prompt -- one that includes brand voice examples, specific constraints, and context about the audience. The difference is night and day, and seeing them side by side is worth more than reading 50 articles about prompt engineering.
In the Marketing Automations episode, the practical demo was building an entire lead-nurturing workflow in Zapier from scratch. I started with a blank screen and had a working automation by the time we were done. A marketing manager in the audience came up afterward and said she'd been putting off building something almost identical for three months because it seemed too complicated. Watching it get built in 20 minutes changed her mind.
12:50: Q&A (Where the Real Magic Happens)
I always leave at least 30-40 minutes for questions. This is the part that most event formats get wrong -- they treat Q&A as an afterthought, five minutes at the end when everyone's already mentally checked out.
At Show & Tell, the Q&A is the main event. It's the part where a real person with a real problem asks "can AI help me with this specific thing?" and I either show them how, right there on screen, or I'm honest and say "not yet, but here's what's getting close."
Some of my favorite questions from recent sessions:
- "Can I use this to write my nonprofit's annual report?" -- Yes. I showed her how to feed Claude the last three years of data and have it generate a first draft structured around her board's priorities. She emailed me the next week to say it saved her 15 hours.
- "My boss thinks AI is going to replace our whole marketing department. Is he right?" -- I pulled up a task I'd tried to fully automate with AI, showed where it worked, and showed where it fell apart spectacularly. The answer is "AI replaces tasks, not people -- but only if the people learn to use it."
- "I'm 62 and I feel like I'm too late to learn this stuff." -- This one gets asked more than you'd think, usually not this directly. I showed her the Anthropic Academy free courses, walked through the interface, and had her type a prompt into Claude right there. She laughed when it worked. That laugh is worth more than any analytics dashboard.
- "How do I know which AI tool to use? There are so many." -- This is the most common question. My standard answer: start with one. I usually recommend Claude or ChatGPT, depending on the use case. Master that before adding anything else. Most people who say they're "overwhelmed by options" are actually just procrastinating because they haven't started.
1:20: The Hallway Conversations
Officially, Show & Tell ends at 1:30. In practice, about half the room sticks around for another 15-20 minutes. This is where the best conversations happen.
People cluster in small groups. The marketing manager and the freelance designer realize they have complementary skills. The nonprofit director meets someone who's already solved the exact problem she's facing. The college student gets a LinkedIn connection from a VP who's impressed that a sophomore already knows what MCP is.
This informal networking wasn't something I planned. It emerged naturally from putting diverse people in the same room and giving them a shared experience to talk about. It's the same dynamic you see at great conferences, except it happens every month, it's free, and it's at a brewery in Grand Rapids.
Why People Come Back
We have regulars. People who've been to three, four, five sessions. They aren't coming back because they haven't "gotten it" yet. They're coming back because the content is different every time and the tools change every month.
In January, I demoed AI writing assistants. In February, it was fractional team tools. In March, AI coding. In April, it'll be something completely different because I'll have spent the previous month exploring a new tool that I think the audience needs to see.
The format also creates a low-stakes way to stay current. Instead of dedicating hours to reading about AI developments, people spend 90 minutes watching someone filter through all of it and show them only the parts that matter. It's AI education through curation -- I do the experimenting so you don't have to.
But I think the real reason people come back is simpler than all of that: it's nice to be in a room with other people who are figuring this out too. AI can feel isolating if you're the only person at your company trying to learn it. Show & Tell is a room full of people who are all in the same boat, at different stages of the same journey, and there's real comfort in that.
Mark Dalgarno wrote about the show-and-tell format on Medium, noting that it “helps stakeholders bond with teams by sharing success” and “improves collaboration.” Michael Raspuzzi makes a sharper distinction: traditional demo days are performative, while show-and-tells are participatory. That’s exactly what we’re going for. Nobody at Show & Tell is performing. I’m sharing work-in-progress, warts and all, and the audience is invited to poke at it.
What Surprised Me
When I started Show & Tell, I thought the audience would be mostly tech-savvy people who wanted to go deeper. I was wrong.
The audience is overwhelmingly people who are NOT in tech. Marketing professionals, small business owners, nonprofit leaders, educators, HR managers, realtors, freelancers. People whose jobs are being reshaped by AI and who want to understand what that means for them, specifically.
The other surprise: the live failure moments are the most popular parts. When I demo something and it doesn't work, the room lights up. Not because they enjoy watching me suffer (well, maybe a little), but because seeing AI fail is deeply reassuring. It means the tool has limits. It means their jobs aren't obsolete. It means there's still a role for human judgment. People need to see both sides -- the magic and the mess -- to build an accurate mental model of what AI actually is.
The lightning talk community on Medium has documented this same phenomenon. Hootsuite’s engineering team found that short-format live presentations “break down knowledge barriers and provide cross-pollination opportunities.” But they also found something subtler: the presentations “surface talent, experience, and insight hidden within organizations.” The same thing happens at Show & Tell. People discover capabilities in themselves and others that they didn’t know existed. The marketing manager who asks a question about automating her email campaigns and gets a live demo? She just learned something no Coursera course would have taught her — because no course could have anticipated her exact question.
How to Experience It
If you've gotten this far and you're curious, you have a few options:
- Come to the next one in person. Check the events page for dates and locations. Grand Rapids runs monthly, and we're expanding to Ann Arbor and Detroit in April. It's free. Just RSVP and show up.
- Watch a past episode. All recordings are free on the Show & Tell Library. Start with Episode 4: Wading into Claude Code if you want the most recent one, or Episode 2: ChatGPT Marketing Sprint if you want the most broadly applicable one.
- Subscribe for notifications. Register on the library page and you'll get an email when new episodes drop. That's it. No sales funnel. No 47-email drip campaign. Just a monthly heads-up.
If you want proof that this format scales, look at Hardware Meetup — a demo-based community that now spans 40+ cities with 40,000 members. Their recent San Francisco event drew over 1,000 RSVPs. The model works because humans learn best by watching other humans do real things, then asking their own questions. That’s been true since the first person showed someone else how to make a fire. Technology changes. The learning mechanism doesn’t.
I started Show & Tell because I thought there should be a place in Michigan where working professionals can see AI tools demoed on real work, for free, without it being a sales pitch. Four episodes and a growing community later, I think the format works. The audience agrees -- they keep coming back.
The next one is April 23 at Bamboo Grand Rapids. The seat is free. The coffee is not (blame Bamboo, not me). I'll be the one at the front, trying to get the projector to cooperate before noon.
See it for yourself.
Every month, a new episode. Every episode, a new set of tools. Always free, always live, always unscripted.
Timothy Haines
Founder of Unicorn Flames and Crash That Course. Runs free monthly AI workshops in Grand Rapids, expanding to Detroit and Ann Arbor. Frequently loses arguments with projectors.