← All services

Team Training

No slides. Laptops out. Your team builds real AI workflows using your actual data, and everyone leaves with something working on their machine that they’ll use tomorrow morning, not just notes from a presentation they’ll never open again.

Book a free call to discuss

Why most AI training doesn’t work

I’ve sat through enough “AI in the workplace” training sessions to know the pattern: someone talks through 40 slides about how important AI is, shows a few ChatGPT demos that look impressive but have nothing to do with your actual work, everyone nods along, and then they go back to their desks and carry on exactly as before. Two weeks later, nobody’s using any of it, and the budget gets written off as “awareness raising” which is corporate for “it didn’t work but we don’t want to say that.”

The problem isn’t that people don’t understand AI. It’s that the training doesn’t connect to anything they actually do on a Monday morning. So I don’t teach “AI” in the abstract. I teach workflows, using your team’s real data, on their own machines, solving problems they recognise from last week’s to-do list.

Half-day workshop (4 hours)

This is the one I’d recommend if you’re not sure where to start. Four hours, laptops open, and by the end everyone has a working setup they can use the next day.

  • AI fundamentals for your industry, not a generic overview but a focused look at what AI can and can’t do for the specific kind of work your team does. I’ll be blunt about the limits, because there’s nothing worse than someone trying to use AI for a task it’s bad at and concluding the whole technology is useless based on that one experience.
  • Claude Desktop with 3 MCP servers configured on everyone’s machine, connecting to tools they already use. Brave Search for web research, file system access for working with local documents, and one MCP server relevant to your specific workflow, maybe a database connector, maybe Google Drive, maybe something for your industry. Everyone walks out with a properly configured AI assistant, not just a chat window.
  • One workflow built from scratch where we pick a real task that someone on your team does regularly and build an AI workflow that handles it. Not a demo. Not a toy example. A workflow using your actual data that someone can run tomorrow morning and save an hour of their time, and that acts as a template they can adapt for other tasks once they understand the pattern.

Full-day workshop (8 hours)

Everything from the half-day, plus the more advanced work that turns casual AI users into people who can build their own workflows without me.

  • Local LLM setup for teams handling sensitive data (legal, financial, medical, anything where you can’t send client information to the cloud). I install and configure a local language model via LM Studio or Ollama that runs entirely on your hardware. It’s slower than Claude, but the data never leaves your building, and for a lot of use cases that trade-off is worth it.
  • Custom automation design sprint where we spend two hours mapping your team’s workflows and designing automations for the three highest-impact opportunities. This is the same process I use in my workflow audit, compressed into a focused sprint that gives you a clear picture of what to build first and why.
  • Everyone leaves with 3 or more working workflows, and not the same three workflows for everyone. Each person builds workflows relevant to their own role, so the marketing person builds a content research pipeline, the ops person builds a data processing workflow, the finance person builds a reporting automation. Different tools, same principles, and everyone understands the pattern well enough to build more on their own.

Ongoing programme

For teams that want to keep building capability after the initial workshop. This is the format that produces the most lasting change, because the first workshop gives people the foundations but it’s the follow-up sessions where things really start to compound. People come back with problems they’ve tried to solve, questions they’ve hit, ideas they want to test, and the conversations get more interesting every month.

  • Monthly 2-hour deep-dive session where we tackle a new use case each month, review what’s been built since the last session, and troubleshoot anything that’s not working. These sessions tend to get more valuable over time as your team gets more ambitious with what they’re building and starts pushing into territory where the nuances actually matter.
  • Slack or Teams support between sessions for quick questions, workflow reviews, and “is this the right approach?” checks. I typically respond within a few hours during working days, and most questions are the kind of thing that takes me two minutes to answer but would cost someone an hour of trial and error to figure out on their own.
  • Workflow reviews where you send me what you’ve built and I tell you what’s good, what’s fragile, and what I’d do differently. Honest feedback, not cheerleading.

What makes this different

I don’t teach “prompt engineering.” I teach workflows. The distinction matters. Prompt engineering is about getting better outputs from a chat interface, which is useful but limited. Workflow design is about connecting AI to your actual systems so it can do real work autonomously, read documents, query databases, produce outputs, and handle errors without someone sitting there typing prompts. One is a party trick. The other changes how your team operates.

And I use the tools I’m teaching. The content pipeline behind houtini.com, the 16 MCP servers I’ve published on npm, the automation workflows I run daily on a multi-GPU Threadripper workstation, these aren’t things I read about in a book. I built them, I maintain them, I know where they break and why. So when someone in your workshop asks “but what happens when the AI gets it wrong?” I have a real answer based on the times it happened to me, not a theoretical one from a slide deck.

The test I apply to every workshop: does everyone leave with something working on their laptop that they’ll actually use tomorrow? If the answer is no, the training failed. Slides and theory are worthless if nothing changes when people get back to their desks.

Common questions

Will my team see this as a threat to their jobs?

In my experience, the opposite happens. Once people build their first workflow and see it handle the tedious part of their job in seconds, they start thinking of more things to automate. The framing matters though, and I’m careful to position AI as a tool that handles the boring repetitive work so people can spend their time on the parts of their job that actually require thinking and judgment.

What if our data is too sensitive for cloud AI?

That’s what the local LLM setup in the full-day workshop covers. I install Ollama or LM Studio on your hardware, configure it with an appropriate model, and your team learns to use AI without any data leaving your network. It’s slower than Claude or Gemini, but for sensitive data the trade-off is worth it, and I’ll be honest about which tasks work well locally and which ones really need a cloud model.

How do we know this will actually stick?

That’s exactly why the ongoing programme exists. The initial workshop gives people working tools and the confidence to use them, but the monthly sessions are where habits actually form. If budget only allows for a one-off workshop, the fact that everyone leaves with working workflows on their own machines (not a shared demo environment) makes it much more likely they’ll keep using what they built.

Want to see if this fits your team?

Book a free 30-minute call. Tell me how big your team is, what they do, and what you’ve tried so far. I’ll recommend the right format and be honest if I think training isn’t what you need.

Book a call

Receive the latest articles in your inbox

Join the Houtini Newsletter

Practical AI tools, local LLM updates, and MCP workflows straight to your inbox.