← AI for SMBs

Private AI Training Cohort

Train your team on the AI tools you already pay for.

A 4-week private cohort for up to 12 operators from across your departments. Three hours a week, on-site or remote, working on your team's real day-to-day tasks. Your team leaves self-sufficient on 80% of the AI work most companies pay consultants to do for them.

Not a public cohort. Private, single-company, tailored.

Most AI training programs we have reviewed run open per-seat cohorts where your operator sits in a Zoom room with strangers from unrelated industries, working through generic case studies that have nothing to do with your workflows. That format is fine for individual upskilling. It is the wrong format for building company capability.

Our training is private by design. Your operators, trained together on your actual workflows with your own data, inside your own IT environment. Confidentiality, tailoring, and a company-level user are the whole point. Nothing leaves the room. The workflows built during the cohort use real inputs from your business and stay yours.

The open-cohort providers are optimizing for per-seat revenue. We are optimizing for your team's capability to run without us.

We train inside the AI tools you already own.

Most AI training programs assume you can install whatever you want on your corporate laptop. That assumption is wrong for most mid-market companies. You have real IT, real compliance requirements, and real licensing commitments you have already paid for.

Our training meets your company where it is.

  • Microsoft 365 shop. We train inside the AI capability your Microsoft environment already provides. Nothing installs. Nothing bypasses IT.
  • Google Workspace shop. We train inside the Gemini and Workspace integrations you already have.
  • Permissive IT environment. We train on Claude, ChatGPT, or whichever fits your team's use case.
  • Starting from scratch. We recommend based on your IT constraints, compliance profile, and budget. No sales incentive, no vendor bias.

The curriculum is capability-first, not tool-first. Your operators learn how to get the same outcomes with whatever they already have.

The 4 weeks

  1. 1

    Solve one problem

    The honest talk about what AI actually can and can't do, including the four LLM misconceptions that most commonly cost operators real money: arithmetic reliability, knowledge cutoffs, PDF parsing, and model routing. Then the Layer 1 to Layer 4 capability stack. Each participant picks their most painful daily task, the one they spend two or more hours on, and builds a working AI workflow for it with the facilitator in the room. Everyone leaves Session 1 with something that already saves time.

  2. 2

    Make it repeatable

    Set up persistent context so the AI stops needing to be re-onboarded every morning. Move from one-off prompts to reusable templates that work every time. This is the Layer 2 jump most teams never make on their own, and it is where output quality goes from “sometimes useful” to “actually reliable.”

  3. 3

    Work across your tools, not just one at a time

    Layer 3 is where the AI stops living inside one chat window and starts pulling from your actual systems: your CRM, your spreadsheets, your transaction exports, your feedback data. You'll learn how to use the connected-data features inside the AI tools you already have (Claude Projects, ChatGPT Connectors, Microsoft 365 Copilot) and when the problem actually needs custom engineering instead of configuration.

  4. 4

    Become the teacher

    Each participant presents their workflow to the group and commits to running a 1-hour training session for their own department within two weeks. The math: 12 trained operators, each teaching five to fifteen peers, and your company has 60 to 180 people with real capability within two months.

What your team learns

By the end of the 4 weeks, your operators have real capability across the first three layers of the maturity ladder, plus the judgment to know when a problem needs to jump to Layer 4 and become a Build engagement.

Layer 1: reliable prompting

How to ask for what you actually want, how to catch when the model is making things up, and why copy-pasting prompts from LinkedIn rarely survives contact with your specific workflow.

Layer 2: persistent context

How to encode your company's terminology, quality standards, and examples of good output into a saved set of instructions the AI reads every session. Onboard the AI once, not every morning.

Layer 3: connected data (the concepts)

How to connect the AI to the files, spreadsheets, and records your team already has access to, using the built-in retrieval features your vendor already ships. The custom Layer 3 systems that need real engineering live in the Build service, not here.

Judgment about Layer 4

How to recognize when a workflow has outgrown the chat window and needs to become custom software. Your operators leave knowing which problems they can solve themselves and which ones are worth escalating to a Build engagement.

Why generic AI training doesn't build company capability

The generic AI training programs on the market are built around a fixed curriculum: slides, videos, quizzes, maybe a capstone project that looks good in a portfolio. They are designed to scale across thousands of students who will never meet each other, working on problems that have nothing to do with their jobs. That format is fine for awareness. It does not build operator capability in your company.

Our cohort is the opposite. Up to twelve of your operators, in a conference room at your office (or Zoom), working on the tasks they are actually stuck on today. Session 1 surfaces the real workflows. Sessions 2 through 4 build AI around those workflows. Nothing is hypothetical. Nothing is generic. Every hour spent in the room produces an artifact your team can use the next morning.

The people running the cohort have sat in the CFO and CEO seat and have shipped production AI systems. Not academics, not career trainers, not vendor-certified enablement reps. When your operators ask “does this actually work in the real world,” the answer comes with receipts.

Cohort + Build

For companies that know they need both training and one or two custom AI workflows out the gate, we run a combined 6-week engagement. The first four weeks are the standard cohort. The next two weeks are a scoped Build engagement that ships one production workflow identified during Session 1.

The combined engagement works best when leadership has already identified a specific workflow that is clearly a Build candidate (multi-system orchestration, custom business logic, or autonomous production use) and wants the capability in place by the time the cohort ends. The Build scope is fixed during the intake call, so there is no expansion risk mid-engagement.

See the Build service for what Layer 3-4 work looks like

Pricing

Every cohort is scoped during a 30-minute intake call. Fixed price per engagement. No per-seat math. No surprise billing. The scope covers four on-site or remote sessions, pre-cohort IT discovery, participant selection guidance, a shared prompt and workflow library that stays with your team, and three months of bi-weekly office hours after the cohort ends.

The intake call is free and non-binding. We will tell you honestly whether the cohort is the right fit, what the engagement would cost, and what outcome you should expect. If the right answer is to use off-the-shelf vendor training and skip us entirely, we will tell you that too.

Tell us how your team works today.

The first conversation is a 30-minute intake call. We will walk through the workflows your operators are stuck on, the AI tools you have already licensed, and what a cohort tailored to your company would cover.