Skip to main content

What We Learned Building SOC in a Box

Ten weeks ago, we started this development diary with a problem statement: small organisations have been told for too long that they don't qualify for a real Security Operations Centre, and we believed that was wrong. We said we were going to document the process of building SOC in a Box — and we've done that, covering the hardware, the detection engine, EmilyAI, DecoyPulse, the named analyst model, the Confidence Score, and the five-day deployment process.

This final post is different. It's not a technical overview. It's an honest account of what we got right, what we got wrong, and what we'd do differently if we were starting again. We think transparency about the development process is more useful to the people considering deploying this product than a polished retrospective that presents everything as having gone to plan.

What We Got Right

The pre-configuration model

The decision to do all configuration work before the appliance leaves our hands was the right one, and it was right more completely than we anticipated. We expected it to reduce deployment time. What we didn't fully anticipate was how much it would reduce client anxiety. When a managing partner plugs in a box and calls us to say the green light is on, and we confirm that yes, we can see their network and 24/7 monitoring is now live, the response is consistently one of genuine relief. There's no liminal period of partial configuration, no "it's mostly working but we're still tuning" — it works, fully, immediately.

The named analyst model

We debated this internally. The efficiency argument for a pooled analyst queue is strong — it maximises analyst utilisation, reduces scheduling complexity, and is what almost every competitor does. We rejected it, and we're glad we did. The feedback from clients consistently identifies the named analyst as the most valued aspect of the service. Not the technology — the person.

The Confidence Score

We agonised over whether a single number was too reductive. It isn't. Multiple clients have told us that the Confidence Score is the first security metric they've had that they can actually use in a board meeting without having to translate it. One client used it to negotiate a reduction in their cyber liability insurance premium. That's the proof of concept we were looking for.

What We Got Wrong

The initial scoping call length

We originally designed the scoping call as a 45-minute session. Clients consistently weren't getting value from the final 15 minutes — we'd covered everything necessary and were either repeating ourselves or asking questions that could be answered more efficiently from the first 24 hours of telemetry. We cut it to 30 minutes and the result is a sharper, more focused conversation with better outcomes on both sides.

Agent deployment guidance

Our original deployment documentation assumed that clients would have central management tooling — Group Policy, Intune, JAMF — that would make rolling out endpoint agents straightforward. A meaningful proportion of our target clients don't. Deploying agents to 50 individual machines manually, or via a non-IT person following instructions written for a sysadmin, created friction that delayed full coverage in some early deployments.

We've addressed this with improved documentation, a simplified agent installer that non-technical staff can run, and a revised Concierge offering that explicitly covers endpoint agent deployment as a primary use case. But it was a gap in our initial assumptions about the technical capability of the people doing the deployment.

OT infrastructure assumptions

We advertise SOC in a Box as supporting both IT and OT infrastructure. That's true, but our initial deployment documentation was weighted heavily towards IT. Clients with operational technology — manufacturing equipment, building management systems, laboratory instrumentation — needed more specific guidance on network segmentation and passive monitoring configuration for OT environments. We've since updated the documentation significantly, but it should have been there from day one.

What Surprised Us

The sector that has responded most strongly to SOC in a Box is not the one we expected. We built the initial marketing around professional services — law firms, accountancy practices, IFAs — because these are the organisations most visibly affected by regulatory pressure. The sector that has moved fastest, and where word-of-mouth referrals have been strongest, is multi-academy trusts.

The economics work particularly well for MATs: a single box per school, standardised detection rules across the estate, a single named analyst who knows all the sites, and a Confidence Score per school plus an aggregate view for the trust. The safeguarding data protection obligations, combined with the cost pressure that most MATs operate under, make the cost-displacement framing — replacing multiple point tools with a single service at a lower total cost — immediately compelling.

The "SOC as a Saving" Insight

The reframing that has had the most impact in sales conversations wasn't one we planned from the outset — it emerged from feedback on early client conversations. Presenting SOC in a Box as a cost is true but unhelpful. Showing a prospective client that their current piecemeal security tool spend — antivirus, email security, vulnerability scanner, password manager, security awareness training — already totals more than the SOC in a Box subscription, and that SOC in a Box replaces all of them, changes the conversation entirely.

The average SMB with 40 endpoints currently spends approximately £16,600 per year on tools that, collectively, don't provide the continuous monitoring and analyst response that SOC in a Box does. Our Medium plan costs £7,200 per year and replaces or surpasses all of them. That's a net saving of approximately £9,400 per year — and you're more secure afterwards than you were before.

We now lead every sales conversation with this framing. It's not a trick. It's an accurate representation of what the product costs relative to what it replaces. But it took feedback from real client interactions to find it.

Where the Product Goes Next

SOC in a Box is live. Clients are deployed, analysts are watching, incidents are being detected and responded to. The product works. What we're focused on now is the iteration that comes from production experience: more sectors, more deployment configurations, deeper integrations, and continuing to improve the Confidence Score model based on the data we're now accumulating across a growing client base.

If you've followed this series and have questions about any of the decisions we made, or want to talk about whether SOC in a Box is right for your organisation, we'd genuinely like to hear from you. Not a sales call — a conversation.

The Product Is Live. The Conversation Is Open.

SOC in a Box is available now, across all three plan tiers, with a 30-day rolling contract and no setup fee. If this development series has raised questions, bring them to your scoping call.

Book your scoping call

Related Articles