Due Diligence

February 23, 2027 · 23 min read

Part 1 of The Big Table

GreenBox is a national produce-box company with 50,000 subscribers across eight Australian cities. They have eighty people, a platform rebuilt with CI/CD and an SRE team, a strategic partnership with Hartland Group under discussion, and a founder who is about to discover that every decision she’s ever made – good, bad, and undocumented – is about to be examined by people who get paid to find problems.

The call comes on a Tuesday morning in late January. Maya is in the Perth office, reviewing next week’s delivery predictions with Priya on a video call. Her phone vibrates. She glances at it: a number she doesn’t recognise, Sydney area code. She lets it ring.

It rings again. Same number.

She picks up. “Maya Chen.”

“Maya. This is Richard Ngata. I’m the head of corporate development at Hartland Group. I believe you’ve been expecting our call.”

She has. The Cerulean Ventures partner planted the seed eight months ago: Have you considered a strategic partnership with Hartland Group? The Hartland board approached through Diane in December, after the national expansion made GreenBox impossible to ignore. Fifty thousand subscribers. Eight cities. A farm network that Hartland’s acquisition of Freshly couldn’t replicate.

“We’d like to explore a strategic partnership,” Richard says. “We think there’s something here that benefits both companies. But before we get into the details, our team would need to conduct due diligence.”

Maya looks at the photo on her desk. Her parents’ farm, before the subdivision. Her father smiling with tired eyes.

“What does that involve?”

“Everything,” Richard says. “We look at everything.”

What due diligence actually means

Due diligence is the process by which a potential partner, acquirer, or investor verifies that a company is what it claims to be. It’s forensic. It covers every domain of the business: commercial, technical, financial, legal, and organisational. The goal is to identify risks, validate strengths, and determine fair value.

For a company like GreenBox, due diligence means opening every door, pulling out every drawer, and answering questions about decisions made years ago by people who may no longer remember why they made them.

Diane briefs Maya over coffee in the Fremantle office the next morning. She’s been through this before – on both sides. When she sold Sunridge to the private equity firm, the DD process took three months and surfaced things she’d forgotten existed. When she advised on two subsequent acquisitions, she was the one asking the uncomfortable questions.

“Think of it as a health check,” Diane says. “Except the doctor is looking for reasons not to treat you.”

She draws a simple framework on the whiteboard. Five domains.

Commercial DD. Revenue quality, customer concentration, contract terms, pipeline health, pricing sustainability. The question: is the revenue real, recurring, and defensible?

Technical DD. Codebase quality, architecture, deployment practices, test coverage, technical debt, security posture. The question: does the technology work, and will it scale?

Financial DD. Revenue recognition, cost structure, unit economics, working capital, tax compliance. The question: are the numbers what they appear to be?

Legal DD. Contracts, IP ownership, employment agreements, regulatory compliance, pending disputes. The question: are there hidden liabilities?

Organisational DD. Management depth, key person risk, culture, succession plans, employee retention. The question: will the company function if the deal changes things?

“In my experience,” Diane says, “the commercial and technical DD is where companies like GreenBox either shine or shatter. You’ve built something real. The question is whether you’ve documented it well enough for someone else to see that.”

Preparing the data room

Before the Hartland team arrives, GreenBox needs to assemble a data room – a structured collection of every document, metric, contract, and record that the DD team will want to examine.

Charlotte takes the lead. She’s done this before. The meal kit company she coached went through DD when it was being acquired – six weeks before it failed. The DD process surfaced problems that the founders hadn’t wanted to see. Charlotte swore she’d never let a company she worked with be unprepared for that moment.

“Data rooms are about telling a story,” Charlotte tells the leadership team. “Not a spin. A documented, evidence-backed story of how the company got here and why it works. Every gap in that story is a risk that the DD team will find and price into their offer.”

She creates a structure.

Folder 1: Corporate. Company formation documents, shareholder agreements, board minutes, governance policies. Patricia reviews these. The formal board governance they put in place during the board room restructuring pays off immediately – every board meeting has minutes, every major decision has a record, every policy has a date and an author.

Folder 2: Financial. Three years of financial statements, monthly management accounts, unit economics by city, revenue by customer segment. The CFO they hired during the national expansion produces these in two days. Clean, audited, reconciled. This is the payoff of the valuation work: GreenBox already knows its numbers, because it’s been tracking them for investors since the Series B.

Folder 3: Commercial. Customer contracts, subscriber terms, farm partner agreements, sales pipeline, churn analysis, LTV calculations. Sam coordinates this. Most of it is clean. But there’s a gap that makes her stomach drop.

“Maya. How many of our farm partnerships have formal contracts?”

Maya closes her eyes. She knows the answer. Of fifty-two farm partners, thirty-eight have the standard partnership agreement that Sam drafted after the key person insurance work. Eight have variations – shorter notice periods, different volume commitments, special pricing. And six – including Dave Morrison – have the original handshake-style agreements. Thin documents, handwritten terms, no standard clauses.

Dave’s partnership agreement is the one he wrote himself on lined paper from the kitchen drawer. It covers minimum volumes, priority customer status, and ninety days’ notice. And one clause in Dave’s careful block letters: If GreenBox ever stops buying local, the agreement ends.

It’s beautiful in its simplicity. It’s also, from a due diligence perspective, uncomfortably informal.

Folder 4: Technology. Architecture documentation, ADRs, deployment practices, test coverage reports, security assessments, incident history, DORA metrics. Tom assembles this. And this is where GreenBox’s history of disciplined technical practice starts to pay dividends.

Folder 5: People. Organisation chart, employment contracts, key person insurance policies, retention metrics, succession plans, culture documentation.

Charlotte reviews the completed data room on a Friday afternoon. She’s quiet for a long time.

“This is better than anything I’ve seen from a company this size,” she says. “It’s not perfect. But it’s defensible.”

The team arrives

Hartland Group sends four people. A commercial analyst, a technology assessor, a financial auditor, and a legal reviewer. They arrive in Perth on a Monday morning and set up in the GreenBox meeting room – the same room where Lee facilitated that first retro, where the Event Storm wall went up, where Charlotte drew the Business Model Canvas.

Richard Ngata is not with them. This is the working team. They’re polite, professional, and methodical. They have laptops, notepads, and a shared document that grows longer every day.

The commercial analyst, a woman named Petra, starts with revenue quality.

“Walk me through your subscriber base. How many are on annual contracts versus month-to-month?”

Sam answers. “Eighty-two percent are month-to-month. Fifteen percent are on annual plans with a discount. Three percent are corporate gift subscriptions – quarterly.”

Petra writes this down. “Month-to-month means any subscriber can leave at any time. That’s a revenue concentration risk. What’s your churn rate?”

“3.2% monthly, blended across all cities. Perth is 2.8%. The newer cities are higher – Adelaide is 4.1%, Hobart is 5.3%.”

“What drives the difference?”

Sam looks at Maya. Maya looks at the wall behind Petra, where the Wardley Map used to be pinned. It’s been replaced by a national operations dashboard, but Maya can still see the ghost of it.

“Maturity of farm relationships,” Maya says. “Perth has Dave Morrison and a network of farms we’ve worked with for three years. They know our quality standards, our substitution rules, our delivery timing. The newer cities have partnerships measured in months, not years. The produce quality is less consistent. The substitution accuracy is lower. Customers feel it.”

Petra nods. “So the churn difference is correlated with supply chain maturity, not market differences?”

“Partly. There are market differences too – Hobart is a smaller market with different demographics. But the primary driver is the farm relationship maturity.”

Petra writes for a long time. Then she asks: “Can you show me the farm partner contracts?”

What saves them

The technology assessment takes three days. The assessor, a man named James who spent a decade at ThoughtWorks, is the most thorough technical reviewer Tom has ever encountered. He reads code. He reads ADRs. He reads commit histories. He runs the test suite. He deploys to staging and watches what happens.

On Tuesday afternoon, he asks Tom to walk him through the architecture.

Tom opens the system diagram. The bounded contexts from the DDD work are clearly drawn: supply matching, customer profiles, delivery logistics, payments, farm portal, substitution engine. Each context has defined interfaces. The services communicate through events and explicit contracts.

“Why these boundaries?” James asks.

Tom starts to explain. Then he stops. “Actually, let me show you something.” He opens the ADR repository. Forty-seven architecture decision records, spanning three years. Each one follows the same format: context, decision, consequences, status.

“ADR-007 documents why we separated supply matching from delivery logistics. ADR-012 explains the event-driven architecture for farm availability updates. ADR-023 covers the multi-region data strategy for the national expansion.”

James reads them. Not skimming – reading. He spends forty minutes on the ADR repository, clicking through cross-references, checking dates against the commit history.

“This is unusual,” he says.

“Unusual bad?”

“Unusual good. Most companies I assess can tell me what their architecture is. Very few can tell me why. The decision records mean I can trace the reasoning behind every major technical choice. That’s worth more than the architecture itself, because it tells me the team thinks about trade-offs, not just solutions.”

He moves to deployment practices. The SRE team – hired during the platform rebuild – has built CI/CD pipelines, automated testing, canary deployments, and observability dashboards. The DORA metrics are tracked weekly: deployment frequency (daily), lead time for changes (under four hours), change failure rate (2.1%), mean time to recovery (eighteen minutes).

“These numbers are strong,” James says. “Particularly for a company this size. Who drove this?”

“Tom laid the groundwork,” Priya says from the Melbourne video link. “But the SRE team made it real. We hired three people specifically for platform reliability during the national expansion.”

James nods. He pulls up the test coverage report. “Eighty-one percent line coverage. That’s respectable. Where are the gaps?”

Tom knows the answer and doesn’t flinch from it. “The subscription management system. It was the first thing I built – week one, before we understood the domain. It’s been rewritten three times, but each rewrite was iterative, not clean. There are code paths in there that date back to the original prototype. Test coverage on those paths is around 40%.”

“Is it a risk?”

“It’s tech debt we know about. It’s in the backlog. We’ve ring-fenced it with integration tests so it doesn’t break silently. But if you’re asking whether a rewrite would improve things – yes. We’ve been prioritising customer-facing features over internal cleanup.”

James writes this down without judgement. He’s not looking for perfection. He’s looking for awareness. A team that knows where its tech debt lives is fundamentally different from a team that doesn’t.

The decision tables impress him more than anything else. The substitution engine – the system that decides what goes in each box when a farm can’t supply what was planned – runs on formal, auditable logic. Every rule has a source (customer preference, seasonal availability, allergen constraint, farm capacity). Every substitution is traceable.

“Who maintains these?”

“Maya wrote the original rules,” Tom says. “Now there’s a team of three – one in Perth, one in Melbourne, one in Brisbane – who update the tables weekly based on farm availability and customer feedback. The tables generate code. The ensemble reviews the generated code before it ships.”

James looks at the tables, then at the generated code, then at the test suite that validates the generated code against the tables. “This is the most auditable substitution system I’ve ever seen in a food company. Most companies I assess have a person who ‘just knows’ what to swap. You’ve turned it into a formal system.”

Tom almost smiles. He remembers Dave Morrison sitting in a workshop, watching his decades of farming knowledge get translated into rows and columns: “I don’t know whether to be flattered or offended.”

The JTBD research gives the commercial analyst pause too. GreenBox has three years of documented customer interviews – transcripts, insight summaries, pattern analyses. The Tuesday interview cadence means there’s a continuous stream of customer understanding, not a one-off research project that went stale eighteen months ago.

“Your customer understanding is deeper than companies twice your size,” Petra says. “That’s rare.”

What hurts them

But DD is not a celebration. It’s a search for problems. And the problems are there.

Dave’s partnership agreement. Petra reads the six informal agreements and flags them immediately. “These are not commercial contracts. There’s no standard liability clause, no IP assignment, no exclusivity terms, no dispute resolution mechanism. If one of these farm partners decided to supply a competitor tomorrow, you’d have no recourse.”

Maya pushes back. “These relationships are built on trust. Dave Morrison has been our anchor partner for three and a half years. He would never –”

“I’m not questioning the trust. I’m questioning what happens if Dave retires, or if his son Ben decides to take the farm in a different direction, or if a competitor offers double the volume at better terms. Trust is wonderful. Contracts protect the trust when circumstances change.”

The room is quiet. Sam looks at Maya. They both know Petra is right. The key person insurance work surfaced this risk, and Dave signed the lined-paper agreement. But “lined paper from the kitchen drawer” and “enforceable commercial contract” are not the same thing.

The Harvest Box integration. GreenBox acquired a smaller produce-box company in Sydney during the national expansion. The acquisition was strategically sound but operationally painful. Half the Sydney team quit in the first three months. The technology integration is still incomplete – eight months later, two systems run in parallel for Sydney subscriber management.

James finds this during the technical assessment. “You have two subscriber management systems running simultaneously for the same city. One is your primary platform. The other is the legacy Harvest Box system. They sync overnight via a batch job. What happens when they disagree?”

Tom: “We have a reconciliation process that runs daily. Discrepancies are flagged for manual review.”

“How many discrepancies per week?”

Tom checks the dashboard. “Between two and five.”

“Per week. For eight months.” James does the maths. “That’s roughly a hundred and fifty discrepancies that required manual intervention. How many were resolved incorrectly?”

The SRE team pulls the data. Seven. Seven cases where a subscriber received the wrong box, or was charged incorrectly, or had their preferences reset to Harvest Box defaults.

Seven isn’t a crisis. But it’s a system running on duct tape and vigilance, and DD assessors know that duct tape doesn’t scale.

Early code without tests. Tom was honest about the subscription system’s test coverage. James digs deeper. The week-one code – Tom’s original prototype, the one he built before the team understood the domain – has been rewritten three times, but archaeological layers remain. Database migration files that reference columns that no longer exist. Dead code paths that the test suite doesn’t exercise. An API endpoint that nobody calls but nobody has removed because “it might break something.”

James lists these in his report. Not as fatal flaws – as indicators of technical debt that accumulated faster than it was addressed.

Contract gaps. The legal reviewer finds more. Three early employees don’t have IP assignment clauses in their contracts – the standard employment agreements were updated eighteen months ago, but the original five team members are still on the founding contracts. If any of them left and claimed they owned code they’d written, GreenBox would have a problem.

Sam, when she hears this, goes very still. She’s one of the original five.

Patricia and the company’s lawyer address this within a week – updated contracts, signed and filed. But the fact that it wasn’t caught until DD is itself a finding. The legal reviewer notes it: “IP assignment gap in founding team contracts – remediated during DD, but indicates that legal hygiene was not prioritised during early scaling.”

Charlotte’s moment

Charlotte has been watching the DD process from a half-step back. She’s not presenting – Maya, Tom, and Sam are the faces of GreenBox. Charlotte is in the room, available for questions about the scaling methodology, but mostly she’s observing.

On Thursday evening, after the DD team has gone back to their hotel, the GreenBox leadership gathers in the meeting room. The Event Storm wall is long gone, replaced by a national operations map. But the room still feels like the place where everything started.

Maya looks drained. Three days of answering questions about every decision she’s ever made. Three days of watching strangers evaluate the thing she built with her bare hands.

“How do you think it’s going?” she asks.

Charlotte is quiet for a moment. Then she says something she hasn’t said before.

“Last time I went through this, we had nothing.”

The room stills. Charlotte doesn’t talk about the meal kit company often. When she does, it’s clinical – lessons learned, patterns to avoid. This is different.

“The meal kit company. Three years before GreenBox. I scaled it from twelve to sixty people. Good product, good team, good market. We got acquisition interest from a national grocery chain. Their DD team came in.” She pauses. “We had no ADRs. No documented decisions. No customer research beyond a survey we ran once. No deployment metrics. No decision tables. No formal architecture documentation. Everything lived in three people’s heads.”

She looks at the wall where the ADR list used to be pinned.

“The DD team shredded us. Not because the company was bad – because we couldn’t prove it was good. Every question they asked, the answer was ‘we just know’ or ‘ask Sarah, she handles that.’ The technology assessment took two weeks instead of three days because nobody could explain why anything was built the way it was. The commercial assessment flagged customer concentration risk that we’d never measured. The legal review found contract gaps that we’d been living with for two years.”

“What happened?” Sam asks.

“The acquirer reduced their offer by 40%. The founders rejected it. Six months later, the company folded.”

Charlotte looks at Maya. “This time is different. You have ADRs that trace the reasoning behind every architecture decision. You have decision tables that make your substitution logic auditable. You have three years of customer research, documented and synthesised. You have DORA metrics that prove your platform works. You have a board that governs, a management layer that functions, and a team that can answer questions without the founder in the room.”

She gestures at the data room folders on the table.

“It’s not perfect. Dave’s contracts are thin. The Harvest Box integration is messy. The early code has gaps. But you have a story you can defend. Last time, I didn’t. That’s the difference between a company that invested in documentation and institutional knowledge and a company that thought it could keep everything in people’s heads.”

Tom, who has been listening, says quietly: “The ADRs. I almost didn’t write them. I thought they were process overhead.”

Charlotte almost smiles. “I know. You told me. Multiple times.”

What DD assessors actually look for

For anyone preparing a company for due diligence – whether for an acquisition, a strategic partnership, or a major funding round – here’s what the assessors are evaluating in each domain.

Technical DD looks for:

  • Architecture documentation – not just diagrams, but reasoning. Why these boundaries? Why these technology choices? ADRs are the gold standard.
  • Deployment practices – how code gets from a developer’s machine to production. Manual? Automated? How often? DORA metrics (deployment frequency, lead time, change failure rate, mean time to recovery) tell the story.
  • Test coverage – not as a single percentage, but as a map. Where is coverage strong? Where is it weak? Does the team know where the gaps are? Awareness matters more than a number.
  • Technical debt inventory – every company has tech debt. The question is whether they’ve identified it, prioritised it, and ring-fenced the highest-risk areas. A team that says “we have no tech debt” is a team that isn’t looking.
  • Security posture – incident history, threat models, access controls, credential management. The assessor isn’t looking for zero incidents. They’re looking for a team that responds well to incidents and learns from them.

Commercial DD looks for:

  • Revenue quality – is the revenue recurring, diversified, and growing? What percentage comes from the top ten customers? (Customer concentration is a discount.)
  • Contract quality – are customer and supplier agreements formal, enforceable, and standardised? Handshake deals are a red flag.
  • Churn analysis – not just the headline number, but the cohort analysis. Are newer customers churning faster or slower than older ones? What drives churn? Is the team actively managing it?
  • Customer understanding – does the company know why customers buy, stay, and leave? Documented JTBD research, ongoing customer interviews, and insight synthesis are evidence of a company that understands its market, not just its product.
  • Competitive positioning – how is the company differentiated? Is the moat real and defensible, or is it a temporary advantage that a better-funded competitor could erode?

Organisational DD looks for:

  • Management depth – is there a functioning leadership team, or does everything depend on the founder? Can the company operate if the founder takes a month off?
  • Key person risk – who holds critical knowledge, relationships, or authority? What happens if they leave? Has the risk been identified and mitigated?
  • Succession plans – for the CEO, the CTO, the head of operations, and any other critical role.
  • Culture indicators – employee retention, engagement survey results (if they exist), Glassdoor reviews, exit interview themes. A company that loses 30% of its people in a year has a problem that no deal structure can fix.
  • Governance – board composition, meeting cadence, decision authority, financial reporting. Professional governance signals a company that takes accountability seriously.

The theme across all five domains is the same: can this company explain itself? Not just what it does, but why. Not just its current state, but how it got here and where the risks are. Due diligence rewards self-awareness and punishes blind spots.

The DD report

Two weeks after the assessment team leaves, Hartland Group sends the DD report. Forty-seven pages.

Maya reads it alone in the office, after hours. Nadia texted earlier: Dinner’s in the fridge. Read the thing. I’ll be here.

The report is organised by domain. Each section lists findings (neutral observations), strengths (things that increase confidence), and concerns (things that create risk or reduce valuation).

Technology strengths: Architecture Decision Records (“exemplary institutional memory”), decision table engine (“industry-leading auditable substitution logic”), DORA metrics (“top quartile for companies of this size”), bounded context architecture (“well-designed for multi-region scaling”).

Technology concerns: Legacy subscription system code (“elevated tech debt, ring-fenced but not resolved”), Harvest Box integration (“parallel systems introduce reconciliation risk and operational overhead”), deployment access (“three team members hold production access credentials – expand to full engineering team”).

Commercial strengths: Revenue quality (“strong retention, improving unit economics across all cities”), customer understanding (“continuous discovery practices are embedded and producing actionable insights”), farm network (“fifty-two direct farm relationships represent a genuine and difficult-to-replicate competitive moat”).

Commercial concerns: Farm contract quality (“six partnerships lack standard commercial terms – remediation recommended before any transaction”), customer concentration by city (“Perth represents 35% of revenue – geographic diversification ongoing but incomplete”), month-to-month subscriber dominance (“82% of revenue is cancellable without notice”).

Organisational strengths: Management layer (“functional and tested under scaling pressure”), board governance (“independent director, regular board packs, KPI tracking”), key person mitigation (“insurance, documentation, cross-training – significantly above average for company stage”).

Organisational concerns: Founding team contracts (“IP assignment gap remediated during DD – indicates legal hygiene was reactive”), founder dependency (“reduced through mitigation work but Maya Chen remains the primary holder of strategic vision and external relationships”).

Maya reads the summary paragraph three times.

GreenBox presents as a well-run, well-documented company with genuine competitive advantages in farm relationships, customer understanding, and technical architecture. The discovery and documentation practices – particularly ADRs, decision tables, and continuous customer research – provide a level of institutional knowledge that is rare in companies of this stage. Key concerns centre on contract formalisation, integration completion, and geographic revenue concentration. Overall, Hartland Group’s assessment team recommends proceeding to the next stage of discussion.

She sits with it for a while. The office is dark except for her desk lamp. Outside, a delivery van – not a GreenBox one, someone else’s – rumbles down the street.

Three and a half years ago, five people sat in this room and wrote sticky notes about what was going wrong. Tom had rebuilt the subscription model twice. Jas had thrown away a full set of designs. Priya was blocked. Sam had forty-seven unread emails. Maya stood in front of a wall of pink sticky notes and said: “Everyone is frustrated with me.”

Lee had said: “When was the last time you all stopped and talked about how the work is going?”

Everything since then – the Event Storms, the decision tables, the ADRs, the JTBD interviews, the cadence, the governance, the board, the team, the fifty-two farms and fifty thousand subscribers and eighty people across eight cities – all of it started with that question.

And now a forty-seven-page report says the company can explain itself. Can defend its decisions. Can trace its reasoning. Can prove that the thing it built is real.

Maya takes a photo of the report’s summary paragraph and sends it to Charlotte. No caption.

Charlotte replies at 11pm: I wish I’d had this the first time.

Maya sends one more message. To Dave. She knows he won’t read it until morning, because Dave doesn’t look at his phone after dinner.

DD report came back. They said the farm network is a “genuine and difficult-to-replicate competitive moat.” Thought you should know.

Dave’s reply arrives at 6:14am the next morning, while Maya is on the coastal track.

Tell them it took three generations to build. They can put that in their report.

Maya laughs, alone, running along the ocean. The sound startles a seagull off the path ahead of her.

The due diligence is done. What comes next is the question Maya has been avoiding since the Cerulean partner first asked it, in a Surry Hills meeting room that feels like a lifetime ago: what does GreenBox want to be?

Hartland Group has answered one question. GreenBox passes. The company is real, defensible, and worth partnering with.

Now Maya has to answer a different question. And this one isn’t in any report.

Questions or thoughts? Get in touch.