Code Coherence Analysis: Catching AI-Pasted Submissions
What "coherent" code looks like
Real engineers have habits. They name booleans the same way across functions (isReady vs is_ready vs ready, but consistently). They prefer one error-handling pattern. They reach for the same standard library tools repeatedly. Their TODOs and comments share a voice.
Even good engineers are messy — but they are messy in a consistent way. That consistency is what we mean by coherence.
What incoherent code looks like
LLM-pasted submissions, especially when stitched together from multiple prompts, fail consistency in predictable ways:
- One function uses
try/except, the next uses optional chaining, the third silently swallows errors. - Variable naming style flips:
userId,user_id,uid, all in the same file. - Comments alternate between "explains the obvious" (LLM tell) and "missing entirely" (human tell).
- Idiom level swings: textbook generic-typed solutions next to copy-pasted Stack Overflow snippets.
Each one alone could be a tired candidate. All four together is a different story.
How the analysis runs
ClarityHire's coherence pass reviews the candidate's final submission with one prompt to an LLM judge: does this look like the work of a single author, or stitched together? The judge returns a score, the specific inconsistencies it noticed, and a confidence level.
Crucially, the judge never sees the candidate's identity. It only sees the code.
Why this works better than "AI-detector" tools
Most "is this AI" detectors are unreliable, especially on code (LLMs and humans write similar Python). Coherence analysis sidesteps the question entirely: we do not care whether AI wrote it; we care whether it was written as a unified solution by one mind. That framing is much more answerable, and much more aligned with what hiring managers actually want to know.
What to do with a coherence flag
Treat it as a prompt to ask the candidate to walk through their code in a 20-minute live followup. Honest candidates explain it easily. Dishonest ones cannot. Either way, you learned something.