22 Aug Silent Data: What We Don’t Capture
What if our tools are filtering out the most human parts?
The tension emerged in a familiar form: a scroll heatmap with a flat line, no visible clicks, and a near-total bounce rate. To the team, the conclusion seemed obvious. “No one’s engaging with this page,” someone said. “We should move the CTA up.” But that verdict felt strangely hollow. I remembered the session, more precisely, the user who had lingered, breathed, scrolled down slowly, then paused for a long time. No interaction, no click. But not nothing either.
What was happening in that pause? The tools offered no answer. Just absence, rendered as failure.
A case where the screen recording failed to explain behaviour
This particular session was a composite, drawn from multiple rounds of moderated testing for a content-heavy landing page. The metrics were bleak. No clicks below the fold. High exit rate. But one participant’s recording showed a curiously slow interaction. They read every line. They scrolled with care, almost hesitantly, as if searching for something unnameable. Then they left. No questions asked. No indication of confusion.
In a follow-up call, they described the experience as “a bit too much at once… but I didn’t want to rush it.” There was a kind of respect in their slowness. They weren’t bouncing, they were processing.
Yet to the analytics layer, it looked like disinterest.
This gap, between presence and interaction, between what is sensed and what is tracked, became the lens through which we revisited other sessions.
What was missing, pauses, gesture, emotional tone
This is where language fails us: we say “user behaviour,” but record only motion. We track taps and scrolls, not silences or furrowed brows. A participant’s pause, sometimes a full ten seconds, often reveals more than any quote. Their hand hovering over a button. A slight shift in posture. A sigh.
None of it captured. None of it categorised.
Quantitative tools can mask this with granularity. Qualitative tools can distort it with narrative. But it’s in the unscripted space between them, between what’s said and what’s stored, that some of the most meaningful signals live.
We began to notice these moments more deliberately:
• When someone hesitated to criticise.
• When eyes flicked sideways toward an unclicked element.
• When a scroll stopped, not due to confusion, but consideration.
Noticing required us to slow down too.
This observation sits in close kinship with the approach described in Ethnographic Methods in UX, where presence and pacing guide what becomes legible.
Mixed methods, speculative prompts, and analogue note-taking
To make room for this slower noticing, we tried something modest: writing by hand during sessions. Not transcriptions or timestamped events, just impressions. Mood. Pacing. Fragmented phrases like “leaning back, smile faint.” It changed how we listened. The act of writing slowed our response time and made us more porous.
Later, in synthesis, these notes became a soft frame, not the final word, but a cue to revisit recordings with new eyes. They led us to ask different questions:
• Not “What went wrong here?” but “What might have passed unspoken?”
• Not “How many dropped off?” but “When did engagement shift?”
We also ran a short trial with speculative methods, lightly inspired by cultural probes. Participants were invited to mark, title, or sketch moments on the interface where they felt something, hesitation, tension, clarity, delight. Some drew lines around white space. One labelled a content block “a pause I needed.” Another crossed out a CTA with the word “too soon.”
The results were imprecise, and not designed to be codified. What mattered was the permission: to surface inner tempo, to let emotional tone enter the frame. These activities revealed structure not by precision, but by association. They made visible what the tools had made mute.
This reframing of method is explored further in The Method is the Medium, where tools are treated as ethical filters, not neutral instruments.
Personal reflections:
As my personal reflection, what changed most was how I read a session. I used to begin with tasks and outputs, what got done, what didn’t. But now I attend first to tempo. Did something slow down? Speed up? Did the participant’s voice falter? Did the air change?
These shifts often point to something just outside articulation, not yet named, but present. I don’t claim to capture it fully. But I no longer assume absence is absence. Some things don’t show up in the tools because they aren’t meant to. They are felt, not logged.
That doesn’t mean we abandon structure. It means we expand our sensitivity. A good tool helps us measure. A good researcher learns to feel what the tool leaves out.
Note: This echoes some of the tensions raised in What Research Forgets, where silence isn’t failure but unacknowledged form.