Learning to Use Mixed Methods: A Researcher’s Journey from Numeric Quantification to Meaningful Inquiry
Learning to Use Mixed Methods: A Researcher’s Journey from Numbers to Meaning
By Dr Tracy Zhandire, PhD
Why Methodological Choices Matter in Research
Most early-career researchers are trained to associate methodological rigor with numbers. While methodological guidance is widely available, less is written about the moment a researcher realizes that a single method—however rigorous—cannot answer their research question. Large samples, statistical significance, and clean datasets are often treated as the gold standard of credible research. Quantitative approaches are powerful, and in many disciplines, they are prioritised, sometimes implicitly, sometimes explicitly, as the safest route to legitimacy.
However, real-world research rarely unfolds in neat, predictable ways. Many research questions, particularly those involving people, institutions, and complex social systems, produce findings that are technically correct but difficult to interpret or apply. Researchers may identify clear patterns while still struggling to explain why those patterns exist or why they do not align with participants’ lived realities.
This gap between what the data show and what people experience is not a weakness of research; it is a methodological challenge. How researchers respond to this challenge determines whether their work remains descriptive or becomes genuinely meaningful and useful. It was within this space between numerical clarity and contextual uncertainty that mixed methods became relevant to my own research. Drawing on this experience, this blog post introduces mixed methods research as a practical response to complexity in real-world research. It reflects on my journey from relying solely on quantitative approaches to recognising when and why mixed methods became necessary. Through concrete examples, I outline the situations that signalled this shift, the methodological choices I made, the challenges I encountered, and the lessons learned along the way. The aim is to offer early-career researchers a grounded account of implementing mixed methods that demonstrates how methodological rigour can be maintained while remaining responsive to lived experience.
The Mixed Methods Journey: Dealing with unanswered questions
I did not begin my research career intending to use mixed methods. Like many junior researchers, I relied primarily on quantitative approaches because they felt structured, defensible, and familiar.
In one of my early studies on healthcare access, I set out to assess whether an intervention had improved access by measuring standard quantitative indicators, including clinic proximity, service availability, and utilisation rates. These measures were selected because they are commonly used to evaluate access and appeared sufficient for capturing improvement. The quantitative findings were clear: clinics were closer to communities, service availability had increased, and utilisation indicators suggested progress. From a numerical perspective, the intervention appeared successful.
Initially, I considered the study complete. The data answered the research questions as designed, and the findings aligned with policy expectations. It was only through late follow-up engagement with community members that I realised these indicators did not accurately capture how access was experienced in practice. During informal conversations and community meetings, participants raised concerns that hadn't appeared in the survey data, frustrations about how they were treated at clinics, anxieties about confidentiality, and a growing reluctance to seek care despite improved proximity. When I conducted more deliberate follow-up interviews to explore these observations, participants described a decline in the quality of care, a lack of dignity, and reduced trust in the health system, dimensions of access that had not been previously measured.
While the data showed improved physical access, it failed to address questions about the relational, experiential, and ethical aspects of care. The survey had told me what had changed, and how much, but it could not explain why communities felt worse off despite improved access, or how these changes were being experienced in daily life-questions I had not anticipated needing to ask. This gap between what was measured and what mattered to participants prompted a critical reassessment of my methodological approach.
At that point, I was confronted with a situation that many early-career researchers encounter but rarely name: the data were not wrong, but they were incomplete. I had begun with a purely quantitative approach, using surveys to assess measurable changes in access following intervention. These data answered what had changed and how much. However, they could not explain how those changes were experienced or why they mattered to the people affected. It was only through subsequent qualitative engagement that these gaps became visible. This was the moment I realised that some research questions cannot be answered responsibly using a single method, and that a sequential explanatory approach, beginning with quantitative data and following with qualitative inquiry, was necessary to produce meaningful understanding.
Mixed methods became relevant not as a theoretical preference, but as a practical response to questions such as:
- Why do outcomes look positive while participants report negative experiences?
- Why does an intervention work in one context but fail in another?
- Why do stakeholders disagree with what the data appear to show?
- Why are patterns visible without clear explanations or mechanisms?
These scenarios are common in health, education, and social research—particularly for junior researchers working in complex, resource-constrained, or culturally diverse settings. Mixed methods emerge precisely at these points of tension, where measurement alone is insufficient.
The Actual Journey: What I Selected and Why
Here's what I actually did, step by step. Having completed the quantitative survey on healthcare access and identified patterns I could not fully explain, I did not turn to a mixed methods textbook to select a design. Instead, I asked a simpler question: What evidence do I need to answer this research question honestly and responsibly?
In the healthcare access study, I needed to explain existing quantitative patterns. This led me, retrospectively, to what is known as a sequential explanatory approach. In practice, this means: Phase 1 - quantitative data collection and analysis, Phase 2 - qualitative interviews informed by the findings from Phase 1. Phase 3- integration of both datasets. I analysed survey data first, identified patterns that raised questions, and then designed interviews specifically to explore the reasons behind those findings. The qualitative data neither confirmed nor contradicted the quantitative results; instead, they addressed questions that the survey data could not answer.
After analysing the quantitative findings, I returned to the field with questions that the survey data could not answer. While the quantitative phase established what had changed and the extent of those changes, I could not explain from these data alone how changes were experienced, why certain indicators mattered more than others, or how community members interpreted these shifts in their everyday interactions with the health system.
I designed the qualitative phase to address these gaps. The quantitative analysis had revealed age-differentiated patterns of clinic utilization; younger women increased their use of the service while older adults declined. I could measure this pattern, but could not explain it. I therefore developed a semi-structured interview guide focused on participants’ experiences of seeking care, their perceptions of service quality, their relationship with providers, and factors influencing their decisions to use or avoid the newly accessible clinic. I purposely selected participants to represent the different patterns observed in the survey data.
Through these in-depth interviews, I explored dimensions of care such as dignity, trust, and perceived quality, issues my original survey instrument had not captured. What I discovered through the qualitative phase completely reframed the quantitative findings. Reduced distance did not translate into meaningful access when quality, trust, and dignity were compromised. The most significant finding I arrived at was not the improvement or decline, but rather the questions the quantitative data raised but could not answer: why did some groups increase their clinic use while others declined? What was shaping these different responses? These explanatory gaps necessitated returning to the field with a qualitative inquiry designed to explore the meanings, experiences, and relational dynamics behind the numerical patterns. captured in the numbers, but rather the disconnect between indicators and experience.
In later work, I encountered the opposite challenge. While developing a documentation tool for traditional health practitioners, I initially relied on structured instruments, assuming that categories used in conventional healthcare settings could be adapted to traditional practice. The tool was technically sound but socially ineffective. Practitioners did not recognise their own practices in it. What I had designed as a documentation system reflected my own frameworks and assumptions about how knowledge should be organised, not how practitioners themselves understood, valued, or structured their healing knowledge. This time, I reversed the sequence. Rather than starting with measurement and discovering its inadequacy later, I needed to first understand the indigenous logics, categories, and relational structures that organised traditional healing practice. This led me to what is known as a sequential exploratory approach. In practice, this meant: Phase 1: qualitative data collection through focus groups and discussions. Phase 2- analysis to identify indigenous knowledge frameworks and organisational principles; phase 3: development of quantitative structured documentation tools informed by the qualitative findings.
I began with qualitative engagement, conducting focus groups and open discussions to understand meanings, priorities, and local logics. These conversations revealed that practitioners organised their knowledge around spiritual relationships, ancestral connections, and contextual factors that had no equivalent in the standardised categories I had initially proposed. Only once I understood these frameworks from practitioners’ perspectives could I begin developing documentation approaches they found meaningful and practically useful. Only after that did I move toward measurement.
Across these projects, the lesson was consistent: the choice of mixed methods design followed the research problem, not methodological preference. Different questions required different pathways, but all required more than one way of knowing. When I needed to explain existing quantitative patterns, a sequential explanatory design was the most suitable approach. When I needed to understand a phenomenon before attempting to document or measure it, sequential exploratory design was necessary. While I have primarily used sequential designs in my work, other mixed methods approaches, such as convergent designs where both data types are collected simultaneously, may be more appropriate for different research questions. What matters is not exhaustive use of all designs, but thoughtful selection based on what the research problem genuinely requires.
Lessons Learnt: What Mixed Methods Demands of You
Bringing qualitative inquiry into the study fundamentally changed the completeness of the findings. The initial quantitative phase established clear patterns of change in service access, while the subsequent qualitative phase explained how those changes were experienced and why they mattered to community members. The quantitative data showed what had changed and by how much; qualitative data revealed the meanings, relationships, and lived realities behind those patterns. Together, the two strands of data provided coherent and complementary answers to the research questions, thereby resolving the gaps that had remained when relying on a single survey alone.
This integration produced a more complete, contextually grounded understanding than either method could achieve independently. It was at this point, when the data were no longer fragmented but integrated, that I recognised both the value and the demands of mixed methods research. The success, however, came with lessons that are rarely discussed openly with early-career researchers about what mixed methods actually require.
First, mixed methods take time. Designing, collecting, analysing, and integrating different forms of data requires careful planning and realistic timelines. It is not something to take on lightly or without support.
Second, integration does not happen automatically. Collecting both quantitative and qualitative data does not, in itself, constitute mixed-methods research. Mixing requires deliberate decisions about when and how different data strands are brought together to inform interpretation. In practice, this can occur at multiple points in a study, but it becomes most critical during the analysis phase.
In my work, mixing occurred at two key points: first, when quantitative findings were used to shape the focus of the qualitative study, determining which questions to pursue and whom to interview; and second, when qualitative insights were used to explain, contextualise, and sometimes challenge numerical patterns. During analysis, this involved moving beyond parallel reporting of results to actively comparing findings across data types, examining where they converged, diverged, or addressed different aspects of the same phenomenon. For example, quantitative data revealed age-differentiated patterns in clinic utilization, with younger women increasing their use while older adults declined; however, the reasons for this trend could not be explained. Qualitative interviews revealed that younger women valued the privacy of new facilities, while older participants missed the relational trust of previous care sites. The pattern became interpretable only when both datasets were integrated. Survey data quantified changes in access, while interview data illuminated how those changes were experienced, revealing dimensions that were invisible in the numbers alone.
Integration also shaped how I presented findings. Rather than separating results by method, I organised the analysis thematically, with each theme drawing on both quantitative and qualitative evidence to build a coherent narrative.
Third, divergence is not failure. When qualitative and quantitative findings do not align, this often signals the most important insight. Some of my most meaningful contributions emerged from these moments of tension between datasets.
Finally, mixed methods require reflexivity, continually examining assumptions, positionality, and methodological limits throughout the research process. Assumptions should be made explicit during the design process. I initially assumed that standard quantitative indicators sufficiently measured access, shaped by my training in public health evaluation. Making this explicit early allowed me to recognise when data challenged it.
Reflexivity becomes critical during analysis. When quantitative findings showed improvement while participants described deterioration, I confronted a deeper assumption: that numbers were more ‘objective’ than lived experiences. This guided productive questions: What does each dataset reveal? Whose definitions am I privileging? How does my positionality shape what counts as evidence? Practically, I kept reflexive memos documenting interpretive tensions. When I noticed my draft foregrounded quantitative findings while treating qualitative data as illustration, the memo prompted restructuring to give equal analytical weight. Reflexivity thus connects design, analysis, and interpretation. While uncomfortable—particularly for early-career researchers —it ensures that research remains accountable to lived realities, not just predefined questions, producing work that is both methodologically credible and contextually responsive.
Practical Recommendations for Early-Career Researchers
Based on my experience, several considerations may guide early career researchers contemplating mixed methods:
Start with the research question, not the method. Ask what evidence is genuinely required to answer your question, rather than defaulting to familiar approaches. If a single method suffices, there is no need to add complexity. Mixed methods emerge from the inquiry’s demands, not from methodological preference.
Be clear about why you are mixing methods. Different purposes, explanations, explorations, and contextualization imply different designs and sequences. Being explicit about your rationale ensures coherence across design, collection, and analysis. Without this clarity, integration becomes difficult, and the study risks becoming methodologically plural but conceptually disconnected.
Discuss methodological choices with your supervisor early on. This is critical, especially where mixed methods are unfamiliar or unevenly supported. Supervisors need to understand not only what you plan to do but why and how you will manage the additional demands. Early alignment prevents confusion and ensures necessary support.
Avoid overextending yourself. Mixed methods need not be large or complex to be effective. A small, well-integrated qualitative component often adds more value than an ambitious but poorly connected design. Focus on doing integration well rather than attempting everything at once.
Respect both methodological traditions. Quantitative and qualitative approaches have distinct standards of rigour; neither should be treated as secondary. Mixed methods require competence in both and a willingness to engage seriously with each tradition’s strengths and limitations.
Closing Reflection
Mixed methods did not make my research easier. It made it more demanding but also more meaningful. If you have ever found your findings to be technically correct yet contextually incomplete, you are not alone; many early-career researchers encounter this tension without the language to name it or a clear pathway to address it. Mixed methods offer a way forward not as a formula or guarantee, but as a reflective approach to research that takes complexity, context, and lived experience seriously. It requires time, intentional integration, and reflexivity, but also allows researchers to ask better questions and produce more credible and useful knowledge. Like any methodological skills, mixed methods develop through practice. Start where you are, learn from each project, and let your research questions, rather than methodological loyalty, guide your choices.
Dr Tracy Zhandire
First image: TSD Studio on Unsplash
Second image: Rifky Nur Setyadi on Unsplash
