Why Research on DPI? Building Blocks of Evidence Generation on DPI Rollout

Explores the necessity and spectrum of research for evaluating Digital Public Infrastructure (DPI) impact, focusing on inclusion and evidence generation.

Updated: Mar 28, 2025
video By Jordan Kraemer, Christian Meyer, Abdoulaye Ndiaye, Jean-Louis Arcand, Peter K., Elizabeth T., Kunal Sen

This panel session from the Global DPI Summit 2024 addresses the critical role of research in understanding and maximizing the impact of Digital Public Infrastructure (DPI). It explores the spectrum of research methodologies needed, from rapid policy insights to rigorous academic evaluations, emphasizing their complementarity and the importance of collaboration across the ecosystem. The discussion highlights the need to move beyond simply deploying technology to actively measuring its effects, particularly concerning inclusion and real-world outcomes for individuals, governments, and the private sector.

Synthesized Summary

The session underscores that while DPI holds immense potential, its positive impact is not automatic and requires deliberate measurement and evidence generation (“showing our work”). Referencing the Solow Paradox, the moderator notes that technological advancements don’t always translate directly into measurable productivity or societal gains without proper understanding and adaptation. The panel presents a spectrum of research approaches necessary for DPI rollout, ranging from descriptive, policy-focused work (useful for rapid feedback and implementation adjustments) to rigorous, long-term academic research like Randomized Control Trials (RCTs) needed for causal impact attribution.

Various organizations presented their roles within this research ecosystem. Oxford’s OxDPI Lab focuses on causal impact evaluation, collaborating closely with governments (like Ethiopia’s NIDP on the Fayda ID system) and acting as an independent convenor. J-PAL Africa’s DigiFI initiative funds and co-generates experimental research on digital ID and finance reforms across Sub-Saharan Africa, emphasizing RCTs and matchmaking researchers with implementers. The Global Development Network (GDN) champions demand-driven research led by Global South researchers, providing support and training to ensure local relevance and policy influence. The World Bank employs a comprehensive Monitoring, Evaluation, and Learning (MEL) framework, using global datasets (ID4D, Findex, ASPIRE) and country-level studies to track impact across stakeholders, focusing on inclusion, data protection, and user empowerment. ISER Uganda brings a civil society perspective, stressing the need for community-driven research and using qualitative methods (like photo essays) to highlight lived experiences and advocate for policy changes based on ground-level evidence. Dalberg discussed a methodology being developed with Co-Develop to track inclusion systematically across DPI implementations, focusing on access to DPI, access/use of enabled services, and demographics, particularly for marginalized groups. A common thread is the necessity of multi-stakeholder collaboration—involving researchers, governments, CSOs, and implementers—to generate robust, relevant, and actionable evidence for building effective and inclusive DPI.

Key Learnings & Recommendations

  • Evidence is Non-Negotiable: Simply deploying DPI is insufficient; rigorous research and evidence are required to demonstrate and understand its actual impact, moving beyond assumptions. [00:31], [00:48]
  • Spectrum of Research: Different research needs require different methodologies. A mix is essential, from rapid descriptive/policy analysis for timely feedback during rollout to long-term, rigorous academic studies (including RCTs and quasi-experimental designs) for causal impact evaluation. [01:11], [01:35], [15:45]
  • Complementarity of Methods: Policy-focused research and academic research are complementary, not mutually exclusive. Both provide valuable insights for different audiences and purposes. [02:00], [03:01]
  • Focus on Inclusion: Measuring DPI’s impact must explicitly track inclusion and exclusion, particularly for marginalized and vulnerable populations (women, poor, minorities, refugees, persons with disabilities). [38:30], [40:05], [22:13]
  • Understand Exclusion Drivers: Research should identify the reasons for exclusion from DPI or related services (e.g., lack of awareness, complex processes, cost, lack of foundational ID). [39:41]
  • Local Ownership & Relevance: Research questions and methodologies should be demand-driven and co-created with local stakeholders (governments, local researchers, CSOs) to ensure relevance and facilitate policy uptake. [15:07], [17:14], [30:45]
  • Collaboration is Key: Effective evidence generation requires collaboration across the entire ecosystem – researchers (local and international), governments, implementers, CSOs, and development partners. [13:45], [35:48], [07:03]
  • Timeliness Matters: While rigor is important, mechanisms for generating timely evidence are needed to inform policy adjustments during DPI implementation, not just years later. [01:39], [08:31]
  • Go Beyond Numbers: Quantitative data should be complemented by qualitative insights and understanding lived experiences (e.g., through community engagement, photo essays) to grasp the full picture of DPI’s impact. [31:01], [33:50]
  • Data as Public Good: Research data and designs should ideally be treated as public goods to enable broader analysis and learning. [07:30]

Key Visual Information

  • Building Blocks of Evidence Generation Slide: [01:05] - [02:41], [09:00], [13:39], [18:07], [20:18] This slide depicts a spectrum of research on DPI rollout, adapted from World Bank ID4D.
    • Horizontal Axis: Shows a continuum from “Descriptive policy focused” research on the left to “Academic research” on the right.
    • Building Blocks (Above Axis): Illustrates key components supporting evidence generation:
      • ID system(s) (meta)data: Foundation for monitoring, studies, evaluations.
      • Dashboards, portals: Interfaces for staff, leadership, and public access to data/reports.
      • Studies and Evaluations: Periodic data collection and analysis on design, pilots, implementation, and impact.
    • Organizational Mapping: The moderator overlays logos/names of participating organizations (ISER, Dalberg, GDN, World Bank, Oxford, J-PAL) roughly corresponding to their position on the research spectrum. This visually reinforces the different but complementary roles these actors play.
  • Spotlight on Ethiopia: Fayda.Lab Slide: [05:45] Shows logos of NIDP, World Bank, ID4D, and University of Oxford, alongside bullet points detailing the context (MOSIP-based ID rollout) and the research focus (inclusivity, impact evaluations in social protection, finance, refugee integration). Includes a photo from the stakeholder kick-off meeting.
  • DigiFI Initiative Slide: [09:11] Displays logos of J-PAL Africa, Bill & Melinda Gates Foundation, and the French Treasury, outlining DigiFI’s mission to co-generate evidence on digital ID and finance reforms via experimental research.
  • Emerging Insights Slide (J-PAL): [10:29] Shows titles/abstracts of research papers, illustrating the types of studies DigiFI supports (e.g., mobile banking adoption in Ghana, targeting humanitarian aid using ML/phone data).
  • Research Funding Slide (J-PAL): [11:35] Outlines the stages of research funding offered by DigiFI: Proposal Development, Pilots & Monitoring Systems, Randomized Evaluation, Policy Outreach, with associated funding caps. Specifies eligibility criteria (J-PAL Affiliates, invited researchers, PhD students, African Scholars Program).
  • GDN Introduction Slide: [14:15] Provides background on GDN’s founding, mission (strengthen capacity via research in social sciences), funding sources (ODA, MDBs, Philanthropies), and scope (supporting >4000 researchers in >150 countries).
  • GDN DPI Pilot Slide: [16:57] Details the aims (research framework, network building), methodology (local researchers, secondary data, quasi-experimental designs), and country focus (Bangladesh, Benin, Ethiopia) of the Co-Develop funded pilot.
  • World Bank Research Capabilities Slide: [37:03] (Appears briefly, seems related to Peter K.’s points on MEL) Shows components like MEL Strategy, System Data, User Satisfaction, Geospatial Mapping, Implementation Studies (Pilot Evaluation, Gender Gap Study, Exit Survey, Inclusion Study, Time and Motion).
  • ISER Introduction Slide: [29:29] Shows ISER logo, Uganda focus, and the cover of their report “Chased Away and Left to Die” (in partnership with NYU CHR&GJ), highlighting their community-focused research approach. Includes a QR code.
  • ISER Photo Essay Slide: [33:34] Displays a photo essay image with the text “YOU DO NOT QUALIFY!”, illustrating the lived experience of exclusion from social protection due to ID issues for older persons.
  • Dalberg Inclusion Tracking Slides: [37:21], [39:00], [39:41], [42:25] These slides outline the rationale (gap in inclusion measurement), objectives (establish standard toolkit, build evidence), methodology (tracking DPI access, service use, demographics via mixed-methods like surveys and interviews), and the key role of CSOs in ensuring local relevance.

Key Consensus Points & Methodological Spectrum

  • Consensus Points:
    • There is a critical need for robust evidence to understand DPI’s true impact beyond deployment metrics. [00:48], [02:53]
    • Measuring inclusion and understanding the experiences of marginalized groups is paramount. [38:30], [22:13]
    • Different research methodologies are valuable and complementary, serving distinct needs and audiences. [01:35], [02:00], [03:01]
    • Collaboration among researchers, governments, implementers, and civil society is essential for effective and relevant evidence generation. [13:45], [35:48]
    • Research should be demand-driven and context-specific to maximize policy relevance and uptake. [15:07], [17:14]
  • Methodological Spectrum/Approaches Presented:
    • Descriptive / Policy Focused: Rapid analysis, policy reports, monitoring dashboards, process evaluations, time/cost savings studies (ISER, Dalberg, World Bank). [01:16], [27:31]
    • Quasi-Experimental: Utilizing secondary data and specific research designs to infer impact where RCTs aren’t feasible (GDN). [17:24]
    • Experimental / RCTs: Rigorous randomized control trials for causal impact attribution (J-PAL, Oxford). [05:00], [09:39], [11:51]
    • Qualitative Research: Understanding lived experiences, reasons for exclusion, and context through methods like interviews and photo essays (ISER, Dalberg). [31:01], [33:50], [41:00]
    • Longitudinal Studies: Tracking individuals/groups over time to measure changes and impact (Dalberg’s proposed methodology). [41:41]
    • Mixed Methods: Combining quantitative and qualitative approaches for a comprehensive understanding (Dalberg). [39:41]
    • Data Sources: Utilizing ID system (meta)data, administrative data, surveys (in-person, telephonic), geospatial data. [01:13], [17:24], [39:41], [37:03]

Key Questions Addressed or Raised

  • Addressed:
    • Why is research on DPI necessary? (To demonstrate impact beyond assumptions, ensure accountability, inform policy). [00:01] - [00:49]
    • What types of research are relevant for DPI? (A spectrum from descriptive policy analysis to rigorous academic evaluations). [01:05] - [02:08]
    • How can different research approaches complement each other? (Meeting needs for both timely feedback and rigorous impact proof). [01:35] - [02:03]
    • What role do different organizations play in the DPI research ecosystem? (Convening, funding, conducting research, capacity building, policy engagement). [02:09] - [03:20] (Implicit throughout)
  • Raised:
    • How can we effectively measure the inclusion impact of DPI, especially for marginalized groups? [38:30]
    • What are the specific drivers and barriers to accessing DPI and DPI-enabled services? [39:41]
    • How can research findings be translated into timely and actionable policy insights, particularly during rollout phases? [08:31]
    • How can collaboration between diverse stakeholders (researchers, government, CSOs) be effectively fostered for DPI evidence generation? [07:03], [42:26]
    • What are the best ways to present research findings beyond traditional reports to influence policy (e.g., photo essays)? [33:50]

Stated or Implied Applications

  • Policy Design & Improvement: Using evidence to design better DPI systems and policies, and to make adjustments during implementation based on timely feedback. [08:31], [16:02], [35:26]
  • Accountability: Holding governments and implementers accountable for the impact (or lack thereof) of

Key Points

  • Measuring the impact of digitization (like DPI) is crucial, as technological presence doesn't automatically equate to expected productivity or societal gains (Solow Paradox relevance). [00:16]
  • A spectrum of research approaches exists for DPI, ranging from descriptive policy analysis to rigorous academic evaluations (e.g., RCTs), each serving different, complementary needs. [01:11]
  • Research must be demand-driven and responsive, providing timely evidence for implementation adjustments and long-term, rigorous proof of impact. [01:35]
  • Collaboration between local and international researchers, governments, implementers, and CSOs is vital for generating relevant and impactful evidence. [13:45], [35:48]
  • Measuring inclusion is a key challenge; methodologies are being developed to track DPI access, usage of enabled services, and demographics, focusing on marginalized groups. [38:30], [39:32]
  • The World Bank utilizes a Monitoring, Evaluation, and Learning (MEL) framework, tracking core indicators and conducting various evaluation types (process, impact) globally and at country level. [25:36]
  • Initiatives like Oxford's Fayda.Lab and J-PAL's DigiFI Africa facilitate research by connecting researchers with governments/implementers and providing funding/support. [05:15], [09:43]
  • Community-level evidence and lived experiences (e.g., via photo essays) are crucial for understanding the real-world impact of DPI beyond quantitative data. [31:01], [33:34]