Digital Principles Monitoring, Evaluation, & Learning Part 4: Digital Principles-Focused Evaluation

This is the final installment of a 3-part series on the Digital Principles’ new suite of monitoring and evaluation resources.

Part 2
Part 3

In 2021, DIAL introduced the Organizational Self-Assessment that focuses on measuring generic Digital Principles engagement through activities associated with promotion, adoption, and institutionalization. Earlier this week, DIAL introduced the Indicator Library, a collection of process indicators for activities and outputs associated with measuring adherence to specific Digital Principles.  

Today, in our final installment of our 3-part blog series, we are introducing the last tool in our new suite of monitoring and evaluation resources—Digital Principles-Focused Evaluation (DP-FE)—a prescriptive model of evaluation adapted from Principles-Focused Evaluation (P-FE) developed by Michael Quinn Patton.  

Digital Principles-Focused Evaluation centers the Principles themselves as the evaluation’s focus and examines the following questions: 

1. Are the Principles clear, meaningful, and actionable?

2. Are the Principles actually being followed?

3. Are the Principles leading to desired results?

Just as theory-based evaluation assesses the merit of a theory of change by evaluating that theory embodied in a program, P-FE evaluates the merit of principles by evaluating principle-driven initiatives. P-FE recognizes that many organizations and initiatives are reorienting their work to be guided by principles, instead of rigid rules or policy, for navigating complex real-world challenges. P-FE therefore provides feedback on how and how well that navigation process is working.  

Furthermore, just as principles, like the Digital Principles, must be interpreted and applied according to the unique circumstance of organizations and the contexts they work in, the application of P-FE and DP-FE are themselves guided by the following principles: 

MatchingConduct Principles-Focused Evaluations on principles-driven initiatives with principles-committed people. Principles are the focus of the evaluation. 
Distinctions MatterDistinguish types of principles (natural, moral effectiveness) and distinguish principles from values, beliefs, lessons, rules and proverbs. 
QualityEnsure that principles adhere to the GUIDE criteria. 
Evaluation RigorInquire into and evaluate effectiveness of principles on if they were followed and what difference they made.
Utilization FocusFocus on intended use by intended users of the principles from beginning to end.
Beyond RhetoricSupport using comprehensively; use them or lose them; don’t let them become a list; apply them across functions like staff development, strategic planning, and MEL work. 
InterconnectionsInterconnect principles. The eight concepts of P-FE are an interdependent interconnected whole (not a pick-and-choose list). 
LearningReflect on the strengths and weaknesses of the P-FE process and results to learn and improve; engage in the principles-focused reflective practice. 

Again, at the heart of Digital Principles-Focused Evaluation, are three criteria and evaluation questions: 

1. Meaningfulness: To what extent and in what ways are the Principles for Digital Development meaningful to our organization or initiative?

2. Adherence: If meaningful, to what extent and in what ways are the Principles for Digital Development adhered to in practice at our organization or initiative?

3. Results: If adhered to, to what extent and in what ways are the Principles for Digital Development leading to desired results?   

Where the Organizational Self-Assessment is associated with principle meaningfulness and adherence generally, and the Indicator Library is associated with principle adherence specifically, DP-FE provides a more holistic view of the merit of the Digital Principles and principle-driven initiatives by investigating all three criteria. 

In concert these criteria and evaluation questions provide endorsers of the Digital Principles insights to strengthen their digital development initiatives. 

A final component of DP-FE is that it should be designed and conducted to promote intended use for intended users, thus adhering to the Design with the User principle. Accordingly, organizations can use DP-FE in two ways: first, through findings use (delivering evaluation findings and possibly recommendations to decision-makers at the organization) and second, through process use (how being involved in an evaluation affects the individuals and organizations involved).  

With this in mind, and applied thoughtfully, Digital Principles-Focused Evaluation can provide insights on how and how well the Digital Principles aid organizations and initiatives in navigating the complexity of the digital development ecosystem.  

Access the new guidance document for greater detail on the methodology, example rubrics, survey protocols, and complementary resources; and most importantly, how to implement a Digital Principles-focused evaluation at your organization!  

DIAL wants to see how the Digital Principles are coming to life and are seeking partners to pilot Digital Principles-Focused Evaluation. If interested, please contact Claudine Lim at [email protected].  

Claudine Lim

Program Manager, the Principles for Digital Development

Claudine first joined the Digital Impact Alliance in October 2017, shortly after receiving a dual masters in international relations and public relations from the Maxwell School and S.I. Newhouse School at Syracuse University. After working as a Program Coordinator and Researcher for DIAL’s Business Operations, she is currently working with the Principles of Digital Development.

Scott Neilitz

Manager, Monitor, Evaluation and Learning at DIAL

Scott believes that creative innovation and technology have the potential to improve the lives of people in low and middle-income countries. He also believes that through constant and iterative research and learning, we can improve programs and, ultimately, impact. Scott joined DIAL in 2018 as a Senior Monitoring and Evaluation Associate.

Zach Tilton

Doctoral Research Associate at Western Michigan University

Zach Tilton joined DIAL in 2018 as a Monitoring and Evaluation Fellow and currently works as an Evaluation Consultant with DIAL. He is a Doctoral Research Associate at the Interdisciplinary Doctoral Program in Evaluation at Western Michigan University specializing in peacebuilding evaluation, an Associate at Everyday Peace Indicators, and a member-at-large with the EvalYouth Global Management Group.