Digital Principles Maturity Matrix for Program Design and Proposal Evaluation

Type:
Tool

Executive Summary

The Digital Principles Maturity Matrix is an interactive tool to better align proposal evaluation with the Principles for Digital Development throughout all phases of the project lifecycle. By creating a living, dynamic rubric of detailed statements tied to specific activities, donors can better realize the full potential of the Digital Principles, as well as provide clear guidance to potential partners as to how priorities are identified and assessed. The release of this Matrix is just the beginning: we will continue to refine and adapt questions, release and incorporate supplementary resources, and optimize visualizations for needed decision making. However, the need to align donors and practitioners with the best practices of the Digital Principles through a shared, measurable standard will persist in digital development for all technology-enabled programs.

The current version is Maturity Matrix – v1.0, published under CC BY-SA v4.0.

Introduction

The Digital Principles Maturity Matrix is a dynamic spreadsheet for donors and practitioners to score proposals along a series of statements that measure alignment with the Digital Principles. Statements are grouped according Digital Principle, project lifecycle phase, and specific activity (e.g., “Design With the User,” “Analyze & Plan,” and “User Advisory Group (UAG)”) to create a consistent approach across sectors, technology, and context-specific use cases. The scoring guide is presented in a table format on a 0 to 3 scale to show how current best practices can be applied in an increasingly comprehensive manner to match specific activities relevant to the donor request. The output is a summary table along with a radar chart visualization of the donor request, proposal response, differences between the two, as well as a clear path to better alignment.

The goal of the Maturity Matrix is not to retroactively score proposals, but rather to enable a conversation between donors and practitioners to better align processes with the promise of the Digital Principles. Only then can a shared standard embraced by donors serve as a change agent — provided that such a standard is as flexible as the Digital Principles themselves. To get there, it is first necessary to clearly define what the Matrix is, and what it is not.

 

The above statements guided the decisions that we made throughout the development of the Matrix. This includes who we initially invited to review the tool and how compromises were made between competing priorities.

Designing a flexible, self-driven guide that can be performed by anyone may seem counterintuitive for a tool that is intended to be used by a small subset of donor staff — those who review proposals that include digital components. However, the wide range of application for Principles across sectors, technologies, languages, and internal processes necessitates such an approach if the tool is to be adapted to relevant contexts and be of use to a single reviewer. For these reasons, the tool was created in a widely used format (spreadsheet) that prioritizes simplicity and usability so that nearly anyone could pick it up and reach approximately the same understanding as it applies to a proposal.

The length and breadth of the Maturity Matrix is fully intentional. Technology-enabled projects, by their nature, are diverse and continually evolving. Any tool that aims to generate meaningful insights into what could be improved in a given project must be sufficiently detailed in itself. In this way, the tool can help identify and guide a reviewer to next steps without including extraneous or incorrect information. To achieve this, the two main statement criteria we propose pertain to matching donor requests and to aligning practitioner solutions with current best practices. For example, a donor could clearly state that a principle or project lifecycle phase is critical for proposal consideration, but that another is only of passing importance. Based on that information, practitioners can make the best possible use of the limited space in a proposal to highlight their strengths and alignment with donor priorities — as well as leverage insights into what activities and processes should be considered to improve that alignment.

Proposals and proposal-review processes are not standardized between donors or implementers, which makes integration into existing institutional processes problematic. Instead, the Maturity Matrix seeks to integrate all parties with the latest best practices and resources. If the community feels that an activity or resource inclusion is important in alignment with the Principles, there should be a matching series of statements to assess whether the proposal is aware of its existence, has integrated it into the appropriate phase, or even fully embraced it to the extent that the donor and community value. In this way, increasingly comprehensive scoring can incentivize a learning journey from simple acknowledgement to attainable optimization, through regular reference.

While the statements in the Maturity Matrix can be incorporated into a static, paper-based assessment, the creation of a digital-first, interactive tool enables instant insights for decision making and shared prioritization. However, this flexibility also creates challenges for usability. Initial testing found that the number of statements required for a 1-to-5 assessment (as is more common for maturity) to be prohibitive, especially when you analyze all nine Principles across all four project lifecycle phases. While there are no paper constraints in what can be included, attention and time are finite resources. As a result, the Matrix was optimized for usability (0 to 3 scoring) and simplicity (under 100 activities total), even though there are no cost constraints for doing so.

Lastly, the Matrix can only achieve its goals if it is used as a tool for collaborative engagement, rather than as a means of provoking competition or rendering judgment. This means assessing proposals against donor expectations and state of best practices, rather than assessing proposals against one another. Ultimately, this formative assessment approach is intended to drive a cycle of continuous learning by which the community understanding collectively advances rather than a summative assessment by which any single proposal is judged and returned.

Suggestions on Using the Maturity Matrix

1: Get the Latest Version

● Download the latest version at the top of this page

2: (Donor Only) Input Priorities

● All cells are protected except for A and B columns on the “Matrix” tab.

● Enter priorities or list as “N/A” that will not be factored.

● (Optional) Protect the A column before making public for others

3: (Practitioner Only) Assess Self

● Enter your own alignment assessments according to donor priorities

● Check for box filled in blue to see target goal for each

4: Review Dashboard

● How far apart are donor and practitioner on each Principle?

5: Review Chart

● Are there areas that the practitioner has demonstrated process maturity that should be considered?

6: Create Notes, Next Steps

● The farthest column to the right is where users can identify resources and make notes to share

● Add links, forum posts, or pull directly from forum.digitalprinciples.org to crowdsource as needed

7: Update the Matrix!

● Published under Creative Commons. Just check the appropriate license for specifics on how you can update and improve this tool.

Contributions and Insights

The development of this tool, as well as core considerations around its development, have been entirely thanks to the insights, interviews, and feedback from key experts and authors. These include:

1. Mitch Hulse, TechChange (Consultant)

2. Linda Raftree

3. Adele Waugaman, USAID

4. Laura Walker MacDonald, DIAL

5. Jaclyn Carlson, USAID

6. Steve Ollis, JSI

7. Wayan Vota, IntraHealth

8. William Lester, NPOKI

9. M K Cope, NPOKI

10. Charlotte Schumann, GIZ

11. Mariette McCampbell, GIZ

12. James Ghaffari, Plan UK

13. Jayne Crow, Plan UK

14. Martine Koopman, SMART Resultancy

15. Craig Jolley, USAID

16. Magda Berhe Johnson, SPIDER

17. Liz Nerad, Palladium

18. Jesus Melendez Vicente, Mercy Corps

19. Michael Dawson, Mercy Corps

20. Josh Mandell, IBM

21. Stuardo Herrera, Palladium

22. Channé Suy Lan, InSTEDD

23. Nora Lindström, Plan International