Tag Archives: complexity

Linking research to policy – my new project

I have recently joined an interesting project to establish a South Asia Urban Knowledge Hub (funded by the Asian Development Bank).  I will have the opportunity to work with research institutes in Nepal, India, Bangladesh and Sri Lanka (to start) on sustainable development issues in the urban sector. My role as knowledge management specialist is to build capacity of the researchers to undertake outcome focused research for policy influencing.  I am tasked with creating a template to help the researchers develop policy influencing strategies, improve technical writing skills for policy briefs, provide a 2-day training workshop and will also act as a mentor on an ongoing basis for two years.  For anyone who knows me, this sounds like a dream!

In some ways, it is a dream and in other ways, I have gone down a rabbit hole of jargon that is giving me nightmares.

evaluator_jargon_evalblogAs I consider the assessment I need to undertake as a first step, I cannot imagine asking people about their familiarity or confidence level in using tools like: problem trees, objective tress, force field analysis, stakeholder maps, SWOT, theory of change, the RAPID framework, alignment/interest/influence matrix, outcome mapping, advocacy strategy, communications strategy, etc etc etc.  Don’t you feel overwhelmed just reading the list of available and suggested tools? I do.

This is forcing me to really take a step back and figure out what I need to know from the participants beforehand in order to design a good training and what is the best way to get the information from them (this is before I even start doing the real work!).

While originally I thought a simple online survey would work, I have decided that this is not the best tool given many people may not use the same terms for undertaking analytical tasks.  My simple assessment could get wordy and complicated.  Moreover, if I leave the online survey open-ended, I won’t have the opportunity to really understand the responses provided without proper follow-up.  This leads me to the idea of conducting group interviews with each centre.  This way, I can probe into the questions I ask with the group from each centre and build on the conversation as it evolves. I believe I will gain a better understanding of the types of methods they have used (or not used) previously.

I have already had one introductory Skype meeting with each centre so this is also a nice way to build our relationship given I am going to be working with them for the next two years.

Sooooo…given I want to learn about what experience (or lack of experience) the centres have with undertaking outcome oriented research for policy influencing these are the questions I’m considering using to guide my interviews.  The results need to inform the design of a template for policy influencing as well as a 2-day training for the centres.  I would love your feedback as its still in draft form.  Or maybe you work in this area and have other bits of advice for me?  If so, thanks for sharing!

1.    Please share an example or two of research you undertook in the past specifically with the purpose of changing policy:

  • How was the research topic decided?
  • Did the research lead to a change? If yes, what were the key factors?
  • Did you document your assumptions for creating change? If yes, how?
  • Were other actors/partners involved? How so?
  • Did you have a strategy in place that you followed?
  • Did you use specific tools or techniques to understand the different actors/stakeholders involved?  Those who would support or challenge your research?
  • How did you monitor the change that took place?
  • How do you know your influence on the change, as opposed to other outside forces?
  •       If not, have you used a strategy for other advocacy work? If yes, please describe what this looked like.
  •       If not, how do you feel embarking on this type of work in the future? What are you most excited about and most concerned about?

 

2.    Please describe how you typically disseminate the knowledge generated from research.

  • What methods do you use? Example strategy?
  • What has been most successful? What has been least successful?
  • How do you define your audiences?
  • Do you typically write different messages for different audiences?
  • Do you work with communication professionals?
  • What type of communications products do you think policy makers find most useful? For example:
    • Policy briefs
    • Opinion articles, News items
    • Media, Community radio
    • Working groups
    • High profile events
    • Public pressure

3.    To what extent are policies in your field evidence-informed?

  • What are some of the factors that determine whether, and to what extent, evidence informs or even influences policy decisions?
  • Is demand a necessary condition for the uptake of research?
  • Do you think well-conceived and compellingly packaged research findings stimulate the interest of policy makers?

4 Comments

Filed under Uncategorized

Monitoring, Evaluation, Holidays and Poetry

I recently received a holiday greeting from a colleague working in monitoring and evaluation (Kylie Hutchinson of Community Solutions). The poem she wrote is brilliant and I want to share it with you!

It Came From Using a Systems Lens
It came from using a systems lens,
Such insights so glorious – behold!
This program in its entirety,
Is really a part of a whole.
So often when we evaluate,
We think only outputs, outcomes,
And fail to see the complexity,
The system from which it comes.

Consider the interactions,
Within and between system parts,
They help or hinder the outcomes,
Ignoring them isn’t smart.
Look at the interrelationships,
Of policies, patterns, and norms,
These interact with the program,
And impact how it performs.

Three interconnected elements,
Can help show you where to begin,
They’re Boundaries and Relationships,
And Perspectives within.
A deeper look at these elements,
Will open up more leverage points,
If you are looking for system change,
These actions won’t disappoint.

But how to choose what to include?
What’s evaluated or not?
Your framing is quite critical,
Requiring some thought.
Be conscious of what is left out,
Think of the three factors above,
And how they impact your outcomes,
The influences thereof.

Why do we bother with systems?
When logic models are so nice?
Well life is not really linear,
They clearly can’t suffice.
We’re living in a complex world,
Simple answers rarely exist,
It’s only through complexity,
That we can learn what we missed.

And this is a great opportunity to wish friends, family and colleagues a very happy holiday and New Year.  I have many blog posts turning in my mind and will work on getting them posted in the near future.  Namaste from Kathmandu!

Comments Off on Monitoring, Evaluation, Holidays and Poetry

Filed under Uncategorized

Prepping for a Think Tank Presentation

I am preparing for the Canadian Evaluation Society Conference June 11 in Toronto, Canada.  I am presenting a paper I co-authored with folks from the International Institute for Sustainable Development (IISD) on Performance Improvement and Assessment of Collaboration: Starting points for networks and communities of practice.  I chose a Think Tank, rather than a standard presentation, because we are only at the start of our understanding…so why not use the knowledge in the room to help advance our collective thinking?

tangle

When researching the paper, one of the first challenges we came across was distinguishing between all the types of collaboration.  In general, it’s a tangle of terminology.

However, the first lesson to share from our research and consulting is that the fine distinctions between these terms are of limited value in determining how to improve performance and how to help organizers and participants account for the time and resources invested in the collaboration.  Rather, as a manager or evaluator, one should focus on key attributes that are critical to designing for and assessing performance.

I won’t share the whole paper here but in summary, the paper focuses specifically on collaborations of individuals seeking knowledge and support for purposeful individual or collective action (CoPs, knowledge networks, campaigns and so forth):

preferred revised imagePerformance improvement of these collaborations focuses on determining:

 – Whether there is sufficient social capital for participants to exchange information, learn from each other and work together; 

– Whether individual participants believe and can demonstrate that their knowledge and skills have benefitted from the time invested; and

– Whether there has been progress in advancing solutions toward a shared challenge.

We suggest that four areas to explore in strengthening performance assessment and improvement of networks are:

  1. Focus and Extensiveness;
  2. Understanding of Structure and the Evolution of That Structure over Time;
  3. Social Capital; and
  4. Activities, Outcomes and the Concept of Value Creation

We also suggest a few tools that might be relevant for assessing networks, however this is really my question for the group of Evaluators, among a few others:

  • What tools are you using to assess networks of individuals collaborating?  
  • What are the strengths and weaknesses of the tools?
  • What new ideas do you have to strengthen this area of practice?  
  • What are the potential pitfalls that we as evaluators should be aware of?

What might be a good question or two for a Think Tank on this subject?  Do you have an idea to share?  I would love to hear your thoughts and welcome your advice as I prepare the session.  Once again the paper can be found here.

Thank you in advance!  I will be sure to share the outcome with you after June 11th!

Leave a comment

Filed under knowledge management

Aha moments…thanks to the metaphor!

In the field of knowledge management, complexity and changing organizations, it’s easy to have conversations where one person is talking past another.  Have you ever stood there wondering what a colleague is trying to explain while they seem really sure of what they are saying?

One way to help clarify concepts is to use metaphors.  Yesterday on Twitter (via @NancyWhite) I saw this example that visually illustrates the difference between data, information, presentation and knowledge.  A picture is worth a thousand words in this case, particularly for people working with knowledge!

Another great metaphor for understanding tacit knowledge is the iceberg metaphor from Anecdote.com (blogged in 2007, fantastic description).  They visualize knowledge as above and below the waterline.  Most of the mass of an iceberg lies below.

Lastly, a simple way to explain complexity is provided on page 9 of the highly recommended book Getting to Maybe.  They use metaphors such as baking cakes, launching rockets and raising children.  Thanks to Gary Ockenden for sharing that one with me a few years ago.

Do you have metaphors you use to explain concepts related to knowledge or complexity?  Please share!

2 Comments

Filed under knowledge management, Uncategorized

Quick keys for understanding complexity

Dave Snowden’s most recent video provides a quick brief in understanding the Cynefin framework he developed.

Some key points to note:

  • Cynefin framework – data proceeds the framework rather than the other way around
  • The framework helps one identify different systems – ideally you use the appropriate method to deal with different situations
  • Complexity domain involves: probe, sense, respond.  This results in emergent practice rather than something predefined.
  • Ideally, most situations are dealt within complex or complicated methods.  This means lots of people, lots of diversity and is noted as a pretty good strategy.

Snowden remarks that bureaucrats tend to work in the simple box, academics/researchers work in the complicated and politicians generally in complexity.  This is instructive to keep in mind when working with groups of people that come from these different sectors.  From my own work, I know its not easy convincing people to work within the complex domain.  Often this requires more facilitation, monitoring and adaptive capacity than the other domains.

I recommend this as a good introduction to Snowden’s Cynefin Framework for those starting down the complex path.

1 Comment

Filed under knowledge management, Uncategorized