Piloting plan

From Responsible Data Wiki
Jump to: navigation, search

Below are rough notes of a group conversation. We are working on cleaning this up so that we can share some clear thoughts and ideas that came from this session.


What are we talking about? Pilots, or another way to put it, closed-loop iteration. Sometimes "pilots" are where projects go to die. What's the diff between feedback and pilots? Pilot includes the audience for which the resource is created.

What is the criteria for needing more iteration?

  • variety of users using and testing it together would be very useful and build community
  • who is the audience? a niche community or the general public?

Find where there is energy - ready to go, want help. Have a contract ("Learning contract") with them: here are the tools, give is input back on how it was used, how it worked, etc. Explain the context and reason for moderation.

"Learning Contract" - teacher and student agrees to commitments - wha tdo they want, objectives, and what will they give back?

  • closed-loop
  • time-bound
  • set roles - owner, user, support
  • share back versions (recognition)
  • set goals

Incentive structure - would it make sense to identify support organization and involve them in the process of piloting/testing these resources? These support orgs are invested in the general resource and want to see how it can be customized and used.

Min requirements for the pilot:

Who are the ideal people to pilot with?

What are some of the channels and mechanisms that a group like this could use for this pilot process? For each Buda output, we need description, audience, sign-up list (commitment to pilot). Here are some steps to think about. How can there be support around these piloting processes, and for those who are trying to use the resource? From this event, the process is something like: participants here agree to take the resources and contribute a first test and iteration.

How do you know if you are ready for a "pilot"?

  • there is a minimal viable product
  • you have a point person ready to take the baton, to manage iterations
  • you have a user in mind

The pilot pad: a relay

Thing (What it is) --> Resource Point Person --> Facilitator (not required) --> User

And the arrows go the other way, too.

What is the "learning contract"

  • how did you use it
  • how did it go, why

Each output needs:

  • what it is
  • how it should be piloted and what kind of input you need
  • who is the suppport group

Ex: atomized dig sec for orgs - each iteration (phone calls, etc) produces different outcomes, not sure how to bring it all together

Can we debug the piloting process? Pilot the pilot?


other thoughts: can we define a process that everyone is on regarding their resources?

development --> draft exists --> iteration (get input and feedback) --> test it out / pilot --> iteration / version-control -->