Transparency Experiment - Analysis

Hi everyone

In August we started a small experiment to increase transparency in the Reps Program. First we measured a baseline in a survey. At the beginning of October we posted a first mid-way analysis. A few weeks ago we followed up with a second survey.

Now it’s time to publish the analysis and a conclusion.

First survey

Here are a few things we’ve heard in the first survey:

  • I’m not spending enough time to be properly involved
  • I don’t see any connection with Mozilla anymore. It is more or less only aligned with the Open Innovation Team which is more or less driving away from Firefox.
  • I feel like the program is not active anymore.
  • Complaints about Resource Reps not responding
  • I think I trust the Council enough to represent our community.
  • No participation of Reps in decisions and discussions on Discourse like - as example - at the time of Reps.Next

Second survey

Here are a few things we’ve heard in the second survey:

  • I love the Mission Driven Mozilla Reps program
  • I really can’t remember the last time I participated in the decisions.
  • The Reps program is kind of closed to a certain group of people, you will find the names that are recognised by the staff, I am not seeing any new names on the floor.

Metrics

When we started we had these metrics defined:

  • Number of Reps being engaged in discussions on Discourse (SUCCESS: 30 Reps)
  • Reps report feeling more involved in the program (SUCCESS: increase of 20% - baseline is 4.26/7 with 70 responses -> target: 5.11)
  • Reps post their own ideas and initiate discussions on Discourse (SUCCESS: 3 posts)

Here are the results after 3 months of running this experiment:

  • Number of Reps being engaged in discussions on Discourse - 23/30 - missed
  • Reps report feeling more involved in the program - based on 35 responses: 4.14/7 - missed
    • This is most probably within the margin of error
  • Reps post their own ideas and initiate discussions on Discourse - 1 post - missed

Conclusions

  • Metrics were set high on purpose
  • There was a given uncertainty about the phrasing in the survey - which we deemed helpful as not everyone defines involvement the same
  • Given that we didn’t manage to run a second part of the experiment, it’s hard to say if that would have increased the feeling of being involved.
  • Over the last 3 months there was no visible increase in how Reps feel involved in the program
  • Reps Peers and Council should keep posting openly on Discourse about things they are working on - to early get feedback from Reps and involve Reps in the process
  • Peers should keep experimenting with input from the community

On behalf of the Reps Peers
Michael

2 Likes

I think that his 3 points are very important, that are any plans or OKR for them?