Comparing peer review processes

I recently found out about the Moodle Peer reviewing checklist when its results where added to issues that I had submitted for integration.  There is also the Integration review documentation, but most of this is about principles than a checklist that might guide a developer on what is expected of them.

Since the OU has a similar checklist, and there was some discussion at the recent developer meeting in Perth about review processes, I thought it would be a good idea to compare the two lists and see if we can learn anything from each other.

The OU process is for peer review and integration to usually be done by the same person, a lead developer. Though sometimes peer review is done by any developer, integration is always done by a lead. We expect the handover checklist to be completed by the developer and checked by the reviewer so the list acts as an aide memoire for developers that they have done everything they need to. Our checklist is a series of statements with sample answers e.g.


[Sample answers:
Y – Tested.
N/A – no course-related data in database or filesystem.
N/A – data is only stored in existing backed-up locations.
N/A – data does not need to be backed up.

Our system of sample answers allows for a little more flexibility on the part of the integrator to accept things which are not perfect, with the integrator using their judgement on how important compliance would be in each situation.

There are a number of checks which only appear in the OU checklist:

  • support for groups needed
  • backup/restore
  • basic accessibility check
  • cross browser support
  • consideration of mobile devices
  • install & upgrade checked

I wonder where in HQ’s processes these things are being picked up?  Presumably by the integrators, though that isn’t clear anywhere.  Should it be? Does it make sense for extra checks to take place on integration than at peer review – personally I think not.

Some of the steps in the HQ checklist are covered by our one code checker entry.   I wonder if it would be of benefit to the entire developer community if the code checker were committed into core?

Perhaps of more interest to me, the following are only in the HQ checklist:

  • provision of testing instructions
  • provision of new unit tests
  • sanity check that similar problems throughout the code have been fixed
  • git commits
  • security – sesskeys, login
  • cross db engines
  • AMOS commands when moving/copying/deleting lang strings
  • renderers, inline style, buffered output, use of appropriate css files
  • _GET, _POST etc not used, moodleisms such as correct api calls used
  • deprecated functions check, deprecation process
  • PHP docblocks present

We’ve talked about testing instructions before at the OU, and generally think its a good idea. That said, for new features where our handover checklist is most used, this should be in the specification document.

The OU does not require new unit tests, only gives brownie points if they are present. That’s a reflection on how busy we often are – which is a poor excuse for not having tests really!

Git commits are the big difference if you work on both core and OU moodle. We flatten work-in-progress branches into a single commit on integration. The HQ way seems more time consuming and complex to me, and if I haven’t got my head round it yet, will some of our less able developers ever manage it? I’m not clear what the benefit of all the extra work is either?

I think adding a security check to our list would be a good idea. It is easy for our developers to think that the entire system is log-in protected, but when the code is used on OpenLearn as well, this is not always the case.

Similarly I think we could add some checks about renderers and CSS styling, and also about not using $_ variables or deprecated functions but using proper moodle APIs where available.

There are a couple of entries that I think are less applicable to us, unless we are offering our modules as contrib – particularly cross DB engine support, AMOS support, deprecation process.

Changes to the OU checklist are agreed by consensus of lead developers. I’m hoping this will spark some discussion about things we could usefully add without overloading people.

  1. tjhunt said:

    Note that the Moodle HQ integrators have a tool which runs code checker and all the unit tests for all commits that are due to be integrated, so HQ is using codechecker. Also, codechecker is now officially maintained by HQ:

    However, I think their decision to leave it as a contrib plugin is reasonable.

  2. Dan Poltawski said:

    The peer review checklist is just a checklist and I think its a tool which mustn’t be abused, there is a ‘human’ element too. Regarding the checks appearing only on the OU checklist, I think that you should propose adding some of those, things like backup and restore are often missed.

    Regarding code checker, I don’t think including it in core would be more beneficial than its current place in contrib. Notably, it doesn’t have the facility to check a diff rather than an entire area so for a lot of the work which goes on in Moodle (not creating brand new files) it’d not be useful.

    We do have code checker and phpdoc checker running diff comparisons on the Moodle continuous integration server, the plan is that we will be automatically putting any patches through these (and other checks, such as phpunit) when submitted for integration.

    Regarding git commits, I don’t really think we have our guidance on this as clear as it could be, and there is much room to improve. But on “I’m not clear what the benefit of all the extra work is either?”. The extra work i’d assume is mostly to do with the process of getting 60 different patches integrated into 4 different branches each week. Really i’d like to see patches with a ‘history which tells a story’, not necessarily the raw commit history of the branch. The benefit is implcit documentation, concurrent work and attribution of credit. I am continually reviewing old history of Moodle and preserving this history is an incredibly useful part of our documentation.

  3. Thanks for doing that comparison, Jenny. It was useful to read.

    Feel free to add elements to the Peer Review checklist on the Dev Docs wiki. I like the elements of groups, backup/restore, browser and mobile checks. I think accessibility is already covered (somewhat) and installation/upgrade is tested during integration testing.


Get every new post delivered to your Inbox.

Join 293 other followers

%d bloggers like this: