A Different Trial Process for Recruitment

This sounds like a great idea! What I am worried about though is if we can reasonably stress the same factors we do right now. For instance, this gives us an idea of how the developer would work is relative isolation while following a very different process then we do regularly. However, it might not give us as good an idea of:

  • how well this person does as part of the larger team
  • how well they can handle our sprint process
  • devops stuff
  • how good they are at reviews
  • how good they are at smaller tasks

I feel like we might still benefit from a shorter trial period of 1-2 sprints after candidates pass this step so we know for sure.

1 Like

@kshitij I know, but there is no perfect process and I suspect that something like this will be “good enough” for an evaluation. Plus, it’s unlikely that someone would demonstrate strong communication on Mattermost, as good questions about a task, implement it well, work well with the reviewer, get it done before the deadline, and then end up being bad at code reviews or terrible at smaller tasks, or something like that. And if there is room for improvement, it’s likely that they’ll be able to improve it with mentoring, if they’re strong on all the fundamentals like communication, time management, and technical skills, which this trial project should test.

2 Likes

Such a great initiative! Thanks!

The best way to do that is to actually see them work.

Agreed, this is the best way to evaluate someone.

But, I need to challenge the quote to some extent because I don’t think we should do this based on some of the arguments it brings. In particular:

Hire them for a miniproject, even if it’s for just twenty or forty hours. You’ll see how they make decisions. You’ll see if you get along. You’ll see what kind of questions they ask. You’ll get to judge them by their actions instead of just their words.

Generally speaking, yes. But, I want to highlight that this is a generalization, and all mini-projects won’t give you the promises above. Moreover, in practice, this is highly dependent on what the mini-project is, and will also vary greatly from candidate-to-candidate. In other words, not all mini-project can show the data points we want.

For example, suppose the mini-project is about adding a feature to Open edX platform, and the candidate has an extensive experience with Python/Django. In that case, there is a chance we won’t see the person dealing with ambiguity, finding the candidate find answers, how the person asks questions and communicate, how learning new abilities go, etc. But, on the other hand, we could evaluate the same candidate on the same criteria if he was working on a project outside of his comfort zone.

You can even make up a fake project.

From what I understand of fake projects, this is even worse. Mock projects will are designed from an idealized, abstracted idea, of what the ideal candidate should do. In the long run, these fake projects will become unrealistic scenarios that are testing unrealistic candidates. In practice, I think the data points we will get with fake projects are the same as if we simply hammer the candidates with coding problems, that is, we are measuring how well they prepared themselves for the interview. It is a valid data point, but not sure if it is what we are looking for.

Finally, the idea of comparing the knowledge work interview process with the car industry interview process is flawed because the level of ambiguity and skills required are different. For example, I would like to know how they would apply the same strategy for hiring Car Engineers.

With all that said, I am not against trying this. I think it is an excellent initiative because of the drawbacks of the trial period that Braden and others already mentioned. But let us compensate for the risk it brings, in particular:

  1. A more robust interview process, one that doesn’t go with the benefit of the doubt, to compensate for the lack of the safety net of the trial period. And,
  2. A constant feedback loop, whenever something goes south in the mini-project or after, we revisit (1.) to find what could have captured those issues.

TL;DR:

I am in favor of trying this. But, I don’t entirely agree with the quote, mainly because we will lose the safety net of the trial period since the mini-project confidence level is lower than the trial period in terms of the data points it offers. And, it is a hard problem to get great mini-projects from the real world, and even harder to design fake ones. But, if we compensate for the risks, I think it will significantly improve our hiring strategy.

1 Like

Yes, I didn’t mention this in my original post but I definitely think the initial interview screenings would have to be a bit harder to pass, before people get to the trial project. But at the same time we have to be careful because this tends to select for people who are good at doing interviews, which is not a skill that will come up in the job at all.

Trial period or not, if someone is really not working out, and not listening to feedback, we can stop working with them.

BTW, previous jobs I’ve worked at didn’t have trial periods but do have a “probationary period” when you start. In my home jurisdiction, employees can be fired “without cause” (basically for any reason) and with no notice (or further pay) anytime in their first three months. OpenCraft’s contract is actually currently much more generous than this and gives a significant notice period from day one. If we proceed with this, I would be in favor of such a probationary period model (you can be terminated with no notice in the first X month[s]) which provides us more of a safety net.

Well, as mentioned in this thread, we used to do exactly that. And that’s also why I suggest that we have a good project lined up before even posting the job ad.

1 Like

The only issue I see with this is that offboarding might become too painful in this case. Currently, newcomers don’t get access to ansible-secrets, configuration-secure, and other hard-to-rotate credentials, which makes offboarding easier.

We can either limit access during the probation period (which might severely limit the work available, especially in DevOps heavy sprints) or come up with managed credentials using something Vault + Boundary (I’m talking in the long run here - not trying to push for more internal projects now :wink:).

:+1: for having well-specified projects (but not so much so that the uncertainty aspect vanishes).

To better evaluate candidates for how they’ll match with us, we can structure the test project like a mini-sprint: planning and discovering the requirements, execution & implementation, delivery - while evaluating time logging, communication, etc.

How about picking from this list? I think every single one of them would fit the bill, and Natalia would be very happy with us. After asking her at a meeting today, she already got the go-ahead for us to use core-contributor time on them. I think there’s enough there to keep our CCs busy and for us to hire 10 people by August 31st, thus solving all the company’s problems. ;)

1 Like

+1 on not picking fake projects – imho the best would be a project for a client, but for which we have at least 1 sprint of margin, to allow us to redo the work ourselves if it doesn’t work. It will be the closest to real conditions. If we can’t find that, then contributions to Open edX would definitely be much better to judge, and also more useful – especially now that we have core committers in the team who can review in a timely manner.

Btw for the contract signature, @gabriel is covering for my part of the recruitment process – so remember to include him if you test this before I come back (which would definitely be good!).

Two things about the contract part:

  • Before getting a candidate to work on a project, they should sign the contract
  • A clear timebox should be provided on the task/project before or simultaneously when assigning the work – and it should be said explicitly that it is the maximum number of hours we would compensate for this task, that they should stop their work once the timebox is reached, and push what they have then. (Feel free to extend it if it looks worth it then, but also be clear on what the new timebox is).

Also, another related idea we haven’t moved forward with – it could be worth sending a list of bounties to the candidates who look promising, but don’t have any experience contributing to open source? That could increase our pool of potential candidates.

3 Likes

The best in terms of evaluating the developer, yes. But it is a high risk, and thus a heavy responsibility, for an epic owner. I had to do this with SE-3741 and @jvdm and it turned out great because JV is awesome, but if he hadn’t been I’d’ve been up the creek.

Ths is to say: it must be up to epic owners to take on this responsibility or not. If nobody wants the risk, we can, and should, fall back to community contributions.

1 Like

@adolfo If there is work whose deadline is more than one sprint ahead, there should be zero risk for the epic manager? If it doesn’t work out, the PR can be discarded, the time counted as purely recruitment time, and the task rescheduled for the following sprint?

Ok, I see what you’re saying. But as an epic owner that will likely have to do this many times, allow me to be explicit about how I’d be comfortable doing it:

  • The trial member is not counted into any capacity calculations. (I didn’t think they would, but we should probably be explicit.)
  • The task in question is due at least one sprint ahead of the one during which the trial member will be executing it. (Again, to be explicit.)
  • The task must have been scoped and estimated to fit into one sprint, and the trial member must have one full sprint to do it. (To avoid giving tasks in the middle of the sprint.)
  • If the task spills over without a good reason, the trial member fails the trial and the reviewer becomes assignee automatically for the following sprint. By the second Thursday of the first sprint, it should be clear what will happen, and the reviewer should plan ahead accordingly. (This is to avoid making the epic owner scramble to find a replacement last-minute.)
  • In case of failure, the time spent by the trial member is not billed to the client. (As you suggest above.)
3 Likes

:+1: ok to try something new.

A few consequences:

  • this creates a new division: where we now had 2 types of developer (newcomer vs. core team) we’d have 3. So I guess that the intention is to remove the newcomer level
  • people in the new level (working on an external project) will be less onboarded than the current newcomers: less access to tools (and to servers), and less knowledge of our handbook and processes. They may not see the real core-team-member experience (including picking tasks, dealing with budget issues, estimating tasks, deploying servers, discussing processes, planning a full sprint, dealing with spill-overs, …). They may see a lighter (and easier) version of OpenCraft. If they pass the test and they like it, they may still not like the real version
  • time logging requires access to JIRA. Granting access to JIRA gives access to a lot of private information (internal infrastructure, clients, …)

And questions:

  • after successfully finishing the small project, do they become newcomers for a short time? (onboarded to most systems, but not having core team member responsibilities yet)

My idea would be to replace the screening review process (i.e. the first two weeks of a newcomer) with the process suggested by Braden. After they pass it, they become newcomers. After 6 more weeks (i.e. already 2 months in OpenCraft) there’s a review and they become core team.

The difficult part seems to be finding suitable tasks, with the right budget, access level, timing, difficulty level, …

Yes, but that’s always a risk with any new job. They can read the handbook (and maybe watch some videos?) to get a good idea of how we work, and the trial project will really help too. But if they pass, join the team, and find they don’t like it, they can always quit, as long as they give sufficient notice. The trial period doesn’t really change their options in that regard, just how much onboarding/offboarding is needed.

That’s up for discussion, but I was thinking more of considering them essentially a full team member right away. Some of the responsibilities should be added in after a few weeks to make their onboarding less overwhelming and some accesses should be granted only as-needed for the first few weeks. What you are suggesting sounds very similar to the current process, with many of the same drawbacks.

1 Like

@jvdm @paulo @raul I see we have some new people starting, but I haven’t heard anything about using this new process. Are they all on the “normal” trial period? Would one of you like to try this out with me for the next hire?

Sorry for the late answer @braden, I completely ignored the forums to handle fires all sprint long.

@shimulch is the current recruitment manager for Bebop right now, he might be in a better position to answer this.

@braden Newcomers that were selected for bebop already got onboarded.

If we want to try this, we need to change our interview script. We mention about 2 months trial period at the end of the interview. So it will be wrong to give them info about 2 months trial and then giving a project instead. It might confuse them.

@braden Thank you for working on this - it would be a much more efficient way to go at this. We just need to make sure we test it, so +1 for trying it for the next hire.

@shimulch @paulo @jvdm Would one of you be willing to take care of a ticket to draft the PR of the change this would make to our recruitment process, with @braden as reviewer? And then keep the PR open while testing to apply it to the next recruit, merging it if we are happy with the results?

Thanks again for the work on this, it’s actions like these that will allow to adjust to the market changes we are facing, and relieve the strain these changes have put on our production line and on the team.

3 Likes

As mentioned in Capacity & process issues - Update - #6 by antoviaque I have created the ticket Log in - OpenCraft for this.

@antoviaque I will be interested to conduct the experiment. I am assigning myself to the ticket. :slight_smile:

3 Likes

@team The MRs for simplifying the trial process is ready for team-wide review.

Handbook MR - [BB-4885] Process spring clean - Simplifying trial process (!355) · Merge requests · opencraft / documentation / public · GitLab
Private Docs MR - https://gitlab.com/opencraft/documentation/private/-/merge_requests/600

5 Likes

As per @giovannicimolin’s comment here some members actually not familiar with the recruitment process. So here is a summary of the new process. :slight_smile:

Stage Activity
0 We prepare a list of trial projects for a recruitment sprint.
1 Recruitment Managers preselects candidates and invites them for interviews.
2 Recruitment Managers conduct interviews with preselected candidates.
3 Recruitment Manager evaluates the interview and another recruitment manager reviews it.
4 Admin Specialists schedule a 2nd interview with Xavier.
5 If the candidate is accepted in the 2nd interview, contract signing is done. Admin Specialists create an onboarding and offboarding checklist.
6 Recruitment Manager picks a Trial project from the list and assigns it to the candidate. A team member is assigned as the reviewer of that task. The candidate is given limited access to Jira and Mattermost.
7 The whole cell does a review of the trial project and decides whether to accept the candidate or not.
8 If accepted the candidate joins as a core team member. All-access are given and all kind of tasks (except being a reviewer of trial projects) are available to the new member. Except for the dev mailing list.
9 A developer review (full cell) is held after 2 months of onboarding. There is no extension. We check for red flags and try to keep our quality of work with this review. If there is no such issue we add the member to the dev mailing list as well.
8 Likes