What’s an alternative to the annual review?

Context: Wholesale Distribution – 380 employees

Jennifer Laurie, CPO at StartOut:

I love a continuous feedback model with regular growth conversations.

Continuous feedback should still have some structure to it. You can either set a cadence of feedback conversations (weekly, biweekly, monthly, etc.) or create a feedback process that individuals or managers can initiate on an occasional/as needed basis or on a specific project. The process can also incorporate peer or 360 feedback. Ongoing training for the full team on giving and receiving feedback can also support this model.

For growth conversations, have folks schedule dedicated sessions, separate from performance feedback, focused solely on career development, skills enhancement, and future goals.

One thing to keep in mind with switching from an annual review to this model is whether and how you tie performance to compensation. If you do, you’ll either want to use measurable criteria like KPIs or keep some structured review – whether it’s on an annual basis, or more frequently but also more lightweight.

Stephanie Lemek, Founder & CEO at The Wounded Workforce:

I really love the work being done with Organizational Network Analysis in lieu of the traditional performance review. David Murray and the team at Confirm are doing amazing work in this space!

Ilona Jurkiewicz, President, Americas at Cappfinity:

The alternative to an “annual” anything is embedding a mindset shift to “daily.” I sometimes question why we ever defaulted to annual performance reviews, and I suspect it’s purely driven by the impact of financial reporting and external reporting deadlines that create an “annual” culture in a company. Once we break that mental connection and essentially disassociate, it becomes easier to first start building a mindset of performance as an everyday conversation, and then change the systems and habits that enable that.

A few quick implementation ideas:

  • Bi-monthly quick review of relevance of goals, with adjustments made on the spot
  • Rewards linked to delivery of individual goals when they are completed vs. an annualized cycle
  • Business cadence that sets/reviews goals on a non-annual cycle, particularly if the stage of growth of the business necessitates it
  • Feedback, Focus, and Future Chat: Focused conversation that creates alignment between how delivery is happening, what you are trying to achieve, and where you are heading (goal + career focus)
  • Employees submitting review requests at the end of each goal, with goal/project based reviews as the norm

We are in the thick of annual reviews and are preparing for calibration sessions next week! What are some best practices, or better yet, ground rules, HR can communicate out to ensure these sessions are efficient, productive, and equitable? How do you recommend structuring this meeting? Thoughts on a timer?

Context: We’ve organized the sessions by department and IC vs. People Manager. For example, 12 IC Engineers are being calibrated by 5 Engineering managers in a 2-hr meeting.

Sondra Norris, OD/OE Consulting:

Not sure if this is the first time, but if it is, I would say first that 2 hours won’t be enough. My experience with Engineering teams specifically:

JOIN 130K+ HR LEADERS

Get insights, learnings, and advice on how to build companies and cultures that people actually love.

This field is for validation purposes and should be left unchanged.

No spam. Unsubscribe any time.

  • They will want to “spiritedly discuss” the way this is set up. Who said this process is the best way? They’ll have many suggestions about alternative ways to do this process. That will take the first 1 hour and 59 minutes.
  • They will be unclear on what “behavioral evidence” is.
  • They will be unclear on what “standards of performance” are.
  • They will want to evaluate quality of code, speed and accuracy in producing code.
  • It will be difficult to introduce and agree on concepts like collaboration, innovation, teamwork, creativity, initiative: what they are, how they should happen in any particular role, why they should happen, what they look like, and what increasing levels of sophistication around these skills should be expected.
  • They will be highly invested in how they can pay, reward, and promote their engineers the most.
  • They will compare apples to oranges.
  • They won’t read anything before they show up.
  • Some will come with 6″ binders to defend their positions, others will come with absolutely no preparation.

The first one will be rough, attributed to the fact that it’s difficult to defend the efficacy of the process and that so much is typically so undefined.

And.

All of that is good news, because they care one heck of a lot about taking care of their people – but in the thick of it, it won’t feel like that.

How I’ve seen this be successful (with kind of the lowest-common-denominator team in mind):

  • Expect it to take multiple rounds to be effective as people start to buy into the concept, see it work.
  • Get their input as much as possible. What’s expected at each level? What do those expectations look like in action? Why are they important – what results are achieved (or not) because of those expectations?
  • Be ready to do iterations on job descriptions, job level differentiation, performance expectations.
  • Have as clear a process as possible. What are the rating metrics? Use numbers. 1=little to no evidence, developing. 2=some evidence, still needs guidance and clear instructions. 3=clearly moving to independence, consistent evidence on target expectations, can give instructions and check back. 4=fully independent, thinks ahead, proactively knows how to adapt behavior, consistent evidence on stretch expectations; need to think about what’s next. [Note this is an even-numbered scale to prevent fence-sitting, and it demands behavioral evidence.] KEEP THIS RATING SCALE VISIBLE AT ALL TIMES.
  • Use a spreadsheet with names down and performance expectations across. Have a space for ratings and for notes. Display this on a screen so that everyone can see all the data going in.
  • Have all data available: self-reviews, past reviews, job descriptions, performance expectations.
  • Be ready to go back to earlier people that you thought were done and dusted as the process and discussion evolves and managers remember things about their people’s performance.
  • Resist comparing people to people. Only compare people to performance expectations.

Good luck!

πŸ“£Rebecca Dobrzynski, Senior HR Business Partner at Klaviyo:

Agreed with many of Sondra’s points, and that 2 hours likely won’t be long enough, especially if folks aren’t used to a consistent process and format. There are many, many things to consider in calibration meetings, but the most top of mind for me are:

Format:

  • Typically I run sessions (within a single function) by discussing all employees in a job level at once, often most senior to most junior. Within each job level, discuss high and low performers (i.e., “outliers”) before anyone who receives your middle/average rating. You want to be sure folks are aligned on what constitutes exceptional or under-performing, and it’s normal for 80% or more of employees to be in the “achieving expectations” bucket or equivalent… generally, you won’t need to spend as much time discussing every employee there. Many groups will be tempted to review every employee in detail, but unless the team is tiny, it’s just not possible. This is where a timekeeper can come in handy, as long as they are not afraid to interrupt people to keep the conversation moving.
  • If promotions are part of the same process, discuss them here too. Those folks are likely receiving top ratings if they are ready for promotion, and should be discussed within the context of their current job level as well as preparedness to ramp into the next level up. I often progress from most senior to most junior as a way to be sure that leaders have already calibrated on acceptable and outstanding performance for the level someone is being promoted into. So if an IC is being nominated to be a senior IC, you’ve already calibrated what’s expected of seniors before deciding whether to promote an outstanding IC.

Discussion parameters:

  • Apply the same standards to each person based on role expectations/level and any company values that are included. This is where job descriptions, leveling guides, and clearly defined goals are your friend!
  • Share specific examples of what was accomplished and what the impact was/why it matters
  • Focus on performance and behaviors relevant to the job, not on personality or style
  • Participants should keep their input limited to questions/clarifications and new information in order to keep the conversation moving (no tangents of “I agree and here’s another long-winded example…”)… you can and should be pretty ruthless about cutting off any well-meaning repeat comments and stories.

I have a whole slide deck that I send as a pre-read and use as an overview, with info on process/agenda, reminders of unconscious bias and how to mitigate it, detailed performance rating definitions to reference during the discussion, etc. Please share a resources like Bias Interrupters early and often during your performance cycle and remind leaders of how important it is to be mindful of many ways bias can creep in to their thinking and evaluating. Even if they are not receiving full-blown training, resources like this in the moment can really help.

But when we are about to dive in, I end with just two overarching instructions for the calibration discussion:

  • Speak neutrally and succinctly, as though you’re producing a court transcript; this is a situation where fewer adjectives and more verbs are better!
  • Expect (and ask!) lots of questions to test your rationale; it’s not personal, but rather an opportunity for leaders to learn from one another and to ensure transparency and fairness in how individuals are evaluated.
Hebba Youssef
Hebba Youssef
JOIN 130K+ HR LEADERS

Get insights, learnings, and advice on how to build companies and cultures that people actually love.

This field is for validation purposes and should be left unchanged.

No spam. Unsubscribe any time.