I am fortunate to work for a company that respects and appreciates Support. We work 40 hour weeks, we get to say no to things that don't make sense (like supporting ancient legacy versions of the software that the Engineering org is no longer willing to work on), and Sales mostly listens when we ask them to change their tune.
However, we're a startup, and we're still finding our footing as a company, and our finance department actually runs a pretty tight ship - so there isn't much money to hire more people, or send the whole team to this conference (in fact, I sent myself!). So, you can imagine that we don't have money for professional trainers or consultants either, and you'd be right if you guessed that for some of my team, this is their first job in tech or in support. Nonetheless, my team are really smart and good at what they do, and it would be great to keep them - even if we can't pay them as much as I think they're worth.
Constrained resources are a fairy godmother of invention, and I know that if my team had too many support engineers, or we were spoon-fed a series of professional powerpoints, we wouldn't learn as much as fast as we are by doing it ourselves. People on my team may be annoyed by the lack of motorized standing desks, but they aren't bored - they're engaged and ensuring that when annual reviews come around, they're in a great position to ask for that merit-based raise.
What's our secret? I like to think of it as fertilizer; the kind that helps the crops grow. But to take this metaphor too far, it also takes some planning to succeed at gardening: you have to till the earth before planting, do some weeding to make sure we're growing the crop we want, and don't forget watering - encouragement, guidance, and investment. And we have to avoid, well, going off into the weeds.
So, I said we can do this for free, right? Yes! But...time isn't free. And that's the investment you'll be making in your team. Giving them the time to be thoughtful about their work, to be purposeful, and thorough, and to practice. I'm going to talk about two categories of growth that we've been cultivating at Jama, and I'd like to point out that we aren't spending days and days on this - we're spending hours. And, some of those hours are on whatever cycle you can afford.
For instance, we do monthly peer reviews. This is team-wide, and takes probably 3 hours per employee. You could do it every 2 months, or focus on the new people, or those who need the most improvement, or even stagger it across the team. But I think you'll see once I explain the methodology that there is nobody who is too good at their work to receive value from the process, and nobody too junior to impart value with some thoughtful opinion sharing.
Similarly, while intra-team training may be mostly led by your most experienced support engineers, the point of the training is to turn your new hires into future instructors - instructors of customers, instructors of peers, and someday, even creators of their own training. And by someday I don't mean years from now. More like months from now!
All of that is happening at my job; In less than a year with Jama, while I will admit to coming into a really fertile team without OMGTOOMANYTICKETS, we've really moved the needle from good to better, en route to totally freakin' awesome.
The high level guidelines mostly boil down to:
And it's likely that the things you're asking can change over time - we used to have a question about tagging: did the support engineer tag a ticket appropriately? We're getting close to having gone over that enough times that we don't need to keep belaboring that point, so we might decide to focus on ticket title for a few rounds, to train people to get better about re-titling anything that comes in with a subject that isn't obvious and germane to what the ticket ends up being about. Does this make a difference to the customer? Hell no. Does it make a difference to the team, to be able to find similar cases from other engineers when they get stumped? Hell yeah!
[this section was added in response to a frequently asked question after the talk] How can we pick the most meaningful tickets to review? Randomness is not the answer! Our review only covers 3-4 tickets (very thoroughly) so picking the right ones matters. I tend to look at the tickets assigned to an engineer (fortunately we practice single-engineer ownership as much as possible) since the last review and pick out a the best of the 10 with the most responses. Note that this is not necessarily the ones with the most overall responses - so I review all of the top 10 or so. These are generally the most "crunchy" and interesting. If there was some stellar customer feedback, or negative customer feedback about engineer style (rather than about product, timeliness, or process), I'll try to include those tickets as well. We also let engineers select a ticket of their choosing if they'd like peer advice on it - it could be a great ticket, or a ticket they struggled with.
Constructive criticism needs reinforcement and hopefully you have someone (maybe you are the someone?) who will be able to look for the patterns to tease out in a couple of sentences from a review, and pass those on to the next reviewer so they have some context to work with - for instance, "Great customer service but really could use to review the docs". This is also a pretty nice way to find team-wide patterns like "We really need more information from our DevOps team about outages".
I find that I tend to have less advice for reviewees than I do to reviewers - after all, the reviewer already advised the reviewee! (In our system, you talk through your review with the reviewee before submitting it for me to go over. This is to make sure that the two of you are fairly in sync as to the results of the review). Typical advice for a reviewer: "Good job noticing all of the problems, but did you make sure to call out the wins too?" or "I could really use more detail in some categories about each ticket - you don't need to write me a novel in each spreadsheet cell, but you do need to go a bit deeper than 'yes' for every answer."
In the end, you should have provided 3 things:
On selecting topics: we let support engineers vote for their most-desired topics (this might start with one engineer noting they need to know more about, say, SSL). But, also factor in ease of delivery vs time cost to develop when picking what to work on next. We also choose to prioritize based on importance (feature under discussion being released soon or already released?)
Teachers need to be qualified and willing to teach a class. There is often more than one person interested in teaching, but we put some effort into encouraging different folks to teach different classes because as you are likely aware, one learns differently when teaching than when being taught and I want to see my whole team share those benefits. To ensure qualification, I find a second subject matter expert and ask them to chat with the proposed presenter and verify that they either have the needed knowledge or a good plan for getting it quickly (eg: an outline that covers all the relevant points and a good data source to research from)
Class is taught, brains are nourished and teacher has eaten their apple. You've put trainer time and trainee time into this, but there's still more returns to be had!
We start with a two-factor approach: first, require students to give actionable feedback. Then, taking this feedback into account, require the trainer to provide some training artifacts that encapsulate the subject matter as well as possible. This might be links to docs, it might include a recording of the class with some text annotation (so you can "search for" that content), it might be an outline, or a set of examples shown during class. It might be all 4 of those things! This hopefully isn't a lot of "extra work" - trainer knows they have to provide it when developing the class, so easy enough to make your notes comprehensible to others.
Finally, though, there is a bit of extra work - periodic review of the material to make sure it hasn't fallen out of date. This is a good segue into the next topic, because that review need not happen by the trainer - it can come from an attendee!
All plants need water and deserve to thrive. So, find a way to get everyone to participate more deeply than just "attending class". Even junior/new team members can help - they can be the ones to document the training and maintain the documentation - plus they've been enabled to teach the class next time! Or maybe they can summarize that new feature for Sales based on some of the artifacts they've contributed to? Junior folks are also frequently the best at pointing out under-explained concepts!
Additionally, I personally attend as many training sessions as I can, even when it's not something I need to know, as the guy who generally doesn't work tickets and doesn't claim to be a product expert - and I judge the delivery & materials (artifacts, as it were) and work with the teacher to make sure that there is something easier to review than just a recording of the class plus a summary provided so that people can remind themselves of the subject matter quickly, later on. Once I've worked with a trainer a few times they know what I'm looking for in terms of artifacts, but even so it's not a bad idea to give the trainer meta-feedback too - "I think you'd be more effective with more examples next time" or "your class was a bit dry - could you spice it up with some interpretive dance?".
Take some of this home and try it with your team. Talk about what worked and didn't work - with your team, with your community, with me. Let's see what we can build that's so exciting that you want to come back and talk about it here next year :)
Feel free to get in touch with me to talk more about any of this - I'd love to learn something from one another!