[Ryan Weber] Welcome to 10-Minute Tech Comm. This is Ryan Weber at the University of Alabama in Huntsville and ever since I’ve started this show I’ve wanted to do an episode about writing effective software documentation. And I finally found a great guest who can guide us though the process.
[Andrew Etter] My name is Andrew Etter, I’m the author of Modern Technical Writing, which is available on Amazon and only Amazon and also on that front, I work for Amazon as a Senior Technical Writer under AWS.
[Weber] Andrew has a very interesting and informative perspective on the challenges and strategies for writing software documentation in contemporary software companies and I hope you enjoy the interview. Thanks.
[Weber] Welcome to the podcast Andrew. Thanks for joining us. I guess I want to get started by asking about how we can know if our documentation is effective. What are some ways that we can know if software documentation is good?
[Andrew Etter] Yeah, so good might not be the right word here, but I really like the word effective in this case because good so much takes us in the direction of qualitative or subjective considerations, where we say things like, “Oh it’s written professionally or the design is really nice,” or project credibility and those things are important but I think it’s like anything in this world where your argument for effectiveness or goodness, whatever you want to call it, it comes much, much better when you add in numbers, when you add in quantitative metrics. So, in this case you really should have documentation analytics that are pointing to which pages people are reading, how often they’re reading them and with things like Google Analytics you even get their flow through the site a little bit. So, you can see, “Okay they arrived at this landing page, and they immediately went here and that’s kind of the flow I was looking for the user to take.” And then you want to bounce those off of product analytics. The nice thing about software is we have so many different metrics that we’re measuring at any given thing to see what users are up to and so you’ll now very concretely this software application has fifteen thousand users maybe. And then you can look at the documentation and go, “Oh but I only got four hundred hits last month. That’s horrible, what’s going wrong here? How can I fix this?” Versus saying, “Oh we have three thousand users, the doc’s got fifteen thousand hits. That’s fantastic. People are reading it, I’m getting very few bugs on the documentation, this is really doing a-an effective or good job.”
[Weber] So really what you’re looking for is some kind of quantitative way of saying, “People are using this documentation and they’re using it to solve their problems.”
[Etter] Absolutely, and ideally you have some sort of feedback loop baked in there, where there’s a means for readers either email you or report a product bug or maybe they need to reach out to your customer support team or something like that. But in the end, do you want to be able to make those comparisons between the products and the docs. And say, “The product got four hundred bugs last month, I only got eight documentation bugs, I’m probably in a pretty good spot, versus if those numbers are flipflopped that’s pretty disastrous for the documentation and no matter how professional the language is, you can’t really argue that you’re doing a great job. And then a few other things are there right, the design of the professional language things, I spoke about a little bit, I think one that is really often neglected is search. Looking at the search that you’ve applied to the documentation. Say, when I enter the ten most popular queries, which you should also get from metrics, do those take me to the right pages? Making sure you’ve got reasonable search engine optimization and that whatever app you’re using to perform search works properly.
[Weber] Great, so searchable and users can use the search to find what they need using the terms that they want to use.
[Etter] Yeah, absolutely. I think so many kind of, whatever you want to call old school technical writers, think about a set of documentation as just a book that’s been broken into webpages, and that’s really not how users consume it. They consume it in little bit sized chunks probably after running a search, especially if you have a good search, and then they can move on with their day as opposed to reading cover to cover and finding the detail that they need halfway through.
[Weber] Great and getting to the actual writing of software documentation, one of the pieces or advice that you give in your book is actually, “Don’t write.” Why do you give that piece of advice?
[Etter] Sure, so this is probably an easier question to answer than the previous one. Essentially when I say don’t write, what I’m trying to tell perspective technical writers is, don’t be an editor or I guess don’t be a scribe. Coworkers are great resources and their going to tell you many, many useful things, but you should trust them and yet go back to your desk and verify what they’ve said. Do your own testing. Learn the concepts yourself. Learn the product in and out and do your best to, even if you never become a subject matter expert, you want to be subject matter competent. I had a colleague a few years ago, kind of ask me you know, “Is technical writing dying?” And I said, “No, no, it’s quite the contrary. Technical writing is growing, but it’s also getting more technical.” And so, if people don’t have subject matter competence, suddenly you’re really just relaying information that other people have told you into the documentation, which isn’t the right way to go about it and doesn’t lead to the best docs.
[Weber] Great, so you’re really encouraging writers to sit down with their product, play around with it, learn it, break it, make mistakes, understand how it works, so that when they sit down to do the writing part, the putting fingers to keys, they really know how it works and can provide a little bit of their own subject matter knowledge.
[Etter] Absolutely and kind of the second piece of that right. The first one is becoming subject matter competent or having experts at your disposal. The other big one is knowing the audience really well, and that’s something that most of these developers you’d be working with in the software industry, really have very little interest in doing. (chuckle) And so on the tech writer side, you know you get to know the audience through looking again at product metrics or maybe you go out and actually interview some customers. You talk to the product managers, figure out what people want to do and then figure out how to do it, and other people are helpful in that, but you really need to be able to do it all yourself in order to be have some credibility.
[Weber] Well that’s something that technical writers should be good at, is understanding who their users are and what they need, so I’m glad you give us some specific strategies for doing that. Beyond that, what kinds of strategies do you use for writing effective documentation?
[Etter] In the software industry, there’s a concept of the minimal viable product. You might’ve heard it called a MVP, and it’s the idea-this is the smallest, quickest thing to produce that we can put out into the world and it’s still good and useful and meets the needs for our customers. And I like to take very much the same approach to documentation, where I write down, you know maybe make an outline or something like that, of these are the things that you just have to have to put out a set of docs. Get those done, get those completed, and get them out for review from coworkers, and if everything looks kosher, get them out into the world as well. There’s this concept as well of, that I really like at least in terms of documentation, called desire lines, where the example is you build a park and you don’t add any paths, and then after six months or a year, however long you look at where humans have actually formed paths and then you pave those areas. And so, you take this minimal viable product and you get it out into the world really quickly, you’re able to get feedback on it from your readers, you can start paving those paths as requests come in and you sort of collate them into trends. A colleague of mine actually used a basketball analogy for this, and called it the rebound and layup approach, which is you just get yourself in position to grab the rebound and it’s probably pretty easy to start putting in buckets.
[Weber] So you’re really letting the users kind of guide what kinds of documents get produced and how those documents get navigated, is that right?
[Etter] Yeah absolutely. One of the ways in which you produce great documentation is to have a really tight feedback loop, where it’s not particularly hard for you to push updates to the documentation. There are days at work where I push fifteen updates to the documentation, and if you can’t do that, you’re in a position where suddenly you’re waiting weeks or months to get something useful out into the world. And so, in this way if you have that tight feedback loop, there’s nothing stopping you from relying on users and bugs and all those things to continue making improvements.
[Weber] So don’t be married to this perfect version of the documentation that you have to wait until it’s perfect to send it out into the world, but instead sort of send them out good enough so that the users can use them, give you feedback, and you can steadily improve them?
[Etter] Absolutely. I think if you’re not looking at a set of documentation as kind of a living thing that you have to tend to, you’re probably going about it wrong if you’re thinking, “Great they’re done, I can move on to the next product,” it doesn’t work that way. (chuckle)
[Weber] You already kind of touched on this, but what kinds of mistake should writers avoid or maybe things we’ve done in the past that we don’t need to do anymore?
[Etter] Absolutely. I’ve got a lot of experience in this area because I’ve made a lot of mistakes and I’ve written my fair share of bad documentation and really I think the big issue that happens here is, is kind of having an ego about the documentation. And not really understanding what you’re talking about, not having done sufficient testing, not having spoken to the right people, and then writing anyway because maybe you have a tight deadline, or you get frustrated or you’re just kind of feeling lazy on that particular day. Another one I guess would be not getting the right people to review new content and this can be really challenging because it’s a soft skill and if you have a lot of content that you’ve written, and you say to yourself, “Okay I think that this is right. It squares with my experience, maybe I actually have done the research here and I’m not getting frustrated or I haven’t had an ego about this, I just don’t get the right people to look at it.” It could go out into the world with some really serious issues because of some subtle fundamental misunderstanding that you had. And then as well to go back to kind of the old school technical writers I was speaking about earlier, I think there is a real, rigid, inflexible adherence to style guides and templates and things like that. Really, we should be prioritizing usefulness over consistency I would say, and I think old school technical writers look at consistency as the thing to strive for. I would rather something be short, a little bit more casual, and highly useful as opposed to fifty percent longer than it needs to be and potentially something that would turn off a large chunk of the audience.
[Weber] When you say put your docs in front of the right people, generally who are the right people to review documentation before it goes out?
[Etter] Sure, you need, I would say, at least in the software industry, something of a cross-section where maybe you get one or two developers to look at it, but you also need to have a tester, a quality engineer look at it. Because often the opinions of those two groups will vary dramatically where you know a developer will say, “Yes, that’s exactly how it worked,” and then you talk to the tester and they go, “Well actually it’s a little different in the real world.” And hopefully you’re doing your own testing so that you’re already sort of in with that QE group and then as well you definitely want to talk to a product manager, if there is one on the team, because these tend to be the people who are most in touch with audience needs and customer needs.
[Weber] Do you ever test your stuff with the users before you mentioned using user feedback extensively, do you ever test your stuff with users before it gets published or do you just generally publish it and then get user feedback?
[Etter] It’s pretty rare that I’ll get it out to actual users and have them take a peek at it, but if there is some sort of product beta or alpha that we do make public to customers, that’s a great opportunity for me to jump in and say, “Here are the beta or alpha docs. Could you please take a look at them and use them in this case,” but so often you know the docs don’t go out until a feature launches and I really don’t have a chance to do that kind of testing.
[Weber] Well thanks so much Andrew. I really appreciate you sharing all this insight with us.
[Etter] Appreciate you having me. Thanks so much.