This third discussion of Author Experience, with Max Johns, focused on the challenges that will be faced when broaching the subject of AX within larger corporation, what we can do to change stakeholders’ perceptions, engage authors, and the benefits that will ensue.
Rick: This is one of those discussions that almost wants to start with “Isn’t it obvious?” We know the benefits, and we know that the stakeholders don’t understand, so will resist.
Max: Perhaps. But we need to think about it from their perspective; understand what’s important to the enterprise and frame both the challenges and the solutions in their language.
I think we have two main issues here:
- First, we need to educate people about the difference between writing and managing content.
- Second, we need to present author experience to management with a cost-benefit analysis.
Rick: That first point, I cover partially in the book (p21). Everyone thinks they can communicate – that they can write. But it takes training to understand how to use structure; how to use semantics consistently. It’s those without the training…
Max: Yeah, everyone thinks they can write, then they show their ignorance with apostrophes. But this is more than that.
Our authors come from various places: marketing comms, advertising, PR, etc. They have specific biases in why they write. They see words as entertainment, or space-filler, or as something to produce to stay employed.
Rick: Or as a way to manipulate people.
Max: Yes. And some of these people can write fantastic copy. But few of our authors are pure believers in words as a way to communicate information.
This issue goes beyond the writing. Once we get past the value of initial content creation, most authors think they can bang out new content today and it will be dead in a week. They forget – or were never aware – that digital content accumulates.
Those are the authors we need to sell author experience to.
I’m not yet sure how we do that without making them stick around until 2017 to see what a mess they’ve created.
Rick: We’re not helped on that front by Google’s promotion of content freshness as a dominant measure of relevance.
Max: Yes. The way some people chase every last advantage in Google’s algorithm means that if something’s near the top of search results, it’s likely a Wikipedia article or less than ten days old. Though the balance is likely to shift again; Google is constantly reinventing its measure of relevance. And we can probably count on them sticking to the principle that relevance is core; it’s just unfortunate that for now, new equates to relevant.
The other thing to remember is that if your content is really relevant, age doesn’t matter. When I worked for a bank, we had old product pages we thought weren’t online anymore – because we had discontinued the product – but they always came up in searches for credit cards; they had authority.
Rick: That’s a good place for AX to help the enterprise: finding and identifying the content.
Max: There is another side to this. Even if – or perhaps especially if – authors are aware of how messy their content set is, why would they care about making the next piece better, cleaner and more reusable? Having one content gem in a cesspool of 100,000 pieces of crap isn’t going to make a difference when it comes to moving everything over to a new system (which they assume will happen). They might be tempted to encourage everything to be thrown out when that time comes.
Rick: Except that when that time comes, the powers that be aren’t prepared to bulk-delete. They invested good money in that content… it might still have some value.
Max: There is one way to get authors to understand the cost of their content: make them do a content audit. That is often the first step from saying you’re a writer to saying you’re a content strategist. That’s where you learn the pain. That’s where you start to appreciate the cost of bad content, and understand that a bad author experience contributes to it.
Once you’ve done an audit, you really feel like you’re more important than a writer. That’s when you start to overcome the first issue.
Rick: That’s a really good point. Let’s come back to it in a bit. First, I want to tackle the second issue you identified: how to present a cost-benefit analysis of good author experience to stakeholders.
Max: Right. We have to identify the cost of bad AX. It’s a price every company that hasn’t put specific focus on their author experience has been paying since day one. They just haven’t realised it.
There is a perception that authors don’t have needs; if they have a keyboard and Microsoft Word, what more could they ask for? This is why they are often treated as the lowest of the low. Anything we do for authors is seen as a cost. And in large enterprises, keeping costs down is often seen as a good way to improve your own situation.
Rick: Except when it comes to CMS procurement; then cost concerns seem to go out the window.
Max: When’s the last time you saw someone buy the most appropriate CMS, over the one that cost a third of what it did?
Rick: I’ve seen too many organisations buy the one with the best salesman. Which also happens to be the most expensive. And the least capable. It’s a capital expense versus operational expense issue. Operational expenses – including authors – are under constant threat.
Perhaps part of this second issue is just another take on the first. Managers see authors as salespeople whose job it is to pump out words. They overlook the real purpose of all this content, which is to communicate effectively. They don’t understand the persistent nature of the medium.
The output of an audit could well be just what they need to understand that it is about content management, not just content creation. They need the usage and revenue conversion metrics from good analytics to go with it, so they can see the associations.
Max: But getting a stakeholder to understand a content audit spreadsheet is a high-impossible undertaking.
Rick: True. It’s too much detail. And a summary fails to show the complexity of the issue.
Max: If you could do it in a more visual and obvious way…
Rick: I think the answer is rather easy. I’ve never seen it in classic content management, but I have seen it used to analyse information: relationship map visualisation. (The example image is from LinkedIn’s discontinued InMaps tool. [source])
Applied to the semantics of a content pool, it can show a lot: subject affinities and other relationships. Combine it with usage and conversion statistics, and see the relationship between dense content clusters and actual value. Find orphaned content clusters. Track sequence paths through your content. Spot the confusion that out-of-sync content duplication causes. Compare value with age. Compare the results of multivariate testing.
It will show relationships between your content that you may not have even been aware of. And it will do an instant visual gap analysis. Quality and relevance could be easily identified. It would help identify content you don’t want to promote that is getting high traffic.
It needs someone to explain what the visualisations mean, but even with a basic understanding, insights will be really easy to extract. And by virtue of being dynamic – you can zoom in, switch between representations easily – there is a huge amount of meaningful information that can be gleaned.
Max: I think if you put that in front of a manager, you might be able to start talking sensibly about the costs of poorly managed content. And you can trust most managers to at least understand inferences about what’s good for people.
We could be onto something, here.
Rick: And the technology is available.
Max: I wonder why they aren’t more common. Maybe they resemble word clouds too much, and we all love to ignore those. But with guidance of how it relates to your content, to its value and performance…
Thinking about it, this would likely solve the other issue too: getting writers to understand more than just what a good page looks like, but what good long-standing content looks like.
Max: It would be fantastic to see the difference between a well-structured content set, and a poorly structured content set, rendered in that way.
Rick: That would be interesting to do, too. Now, I just need to find the software. Oh, and a well-structured content set. (That would be the New York Times, see p149.)
Max: If we assume that a good visualisation tool, combined with good analytics data, will sway the managers, how do we get the authors themselves to embrace structured, semantic content? They may appreciate the mess we can show them, but why would they care enough to do a better job? How do we get them to buy in to doing it right?
You sell the system on the idea that it will give you perfect structure. The structure provides power and flexibility. You can manage content with ease. But, only insofar as authors use the system responsibly. In a wonderfully structured system, even the smallest bit of unreliable data can muddy the entire set.
It’s a problem with every system ever invented. But there’s a definite risk, especially if – as you say in the book – so many writers feel unappreciated and underpaid.
Rick: Those questions need to be answered separately.
First, we need to address the reasons authors might be sloppy and not care. Some factors, we can’t do much about: they are overworked, stressed outside the job, etc. But within the work environment, we can do something to encourage and engage them: trust them to do their jobs, respect them for the skills they bring to the table, treat them as valuable team members.
Max: There is nothing better than a trustworthy person who takes their job seriously.
But it’s still the people problem: hut how do we stop authors gaming the system? What can we do about malicious authors?
Rick: There’s not a lot we can do about malicious authors.
But for everyone else, we need the tools to do what they are supposed to – facilitate content management. It should be easier to do it right than to be sloppy. There are three stages to this.
The first is providing tools that help them do a better job. The second is to explain how and why those tools are valuable. The third is to show them the consequences of not using the tools properly. I’m not talking about threats of violence, but about the additional workload that being sloppy will create for them.
The classic example is when content references people, or other identified entities. If the system unobtrusively helps the author semantically identify the person, then when that person’s name or position changes, every reference can be updated automatically. Failure to embed those references means the author has to trawl through all the content, updating each. And if there are two people with the same name, find-and-replace won’t work. It becomes laborious.
This is at the heart of the WYSISMUC editor (p129). It looks a lot like WYSIWYG, but it embeds semantics into the content, rather than visual styling.
Max: Let’s assume AX takes off. People love it. Could we accidentally get to the point where organisations assume the system will take care of everything, and stop hiring the people because it’s a cost they can avoid?
Rick: That’s a question of artificial intelligence. Is the system good enough at what it does to be trusted?
Max: Countered by the question of whether the people can be trusted? It’s the same problem, really.
What if people started trusting the structured content to look after their taxonomy? Can a system handle the subject affinities?
Rick: That’s one of the areas where smart agents are already fairly advanced. Latent semantic indexing may not be perfect, but it is pretty good. If you have a good agent in place, you’re fine. If no, you’re screwed.
Max: That, I think, is where the rub may lie in a few years. Of course, this would be a fantastic problem to have: too many people are using this system.
It’s the eternal simplicity versus complexity issue: we tend away from the middle. We love WordPress because it’s simple, but we end up with a bunch of pages that all relate equally to each other, because every post has all the same tags. So we flip to automatically structuring everything; we shirk responsibility. And we lose the skills – we are no longer able to judge quality…