Process: Proactive maintenance of your knowledge base
Dec 10, 2021
Standards of KB maintenance
Articles on the KB should:
be technically accurate.
be written in line with our style guide.
fit the structure of the KB.
not duplicate existing content.
These standards provide a good experience for our users. To consistently improve and maintain these standards, the following processes should be in place:
A system to process feedback from both external and internal users.
A process to periodically review articles to ensure standards as our style guide evolves and possible inaccuracies with small changes in the UI.
A system to proactively source new content.
A way to determine the fit of a suggested new article for the KB.
Feedback
CURRENT PROCESSES
Feedback from internal users, through our feedback form.
Community posts about KB articles by our external users.
YES/NO survey at the bottom of each article.
#Knowledge-Team-Public
Speaking to individual tech writers.
Team Lead QA
Let us know if you've any feedback on these channels!
NEW PROCESSES
Surveys
As stated in the Great Knowledge Base Audit, we are hoping to survey more internal users this year and have a reporting dashboard for our KB. This feedback will inform many of our decisions in the other processes we are putting into place.
Knowledge team feedback - collaboration
While our team gives each other feedback casually through Slack (in the group and privately), we are starting a formal process of reviewing each other's new articles.
Upon a product update, we update and create articles. Once this update and creation is done, we place the related group of articles in a queue. Another tech writer will come in and review one of the articles as a spot check. This review happens after the article has gone live, so that any new product updates can be addressed in a timely manner, but feedback from another writer's eye will definitely help tighten our articles and also help each other's technical writing skills.
Currently, we are aiming to address a card within a week.
Periodic review
CURRENT PROCESSES
Between November 2017 and March 2019, we reviewed and updated all KB articles as part of the Great Knowledge Base Audit. Now that the audit is finished, we want to set up new periodic review processes so that work doesn't become out-of-date.
NEW PROCESSES
Least recently updated articles
During the audit, we noticed certain articles were very outdated (e.g. updated in 2016 when looking at it in 2018). We reviewed this content to determine if it was still relevant, and if so, how we could update it. To avoid this happening in the future, we are putting in place a process that surfaces articles that have not been updated within the last year. This is intended to keep our content relevant and give users confidence that our articles are being consistently maintained.
When reviewing least recently updated articles, use the quick checklist below, along with the process map for existing content:
Is the article, including screenshots, still accurate?
Does the article still address the correct user persona?
Does the article conform to the most updated Style Guide?
Is this article unique, or is there duplicate content elsewhere that is more up-to-date?
Should the article be combined with another?
Casper Ong will own this check monthly, and will rope in other tech writers when necessary.
Now that articles are about to be owned by a specific TW moving forwards, this will not be a check by one person. Casper Ong will be rolling out a standardized process for TWs to do this check.
Monthly tune-ups
We've gotten feedback on some articles being difficult to understand or too lengthy for users. We will have a monthly tune-up of particularly lengthy or technically difficult articles to see if we can restructure them.
This will be based on professional judgement, and feedback, especially relating to the YES/NO data at the bottom of each article.
We will determine the articles to be tuned up in the first week of the month and delegate this among the team.
The technical writer may or may not receive an article related to their product areas.
Pilot completed - proposed process steps:Select an article based off the YES/NO data and other feedback channels.Check for support specialists who have included the article in their cases.Reach out to 3 to 5 of them for feedback. When there isn't enough specialists, ask other support specialists. (Criteria in selecting these specialists have not been decided yet, but for now, reach out to specialists who have sent good feedback to the form).Consolidate that feedback and write the action plan on how to improve the article.Share the action plan with the team for critique and other suggestions.Carry out finalized action plan.
Further audits
While we just finished a large audit, with curated content, future audits will be less of a behemoth to tackle. Using data, we can determine when another audit will be required.
Metadata checks
Because our database management does have a human element, consider periodically pulling the number of published articles from 53 versus the current number in AirTable to check if there were any articles missed out. If there is a difference, to dig deeper and find the missing articles. This will also work in hand with the Updating old articles process, as the articles that aren't in our AirTable database usually are old legacy articles.
New content
CURRENT PROCESSES
Product updates: How does the Knowledge team handle product updates?
Feedback received from the channels above.
POTENTIAL PROCESSES
New content should encompass more than just new functionalities or product features. There could be certain aspects of an existing function that we are missing. Some of this can be addressed via our different modes of feedback, but the best way forwards should be data-driven.
While feedback is important, they don't tell the full story. We serve users, but because the many different user personas we serve, we have to consider users as a whole. While an article may be useful for one specific user, creating more and more articles will only make the KB cluttered and difficult to search through for an average user. Hence, we need hard data on what users need, and not merely what they want. Some of this data can come from:
Support tickets filed as "Non-Documented Solution."
Google Search terms.
Community posts.
Help widget (what parts of the app users click the widget most).
We are looking to find an automated way to access and sift through this data.