Categories
artificial intelligence Technical Communication Training UX Design

Enhancing Critical Thinking in UX Design Through AI Coaching

In the spring I started a UX Design Course in Coursera (and will complete the certificate this month!).

And I’ve come to discover that the use of AI within the training has been captivating!

Going into this course, I had extensive experience as a Technical Writer and wanted to evolve my knowledge to include all aspects of working in user experience from research to surveys to documenting the process.

It has been a minute since I was part of academia – online or in person – so I needed to relearn some habits. One of them was studying and was intrigued to see that studying for a test on the platform included help with an AI coach. The coach would ask detailed questions and expect detailed responses.

AI Coach from Coursera Course

As an example:

AI Coach: How would you connect the features and benefits of a product with the needs of its users?

Me: Connecting the features and benefits of a product with the needs of its users involves understanding the specific needs and pain points of the target audience and then demonstrating how the product’s features directly address those needs. It’s about showing the value that the product brings to the users.

AI Coach: Interesting response! While creating a value proposition is indeed important, let’s dive a bit deeper. Product features are the specific characteristics or functionalities of a product. They describe what the product can do or what it includes. On the other hand, product benefits are the positive outcomes or advantages that users gain from using the product. They answer the question, “What’s in it for me?” or “How does this product improve my life or solve my problem?”

Me: I’m ready for my test!

The AI Coach helps in so many ways to expand on what’s in the lesson and also in doing test preparation. It can assist with:

  1. In-depth review of the module
  2. Encourage critical thinking
  3. Ask for detailed responses
  4. Provided harder test questions (than the actual test)

In what unexpected ways has AI helped you think more critically and encouraged more learning options?

Categories
artificial intelligence Cybersecurity Technical Communication

Optimizing Remote SME Interviews with Top Collaboration Platforms

Starting out in software documentation, the entire team would sit around a table and talk with the PM and divvy out sections of the project. Then we’d break off and scatter to do the work. My work entailed jumping around to those who held all legacy knowledge and expanding wisdom – the subject matter expert (SME). Reaching this group with their full schedules and meetings required a lot of flexibility in scheduling and note taking. Though I am an extrovert, and interviewing was one of my favorite parts of my job, I did wonder if taking notes and outlining along with any specific details that may have been missed while I was simultaneously writing and listening could have been streamlined.

My wishes were granted.  Though artificial intelligence has been around for a while, it’s only been within the last year that it’s really taken off. Advanced transcription tools and platforms designed for remote interviews have become available on a large scale. Along with the benefit of tools and platforms, attitudes about various perspectives and the inclusion of multiple voices have made the creation of products that serve everyone more accessible.

New Tools

AI-driven transcription and analysis tools have become more sophisticated, allowing for real-time transcription with higher accuracy. Tools like Dialpad and Otter.io provide either in-platform or ad-in services that offer advanced features like sentiment analysis and keyword extraction. These tools provide deeper insights from interviews. Greater detail in the transcripts can aid technical writers when adding more specific information to the documentation.

Reaching Out…Remotely

During most of my freelance work, I became used to the video conference platforms (but I know everyone got a crash course in 2020!). Being familiar with Zoom or Microsoft Teams allowed me to easily interview SMEs from anywhere. Now with the shift towards remote and hybrid work environments having accelerated, making virtual SME interviews more common. Platforms and tools have been specifically designed or updated for remote interviews, now offering better video and audio quality, seamless scheduling, and integrated collaboration features. Google Meet and Zoom both have collaboration and scheduling options.

Options to include chats to interview SMEs have also been considered.

Listening to Everyone

Within the last few years, there has been an increased emphasis on ensuring diverse and inclusive representation in SME interviews. Organizations are more conscious about involving a wider range of experts from different backgrounds, ensuring a broader spectrum of perspectives and experiences are documented. Interviews can be extended to the employees outside the usual user-group to test the software and review the documentation. User acceptance testing of the software may include an evaluation of the documentation and provide a different perspective on written assumptions.

Categories
Metrics Technical Communication

The Role of Metrics in Continuous Improvement for Technical Writers

Let me just cut to the chase. I hate metrics. Or hated, I should say, but now I think I may be starting to appreciate them. Most of my career has been built around the written word, not numbers. So, there was a learning curve, even knowing Excel as well as I do!

My first foray into the world of metrics came when I started as a freelancer. I wanted to plan out my month and see how much work I needed to maintain my income. As long as I had an anchor client, and a few additional clients each month, I was reaching month’s end in the clear. This is when I learned that “what gets measured, gets managed.”

This is true not only to adjust work but also to figure out, once the documentation is written, how it’s being used and how it can be improved.

Metrics provide the basis for continuous improvement.

The areas to be measured include:

  • User Feedback (comments, ratings)
  • Productivity (documents, pages within defined timeframe)
  • User Engagement (page views, time on page)
  • Searches Topics (keywords, sentences used)
  • Usability (measure task completion)
  • Improvement Tracking (monitoring, document usage and trend reports)

Technical writers are constantly getting feedback on documentation through users, peers, and subject matter experts. Users rate and comment, and if using a content management system like Flare or RoboHelp, they can provide feedback in the application. Measurements that can be used to identify user engagement are search phrases, topics viewed, and demographic statistics such as browsers and operating system utilized.

Measuring the documentation itself is another metric.

With documentation, the time between the initial draft and publication can show the level of productivity. Writers identify how long it takes to produce a document and determine from this any inefficiencies and places where the process can be streamlined. Another metric to track is the number of documents or page count that can produced within a certain timeframe.

Improvement tracking includes monitoring, which ensures that documentation remains accurate, up-to-date and effective. Additionally, obtaining information on how often the documentation is used can provide insight into how well the processes are written. Again, with specific software systems (Madcap Flare) or platforms (ServiceNow) trend reports for detailed searches, click-through, and most viewed procedures can be reviewed. By analyzing trends over time, technical writer can identify long-term improvements and recurring issues, make data-driven decisions to enhance their documentation and enable continuous improvement in quality, efficiency, and user satisfaction.