DITA

Getting to know Wikidata

bobdc.blog - Sun, 2017-02-26 15:23
First (SPARQL-oriented) steps. Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

Five Tips to Succeeding as an Agile Technical Writer

The Content Wrangler - Fri, 2017-02-24 23:39

Creating documentation within an Agile team is both challenging and rewarding. Agile moves fast and emphasizes team collaboration. Many Agile evangelists think that being Agile means no documentation. The same amount of documentation is required in Agile. The difference is when and how that documentation is created.

Consider the following tips.

 

1. Join Daily Stand-Ups

A stand-up is a short meeting where the team reports on what they’ve done, what they plan to do, and anything preventing them from doing their work. If you’re not attending, you’ll quickly fall out of sync.

When attending stand-ups:

  • Focus on team activities and priorities.
  • Even if there are no changes to user documentation, listen and learn.

If you are not considered part of the team or are not invited to the stand-ups:

  • The project may be extremely difficult.
  • Inform the Scrum Master and Product Owner.
  • Explain your role, and how your absence negatively affects documentation.

 

2. Write Documentation User Stories

User stories are the format used for defining software requirements in Agile. Stories take the form of:

“As a (role) I want to (do something) so I can (get something).”

User stories:

  • Should not cover every detail.
  • Should not include implementation details.
  • Should be used to start team conversations, where details are discussed and explored.

Documentation user stories highlight your work and how that work fits into the rest of the project. Software delivery is incomplete without documentation. Good Agile teams will appreciate your contribution.

Developers often loathe working on documentation. By politely insisting that documentation user stories are an integral part of the definition of done, you’ll gain respect and cooperation.

To write documentation user stories, define your audience, and consider WHY they need to read the documentation.

Is the audience:

  • End users who are confused?
  • Developers who need to integrate and extend functionality?
  • Administrators who need reference information?

What is the desired result of reading the documentation? What problem does the documentation solve?

“As an end user, I want to read user help to understand a procedure.”

“As a developer, I want to read API documentation to avoid wasting time when integrating the system.”

“As a buyer, I want to see the documentation to make sure the system is complete.”

“As a technical support person, I want documentation to troubleshoot customer issues.”

 

3. Get Synchronized

Many Agile teams delay documentation until after delivery. This is a mistake. Any user interaction flaws the technical writer detects are delayed, and may never get addressed. You end up scrambling to document what you cannot understand because you weren’t in stand-ups.

If you are not a full team member, working on the same features at the same time as the rest of the team, you’ll always be behind.

You’ll be the first “smoke tester” for new features. You can make sure that user stories are consistent and don’t stray into “how to” territory.

A good team appreciates your non-technical perspective. Your team will want to know as early as possible if something is confusing or inconsistent.

Related reading: Mastering Technical Communication Leadership

4. Work On Documentation Debt

A technical writer’s work is never done. No product is ever 100% documented. Similarly, the development team always has a certain amount of technical debt tasks in their backlog.

Keep track of your documentation debt. When the development team is resolving technical tasks in the backlog, you can fill documentation gaps, write documentation user stories, or write that missing help for advanced features.

 

5. Use Intelligent Guessing

Because Agile moves so quickly, you may not have time to wait for the team to finish the UI before writing.

What can you do?

User stories are a great help to writers. With a good grasp of the UI, you can guess how the new features will work in advance. Just make up the steps and descriptions and be reasonable.

Then share your work with the team. Tell your team in the daily stand up that:

  • You’ve guessed how the new features will work.
  • They can use this to guide their decisions, or
  • They can tell you what you’ve got wrong so that you can fix it.

At first, this can be daunting. “How can I know beforehand?”

Take advantage of inflated engineer egos. They love to point out errors. Encourage them to correct your guesses. Everyone saves time. They can focus on clarifying what is wrong.

It takes a big dose of both confidence and humility, but you’ll quickly earn the team’s respect. Yes, you’ll get things wrong. But you were going to make mistakes anyway. At first, the team will just focus on your errors. Eventually, they’ll use your guesses to guide them in developing features.

Conclusion

Agile is here to stay. The Agile Manifesto puts working products before extensive up-front documentation, but it never says that Agile is documentation-free.

In fact, the same amount of documentation must be done. If there’s no technical writer, the engineers must write it themselves (which they usually hate), or it doesn’t get done until a customer complains. At that point, it becomes a real challenge to figure out after the fact why the feature ended up as it was delivered.

Joining an Agile team as a technical writer isn’t an option. It’s a necessity. Using these five tips should help.

 

Fred Williams photo taken by Andrea Vuongova, 2015

Fred Williams is a featured speaker in The Content Wrangler webinar, Managing Technical Writing Outsourcing in an Agile Environment. Find out more here. Fred is available to train your team. You can learn more about him and watch some free videos here.

The post Five Tips to Succeeding as an Agile Technical Writer appeared first on The Content Wrangler.

Categories: DITA

Artificial Intelligence: How To Avoid Being Automated

The Content Wrangler - Thu, 2017-02-23 18:21

I’ve been reading up on artificial intelligence lately. AI, as it’s referred to, is a big topic with lots of exciting possibilities. Like many technology-inspired changes, the adoption of AI brings with it incredible promises for improvement. But, it also conjures up fear of impending disaster.

“No, robots will not organize a digital insurrection and take over the planet—at least, not anytime soon,” says Michael Rosinski, CEO of Astoria Software. “But, AI will usher in dramatic and significant changes to the way we live, work, and play.”

Rosinski’s right. Big change is coming. AI is already part of our daily lives. Fast food joints, banks and credit card companies and customer service departments of all types are using the power of machine learning to automate numerous tasks historically performed by humans.

Artificial Intelligence Will Eliminate Jobs, Not Work

Some researchers believe an interconnected, cognitive world of artificial intelligence will lead to the displacement of millions of workers. Others say it will lead to new artisan-type jobs that require a unique mix of technical know-how, interpersonal interaction, flexibility, adaptability, and problem-solving.

Just before President Barack Obama left office in January 2017, The White House published a report, “Preparing for the Future of Artificial Intelligence,” that details where job losses are expected to occur. Made for the White House by the National Science and Technology Council’s Subcommittee on Machine Learning and Artificial Intelligence, the report (removed from The White House website after Donald Trump took office) predicts:

  • 83% of US jobs paying less than $20 per hour will be subject to automation or replacement
  • Up to 47% of all US jobs are in danger of being made irrelevant due to technological advancements, with most job losses due to occur amongst the undereducated
  • As many as 3.1 million car, bus, and truck driving jobs will be eliminated in the US due to the adoption of autonomous vehicles
Avoid Being Automated

Artificial Intelligence Isn’t Just A Blue Collar Threat

Disruption isn’t limited to service work and other blue-collar labor. White collar knowledge work will change, too. McKinsey & Company estimates that “as much as 45 percent of the activities individuals currently perform in the workplace can be automated using already demonstrated technologies.” Automating those tasks will save US industry an estimated $2 trillion USD in annual wages.

That’s not good news for some white-collar workers, whose jobs will be impacted by AI. Whether those occupations will be redefined—or eliminated—remains to be seen. What is certain is that some activities performed by knowledge workers in the digital labor pool will be automated, requiring a realignment of job roles and responsibilities.

Pretty much any job you can think of is riddled with activities that could be made more efficient if automated. Even mundane legal work—like appealing parking tickets—can be successfully automated.

How To Avoid Being Automated

In his farewell speech, outgoing President Barack Obama warned US citizens that “the next wave of economic dislocation won’t come from overseas. It will come from the relentless pace of automation that makes many good, middle-class jobs obsolete.”

Labor researchers say widespread adoption of AI can help US companies overcome less-than-impressive productivity gains. The benefits of AI—increased output, faster time-to-value, higher quality, improved consistency, and reliability—far outweigh the financial costs. A 2013 article in MIT Technology Review claims warehouses equipped with robots process four times the amount of orders as those that have yet to adopt automation.

Automation of routine, repetitive tasks, analysts say, allows workers to spend more time focused on creative tasks that provide value to the company and its customers. And, it allows companies to invest in highly-skilled, knowledgeable and experienced workers who will use artificial intelligence to augment their work.

Human-computer collaborations, some argue, will create an entirely new class of work. A quick search of the online jobs database Indeed provides a peek at the AI-related roles that companies are looking to fill today.

Thomas Hayes Davenport and Julia Kirby wrote a book loaded with advice for workers who want to ensure they won’t be replaced by technology entitled, Only Humans Need Apply. The authors say they believe viewing AI as competition—taking jobs away from humans—is a mistake.

”Instead of viewing these machines as competitive interlopers,” the authors say, “we see them as partners and collaborators in creative problem solving” that help us work better, faster, and smarter.

Earlier this year, Jonas Prising, CEO of ManpowerGroup, speaking at the World Economic Forum Annual Meeting in Davos-Klosters, Switzerland, offered actionable advice to those fearful of automation.

“In an environment where new skills emerge as fast as others become extinct,” Prising said, “employability is less about what you already know and more about your capacity to learn.”

Be Part of the Solution, Not Part of the Problem

Learning everything you can about AI—and how it can help you work more efficiently and effectively—is a good first step toward avoiding displacement. Develop an area of expertise, but don’t limit yourself.

  • Seek out learning opportunities online (there are a wide variety of free, web-based classes covering artificial intelligence-related topics, and lessons shared by professionals, like this on machine learning from the folks at R2D3.
  • Stay abreast of best practices being developed now by Partnership on AI and pay attention to what’s happening at the Association for the Advancement of Artificial Intelligence.
  • Keep tabs on the great work from San Francisco-based, OpenAI, a non-profit research firm dedicated to helping us build safe AI systems that are available to everyone.
  • Attend a conference—like Information Development World—to determine the best route forward, learn best practices, and to gain insight from others who have made the mistakes you’ll want to avoid.

The Future of Work Looks Bright

There’s good reason to be concerned about the future of work—and your place in the labor pool. But, let’s not assume the worst. Many tasks currently performed by humans will be automated. But that’s no reason to believe that the future of work is dim.

“Historically productivity growth has been associated with rising living standards for the bulk of the working population. There is no technological reason that this will not be the case in the future,” says Dean Baker, Center for Economic and Policy Research. There is no obvious basis for thinking that future technologies will be more harmful to ordinary workers than the technological developments of the prior seventy years.”

The future of work looks bright. Make sure you’re ready.

The post Artificial Intelligence: How To Avoid Being Automated appeared first on The Content Wrangler.

Categories: DITA

Gilbane Advisor 2-15-17 — Apple and Web Standards, Gen Z, AMP links, Cognitive Overhead

Next-generation 3D Graphics on the Web Thanks to Benedict Evans for noticing this. From his newsletter: Apple proposed web standards that give web pages access to the smartphone (or PC) GPU to run ‘general purpose computation’ (i.e. machine learning) as well as graphics. Very surprising – I’d have expected this from Google or Facebook rather than […]

This post originally published on https://gilbane.com

Categories: DITA

How to add a DITA specialization to oXygen Editor or oXygen Author

Accelerated Authoring - Mon, 2017-02-06 13:22

Congratulations. You just got your brand new DITA specialization. And, you want to use it.  Follow these instructions to successfully integrate your DITA Specialization to oXygen Editor or oXygen Author.

These instructions apply when the DITA specialization is available as a DITA Open Toolkit plugin.

  • Copy the plugin to the location of the DITA OT you are using (by default DITA_OT_DIR\plugins).
    • On my Windows device, the full path was:
      C:\Program Files (x86)\Oxygen XML Editor 18\frameworks\dita\DITA-OT\plugins\com.myspecialization.doctypes\dtd
  • Start oXygen as an administrator.
  • Run the DITA OT integrator to integrate the plugin.
    • In the Transformation Scenarios view there is a predefined scenario called Run DITA OT Integrator that can be used for this.

The help page is available with additional information.

 

The post How to add a DITA specialization to oXygen Editor or oXygen Author appeared first on Method M - The Method is in the Metrics.

Categories: DITA

RSuite in Action!

Really Strategies - Thu, 2017-02-02 17:22
RSuite 5 pic.png

Interested in RSuite?

If you’re curious about RSuite but have never seen it in action, we invite you to join one of our bi-weekly webinars happening every other Tuesday at 10AM ET.  Our Director of Sales Engineering, Rob Smilowitz, provides an engaging 30-minute demo to highlight RSuite’s features and capabilities.

Categories: DITA

Content Migration Planning: Unpacking The Boxes

The Content Wrangler - Tue, 2017-01-31 23:11

After ten years of accumulating stuff, I am in the middle of packing up our belongings and moving them out before we begin a major house renovation. It’s a pain that I am experiencing as I write this article. But, while moving a home can be a hassle, it isn’t as complicated as a digital content migration. That said, both types of moves can be made easier while also minimizing unexpected surprises—with careful content migration planning.

Step 1: Start With A Discussion About Quality

Two important facts about content migration planning:

  1. Each migration is unique. Size is an obvious difference—the bigger the project, the greater the effort required. From a migration perspective, the variance between content types dramatically impacts the amount of effort required.
  2. There are two ways to quickly—and radically—decrease the amount of effort required during a content migration: move some of the content (deleting unneeded content or deciding not to migrate some existing content) or move it all (then clean it up later). 

Of course, ease of migration shouldn’t be our only concern. Content migration planning should focus on producing results that help us meet our business objectives. Starting with the quality discussion will prevent us from doing whatever is easiest to do.

One of the reasons to start content migration planning early is that doing so allows us to determine the appropriate amount of work (and rework) the migration will require. By thinking through the effort—with our desired quality goal in mind—we may decide on an approach that requires less effort, but helps us achieve our aims. It’s important to recognize that the decisions we make about quality will impact other aspects of our content migration project.

Moving Content Isn’t The Same As Moving Your Personal Belongings

Boxing up—and moving—personal belongings to a new house can be frustrating, but it isn’t all that difficult. Consider what’s involved in moving a box of books from the living room in one house to the living room of another.

Put your books in boxes. Label them (living room books). Seal them up. Arrange to have them delivered to the new house. Once the box is delivered, put it in the living room. Open and unpack it. Place the books where they look best. It’s pretty straightforward.

Content is different. It requires an additional level of effort. We have to determine if the legacy content we want to move from our old website will fit nicely into our new one.

Let’s say we have a set of product descriptions on our legacy website, and we want to use them on our new site. To make them work in their new home, we may need to rewrite and restructure them. To make that happen, we’ll need to unpack the boxes (make changes to the content) during migration.

 What to unpack during migration

Step 2: Boxing Up Your Content

Boxing up content helps you organize and plan a migration. The goal is to group content that will be handled similarly into the same box. To do so, we need to know:

  1. What content goes into each box (preferably with rules)
  2. What we are going to do with it (the disposition)
  3. Who will move it

For example:

  1. The biographies (a discrete content type in our current content management system)
  2. Will be rewritten (the action that will be taken)
  3. By “us” (the owners of the content displayed in that section of the site)

 Labeling the boxes

We box up content during a migration primarily to help us with planning. But, boxing provides additional value during a content migration. For instance, if we decide not to migrate some content to the new system, we can put that content in a virtual box and forget about it until later. By putting it into a box, we can examine the implications of our decisions on the new site in an organized fashion. Are there hyperlinks in the content we are migrating that point to legacy content we don’t plan to migrate? How will we handle those issues?

Organizing content into virtual boxes also helps us assign and prioritize tasks, and streamline overarching migration efforts. We can assign virtual boxes of content to teams dedicated to unpacking that content and acting upon it.

Step 3: Plan To Unpack Content During Migration

How much content we can box up and tackle at once depends on a variety of factors. Sometimes, we can handle an entire box of content without much difficulty. For instance, content that supports a product or service we no longer offer for sale. When we don’t plan to migrate content, we can put it in a box and toss it out. Or when we know that a type of content will remain in the legacy system (and therefore, is not part of the migration effort), we can box it up and ignore it.

More often than not, however, content migration projects involve making changes to legacy content, so it will fit nicely into the new environment. In these cases, we need to open the boxes to make needed changes.

The two common types of changes we make to legacy content during content migration are:

  • Technical changes
  • Editorial changes

Technical changes and editorial changes often overlap (e.g. will we migrate all content automatically? or will some manual intervention be required?), while some changes belong to only one of those categories.

Moving content to the new system is a technical issue. Rewriting product content is an editorial concern.

See also: What can be automated and what must be manual?

Other types of changes include:

  • structuring unstructured content
  • breaking content into granular, semantically rich chunks
  • mapping discrete pieces of content to fields in a database
  • applying metadata schemes
  • resizing images
  • turning unmanaged assets into managed assets
  • changing URL patterns and links
  • stripping out extraneous information
  • cleaning up HTML code

Step 4: Estimate, Not For Perfection, But For Higher Impact

Migration projects are never perfect. Neither are their plans, despite the well-intentioned efforts of planners. Migrations are complicated. They involve moving parts and a host of dependencies. That said, there are things we can do to minimize surprises and increase quality.

Assume we started our content migration project with a discussion about the quality levels we need to achieve our business goals. Once we have our quality goals in mind, we need to estimate the level of effort—and resources—required.

To create an estimate, consider the six steps of handling content:

  1. Sort (how much effort is needed to put content into appropriate boxes)
  2. Place (where will that content be represented in the sitemap)
  3. Edit (what editorial changes are required of legacy content)
  4. Move/Transform (actually moving into the new system)
  5. Enhance (applying any tags or metadata)
  6. Quality Assurance (QA testing)

 

 Content Handling Process

Because quality is more of a continuum than a binary decision, this helps drive useful conversations about an acceptable level of quality. These conversations sometimes result in acknowledging that we’ll need more resources. Other times these discussions will help us determine if our quality requirements are set too high.

Dealing With Imperfection

Content migrations are imperfect by nature. Careful planning is required to migrate content efficiently and effectively. Planning helps us prioritize our efforts and focus on the most important parts of the migration. For example, it may be more important to rewrite 50 product descriptions than to optimize 1,000 existing blog posts. Prioritizing allows us to focus on the most important tasks—the ones with the biggest return on investment—first.

Most content migration projects mistakenly focus on tackling the low-hanging fruit— the easy stuff —but that’s a mistake. It’s always better to focus on how we will achieve our overall vision, with planning and prioritization (the harder items) the focus of our work; doing so allows us to prepare for unexpected surprises. When obstacles present themselves, we’ll focus on solving high-value content challenges instead of wasting time addressing problems with less-valuable content.

Another way of dealing with imperfection is to start migrating—using actual content—as early as possible. Conduct a formal content inventory designed capture details about our content—what we have, what we don’t, where it lives—in order to help us make sound decisions. Although content creators and managers have deep insight into the content they produce, they are unlikely to be able to document content dependencies and relationships without the details provided in an inventory.

Get Boxing (And Be Ready To Unbox)!

Don’t fall into the trap of assuming it’s necessary to process and migrate each piece of content one-by-one. And, don’t assume there’s an ‘automagic’ way to migrate content without some manual intervention. Think about the goal of the effort. Then figure out how the best way to get there.

And finally, consider how to “box” content; grouping it into batches that can be treated similarly during the move. When planning, look at how much effort is needed to move each box—and determine whether the effort is worth the benefit. Determine the best way to “unbox” content when it needs improvement or alterations before migration.

Learn More About Content Migration

Plan to attend this free, one-hour webinar entitled Boxing and Unpacking Content for a Digital Migration, March 7, 2017, at 12 Noon PT as part of The Content Wrangler’s Virtual Summit on Advanced Practices in Content Management. Come prepared to learn — and to ask questions.

The post Content Migration Planning: Unpacking The Boxes appeared first on The Content Wrangler.

Categories: DITA

Gilbane Advisor 1-27-17 — Apple Facebook dance, platform battles

The Great Unbundling We’ve seen the different ways the internet unbundled print and music. TV is evolving, or at least unbundling, more slowly. Ben Thompson has been tracking this for some time. In his latest look he focuses on TV and how Facebook, Snapchat are contributing to its unbundling. This is not just about commercial […]

This post originally published on https://gilbane.com

Categories: DITA

Brand-name companies using SPARQL: the sparql.club

bobdc.blog - Sun, 2017-01-22 14:37
Disney! Apple! Amazon! MasterCard! Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

Server automation for documentation deployment

JustWriteClick - Sat, 2017-01-14 13:12

When you treat docs like code, you want to deploy that “code,” such as doc source files, so that you can see how the doc looks on a web site. I have been practicing these deployment techniques while working on OpenStack, which offers open source cloud computing services. I needed to practice so I could get better, and also practice was the best way to learn this type of technical problem-solving—by doing.

The way I approached the practice effort was to:

  1. Find credentials for a cloud (or two).
  2. Determine which web services to install on the cloud servers I launch there.
  3. Find deployment orchestration templates that launch the right combination of web services to make the site I wanted, deploying Ruby, Jekyll, and NGINX, using Ansible.
  4. Test, test, test. Test some more.
  5. Try out Docker locally, then get the cloud server working, finally. This step took a while while I worked out the Linux user permissions needed for installing Ruby with the compatible version needed.
  6. Set up the cloud server as a git remote, then push the website to the git remote, building the HTML and copying the files to the web server.


Hear me talk about my excitement trying out docs deployment in this video clip from the original on thenewstack.io. In it, I talk to Alex Williams, founder of TheNewStack.io, about my adventures at the OpenStack Summit in Barcelona.  Thanks to Alex for the permission to re-post and for asking about my latest work.

Resources

The deck is available on Slideshare.

Deploying Apps on OpenStack from Anne Gentle

The Ansible code is on GitHub.

The Jekyll theme, so-simple, is on GitHub.

The content repo is on GitHub.

This demo shows pushing the site to the git remote to update the content.

Categories: DITA

Attending DBW Next Week?

Really Strategies - Fri, 2017-01-13 16:22
Categories: DITA

Gilbane Advisor 1-5-17 — Bots, Deep Learning, Mobile Marketing

Happy New year Dear Reader! We have chosen a small number of the superabundance of end-of-year reviews and predictions to recommend, each focused on rapidly developing areas that are important for you to keep up with, even if at a high level. Topics include bots, deep learning, mobile, marketing technology, software development, and design. Bot Check-In: A […]
Categories: DITA

Gilbane Advisor 1-5-17 — Bots, Deep Learning, Mobile Marketing

Happy New year Dear Reader! We have chosen a small number of the superabundance of end-of-year reviews and predictions to recommend, each focused on rapidly developing areas that are important for you to keep up with, even if at a high level. Topics include bots, deep learning, mobile, marketing technology, software development, and design. Bot Check-In: A […]

This post originally published on https://gilbane.com

Categories: DITA

Diversifying Content Strategy To Improve Customer Engagement

The Content Wrangler - Thu, 2017-01-05 11:59

Guest post by Laurel Nicholes and Niki Vecsei Harrold

Prior to presenting, Content Potluck: Bringing Everyone to the Community Table (during the Virtual Summit on Advanced Practices in Technical Communication) we authored a quick-read blog post in which we defined content potluck and outlined how to find champions in your organization to move a project forward. Today, we follow up on that effort. We provide tips for diversifying content strategy by supporting a content potluck. We also provide some advice on organizing your content production and distribution efforts using a collaborative editorial calendar.

During the Summit, we asked attendees to answer questions designed to capture current practices and to spot opportunities for improvement. Tips designed to help keep both participants and stakeholders informed and engaged are included below.

Tip #1: Build An Editorial Calendar

There are many software tools available (free and paid) that you can use to build editorial calendars. Look for calendaring tools that allow you to set alerts, send reminders, and share content with others. Here are a few ideas:

  1. Calendar functionalities (Google Calendar, Atlassian wiki calendar, Outlook Shared Calendar)
  2. Spreadsheets (Excel, Google Sheets, Smartsheet)
  3. Intranet (Jive, Atlassian, Sharepoint)

Regardless of the software tools you choose, how you use your calendar makes all the difference. These best practices can be leveraged to help your team recognize instantly what content types are missing, or where content gaps exist:

  • Assign content an owner
  • Set alerts for specific content tasks, including deliverable dates
  • Code content types
    • blogs
    • tweets
    • videos
    • articles
    • interviews (like “Ask Me Anything” question and answer sessions)

Tip #2: Plan Topical Campaigns

Use your editorial calendar to orchestrate the distribution of topic-based content. Make sure to target a mix of channels. This is a good chance to re-use and re-distribute existing content assets. Think beyond product launches—customers want to engage consistently with your brand. Be creative. For example, you could provide content to help customers solve a common problem. Wherever you decide to publish, drive traffic consistently to the same content asset. Doing so will help you grow your audience and attract repeat visitors.

Tip #3: Engage With Your Audience

Asking questions of prospects and customers—and listening to their answers—across multiple feedback channels is an excellent way to identify content ideas. Documenting your findings helps your champions understand the impact of content on those you serve. If everyone on your team commits to the potluck, they will want to share feedback and metrics with their peers and managers.

Advice: Make sure to share community and content performance reports in an easy-to-understand format at regular intervals. Consider having direct feedback sessions between content creators and your audiences. Low-tech feedback loops can include:

  1. One-on-one calls
  2. “Ask Me Anything” question and answer sessions
  3. Polling and surveying through social and community channels

During the Virtual Summit on Advanced Practices in Technical Communication, we asked our audience how they gather feedback on their content today.

Surveys are the most inexpensive and most-used tools for collecting feedback. Surveys don’t have to be long and involved to provide utility. Single-question surveys can yield significant insight, deliver great response rates, and help customers provide feedback quickly with minimal effort.

Tip #4: Reward Your Champions

Recognizing your champions keeps them engaged and loyal to the potluck. Some of the easiest ways to recognize people for their hard work:

  1. Provide bonuses or manage by objectives rewards
  2. Send thank-you notes to champions (and their bosses)
  3. Share analytics reports (demonstrating the impact of their work)
  4. Ask them to lead conversations in the community (at both live events and virtual events like webinars)
  5. Give them a special virtual badge—or ranking in your community—to make sure their profile stands out and the audience recognizes their expertise

Below we answer the questions submitted by attendees during the live webinar.

Questions From The Audience:

Q1: Do you have any thoughts for companies that are extremely limiting with what goes out to customers? We don’t have access to social media and only publish behind a firewall.

A1: We understand. When we started our project, we had to break down barriers. We too, were publishing behind a firewall, and had to campaign executive leadership in order to get permission to build our Twitter and LinkedIn presence. Your first step is to effect change.

We achieved significant impact—and broke down barriers—by doing some basic market research designed to capture where our prospects and customers were talking about our brand on the internet. Being a technology company, we found that a lot of conversations were happening on Quora, Spiceworks, and Github. Our prospects and customers were exchanging and giving advice without us. Much of the information about our firm that was incorrect or not up-to-date.

We also did some social media research and queried Twitter conversations to capture how many non-marketing conversations are happening with our brand or product name. We found a staggering ratio between neutral, positive and negative conversations. Presenting our findings to executives caused them to realize that we could no longer afford to ignore these outside the firewall discussions.

We also became active contributors to our company website. We got permission to write a weekly blog that contained links to content behind the firewall. Blog topics varied; they could promote an upcoming launch, address customer feedback, or answer a common question. By blogging links to our content, search crawlers could pick up the links and our SEO improved.

Q2: Great presentation! I love the Content Potluck model. Can you share how you were able to get executive sponsorship that led to even representation and participation across the organization?

A2: We started the project by listening to the pain-points our executives were having. Product management wanted to grow their thought leadership position and gather customer’s ideas for new features. Training wanted to market their classes as well as provide free training videos and needed a platform. Customer service wanted a home for their forums and wanted to reduce the cost of support. Engineering wanted more people to read the technical content prior to escalating problems to the support staff.

We did some grassroots work, too. We recruited people interested in creating content for the community, including podcasts, videos, blogs, and as well as volunteers to monitor the forum. These weren’t always managers; frequently they were knowledge workers starting out in their career; folks with a passion for the technology who wanted to grow their professional reputation as a subject matter expert.

We matched volunteers with problems to solve, and created a presentation to explain the content potluck idea. Then we took our show on the road. We tailored the presentation to executive problems, rather than asking them all to one general presentation. We believe that this is what made us successful.

Once we got buy-in for our effort, we had to ensure we could deliver results. To demonstrate our successes, we shared metrics showing membership growth and increased content consumption. Ultimately, that’s what kept executive support growing.

Q3: When you ask so many different types of people to contribute content, how do you maintain the quality? How do you prevent it from feeling like a cacophony of inconsistent voices?

A3: The audience who engages with you in a community space craves multiple voices. Our experience shows that having different voices and styles of writing resonates better and lends authenticity in the customer’s eye over the one (sterile) tone of voice approach of “official documentation.” The variety of voices does not significantly impact quality. And, only subject matter experts get to craft content. Anything that is long format (not just an answer to a question) gets vetted for the first few publications by a central figure such as a content strategist or community manager. Once that contributor shows they understand the guardrails around quality content, they are allowed to post without moderation.

It’s important to note that champions won’t immediately start writing content for your community. Most content potluckis start with one or two contributors. Over time, contributions and contributors will increase. Eventually, your champions will master the base requirements of content creation—authenticity and quality—and will coach other team members with similar expertise.

Q4: “Ask Me Anything” discussions—we call them Office Hours. This is one hour per month where clients can connect with the team and ask anything.

A4: Office Hours or “Ask Me Anything” sessions are a great ways to uncover the content needs of your prospects, customers, and other shareholders. Try them out in different channels and formats. Conference calls, Google Hangouts, web discussions, Facebook Live feeds, Twitter chats, and question and answer sessions focused on the needs of your community not only provide excellent ideas for content potluck projects, but they also demonstrate that you value your audience. Hosting such events—and acting on the knowledge gained—tells your audience that you are looking for ways to improve their satisfaction.

But don’t stop there. Make sure to involve internal teams to these discussions. Doing so attracts new and different voices to the conversation. Always provide a recording or transcript, and hold the subject matter experts accountable publicly for delivering on their promises. It should go wothout saying, but we’ll say it here: Don’t focus only on the good ideas. When you listen, hear what is being said. Always follow up complaints.

If you have additional questions, leave a comment on this blog post (or find us on Twitter under Niki Vecsei and Laurel Nicholes). We would love to hear from you.

The post Diversifying Content Strategy To Improve Customer Engagement appeared first on The Content Wrangler.

Categories: DITA

Managing Enterprise Content: 12 Lessons From 2016

The Content Wrangler - Tue, 2017-01-03 17:57

Drawing from 20+ years of experience wrangling content, Robert Norris presents us with a twelve-part series on managing enterprise content. His articles (some of the most popular posts of 2016) take on the complex topic of enterprise content strategy from a heuristic angle, bypassing academic approaches for more pragmatic solutions designed for immediacy and ease.

This article serves as a summary of Norris’ outstanding series, inviting you to explore the rich and unique insights contained in his work.

Managing Enterprise Content: 12 Lessons

1. Think Like a Librarian

 Librarian checking in books

Norris’ first post, Think Like a Librarian, elaborates on the theme of agility in design to create an easily accessible and user-responsive knowledge-base. Summoning the stereotypical figure of the “librarian” as a stand-in for “knowledge manager” or “content troubleshooter,” Norris elaborates on five principles of librarianship that can be used to enhance knowledge base experiences.

2. Building Problem-Solving Toolkits

 Team of workers thinking through a problem

Norris second installment tackles the issue of quality control for content curators and content technologists. For the content curator, user-advocacy is a critical aspect of making knowledge-base collections more intuitive and streamlined for the end user. The key to achieving this is to consider not just audience needs—but more importantly—audience competency levels. As audience expectations will differ across a broad spectrum of competency levels, it is important to meet those expectations by making content appropriate to the user’s knowledge level.

The second part of the article, aimed at technologists, provides a few tips for decluttering and recombining content resources to create valuable collections. Not every archived document will have stand-alone value to users. And a document’s internal (company) value will erode when it exists among duplicates, particularly if the duplicates contain slight variations. By alleviating duplication problems and finding new ways to recombine documents in response to user needs, technologists may be able to envision new product possibilities from existing archives.

3. The Curse of Elegance

 Elegance

Norris’ third installment addresses the “paradoxical challenges” facing every content designer: the more elegant the design, the less noticeable it becomes; almost like a great film score that is “felt” but never “heard.” Designing for functionality can either make you a target for complaints, if your design is faulty, or an “unsung hero” if your design is elegantly crafted. Yet developments in content design cannot be cultivated without appropriate feedback, one primary component being “appreciation.” Customer feedback is essential for assessing the “effect” of a given design–the UX of functionality. Feedback from colleagues is also an essential factor that helps inform and shape the mechanics behind UX.

Ultimately, and to add yet another paradox, content design requires heightened noticeability to achieve its optimal state of “invisible” functionality.

4. (Im)-Proper Care and Feeding of Subject Matter Experts

 Sign — Wrong way.

Norris’ humorous title—(Im)-Proper Care and Feeding of Subject Matter Experts——refers to the notion that subject matter experts (SME’s) are a very different kind of animal, proverbially speaking, in the operational realm of content and communications. Though most people are aware that brilliant subject matter experts don’t always make great teachers or communicators (and vice versa), not everyone has the skills to effectively approach and collaborate with SME’s when their input is needed. Norris lays out the main challenges in this relationship, providing practical solutions for each one.

5. Hey! Where’s Our Content?

 Woman searching; looking through binoculars

If the word “strategy” denotes a comprehensive plan of action, entailing a wide range and long-term perspective, then the notions of myopic focus and recency bias seem antithetical to it. Yet, this is what many content managers often face in companies where C-level executives view content as “deliverables” rather than as active informational networks and relays.

As Norris points out, many professionals outside of the content field do not fully understand the scope and principles comprising content strategy as a discipline. Contrary to what many organizational managers may think, content strategy goes beyond the production of marketing and sales content. Content strategy also cannot be restricted to the limits of content production (deliverables). This fifth installment discusses the (not so) unforeseen negative consequences of this misunderstanding.

6. Devising a Content Strategy to Serve Every Audience

 Content Strategy

At the opening of his sixth installment, Norris presents us with a definition of strategy that is in alignment with most current notions surrounding the concept. The term “strategy” makes for an interesting comparative distinction when viewed etymologically from the military context in which it had originated.

In a military context, a direct offensive is only as strong as its means of support, the latter posing as a critical vulnerability to be targeted by an opposing force. With the aim of reducing vulnerability to one’s side, a campaign leader cannot afford to be so myopic as to focus solely on the tip of the spear. Norris points out, based on his experience, that many organizations have a tendency to work contrary to this basic principle.

According to Norris, many organizations place lopsided emphases on marketing and sales efforts with regard to recognition and resource allocation. Such biases affect the quality of content operations, as focus shifts from the “enterprise” level to its subsets (i.e. marketing and sales content). Contrary to this common tendency, Norris reinforces the notion that “enterprise content” encompasses, obviously, the entire “enterprise,” what Norris calls “every audience,” or every internal and external user. To this end, Norris provides an exceptionally clear framework for constructing a content strategy that is gap-proof and all-encompassing.

7. Your Content Strategy: Is It Feasible?

 Team meeting

Conducting a feasibility study is an effective way to assess the practicality of a method or plan. When developing an enterprise content strategy involving multiple individuals and departments—all of whom have different perspectives, work methods and goals—a feasibility study is necessary to see how the workflow puzzle can be collaboratively assembled.

One effective way to conduct such a study is to simulate a real-life scenario. Simulations can help teams collaboratively construct project roles and expectations, and shape contingency responses based on individual capabilities and expertise. Norris’ seventh installment proposes a few tips for conducting an organization-wide feasibility study to help test and shape the real-world implementation of a content strategy.

8. Best Practices for Fostering Support from Stakeholders

 Project Stakeholders

Fostering stakeholder support is critical to any organizational undertaking. Without “buy-in” from the managerial and executive levels, a project may not get the opportunity to leave the runway. Similar to the previous installment, Best Practices discusses the diverse and potentially dissimilar interests, goals, and personalities among stakeholders.

A complicated scenario, sharp differences can exist among stakeholders interests despite their general alignment with larger organizational goals. A key solution, which Norris explores in detail, is to study the stakeholders themselves, an approach similar to that of a feasibility study, before selling their ideas upstream.

9. A Swing and a Miss: Faulty Customer Support Metrics

 Baseball player strikes out.

Norris’ ninth installment puts the spotlight on support center operations and the role they play in shaping the overall quality of products and services. He advances two general propositions. First, an individual user’s experience is a correlated stand-in for mass-user experience. Second, the support center should be viewed not only as the spear-tip of customer engagement but also as a critical player in quality evaluation.

While leadership tends to focus on the big-picture metrics surveying conditions on a larger scale, the key to quality control is in the discrete metrics that are often overlooked.

10. Building a Robust Content Quality System

Business key rating increase web icon

Managing product and service quality is standard procedure for most businesses. But the importance of content quality management is a matter that’s often siloed if not neglected altogether. In the absence more integrated procedures for quality content control, unmonitored databases can leave businesses vulnerable to several unforeseen risks. Norris’ tenth installment discusses the risks posed by “orphaned” documents—ownerless and often outdated documents floating in a database. The result of customers accessing such documents can range, depending on context, from minor errors to severe harm.

Active ownership is key to content quality control. It is also the most effective way to mitigate risks posed by inactive content. Norris presents a quality control framework designed to help you ensure content quality and prevent content mismanagement snafus.

11. Developing a Unified Content Strategy: Learning From the Masters

 Martial arts master

Although marketing content is designed to drive growth, support content plays a critical role in maintaining customer engagement and satisfaction. Marketing content promises a particular product/service experience, while support content enhances the delivery of that experience. If the goal in “marketing” is to communicate the value of a product/service, then all organizational content can be considered an extended form of “marketing content” addressing various touch points of customer experience over time.

As company executives are lopsidedly biased toward growth initiatives, the marketing side of content operations tends to receive more attention and resources than non-marketing counterparts. But given such organizational focus, marketing teams tend to be better equipped, experienced in collaborative settings, and adept in operating across multiple channels of communication. In his eleventh installment, Norris explores ways to tap marketing’s capacity to create a well-balanced content strategy across the organization.

12. Managing Counterproductive Organizational Expectations

 Bengal tiger

Most of the articles up to this point discuss the challenges of content managers operating at the periphery of executive focus. It’s what Norris calls the curse of elegance. But what happens when a successful content operation attracts the full attention and scrutiny of executive management?

In this final article in the series, Norris discusses the burden of success—misguided expectations, executive micromanagement, etc.—along with a few contingency ideas to help mitigate the problems that come with being successful.

 

The post Managing Enterprise Content: 12 Lessons From 2016 appeared first on The Content Wrangler.

Categories: DITA

A modern neural network in 11 lines of Python

bobdc.blog - Thu, 2016-12-22 12:52
And a great learning tool for understanding neural nets. Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

Overcoming Objections to Intelligent Content

The Content Wrangler - Sat, 2016-12-10 21:33

We get a lot of questions about intelligent content. Of course, sometimes, rather than getting questions, we get “told things”—stories taken out of context or statements repeated by others without factual support. In this post, we examine and debunk some of the most common objections to intelligent content.

Some of these objections arise from common misunderstandings about factors such as cost (We need how much money?), purpose (Why do we need to change how we work, anyway?), perceived limitations (That’s cool, but it will never work for us…), or technology (Why can’t we just use the tools we have already?). Other objections are understandable concerns related to change: worries about losing writers, uneasiness about adjusting job descriptions, and so forth.

I’m Marketing, it’s just for technical content

I’m in marketing—there’s no way this could possibly work for us. This is one of the most common objections and it’s fairly easy to dispel.
Marketing content is different from technical or explanatory content in some ways, but at its heart it shares the same need for accuracy, verifiability, and quality. It must be created quickly to meet the needs of the market. It must be interesting, relevant, and engaging. And we must be able to update it rapidly and inexpensively.

Traditionally, marketing content was agile, while technical content was not. Marketing content was highly visual, while technical content was not. Marketing content was engaging and interesting, while technical content was not. Technical content needed sophisticated technology, marketing content did not. See a trend emerging?

In most cases, it is beneficial for technical content to be aligned with marketing content because all content affects the way prospects and existing customers feel about our brand. Therefore, all content is marketing content, regardless of who creates it.

We need intelligent content to unify our content, making it possible to provide a consistent experience across all touch points with our content.

There’s no good business reason for creating inconsistent content that damages brand.

In today’s global, hyper-connected world, there’s no reason to create inconsistent content that damages brand, ruins the customer experience, and wastes finite corporate resources. Marketing must learn to leverage the techniques some forward-thinking technical communication departments have already mastered, techniques that can reduce or eliminate unnecessarily cumbersome, time-consuming, and expensive manual processes; automate content creation, formatting, and publishing tasks; systematically reuse content to meet customer needs; and publish content to multiple channels, simultaneously.

Intelligent content allows us to do all of these things — and everything else we’ve always done — more efficiently and effectively, affording us the luxury of using the resources saved to innovate.

For instance, intelligent content allows us to create content, and, with the push of a button, republish it in different forms. We can easily re-skin content and know that we’re still using the correct/approved content. With intelligent content, we can repurpose marketing content intended for one medium in a totally different medium, depending on the metadata associated with the content and our output requirements. With intelligent content we can reuse, repurpose, and customize our information for different outputs and markets faster and more accurately than ever before.

Just because content is well-written, displayed in an innovative format, and published in some new and super-cool way, does not make it intelligent content. These things may be attractive, interactive, or amazingly different, but those characteristics don’t make content intelligent. Intelligent content relies on repeatable methods, content standards, automated processes, and software technology designed to help us create, manage, and deliver relevant, personalized content in more efficient and effective ways than traditional publishing approaches allow.

We can’t review content in modules; we need content in context

When first encountering the concept of intelligent content, many reviewers, editors, and proofreaders are concerned about the difficulty of properly reviewing content that’s broken into reusable components. While they agree that creating modular content for reuse is interesting (and likely advantageous), they argue that reviewing content in sections any smaller than a document is simply impossible.

The actual content review method employed depends on the software tools selected and the implementation details, but in general, during a review cycle, the content under review is displayed in context. Comments made by a reviewer are attached to the content component being reviewed. This same technique is used to provide context to translators.

That said, intelligent content presented for in-context review may not be fully formatted — layout will be minimal. That’s not a problem, because we want to focus on improving our content, not the appearance of our content (important though that will be in the final output).

All content is marketing content, no matter who creates it.

Instead, an approximation of the final content design (images, charts, graphics, text, fonts, colors) will be presented to reviewers during in-context review. For instance, if the text is supposed to be wrapped around an image in the final output, that’s unlikely to be visible to the reviewer. For reviewers used to working in desktop publishing environments, not being able to see the final design as they edit and review may introduce some challenges at first, but they usually adjust to this difference as they grow familiar with the approach.

Using visual design as a basis for intelligent content review is a bad practice. It’s better to review the content with light styling. That way the content is less likely to be tweaked for a specific appearance by well-intentioned reviewers and more likely to be reusable in multiple contexts.

It’s only about reuse

Although intelligent content supports efficient reuse, that’s not the only reason to implement the approach. , discusses other benefits, but even if intelligent content was only about reuse, which it isn’t, it would still offer value.

How much content is reusable? We find the average organization that adopts intelligent content enjoys at least 25% content reuse. Depending on the organization and the content, that percentage can be much higher. It can be lower, but it’s rarely zero. Managing even small levels of content reuse intelligently is much easier than copy-and-paste.

Content reuse can pay even higher dividends in organizations that translate content into multiple languages. When we reuse content components and their translations, our return on investment (ROI) skyrockets. Some organizations report that their biggest savings come from reduced translation costs attributable to content reuse.

It takes too long to implement

It does take extra time to use intelligent content correctly in the early phases, but that’s because we have to reengineer our old, outdated processes; adopt new roles, responsibilities, and tools; and learn how to work differently. Moving to intelligent content transforms the way we produce our content, and transformation means big changes involving pushback, uncertainty, fear, rumors, and temporary setbacks the first time through.

The key to success is to correctly identify the scope of the project and clearly understand our objectives. As an intelligent content project rolls out, we will identify new participants and departments who will want to get in on the action. While it’s important to encourage widespread participation from all divisions of the organization, we must also avoid expanding project scope. We must address immediate needs first and then, once our systems are up and running, invite other departments to participate.

Take small steps, such as structuring content, first before tackling all the automation, workflow, writing style, and technology changes. Adopting structured, semantically rich content alone will make our content much more effective.

It’s too hard to do

Introducing anything new can be a challenge. When creating intelligent content we have to change from a page-based way of thinking about content, to a component-based content paradigm. This is often the biggest challenge, especially for experienced content contributors.

Authors take pride in their work. Traditionally, this has meant that an individual author was responsible for a specific set of deliverables. With intelligent content, authors exchange individual control of deliverables for the flexibility of creating shareable, reusable content products that they develop collaboratively. Rather than being responsible for one content product (or suite of content products) they become responsible for a much broader range of content.

Interestingly, some technical illustrators and graphic artists have adopted component-based content approaches. They’ve learned to easily create modular components and reuse them where needed.

Software developers have been using intelligent content principles, especially component content reuse, since the 1990s. They overcame the “It’s too hard to do!” objection by recognizing that the benefits of their approach (they called it object-oriented programming) far outweighed the up-front work required.

It’s not about content, we just need new technology

Actually, no. We need some new technology (software) to implement all but the smallest intelligent content project, but that’s not our biggest issue. Not by a long shot.

When we move to intelligent content, the biggest challenge isn’t software. After all, software doesn’t produce intelligent content; we produce it. Software helps us plan, create, manage, and deliver our content, and it helps us do these things efficiently and effectively. Software supports our efforts, but it shouldn’t be our focus.

Instead, we must focus on moving to a modular, component-content way of thinking. With intelligent content, we create individual components of content that can be mixed and matched in different ways, for different audiences, which consume content on different output channels. To succeed, we need to collaborate and share information with more people than ever before. And, the way we work (our actual tasks) will change, as will our workflow. We may even need to change the way we define success.

Technology is the easy part; people are the are the challenge.

Technology is the easy part. People, and the changes we expect them to make, are often the most difficult obstacles.

Despite the people challenges, moving to intelligent content can be done! The key is to understand what we want to do, when we want to do it, and clearly communicate our goals and expectations to everyone involved in the process, including the naysayers.

Recognizing that this is a cultural change—more than a technological change—and managing expectations is critical to success.

It costs too much

There’s no question that implementing intelligent content comes with costs. For some organizations it can be cost prohibitive. Software isn’t free, and there are associated installation, implementation, and configuration costs. Oh, and there’s training. But these costs are common expenses in nearly any transformational project.

Moving to intelligent content is both a cultural and technological transformation. Cultural changes are often harder to make than technological ones, but the cost of technological change can’t be dismissed.

The best way to control these costs is to create a well-defined intelligent content strategy aimed squarely at helping us accomplish clearly-defined, achievable goals. The best method of keeping costs down is to stick to the project plan. Scope creep increases cost.

Some organizations seem to handle change better when it’s introduced in phases. When we break our intelligent content project down into smaller, more manageable chunks, not only is it easier to implement, but cost becomes less of an issue. Expenditures are smaller and the total project budget is spread out over time, making funding such initiatives easier for some, more palatable to others.

Regardless of the approach, it’s critical to have a well-researched business case that spells out the potential return on investment from each phase of an intelligent content project. Maintaining momentum is easier when we can prove that the benefits of our efforts far outweigh the costs.

Intelligent content is only for regulated industries

Intelligent content is a particularly good fit for regulated industries. But, intelligent content can provide benefits for nearly every content-heavy organization, regardless of whether it is regulated or not.

When considering a move to intelligent content, organizations in regulated industries are attracted to a variety of benefits, but the primary benefit is improved control over content. Intelligent content is far less likely to be incorrect or outdated when published.

Why? Traditional content production methods rely on outdated content review approaches in an attempt to ensure quality. The traditional approach involves completing a document and then sending it around for review by others. This approach is slow, error-prone, and doesn’t align with the agile methodologies many companies are using to drive product development. Quality is often achieved (at least in part) by ensuring that many people get a chance to review the content.

Because we’re accustomed to working on a document/page basis, many people feel that the most important review is the final review when the document /page has been completely written and styled. So until they have seen a publication-ready document, they won’t sign off on it.

This method was rejected as being too costly and not very effective by the manufacturing sector decades ago. What manufacturers do today is design quality in, and use fewer, but better targeted quality checks early in the development process. That doesn’t mean that reviewers don’t look at the final version, they do. However, this final check should require few, if any, changes.

When we adopt intelligent content, we streamline our review process because reviewers can focus on the content without worrying about the fonts, colors, style, and look-and-feel of the final deliverables. This improved focus lowers costs and improves quality.

While regulated industries realize important benefits from intelligent content (quality and content control), all content producing organizations can gain benefit from intelligent content.

We’re too small to use intelligent content

This objection has a kernel of truth, but even so, it’s not entirely valid.

Not all intelligent content implementations are huge, expensive efforts. We can use the principles of intelligent content—small modular pieces of structured information, early reviews, separating the content from the way it looks—to improve most manual content creation processes.

Many successful projects take place in sizable writing departments in large (often global) companies. These success stories often detail the expansive—and expensive—technology solutions selected by the featured company to make their project a reality. But, intelligent content projects come in all shapes and sizes. Small organizations can also adopt the approach and see sizable benefits.

Guaranteeing success is about choosing the approach, technology, and training that meets the needs of the organization. Not every company that implements intelligent content will require the same plan of attack or the same software tools. There are a variety of tools and technologies, at various price points, that can help us achieve our goals.

Intelligent content can provide even a single writer with benefits worth the effort. We don’t need an expensive, multi-seat license to a component content management system to create intelligent content. Some organizations find that small cloud- or server-based systems can empower a small department to create intelligent content. A single writer or small writing team can benefit from being able to easily locate content and know they have the latest version.

The size of the organization is not a determining factor. Don’t discount the benefits of intelligent content because of the size of the company.

We’re too big to use intelligent content

When we move to intelligent content, we don’t have to make all the changes needed at once. In fact, there are many good reasons for adopting a phased approach.

To succeed in a large organization, one of the first steps is to identify influential people in other departments who would benefit from adopting intelligent content. Design an approach that meets their needs. Start small. Identify an individual group to target. Ensure that the type of content they produce can be leveraged by other groups, and make sure the approach works for the initial group before rolling it out to others.

Also, keep in mind that not every group we target will want to join the effort. Some groups won’t adopt the new approach, even though it could prove to be useful to them. Other groups might seem like great candidates, but if their current processes are too far away from our reality, including them would be counter-productive. Don’t force it!

If there’s no business reason to use intelligent content within a particular group, department or organization—don’t use it!

Intelligent content requires new technology

This is half true. We can accomplish many of the goals of intelligent content without adopting new technology—at least in the early stages. However, to benefit from all the bells and whistles—automation, sophisticated content reuse, and multi-channel publishing—we need tools to help us create, manage, and deliver intelligent content.

But, there are things we can do to prepare ourselves for the move to intelligent content before we invest money acquiring new technology. The biggest value comes from structuring our content. Analyzing our content and designing a repeatable structure yields more consistent, coherent, streamlined, and effective content.

We don’t need special tools to support structure. We can create structured web forms for authoring, or even set up structured content templates in Microsoft Word, before we purchase new technology. As long as authors adhere to the structure of the content and can quickly and easily create content, we can realize significant rewards.

Structured content is not only a best practice, it’s required for intelligent content solutions. We need to create, manage, and deliver structured content to realize the full benefits of intelligent content.

I’ll lose control/creativity

Control is a myth. On most devices, the reader has ultimate control of fonts, colors, point size, etc. In print, corporate style guides control the look. Intelligent content practices allow us to spend less time with issues, such as look and feel, that we never have had much control over anyway. By spending less time on those issues, we have more time to deal with the aspects of our job where we can be creative. It takes just as much creativity to write structured content as it does to write in stream-of-consciousness mode, and the result will serve the customer better.

The post Overcoming Objections to Intelligent Content appeared first on The Content Wrangler.

Categories: DITA

Remodeling documentation

JustWriteClick - Sat, 2016-12-10 15:19

An article I wrote on docslikecode.com website:

A few years ago we went house-hunting in Austin, Texas. One house was so popular during the first showing, there were six back-to-back appointments. We waited in the driveway while another couple toured it. Once they left, we could quickly go through it while another prospective buyer waited on the front walkway.

This house was awful. Every single surface was ugly, out-dated, and circa 1973. There was a giant hole in dirt by the front porch, likely dug by an animal. But you know what? I loved it. I wanted to bring it back to a vibrant family home, taking it back from the rogue porch-dwelling raccoons — or was it dirt-digging armadillos? We may never know.

Raccoon visiting

Let’s look at your code base and your doc base as a great house with a good layout and foundational “bones.” You still need that “punch list” to hand to your contractor. When you move towards more docs like code techniques, make sure you treat your doc base like a code base, and track defects. Get that “punch list” done.

With a code base, you know how much remodeling you need to do. The same thinking can work well for docs. How dated have your docs become? How accurate are the docs compared to the rest of the code base? How can you make the site livable and vibrant again?

Let’s give your readers the chance to do those quality checks for you as easily as possible: by reporting the bug on the page where they found it.

This technique works well when:

  • You have more readers than contributors. (I generally hope this always happens.)
  • Your readers are super busy. Still, they do want to make the docs better and help others.
  • You want to know how far your docs have “drifted” from the truth.
  • You want your docs to be more trustworthy by chipping away at a bug backlog.
  • You have a private GitHub repo for documentation, but you want to enable public bug reports with tracking back to your docs repo.

Your quick win is to look at your current docs site, any given page. Is there a way to report a bug publicly, to add to the “punch list?”

  1. Bare minimum starter level would be an email address link from every page.
  2. Level up by adding a link to your GitHub source repo Issues page so readers can report bugs.
  3. Better yet, write a quick bit of code to embed on every output doc page so that the issue is pre-filled with relevant information.

Here are some resources to get your first punch in that punch list:

  • Using Python Sphinx? The OpenStack docs theme has some Javascript you could re-use to pre-populate an Issue template so the reporter sends the page URL, the source URL, the git SHA of the commit for that version of the page, and the release version value. See this docs.js snippet.
  • Using a private repo for docs, but want to track bugs publicly? Use Issues-only Access Permissions.
  • Want to add a bit of code to pre-create Issues to use as comments on every page? Free yourself from Disqus comments. Try this set of tips and sample code in a blog post.
Categories: DITA

Mastering Technical Communication Leadership

The Content Wrangler - Mon, 2016-12-05 22:20

Effective leadership is measured by customer satisfaction. Caring about your customers is crucial. Everyone is a customer: those who purchase and use your products, and your colleagues in every department.

Leadership requires a mixture of confidence and humility. Change doesn’t always happen quickly. Success may require all of the creativity, resourcefulness, and diplomacy you can muster.

Use these 7 Habits of Highly Effective Technical Communication Leaders to learn how to effectively put the customer first in all of your work. You can be an effective leader, regardless of your position.

1. BE A CUSTOMER ADVOCATE

While you may have less product knowledge than an engineer, you can view the product from a user’s perspective. Share your input and advocate for product and process improvements. Support similar efforts initiated by others. Focus on quality.

picture2 2. ADVOCATE FOR PROCESS IMPROVEMENTS

Process improvements can benefit everyone on the team, and improve product quality. Can the quality assurance team test the documentation? Can technical writers edit the user interface text and error messages? Can written documentation reviews become a factor in the performance evaluations of all product team members? Would documentation review meetings improve quality?

picture33. CARE ABOUT ALL CUSTOMER-VISIBLE CONTENT

If you’re only writing the customer documentation, there is a chance that no one with your skills is editing other user-visible content. Even if you cannot edit this content, you can educate others about key technical writing practices. For example – One concept, One term – each word should be used to mean only one thing. This avoids user confusion, and saves translation funds. Another key technical writing practice: short sentences.

picture44. BE AN EFFECTIVE INTRAPRENEUR

Think about how to create change – look before you leap. Understand the who, what, and how of change. Who are the key stakeholders and decision-makers, and what motivates them? What do they care about? How can you best bring them on board? What are their backgrounds – cultural, professional, educational? Start with curiosity. Listen. Discuss. Ask questions.

picture55. KNOW YOUR AUDIENCE

What do you know about the users – education levels, roles? How do they use the product and access the documentation? What percentage read the documentation in English? Collaborative efforts with other teams can aid your research. One way to learn more is to conduct a user survey. While gaining approval can be an uphill battle, the insights gained from a well-designed survey can make the effort worthwhile.

picture66. PROMOTE APPROPRIATE TECHNICAL COMMUNICATION METHODOLOGIES, GENTLY

Not every organization needs to use the latest methodologies. What worked elsewhere may not work for your team. Bring others along with you – educate and involve engineers, quality assurance personnel, marketers, and product managers in assessing and exploring new methodologies. For example, some organizations can benefit from adopting topic-based authoring, without XML or DITA.

picture77. KEEP PERSPECTIVE, AND DE-STRESS

Mistakes provide tremendous opportunities for personal and organizational growth, including improved processes, communications, and skills. Accept these moments and make the most of them. Learn from successes and failures. Remember that what is truly irreplaceable is human life. Learn what you need, and what your coworkers need, to reduce and relieve stress.

Consistently practicing these 7 habits will support your development as an effective technical communications leader, continue process improvement in your organization, create harmonious work relationships, and improve content and product quality. Be the change you wish to see.

Want to learn about the key steps to success in technical writing outsourcing, how outsourced writers integrate into Agile teams, and how to manage these approaches to improve quality, save time, and reduce costs? Then check out this December 8, 2016 webinar held during The Content Wrangler’s Virtual Summit on Advanced Practices in Technical Communication.

The post Mastering Technical Communication Leadership appeared first on The Content Wrangler.

Categories: DITA
XML.org Focus Areas: BPEL | DITA | ebXML | IDtrust | OpenDocument | SAML | UBL | UDDI
OASIS sites: OASIS | Cover Pages | XML.org | AMQP | CGM Open | eGov | Emergency | IDtrust | LegalXML | Open CSA | OSLC | WS-I