Server automation for documentation deployment

JustWriteClick - Sat, 2017-01-14 13:12

When you treat docs like code, you want to deploy that “code,” such as doc source files, so that you can see how the doc looks on a web site. I have been practicing these deployment techniques while working on OpenStack, which offers open source cloud computing services. I needed to practice so I could get better, and also practice was the best way to learn this type of technical problem-solving—by doing.

The way I approached the practice effort was to:

  1. Find credentials for a cloud (or two).
  2. Determine which web services to install on the cloud servers I launch there.
  3. Find deployment orchestration templates that launch the right combination of web services to make the site I wanted, deploying Ruby, Jekyll, and NGINX, using Ansible.
  4. Test, test, test. Test some more.
  5. Try out Docker locally, then get the cloud server working, finally. This step took a while while I worked out the Linux user permissions needed for installing Ruby with the compatible version needed.
  6. Set up the cloud server as a git remote, then push the website to the git remote, building the HTML and copying the files to the web server.

Hear me talk about my excitement trying out docs deployment in this video clip from the original on thenewstack.io. In it, I talk to Alex Williams, founder of TheNewStack.io, about my adventures at the OpenStack Summit in Barcelona.  Thanks to Alex for the permission to re-post and for asking about my latest work.


The deck is available on Slideshare.

Deploying Apps on OpenStack from Anne Gentle

The Ansible code is on GitHub.

The Jekyll theme, so-simple, is on GitHub.

The content repo is on GitHub.

This demo shows pushing the site to the git remote to update the content.

Categories: DITA

Attending DBW Next Week?

Really Strategies - Fri, 2017-01-13 16:22


I always look forward to attending the Digital Book World Conference in New York each year.  Unlike some other events, DBW’s sessions are the focus – sessions filled with valuable insights on significant topics from a “who’s who” of industry speakers.  DBW also attracts an impressive group of attendees, so there’s no shortage of dynamic conversations to be had with some of the best minds in publishing.

This year’s program is all about “Smarter Book Publishing for a Digital World,” which is exactly what we do for our book publishing clients at RSI.  So I’m excited to get there and get started! 

Contact me at jwood@rsicms.com if you want to connect at the event and we can arrange a time to meet.

Contact Jeff Visit our Website Register for DBW


Categories: DITA

Gilbane Advisor 1-5-17 — Bots, Deep Learning, Mobile Marketing

Happy New year Dear Reader! We have chosen a small number of the superabundance of end-of-year reviews and predictions to recommend, each focused on rapidly developing areas that are important for you to keep up with, even if at a high level. Topics include bots, deep learning, mobile, marketing technology, software development, and design. Bot Check-In: A […]
Categories: DITA

Gilbane Advisor 1-5-17 — Bots, Deep Learning, Mobile Marketing

Happy New year Dear Reader! We have chosen a small number of the superabundance of end-of-year reviews and predictions to recommend, each focused on rapidly developing areas that are important for you to keep up with, even if at a high level. Topics include bots, deep learning, mobile, marketing technology, software development, and design. Bot Check-In: A […]

This post originally published on https://gilbane.com

Categories: DITA

Diversifying Content Strategy To Improve Customer Engagement

The Content Wrangler - Thu, 2017-01-05 11:59

Guest post by Laurel Nicholes and Niki Vecsei Harrold

Prior to presenting, Content Potluck: Bringing Everyone to the Community Table (during the Virtual Summit on Advanced Practices in Technical Communication) we authored a quick-read blog post in which we defined content potluck and outlined how to find champions in your organization to move a project forward. Today, we follow up on that effort. We provide tips for diversifying content strategy by supporting a content potluck. We also provide some advice on organizing your content production and distribution efforts using a collaborative editorial calendar.

During the Summit, we asked attendees to answer questions designed to capture current practices and to spot opportunities for improvement. Tips designed to help keep both participants and stakeholders informed and engaged are included below.

Tip #1: Build An Editorial Calendar

There are many software tools available (free and paid) that you can use to build editorial calendars. Look for calendaring tools that allow you to set alerts, send reminders, and share content with others. Here are a few ideas:

  1. Calendar functionalities (Google Calendar, Atlassian wiki calendar, Outlook Shared Calendar)
  2. Spreadsheets (Excel, Google Sheets, Smartsheet)
  3. Intranet (Jive, Atlassian, Sharepoint)

Regardless of the software tools you choose, how you use your calendar makes all the difference. These best practices can be leveraged to help your team recognize instantly what content types are missing, or where content gaps exist:

  • Assign content an owner
  • Set alerts for specific content tasks, including deliverable dates
  • Code content types
    • blogs
    • tweets
    • videos
    • articles
    • interviews (like “Ask Me Anything” question and answer sessions)

Tip #2: Plan Topical Campaigns

Use your editorial calendar to orchestrate the distribution of topic-based content. Make sure to target a mix of channels. This is a good chance to re-use and re-distribute existing content assets. Think beyond product launches—customers want to engage consistently with your brand. Be creative. For example, you could provide content to help customers solve a common problem. Wherever you decide to publish, drive traffic consistently to the same content asset. Doing so will help you grow your audience and attract repeat visitors.

Tip #3: Engage With Your Audience

Asking questions of prospects and customers—and listening to their answers—across multiple feedback channels is an excellent way to identify content ideas. Documenting your findings helps your champions understand the impact of content on those you serve. If everyone on your team commits to the potluck, they will want to share feedback and metrics with their peers and managers.

Advice: Make sure to share community and content performance reports in an easy-to-understand format at regular intervals. Consider having direct feedback sessions between content creators and your audiences. Low-tech feedback loops can include:

  1. One-on-one calls
  2. “Ask Me Anything” question and answer sessions
  3. Polling and surveying through social and community channels

During the Virtual Summit on Advanced Practices in Technical Communication, we asked our audience how they gather feedback on their content today.

Surveys are the most inexpensive and most-used tools for collecting feedback. Surveys don’t have to be long and involved to provide utility. Single-question surveys can yield significant insight, deliver great response rates, and help customers provide feedback quickly with minimal effort.

Tip #4: Reward Your Champions

Recognizing your champions keeps them engaged and loyal to the potluck. Some of the easiest ways to recognize people for their hard work:

  1. Provide bonuses or manage by objectives rewards
  2. Send thank-you notes to champions (and their bosses)
  3. Share analytics reports (demonstrating the impact of their work)
  4. Ask them to lead conversations in the community (at both live events and virtual events like webinars)
  5. Give them a special virtual badge—or ranking in your community—to make sure their profile stands out and the audience recognizes their expertise

Below we answer the questions submitted by attendees during the live webinar.

Questions From The Audience:

Q1: Do you have any thoughts for companies that are extremely limiting with what goes out to customers? We don’t have access to social media and only publish behind a firewall.

A1: We understand. When we started our project, we had to break down barriers. We too, were publishing behind a firewall, and had to campaign executive leadership in order to get permission to build our Twitter and LinkedIn presence. Your first step is to effect change.

We achieved significant impact—and broke down barriers—by doing some basic market research designed to capture where our prospects and customers were talking about our brand on the internet. Being a technology company, we found that a lot of conversations were happening on Quora, Spiceworks, and Github. Our prospects and customers were exchanging and giving advice without us. Much of the information about our firm that was incorrect or not up-to-date.

We also did some social media research and queried Twitter conversations to capture how many non-marketing conversations are happening with our brand or product name. We found a staggering ratio between neutral, positive and negative conversations. Presenting our findings to executives caused them to realize that we could no longer afford to ignore these outside the firewall discussions.

We also became active contributors to our company website. We got permission to write a weekly blog that contained links to content behind the firewall. Blog topics varied; they could promote an upcoming launch, address customer feedback, or answer a common question. By blogging links to our content, search crawlers could pick up the links and our SEO improved.

Q2: Great presentation! I love the Content Potluck model. Can you share how you were able to get executive sponsorship that led to even representation and participation across the organization?

A2: We started the project by listening to the pain-points our executives were having. Product management wanted to grow their thought leadership position and gather customer’s ideas for new features. Training wanted to market their classes as well as provide free training videos and needed a platform. Customer service wanted a home for their forums and wanted to reduce the cost of support. Engineering wanted more people to read the technical content prior to escalating problems to the support staff.

We did some grassroots work, too. We recruited people interested in creating content for the community, including podcasts, videos, blogs, and as well as volunteers to monitor the forum. These weren’t always managers; frequently they were knowledge workers starting out in their career; folks with a passion for the technology who wanted to grow their professional reputation as a subject matter expert.

We matched volunteers with problems to solve, and created a presentation to explain the content potluck idea. Then we took our show on the road. We tailored the presentation to executive problems, rather than asking them all to one general presentation. We believe that this is what made us successful.

Once we got buy-in for our effort, we had to ensure we could deliver results. To demonstrate our successes, we shared metrics showing membership growth and increased content consumption. Ultimately, that’s what kept executive support growing.

Q3: When you ask so many different types of people to contribute content, how do you maintain the quality? How do you prevent it from feeling like a cacophony of inconsistent voices?

A3: The audience who engages with you in a community space craves multiple voices. Our experience shows that having different voices and styles of writing resonates better and lends authenticity in the customer’s eye over the one (sterile) tone of voice approach of “official documentation.” The variety of voices does not significantly impact quality. And, only subject matter experts get to craft content. Anything that is long format (not just an answer to a question) gets vetted for the first few publications by a central figure such as a content strategist or community manager. Once that contributor shows they understand the guardrails around quality content, they are allowed to post without moderation.

It’s important to note that champions won’t immediately start writing content for your community. Most content potluckis start with one or two contributors. Over time, contributions and contributors will increase. Eventually, your champions will master the base requirements of content creation—authenticity and quality—and will coach other team members with similar expertise.

Q4: “Ask Me Anything” discussions—we call them Office Hours. This is one hour per month where clients can connect with the team and ask anything.

A4: Office Hours or “Ask Me Anything” sessions are a great ways to uncover the content needs of your prospects, customers, and other shareholders. Try them out in different channels and formats. Conference calls, Google Hangouts, web discussions, Facebook Live feeds, Twitter chats, and question and answer sessions focused on the needs of your community not only provide excellent ideas for content potluck projects, but they also demonstrate that you value your audience. Hosting such events—and acting on the knowledge gained—tells your audience that you are looking for ways to improve their satisfaction.

But don’t stop there. Make sure to involve internal teams to these discussions. Doing so attracts new and different voices to the conversation. Always provide a recording or transcript, and hold the subject matter experts accountable publicly for delivering on their promises. It should go wothout saying, but we’ll say it here: Don’t focus only on the good ideas. When you listen, hear what is being said. Always follow up complaints.

If you have additional questions, leave a comment on this blog post (or find us on Twitter under Niki Vecsei and Laurel Nicholes). We would love to hear from you.

The post Diversifying Content Strategy To Improve Customer Engagement appeared first on The Content Wrangler.

Categories: DITA

Managing Enterprise Content: 12 Lessons From 2016

The Content Wrangler - Tue, 2017-01-03 17:57

Drawing from 20+ years of experience wrangling content, Robert Norris presents us with a twelve-part series on managing enterprise content. His articles (some of the most popular posts of 2016) take on the complex topic of enterprise content strategy from a heuristic angle, bypassing academic approaches for more pragmatic solutions designed for immediacy and ease.

This article serves as a summary of Norris’ outstanding series, inviting you to explore the rich and unique insights contained in his work.

Managing Enterprise Content: 12 Lessons

1. Think Like a Librarian

 Librarian checking in books

Norris’ first post, Think Like a Librarian, elaborates on the theme of agility in design to create an easily accessible and user-responsive knowledge-base. Summoning the stereotypical figure of the “librarian” as a stand-in for “knowledge manager” or “content troubleshooter,” Norris elaborates on five principles of librarianship that can be used to enhance knowledge base experiences.

2. Building Problem-Solving Toolkits

 Team of workers thinking through a problem

Norris second installment tackles the issue of quality control for content curators and content technologists. For the content curator, user-advocacy is a critical aspect of making knowledge-base collections more intuitive and streamlined for the end user. The key to achieving this is to consider not just audience needs—but more importantly—audience competency levels. As audience expectations will differ across a broad spectrum of competency levels, it is important to meet those expectations by making content appropriate to the user’s knowledge level.

The second part of the article, aimed at technologists, provides a few tips for decluttering and recombining content resources to create valuable collections. Not every archived document will have stand-alone value to users. And a document’s internal (company) value will erode when it exists among duplicates, particularly if the duplicates contain slight variations. By alleviating duplication problems and finding new ways to recombine documents in response to user needs, technologists may be able to envision new product possibilities from existing archives.

3. The Curse of Elegance


Norris’ third installment addresses the “paradoxical challenges” facing every content designer: the more elegant the design, the less noticeable it becomes; almost like a great film score that is “felt” but never “heard.” Designing for functionality can either make you a target for complaints, if your design is faulty, or an “unsung hero” if your design is elegantly crafted. Yet developments in content design cannot be cultivated without appropriate feedback, one primary component being “appreciation.” Customer feedback is essential for assessing the “effect” of a given design–the UX of functionality. Feedback from colleagues is also an essential factor that helps inform and shape the mechanics behind UX.

Ultimately, and to add yet another paradox, content design requires heightened noticeability to achieve its optimal state of “invisible” functionality.

4. (Im)-Proper Care and Feeding of Subject Matter Experts

 Sign — Wrong way.

Norris’ humorous title—(Im)-Proper Care and Feeding of Subject Matter Experts——refers to the notion that subject matter experts (SME’s) are a very different kind of animal, proverbially speaking, in the operational realm of content and communications. Though most people are aware that brilliant subject matter experts don’t always make great teachers or communicators (and vice versa), not everyone has the skills to effectively approach and collaborate with SME’s when their input is needed. Norris lays out the main challenges in this relationship, providing practical solutions for each one.

5. Hey! Where’s Our Content?

 Woman searching; looking through binoculars

If the word “strategy” denotes a comprehensive plan of action, entailing a wide range and long-term perspective, then the notions of myopic focus and recency bias seem antithetical to it. Yet, this is what many content managers often face in companies where C-level executives view content as “deliverables” rather than as active informational networks and relays.

As Norris points out, many professionals outside of the content field do not fully understand the scope and principles comprising content strategy as a discipline. Contrary to what many organizational managers may think, content strategy goes beyond the production of marketing and sales content. Content strategy also cannot be restricted to the limits of content production (deliverables). This fifth installment discusses the (not so) unforeseen negative consequences of this misunderstanding.

6. Devising a Content Strategy to Serve Every Audience

 Content Strategy

At the opening of his sixth installment, Norris presents us with a definition of strategy that is in alignment with most current notions surrounding the concept. The term “strategy” makes for an interesting comparative distinction when viewed etymologically from the military context in which it had originated.

In a military context, a direct offensive is only as strong as its means of support, the latter posing as a critical vulnerability to be targeted by an opposing force. With the aim of reducing vulnerability to one’s side, a campaign leader cannot afford to be so myopic as to focus solely on the tip of the spear. Norris points out, based on his experience, that many organizations have a tendency to work contrary to this basic principle.

According to Norris, many organizations place lopsided emphases on marketing and sales efforts with regard to recognition and resource allocation. Such biases affect the quality of content operations, as focus shifts from the “enterprise” level to its subsets (i.e. marketing and sales content). Contrary to this common tendency, Norris reinforces the notion that “enterprise content” encompasses, obviously, the entire “enterprise,” what Norris calls “every audience,” or every internal and external user. To this end, Norris provides an exceptionally clear framework for constructing a content strategy that is gap-proof and all-encompassing.

7. Your Content Strategy: Is It Feasible?

 Team meeting

Conducting a feasibility study is an effective way to assess the practicality of a method or plan. When developing an enterprise content strategy involving multiple individuals and departments—all of whom have different perspectives, work methods and goals—a feasibility study is necessary to see how the workflow puzzle can be collaboratively assembled.

One effective way to conduct such a study is to simulate a real-life scenario. Simulations can help teams collaboratively construct project roles and expectations, and shape contingency responses based on individual capabilities and expertise. Norris’ seventh installment proposes a few tips for conducting an organization-wide feasibility study to help test and shape the real-world implementation of a content strategy.

8. Best Practices for Fostering Support from Stakeholders

 Project Stakeholders

Fostering stakeholder support is critical to any organizational undertaking. Without “buy-in” from the managerial and executive levels, a project may not get the opportunity to leave the runway. Similar to the previous installment, Best Practices discusses the diverse and potentially dissimilar interests, goals, and personalities among stakeholders.

A complicated scenario, sharp differences can exist among stakeholders interests despite their general alignment with larger organizational goals. A key solution, which Norris explores in detail, is to study the stakeholders themselves, an approach similar to that of a feasibility study, before selling their ideas upstream.

9. A Swing and a Miss: Faulty Customer Support Metrics

 Baseball player strikes out.

Norris’ ninth installment puts the spotlight on support center operations and the role they play in shaping the overall quality of products and services. He advances two general propositions. First, an individual user’s experience is a correlated stand-in for mass-user experience. Second, the support center should be viewed not only as the spear-tip of customer engagement but also as a critical player in quality evaluation.

While leadership tends to focus on the big-picture metrics surveying conditions on a larger scale, the key to quality control is in the discrete metrics that are often overlooked.

10. Building a Robust Content Quality System

Business key rating increase web icon

Managing product and service quality is standard procedure for most businesses. But the importance of content quality management is a matter that’s often siloed if not neglected altogether. In the absence more integrated procedures for quality content control, unmonitored databases can leave businesses vulnerable to several unforeseen risks. Norris’ tenth installment discusses the risks posed by “orphaned” documents—ownerless and often outdated documents floating in a database. The result of customers accessing such documents can range, depending on context, from minor errors to severe harm.

Active ownership is key to content quality control. It is also the most effective way to mitigate risks posed by inactive content. Norris presents a quality control framework designed to help you ensure content quality and prevent content mismanagement snafus.

11. Developing a Unified Content Strategy: Learning From the Masters

 Martial arts master

Although marketing content is designed to drive growth, support content plays a critical role in maintaining customer engagement and satisfaction. Marketing content promises a particular product/service experience, while support content enhances the delivery of that experience. If the goal in “marketing” is to communicate the value of a product/service, then all organizational content can be considered an extended form of “marketing content” addressing various touch points of customer experience over time.

As company executives are lopsidedly biased toward growth initiatives, the marketing side of content operations tends to receive more attention and resources than non-marketing counterparts. But given such organizational focus, marketing teams tend to be better equipped, experienced in collaborative settings, and adept in operating across multiple channels of communication. In his eleventh installment, Norris explores ways to tap marketing’s capacity to create a well-balanced content strategy across the organization.

12. Managing Counterproductive Organizational Expectations

 Bengal tiger

Most of the articles up to this point discuss the challenges of content managers operating at the periphery of executive focus. It’s what Norris calls the curse of elegance. But what happens when a successful content operation attracts the full attention and scrutiny of executive management?

In this final article in the series, Norris discusses the burden of success—misguided expectations, executive micromanagement, etc.—along with a few contingency ideas to help mitigate the problems that come with being successful.


The post Managing Enterprise Content: 12 Lessons From 2016 appeared first on The Content Wrangler.

Categories: DITA

A modern neural network in 11 lines of Python

bobdc.blog - Thu, 2016-12-22 12:52
And a great learning tool for understanding neural nets. Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

Overcoming Objections to Intelligent Content

The Content Wrangler - Sat, 2016-12-10 21:33

We get a lot of questions about intelligent content. Of course, sometimes, rather than getting questions, we get “told things”—stories taken out of context or statements repeated by others without factual support. In this post, we examine and debunk some of the most common objections to intelligent content.

Some of these objections arise from common misunderstandings about factors such as cost (We need how much money?), purpose (Why do we need to change how we work, anyway?), perceived limitations (That’s cool, but it will never work for us…), or technology (Why can’t we just use the tools we have already?). Other objections are understandable concerns related to change: worries about losing writers, uneasiness about adjusting job descriptions, and so forth.

I’m Marketing, it’s just for technical content

I’m in marketing—there’s no way this could possibly work for us. This is one of the most common objections and it’s fairly easy to dispel.
Marketing content is different from technical or explanatory content in some ways, but at its heart it shares the same need for accuracy, verifiability, and quality. It must be created quickly to meet the needs of the market. It must be interesting, relevant, and engaging. And we must be able to update it rapidly and inexpensively.

Traditionally, marketing content was agile, while technical content was not. Marketing content was highly visual, while technical content was not. Marketing content was engaging and interesting, while technical content was not. Technical content needed sophisticated technology, marketing content did not. See a trend emerging?

In most cases, it is beneficial for technical content to be aligned with marketing content because all content affects the way prospects and existing customers feel about our brand. Therefore, all content is marketing content, regardless of who creates it.

We need intelligent content to unify our content, making it possible to provide a consistent experience across all touch points with our content.

There’s no good business reason for creating inconsistent content that damages brand.

In today’s global, hyper-connected world, there’s no reason to create inconsistent content that damages brand, ruins the customer experience, and wastes finite corporate resources. Marketing must learn to leverage the techniques some forward-thinking technical communication departments have already mastered, techniques that can reduce or eliminate unnecessarily cumbersome, time-consuming, and expensive manual processes; automate content creation, formatting, and publishing tasks; systematically reuse content to meet customer needs; and publish content to multiple channels, simultaneously.

Intelligent content allows us to do all of these things — and everything else we’ve always done — more efficiently and effectively, affording us the luxury of using the resources saved to innovate.

For instance, intelligent content allows us to create content, and, with the push of a button, republish it in different forms. We can easily re-skin content and know that we’re still using the correct/approved content. With intelligent content, we can repurpose marketing content intended for one medium in a totally different medium, depending on the metadata associated with the content and our output requirements. With intelligent content we can reuse, repurpose, and customize our information for different outputs and markets faster and more accurately than ever before.

Just because content is well-written, displayed in an innovative format, and published in some new and super-cool way, does not make it intelligent content. These things may be attractive, interactive, or amazingly different, but those characteristics don’t make content intelligent. Intelligent content relies on repeatable methods, content standards, automated processes, and software technology designed to help us create, manage, and deliver relevant, personalized content in more efficient and effective ways than traditional publishing approaches allow.

We can’t review content in modules; we need content in context

When first encountering the concept of intelligent content, many reviewers, editors, and proofreaders are concerned about the difficulty of properly reviewing content that’s broken into reusable components. While they agree that creating modular content for reuse is interesting (and likely advantageous), they argue that reviewing content in sections any smaller than a document is simply impossible.

The actual content review method employed depends on the software tools selected and the implementation details, but in general, during a review cycle, the content under review is displayed in context. Comments made by a reviewer are attached to the content component being reviewed. This same technique is used to provide context to translators.

That said, intelligent content presented for in-context review may not be fully formatted — layout will be minimal. That’s not a problem, because we want to focus on improving our content, not the appearance of our content (important though that will be in the final output).

All content is marketing content, no matter who creates it.

Instead, an approximation of the final content design (images, charts, graphics, text, fonts, colors) will be presented to reviewers during in-context review. For instance, if the text is supposed to be wrapped around an image in the final output, that’s unlikely to be visible to the reviewer. For reviewers used to working in desktop publishing environments, not being able to see the final design as they edit and review may introduce some challenges at first, but they usually adjust to this difference as they grow familiar with the approach.

Using visual design as a basis for intelligent content review is a bad practice. It’s better to review the content with light styling. That way the content is less likely to be tweaked for a specific appearance by well-intentioned reviewers and more likely to be reusable in multiple contexts.

It’s only about reuse

Although intelligent content supports efficient reuse, that’s not the only reason to implement the approach. , discusses other benefits, but even if intelligent content was only about reuse, which it isn’t, it would still offer value.

How much content is reusable? We find the average organization that adopts intelligent content enjoys at least 25% content reuse. Depending on the organization and the content, that percentage can be much higher. It can be lower, but it’s rarely zero. Managing even small levels of content reuse intelligently is much easier than copy-and-paste.

Content reuse can pay even higher dividends in organizations that translate content into multiple languages. When we reuse content components and their translations, our return on investment (ROI) skyrockets. Some organizations report that their biggest savings come from reduced translation costs attributable to content reuse.

It takes too long to implement

It does take extra time to use intelligent content correctly in the early phases, but that’s because we have to reengineer our old, outdated processes; adopt new roles, responsibilities, and tools; and learn how to work differently. Moving to intelligent content transforms the way we produce our content, and transformation means big changes involving pushback, uncertainty, fear, rumors, and temporary setbacks the first time through.

The key to success is to correctly identify the scope of the project and clearly understand our objectives. As an intelligent content project rolls out, we will identify new participants and departments who will want to get in on the action. While it’s important to encourage widespread participation from all divisions of the organization, we must also avoid expanding project scope. We must address immediate needs first and then, once our systems are up and running, invite other departments to participate.

Take small steps, such as structuring content, first before tackling all the automation, workflow, writing style, and technology changes. Adopting structured, semantically rich content alone will make our content much more effective.

It’s too hard to do

Introducing anything new can be a challenge. When creating intelligent content we have to change from a page-based way of thinking about content, to a component-based content paradigm. This is often the biggest challenge, especially for experienced content contributors.

Authors take pride in their work. Traditionally, this has meant that an individual author was responsible for a specific set of deliverables. With intelligent content, authors exchange individual control of deliverables for the flexibility of creating shareable, reusable content products that they develop collaboratively. Rather than being responsible for one content product (or suite of content products) they become responsible for a much broader range of content.

Interestingly, some technical illustrators and graphic artists have adopted component-based content approaches. They’ve learned to easily create modular components and reuse them where needed.

Software developers have been using intelligent content principles, especially component content reuse, since the 1990s. They overcame the “It’s too hard to do!” objection by recognizing that the benefits of their approach (they called it object-oriented programming) far outweighed the up-front work required.

It’s not about content, we just need new technology

Actually, no. We need some new technology (software) to implement all but the smallest intelligent content project, but that’s not our biggest issue. Not by a long shot.

When we move to intelligent content, the biggest challenge isn’t software. After all, software doesn’t produce intelligent content; we produce it. Software helps us plan, create, manage, and deliver our content, and it helps us do these things efficiently and effectively. Software supports our efforts, but it shouldn’t be our focus.

Instead, we must focus on moving to a modular, component-content way of thinking. With intelligent content, we create individual components of content that can be mixed and matched in different ways, for different audiences, which consume content on different output channels. To succeed, we need to collaborate and share information with more people than ever before. And, the way we work (our actual tasks) will change, as will our workflow. We may even need to change the way we define success.

Technology is the easy part; people are the are the challenge.

Technology is the easy part. People, and the changes we expect them to make, are often the most difficult obstacles.

Despite the people challenges, moving to intelligent content can be done! The key is to understand what we want to do, when we want to do it, and clearly communicate our goals and expectations to everyone involved in the process, including the naysayers.

Recognizing that this is a cultural change—more than a technological change—and managing expectations is critical to success.

It costs too much

There’s no question that implementing intelligent content comes with costs. For some organizations it can be cost prohibitive. Software isn’t free, and there are associated installation, implementation, and configuration costs. Oh, and there’s training. But these costs are common expenses in nearly any transformational project.

Moving to intelligent content is both a cultural and technological transformation. Cultural changes are often harder to make than technological ones, but the cost of technological change can’t be dismissed.

The best way to control these costs is to create a well-defined intelligent content strategy aimed squarely at helping us accomplish clearly-defined, achievable goals. The best method of keeping costs down is to stick to the project plan. Scope creep increases cost.

Some organizations seem to handle change better when it’s introduced in phases. When we break our intelligent content project down into smaller, more manageable chunks, not only is it easier to implement, but cost becomes less of an issue. Expenditures are smaller and the total project budget is spread out over time, making funding such initiatives easier for some, more palatable to others.

Regardless of the approach, it’s critical to have a well-researched business case that spells out the potential return on investment from each phase of an intelligent content project. Maintaining momentum is easier when we can prove that the benefits of our efforts far outweigh the costs.

Intelligent content is only for regulated industries

Intelligent content is a particularly good fit for regulated industries. But, intelligent content can provide benefits for nearly every content-heavy organization, regardless of whether it is regulated or not.

When considering a move to intelligent content, organizations in regulated industries are attracted to a variety of benefits, but the primary benefit is improved control over content. Intelligent content is far less likely to be incorrect or outdated when published.

Why? Traditional content production methods rely on outdated content review approaches in an attempt to ensure quality. The traditional approach involves completing a document and then sending it around for review by others. This approach is slow, error-prone, and doesn’t align with the agile methodologies many companies are using to drive product development. Quality is often achieved (at least in part) by ensuring that many people get a chance to review the content.

Because we’re accustomed to working on a document/page basis, many people feel that the most important review is the final review when the document /page has been completely written and styled. So until they have seen a publication-ready document, they won’t sign off on it.

This method was rejected as being too costly and not very effective by the manufacturing sector decades ago. What manufacturers do today is design quality in, and use fewer, but better targeted quality checks early in the development process. That doesn’t mean that reviewers don’t look at the final version, they do. However, this final check should require few, if any, changes.

When we adopt intelligent content, we streamline our review process because reviewers can focus on the content without worrying about the fonts, colors, style, and look-and-feel of the final deliverables. This improved focus lowers costs and improves quality.

While regulated industries realize important benefits from intelligent content (quality and content control), all content producing organizations can gain benefit from intelligent content.

We’re too small to use intelligent content

This objection has a kernel of truth, but even so, it’s not entirely valid.

Not all intelligent content implementations are huge, expensive efforts. We can use the principles of intelligent content—small modular pieces of structured information, early reviews, separating the content from the way it looks—to improve most manual content creation processes.

Many successful projects take place in sizable writing departments in large (often global) companies. These success stories often detail the expansive—and expensive—technology solutions selected by the featured company to make their project a reality. But, intelligent content projects come in all shapes and sizes. Small organizations can also adopt the approach and see sizable benefits.

Guaranteeing success is about choosing the approach, technology, and training that meets the needs of the organization. Not every company that implements intelligent content will require the same plan of attack or the same software tools. There are a variety of tools and technologies, at various price points, that can help us achieve our goals.

Intelligent content can provide even a single writer with benefits worth the effort. We don’t need an expensive, multi-seat license to a component content management system to create intelligent content. Some organizations find that small cloud- or server-based systems can empower a small department to create intelligent content. A single writer or small writing team can benefit from being able to easily locate content and know they have the latest version.

The size of the organization is not a determining factor. Don’t discount the benefits of intelligent content because of the size of the company.

We’re too big to use intelligent content

When we move to intelligent content, we don’t have to make all the changes needed at once. In fact, there are many good reasons for adopting a phased approach.

To succeed in a large organization, one of the first steps is to identify influential people in other departments who would benefit from adopting intelligent content. Design an approach that meets their needs. Start small. Identify an individual group to target. Ensure that the type of content they produce can be leveraged by other groups, and make sure the approach works for the initial group before rolling it out to others.

Also, keep in mind that not every group we target will want to join the effort. Some groups won’t adopt the new approach, even though it could prove to be useful to them. Other groups might seem like great candidates, but if their current processes are too far away from our reality, including them would be counter-productive. Don’t force it!

If there’s no business reason to use intelligent content within a particular group, department or organization—don’t use it!

Intelligent content requires new technology

This is half true. We can accomplish many of the goals of intelligent content without adopting new technology—at least in the early stages. However, to benefit from all the bells and whistles—automation, sophisticated content reuse, and multi-channel publishing—we need tools to help us create, manage, and deliver intelligent content.

But, there are things we can do to prepare ourselves for the move to intelligent content before we invest money acquiring new technology. The biggest value comes from structuring our content. Analyzing our content and designing a repeatable structure yields more consistent, coherent, streamlined, and effective content.

We don’t need special tools to support structure. We can create structured web forms for authoring, or even set up structured content templates in Microsoft Word, before we purchase new technology. As long as authors adhere to the structure of the content and can quickly and easily create content, we can realize significant rewards.

Structured content is not only a best practice, it’s required for intelligent content solutions. We need to create, manage, and deliver structured content to realize the full benefits of intelligent content.

I’ll lose control/creativity

Control is a myth. On most devices, the reader has ultimate control of fonts, colors, point size, etc. In print, corporate style guides control the look. Intelligent content practices allow us to spend less time with issues, such as look and feel, that we never have had much control over anyway. By spending less time on those issues, we have more time to deal with the aspects of our job where we can be creative. It takes just as much creativity to write structured content as it does to write in stream-of-consciousness mode, and the result will serve the customer better.

The post Overcoming Objections to Intelligent Content appeared first on The Content Wrangler.

Categories: DITA

Remodeling documentation

JustWriteClick - Sat, 2016-12-10 15:19

An article I wrote on docslikecode.com website:

A few years ago we went house-hunting in Austin, Texas. One house was so popular during the first showing, there were six back-to-back appointments. We waited in the driveway while another couple toured it. Once they left, we could quickly go through it while another prospective buyer waited on the front walkway.

This house was awful. Every single surface was ugly, out-dated, and circa 1973. There was a giant hole in dirt by the front porch, likely dug by an animal. But you know what? I loved it. I wanted to bring it back to a vibrant family home, taking it back from the rogue porch-dwelling raccoons — or was it dirt-digging armadillos? We may never know.

Raccoon visiting

Let’s look at your code base and your doc base as a great house with a good layout and foundational “bones.” You still need that “punch list” to hand to your contractor. When you move towards more docs like code techniques, make sure you treat your doc base like a code base, and track defects. Get that “punch list” done.

With a code base, you know how much remodeling you need to do. The same thinking can work well for docs. How dated have your docs become? How accurate are the docs compared to the rest of the code base? How can you make the site livable and vibrant again?

Let’s give your readers the chance to do those quality checks for you as easily as possible: by reporting the bug on the page where they found it.

This technique works well when:

  • You have more readers than contributors. (I generally hope this always happens.)
  • Your readers are super busy. Still, they do want to make the docs better and help others.
  • You want to know how far your docs have “drifted” from the truth.
  • You want your docs to be more trustworthy by chipping away at a bug backlog.
  • You have a private GitHub repo for documentation, but you want to enable public bug reports with tracking back to your docs repo.

Your quick win is to look at your current docs site, any given page. Is there a way to report a bug publicly, to add to the “punch list?”

  1. Bare minimum starter level would be an email address link from every page.
  2. Level up by adding a link to your GitHub source repo Issues page so readers can report bugs.
  3. Better yet, write a quick bit of code to embed on every output doc page so that the issue is pre-filled with relevant information.

Here are some resources to get your first punch in that punch list:

  • Using Python Sphinx? The OpenStack docs theme has some Javascript you could re-use to pre-populate an Issue template so the reporter sends the page URL, the source URL, the git SHA of the commit for that version of the page, and the release version value. See this docs.js snippet.
  • Using a private repo for docs, but want to track bugs publicly? Use Issues-only Access Permissions.
  • Want to add a bit of code to pre-create Issues to use as comments on every page? Free yourself from Disqus comments. Try this set of tips and sample code in a blog post.
Categories: DITA

Mastering Technical Communication Leadership

The Content Wrangler - Mon, 2016-12-05 22:20

Effective leadership is measured by customer satisfaction. Caring about your customers is crucial. Everyone is a customer: those who purchase and use your products, and your colleagues in every department.

Leadership requires a mixture of confidence and humility. Change doesn’t always happen quickly. Success may require all of the creativity, resourcefulness, and diplomacy you can muster.

Use these 7 Habits of Highly Effective Technical Communication Leaders to learn how to effectively put the customer first in all of your work. You can be an effective leader, regardless of your position.


While you may have less product knowledge than an engineer, you can view the product from a user’s perspective. Share your input and advocate for product and process improvements. Support similar efforts initiated by others. Focus on quality.


Process improvements can benefit everyone on the team, and improve product quality. Can the quality assurance team test the documentation? Can technical writers edit the user interface text and error messages? Can written documentation reviews become a factor in the performance evaluations of all product team members? Would documentation review meetings improve quality?


If you’re only writing the customer documentation, there is a chance that no one with your skills is editing other user-visible content. Even if you cannot edit this content, you can educate others about key technical writing practices. For example – One concept, One term – each word should be used to mean only one thing. This avoids user confusion, and saves translation funds. Another key technical writing practice: short sentences.


Think about how to create change – look before you leap. Understand the who, what, and how of change. Who are the key stakeholders and decision-makers, and what motivates them? What do they care about? How can you best bring them on board? What are their backgrounds – cultural, professional, educational? Start with curiosity. Listen. Discuss. Ask questions.


What do you know about the users – education levels, roles? How do they use the product and access the documentation? What percentage read the documentation in English? Collaborative efforts with other teams can aid your research. One way to learn more is to conduct a user survey. While gaining approval can be an uphill battle, the insights gained from a well-designed survey can make the effort worthwhile.


Not every organization needs to use the latest methodologies. What worked elsewhere may not work for your team. Bring others along with you – educate and involve engineers, quality assurance personnel, marketers, and product managers in assessing and exploring new methodologies. For example, some organizations can benefit from adopting topic-based authoring, without XML or DITA.


Mistakes provide tremendous opportunities for personal and organizational growth, including improved processes, communications, and skills. Accept these moments and make the most of them. Learn from successes and failures. Remember that what is truly irreplaceable is human life. Learn what you need, and what your coworkers need, to reduce and relieve stress.

Consistently practicing these 7 habits will support your development as an effective technical communications leader, continue process improvement in your organization, create harmonious work relationships, and improve content and product quality. Be the change you wish to see.

Want to learn about the key steps to success in technical writing outsourcing, how outsourced writers integrate into Agile teams, and how to manage these approaches to improve quality, save time, and reduce costs? Then check out this December 8, 2016 webinar held during The Content Wrangler’s Virtual Summit on Advanced Practices in Technical Communication.

The post Mastering Technical Communication Leadership appeared first on The Content Wrangler.

Categories: DITA

Content Potluck: Bringing Everyone to the Community Table

The Content Wrangler - Wed, 2016-11-30 21:25

Guest post by Niki Vecsei Harrold and Laurel Nicholes

An increasing number of consumers no longer want to hear from one, authoritative voice. They don’t trust—or feel compelled to rely upon— your “official” communication channels. That’s because an increasing amount of your audience grew up online. They’re digital natives accustomed to digital content experiences. They want to feel connected to the content they consume and the brands with which they interact. They want content experiences that resonate with them; that adapt to the way they learn, live, and work.

In order to provide exceptional content experiences today, you’ll need to learn to share content in a social, community setting. To do so, you will likely need to expand the way you think about content and those who contribute it. This article explores the need for a new methodology we call Content Potluck—an approach we believe can help you craft engaging and valuable content experiences.

But, before we tackle the methodology, let’s answer this question: “How do you find the right content contributors?”

Team of creative professionals meeting in conference room

Finding The Right Content Contributors

The best content is content that addresses the needs of your audience; content that helps them solve a problem or answers a question. This content is often difficult to locate. Far too often, it’s buried deep within organizational silos. Finding it—and using it to improve customer experiences—is the goal.

Today, nearly everyone in your organization is a potential content contributor. If your firm is like most, chances are your co-workers are creating:

  • social media posts
  • blogs and articles on LinkedIn
  • answers to questions asked in community forums
  • instructions in email and text messages
  • how-to videos and other user assistance materials

Leveraging existing content assets creates a win-win situation for all. You gain access to valuable content. Your teammates get the chance to broaden their sphere of influence and grow new skills. And, your prospects and customers benefit from improved content experiences.

How do you find the right type of content contributor? Start by identifying the profile of an ideal content contributor (regardless of their function, title, or seniority in the company).

The Right Profile

Communities benefit from a wide variety of content creators. While diversity is important, content contributors should share some essential characteristics:

  • A passion for connecting with customers in a social forum, whether it’s your platform or online community, or public channels like Twitter, Facebook or GitHub.
  • A deep interest in improving how your customers experience your products and services. These folks want to learn from others as much as they want to share information.
  • Empathy for customer problems and a desire to provide solutions. Contributors who publish content—but don’t care to respond to queries—aren’t as valuable as those who interact with prospects and customers and engage them in conversation.
  • An understanding of the importance of social community and its potential impact on customer experience. Helping one person online can lead to the deflection of hundreds—perhaps thousands—of similar or identical support cases.
  • A desire for career advancement and personal development. Managing and participating in community content development efforts can lead to additional career opportunities. Frequently, junior team members see the importance of  such projects. They are often eager to dedicate time to such initiatives, in addition to performing their regular duties. They understand they are building a personal brand; an online reputation as a subject matter expert. They realize that their influence grows as their network expands.


Getting Them to the Table

So, how do you get others to contribute? How do you manage the potential chaos that may occur when you involve so many different voices? This is where Content Potluck comes into play.

Content Potluck is an approach designed to help you organize—and bring together—representatives from various content creation teams. Potlucks require an organizer (or two) to lead the initiative. Ideal organizers include community managers (who should already have relationships in place with other content producing departments) and technical writers and other content creators with deep product knowledge and an interest in broadening their skill set.

The next step is to schedule a recurring potluck—a gathering (preferably over lunch or breakfast) at which you will discuss:

  • Current content projects (by team)
  • Existing content types (being produced today)
  • New content types (to be produced in the future; derived from discussions with support staff, as well as from interactions in community forums)
  • Publication dates (to ensure the production of a steady stream of content)
  • Customer feedback from social media, communities, and content portals

Productive potlucks are facilitated sessions led by an organizer. To help your team focus on actionable outcomes, create an agenda and follow it. Nominate someone whose role will be to take notes for the group. Capturing tasks, deadlines, goals, and other meeting details—and sharing them with the team—is critical to success.

Invite members from all customer-facing content-producing teams. Interested participants can be found in nearly every department. Make sure to include content creators in marketing, documentation, support, training, engineering, product management, sales, professional services and beyond.

Next Steps

What do you do once you have your participants at the table, the discussion flowing, and notes captured? The next steps include identifying common tools, creating a universal editorial calendar, assigning action items, creating social media marketing campaigns to promote your content, and sustaining the group for long-term growth. Each of these subjects deserves its own blog post.

You can learn more about Content Potlucks—and find sample templates—here. Plan to attend our upcoming webinar during The Content Wrangler’s Virtual Summit on Advanced Practices in Technical Communication, December 7, 2016 at 3pm PT. You’ll hear real-life case studies and be able to ask questions of the presenters.


The post Content Potluck: Bringing Everyone to the Community Table appeared first on The Content Wrangler.

Categories: DITA

Pulling RDF out of MySQL

bobdc.blog - Sun, 2016-11-13 15:09
With a command line option and a very short stylesheet. Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

Learning the Vocabulary of GitHub for Docs

JustWriteClick - Sat, 2016-11-12 16:36

An article I wrote on GitHub for the docslikecode.com website:

What if you could use GitHub, static site generators, and Continuous Integration and Deployment (CI/CD) for our documentation? I imagine you can track your backlog and get some metrics on the quality with their nice contributor graphs. I bet you could measure “docs drift” to figure out just how behind the docs have gotten. Hey, let’s also get access to the developer playground and fun equipment! Let’s play on the slides and swings while we make cool and beautiful documentation, side-by-side as collaborators.

I hope you’re wondering, “What would happen if we treated docs like code?”

Believe me, your fellow software builders are wondering, experimenting, or already starting down this road. I’ve seen this vision come to life and want to share my experiences so more people can learn these techniques.


I’ve found that the principles inherent to the social web for coding work extremely well for documentation. The social web, leads to social coding, leads to social documentation.

What is GitHub?

GitHub Logo

Like many tools, git and GitHub were created by fire — through a pressing need for performant and efficient source control management for theLinux kernel. Read the history in the excellent Pro Git Book.

GitHub is the web interface for git the command-line tool, that works well on Linux, Mac, or Windows. To work with others on a project (code or docs), you merge files. This model is the opposite of using a “lock and checkout” model, where no one else can work on the piece at the same time as you. With GitHub, you can work separately and bring it all together later. Git has a non-linear branching model that can take some learning to get used to. That said, I’ve found git and GitHub for docs quite practical and even inspirational.

You can keep docs in a source code repository then the developers will review all your changes prior to merging them in. Unlike traditional source code management, branches are not full copies of entire code base so they are “cheap” and “fast.” The more Agile techniques are applied to documentation, the more treating docs like code makes sense.

GitHub definitions and parallels for information

I hope I’m talking to people who care a lot about words. Let’s start with some vocabulary and definitions to build upon.

  • Branch: Indicator of divergence from base without changing the main line (or “trunk” if you like to visualize organic tree-structures to remember this term).
  • Commit: Point in time snapshot of repository with changes.
  • Fork (noun): Copy of the repository that is entirely yours in your namespace. In GitHub-land, forking does not have a negative connotation that it can in other contexts (such as taking an open source project in a new direction in a huff to get different contributors). Rather, it is a way to contribute openly and publicly with your account attributed.
  • Fork (verb): Making a copy of the repository.
  • Issue: Defects, tasks, or feature requests.
  • Organization: Collection of group-owned repositories.
  • Pull Request: Comparison of edits to see if team wants to accept changes.
  • Push: Move changes branch-to-branch. The man page says “Update remote refs along with associated objects” but that’s more technical than we need here.
  • Repository: Collection of stored code or documentation that is written and built like code.
  • Review: Do a line-by-line comparison of a change, much like an editor would for documentation, and comment on improvements or changes.

These definitions can give you decision points to make about information architecture, so think about which deliverables you’ll make, who should review and collaborate on those deliverables, and how you can automate publishing with the chunks of a repository or an organization as overarching collections.

Take a look at this article’s source on GitHub to get a sense of the “source” for a document. We’ll look at the source aspects in a future article. To stay in touch, subscribe to get relevant emails in your inbox.

Categories: DITA

Fostering Innovation in Media and Publishing

The election is over—it’s time to look forward. In that spirit, I wanted to invite you to participate in a forum running right after Thanksgiving at the Gilbane Digital Content Conference this year—a town hall focused on innovation. Send suggestions via Twitter using #gilbane. Driven to change It’s no secret that publishers have been grappling […]
Categories: DITA

Fostering Innovation in Media and Publishing

The election is over—it’s time to look forward. In that spirit, I wanted to invite you to participate in a forum running right after Thanksgiving at the Gilbane Digital Content Conference this year—a town hall focused on innovation. Send suggestions via Twitter using #gilbane. Driven to change It’s no secret that publishers have been grappling […]

This post originally published on https://gilbane.com

Categories: DITA

Gilbane Conference featured speakers

We are thrilled to have over 100 expert speakers for you to learn from and network with. Join us and your content and digital experience professional peers in Boston in three weeks. Below is a sample of who you’ll meet. Look forward to seeing you. Register today to save your seat – use code F16G for a […]
Categories: DITA

Gilbane Conference featured speakers

We are thrilled to have over 100 expert speakers for you to learn from and network with. Join us and your content and digital experience professional peers in Boston in three weeks. Below is a sample of who you’ll meet. Look forward to seeing you. Register today to save your seat – use code F16G for a […]

This post originally published on https://gilbane.com

Categories: DITA

Gilbane Advisor 11-4-16 – mobile / desktop evolution, enterprise software, attribution

In the spirit of right-tool-for-the-job, our first two articles relate to the evolution of mobile and desktop platforms. There is a lot of, mostly rational, exuberance around the speed with which smartphones are taking over the world. But that is only possible because they are not limited to content in native apps and walled gardens. According to […]
Categories: DITA

Gilbane Advisor 11-4-16 – mobile / desktop evolution, enterprise software, attribution

In the spirit of right-tool-for-the-job, our first two articles relate to the evolution of mobile and desktop platforms. There is a lot of, mostly rational, exuberance around the speed with which smartphones are taking over the world. But that is only possible because they are not limited to content in native apps and walled gardens. According to […]

This post originally published on https://gilbane.com

Categories: DITA

The Economist and Pennwell – Innovating through Transformation

Join us in Boston in November for these featured case studies and our other 32 conference sessions. Innovating through Transformation How are media companies transforming their business from one reliant on content consumption to one in which content mixes with tools and / or community for greater engagement and new revenue? This session’s case studies […]
Categories: DITA
XML.org Focus Areas: BPEL | DITA | ebXML | IDtrust | OpenDocument | SAML | UBL | UDDI
OASIS sites: OASIS | Cover Pages | XML.org | AMQP | CGM Open | eGov | Emergency | IDtrust | LegalXML | Open CSA | OSLC | WS-I